Learn how to use real-time controllers to drive great results from two fantastic real-time 3D graphic systems—TouchDesigner and Unreal Engine.
- [Instructor] In this course we're going to learn how to use interactive hardware controllers to drive real-time 3D scenes, in both TouchDesigner and Unreal Engine. We'll start off in TouchDesigner where we'll look at a MIDI controller with sliders that we'll use to drive an effects system that we're going to have create effects after a real-time rendering system. Then we're going to move on to OSC, where we're going to use TouchOSC to be able to build a custom UI that we transfer to an iPad, that sends OSC messages that TouchDesigner can receive. Then we'll learn how to use those messages to move particles and objects around on screen.
Then we'll wrap up our touch section with looking at how to get data off a Kinect sensor in a TouchDesigner to be able to control that same particle and object movement. Then we'll switch over to Unreal Engine, we'll start off with MIDI and we'll look at how we build a blueprint in Unreal Engine, to get MIDI flowing into our system. And then we're going to show you how we can build a post process material that's controlled from that MIDI data to in real-time change the stylization of our render. Then we'll move on to OSC and we'll use an iPad running TouchOSC to send OSC data into Unreal, and then we'll learn how to control objects in real-time, in Unreal, from this wireless hardware controller.
And then what we're going to do finally with Unreal, is we're going to get the data off a Kinect sensor into Unreal, and learn how we can control object and particle movement, in Unreal, from the data flowing in off a Kinect, as we move around a room. So these are two fantastic real-time 3D graphics systems that have different strengths for different projects. And we're going to learn how we can use real-time controllers to drive some great results in both of them.
This course focuses on using interactive hardware controllers to drive real-time 3D scenes in TouchDesigner and Unreal Engine. In TouchDesigner, look at a MIDI controller with sliders that you can use to drive the creation of effects in a real-time rendering system. Next, learn about using TouchOSC to build a custom UI that you can transfer to an iPad that sends OSC messages to TouchDesigner. Then, learn how to use those messages to move particles and objects around on screen. To wrap up, learn how to get data off of a Kinect sensor into TouchDesigner so you can control that same particle and object movement. The course also covers MIDI, OSC, and Kinect as they apply to Unreal Engine.
- Using TouchOSC to build a custom UI
- Transferring a custom UI to an iPad
- Controlling objects with OSC
- Setting up Kinect
- Getting data off of a Kinect sensor and into TouchDesigner
- Using OSC messages to move particles and objects around
- Installing VS and downloading the plug-in
- Creating a C++ Unreal project and compiling a plugin
- Installing the Kinect 4 Unreal plugin