Join Scott Pagano for an in-depth discussion in this video Kinect setup, part of TouchDesigner & Unreal: Interactive Controllers.
- [Narrator] The third interactive controller we're going to take a look at to interface with TouchDesigner is a Kinect for Xbox One. And these are pretty fantastic sensors that allow us to do motion tracking of people real easily. Alright, so we've got the sensor set up in the studio here, and what we also have, is we have the Kinect adapter for Windows PC, which takes the Kinect and allows you to interface with a normal PC via USB 3. And we've downloaded and installed the Kinect drivers, and that's the Kinect for Windows Runtime 2.0. And what we've also gone and done is we've downloaded the Kinect for Windows SDK 2.0.
And what this does is it gives us some test applications, so we can view the output of the Kinect. Alright, so all of that is downloaded and installed and set up here. And so, what we're going to go do, is in the Microsoft SDK folder that gets installed, we're going to navigate into the bin folder, and we're going to see this BodyBasics-D2D application. I'm going to open that up. Alright, and this is really great cause what this does, is this just gives us the preview of the skeleton that's being tracked right now. I'm going to stand up, move my hands around, and you can see we get this really great, really easy straight-up skeleton tracking going on.
And we can stream this data into TouchDesigner with ease. Okay, so now what we're going to go do, is I'm going to close this file explorer. And let's go over to TouchDesigner. We're just in an empty project here. And I'm going to go Tab. In our chops I'm going to type Kin. We're going to get a Kinect chop and lay that down. It's going to have a warning at first but if I press Play, you can see all of a sudden, we get all this data start to stream in. And if I start moving around a little bit, you can see we get all this information. If I scroll in a bit, you can see we're getting hands, wrists, shoulders.
We're getting translation and rotation data for various things. Okay, so what I'm going to do here, is I'm going to split my screen left and right. Let me just go up a level here, so we can see this. And I'm going to go over to my body basics and have that up in the right corner here. And now I'm going to go over here, and I'm going to turn on the display for that, and let me bring up my little Kinect demo, skeleton thing again here. And alright, there we go. So, now I just wanted these side-by-side so we can see. We've got the Kinect demo app showing us our skeleton coming in and then we can see all of our data coming into TouchDesigner's channels, which now we're going to move on to controlling the particle system that we built in the OSC chapter with hand movement coming out of the Kinect into TouchDesigner.
This course focuses on using interactive hardware controllers to drive real-time 3D scenes in TouchDesigner and Unreal Engine. In TouchDesigner, look at a MIDI controller with sliders that you can use to drive the creation of effects in a real-time rendering system. Next, learn about using TouchOSC to build a custom UI that you can transfer to an iPad that sends OSC messages to TouchDesigner. Then, learn how to use those messages to move particles and objects around on screen. To wrap up, learn how to get data off of a Kinect sensor into TouchDesigner so you can control that same particle and object movement. The course also covers MIDI, OSC, and Kinect as they apply to Unreal Engine.
- Using TouchOSC to build a custom UI
- Transferring a custom UI to an iPad
- Controlling objects with OSC
- Setting up Kinect
- Getting data off of a Kinect sensor and into TouchDesigner
- Using OSC messages to move particles and objects around
- Installing VS and downloading the plug-in
- Creating a C++ Unreal project and compiling a plugin
- Installing the Kinect 4 Unreal plugin