Creating Gesture Based Controls for VDMX using the Gestrument Kinect MIDI controller app

Download the completed VDMX project file for this tutorial. 

Last week featured on CreateDigitalMusic we caught wind of Gestrument Kinect which is currently in beta, a simple Mac app that can be used to convert the camera depth data from a Kinect to MIDI for controlling music or VJ live visuals. Since it sends of standard MIDI, it only took a few seconds to connect it to VDMX for a quick demonstration on using its gestures to trigger events and adjust video FX parameters.

The four pieces of provided control data from Gestrument Kinect are active signal, x and y blob position and dynamics (weight). These values can be used together in a variety of different ways within VDMX to set up controls that respond to actions such as waving a single hand from right to left.

First, to get the video signal we can use the Window Inputs feature to capture and crop the on screen display from the Gestrument Kinect application.

Next we'll use a Control Surface plugin to create custom data-sources that represent different actions based on the incoming MIDI data and use the results to advance to the next movie in a media bin.

Notes and next steps:

Once you've got this mastered, try to control a video feedback loop or to crossfade between four different layers.

Adjust the Depth range slider in Gestrument Kinect until only your hands are visible.

In the ‘Workspace Inspector’ under Vid In, enable ‘Window Video Inputs’.

Completed VDMX project with ‘Right to Left’ and ‘Bottom to Top’ hand wave gestures.