Perhaps you’ve seen the demo videos, as people do astounding things by moving their body around and using the Kinect camera to make music. Now, a set of Max for Live devices makes it reasonably easy to access your body as input inside Ableton Live.

FaceOSC Mapper

Pictured at top, this builds on Kinect superstar coder Kyle McDonald’s face-tracking tool and lets you use your face – position and even facial movements – to control Ableton Live parameters.

Kinect Camera

For use with the V-Module and vizzABLE systems, you can plug in one or more Kinect cameras, and get tilt, distance filtering (to remove backgrounds), depth maps, RGB and IR modes, and plug in your depth-sensing camera for more goodness.

Kinect – OSCeleton

The home run: look at skeletal tracking for extremely precise human control of parameters, as seen in the video. It only gives your left and right hand, but stay tuned for further developments. See also this example patch.

Example video shows how to “track hand positions and translate to volume and send levels.” Not impressed? Remember, it’s a proof of concept: you can assign to other parameters, practice your movements, change the musical content, and even modify the patch to make it work better.

It’s Electronic Body Music! (Sorry, couldn’t resist. Kids, ask your … parents jeez we’re getting old, aren’t we?)