Perhaps you’ve seen the demo videos, as people do astounding things by moving their body around and using the Kinect camera to make music. Now, a set of Max for Live devices makes it reasonably easy to access your body as input inside Ableton Live.

FaceOSC Mapper

Pictured at top, this builds on Kinect superstar coder Kyle McDonald’s face-tracking tool and lets you use your face – position and even facial movements – to control Ableton Live parameters.

Kinect Camera

For use with the V-Module and vizzABLE systems, you can plug in one or more Kinect cameras, and get tilt, distance filtering (to remove backgrounds), depth maps, RGB and IR modes, and plug in your depth-sensing camera for more goodness.

Kinect – OSCeleton

The home run: look at skeletal tracking for extremely precise human control of parameters, as seen in the video. It only gives your left and right hand, but stay tuned for further developments. See also this example patch.

Example video shows how to “track hand positions and translate to volume and send levels.” Not impressed? Remember, it’s a proof of concept: you can assign to other parameters, practice your movements, change the musical content, and even modify the patch to make it work better.

It’s Electronic Body Music! (Sorry, couldn’t resist. Kids, ask your … parents jeez we’re getting old, aren’t we?)

  • Is it not just too much fun! Next try and play with building an UI.

    Have you seem my kitchen mess arounds

    Big fan

  • This is huge.

  • digid

    Excellent stuff, will try FaceOSC tonight!

    … without giving any further explanation to my kids or my better half as to why I look like I am having a seizure in front of my Mac.

  • Mike Todd

    Thanks for sharing – if I had known it'd make it to CDM I'd have made a better demo video!!

    These are my first foray into M4L so feedback is welcome. I will be making some tutorial videos with screen captures in the coming weeks. 

  • Mike Todd

    ALSO – getting OpenNI and OSCeleton working take some time but the steps outlined in the link you posted seem to be the most reliable way on OS X. It is worth the effort, however, as it also let's you plug OSCeleton messages into things like Animata (which is very fun puppet animating that you've featured). 

    I did a project using Animata with a very smart 3rd grade class to animate and act out their stories, it was a great time!

  • if you could control tempo and dynamics via the kinect you could pull off some great orchestral style conducting

  • Michael Coelho

    Must get Kinect!

  • To be clear – you don't need a Kinect for the FaceOSC stuff, any webcam will do

  • Tim

    Hooray! Now I can perform my gigs pumping my fist in front of a laptop!!!