Matt Davis [namethemachine] is seen here with Microsoft’s Kinect computer vision / 3D camera controller, plus – stealing the show – lasers. The lasers in question are a rig by Henry Strange, which allows computer control of laser direction using the DMX protocol. (DMX is a protocol similar to MIDI – though actually a bit simpler, if you can believe that – generally associated with lighting and show control.)

I could say more, but I’ll let you watch the video and ponder. The ingredients:
OpenNI, the “natural interface” not-for-profit standards body and organization that allows drivers across multiple hardware (Kinect being the best-known)
Ableton Live (sound)
Max/MSP (I believe here just translating OpenNI control to MIDI and perhaps DMX, as well)

The result: audiovisual control, and The Future. (Now, the only problem is, I’m not sure I’d want to watch an entire lineup of people doing these kinds of gestures while performing, but I could certainly see this alongside other alternative control schemes, from breath to good-old-fashioned tangible controllers.)

Thanks, Laura Escude, for the tip. (Laura has her own interface for futuristic electronic performance – she uses a violin!)

  • Ok… who wants to see a great breakdancer use this to control a live dj remix? Maybe that Lil Buck guy?&nbsp ;

  • Peter Kirn

    @Martin: Yeah, absolutely — this sort of thing means we all have to go practice *movement* / dancing as well as music. 

    Oh, and PS, should anyone decide to (cough) copy edit my headline, see:

    It's just a silly spelling.

  • deb

    ok, apparently i get to say it first:

  • I can't do Kinect, but I can stump up Arduino-controlled lasers, Ableton Live, Max/MSP, Java, Python, quadraphonic sound and a 300Kg running machine:

    Laser testing:

    At some stage I really should do a proper technical write-up. Our lasers are 100mW and I'm not 100% sure they're legal.

  • bar|none

    Hook a cat up to that and it's Lazer Cat for realz!

  • Random Chance

    Seems to work reasonably well for this kind of dubstep performance. With a great deal of fine tuning (for example, smooth the signals controlling the orientation of the lasers) this could be interesting. All this stuff about motion tracking makes me want to dig up my old stuff and finally try out the OpenCV objects for Max (although I guess I'd still be faster just using C as a substrate for image processing stuff).  It's somewhat humbling to live in a time where you can basically do anything without seriously pushing technical or financial boundaries. 

  • yip, it's great how toys and cellphones are equipped with technology of the future nowadays. I still haven't realized half of the crazy stuff I have in mind for the wiimote.

  • mat

    wow – what a really, really nice add-on for a live performance!
    Not for an entire set (also you may change the routing of the 4 axis between songs -they are 4?) And for sure in combination with other controllers (cause 4 axis wouldn´t be enough anyway).
    But damn, I can imangine how great the feedback of the audience will be! It is so important to see some action you can link to the audio you hear…(and clickin a mouse don´t do 😉 )

  • yeah. lazors and all but eh. too bad the music sucks.


  • Peter Kirn

    It's a proof-of-concept demo, just a first run. I've done some "let's see if this patch is working" tests that sound utterly dreadful; I'd say this is just fine!

  • pdx

     i've been talking about "motion capture musical instruments" since i first saw Kinect in action, but have been disappointed by the ~100ms latency. you can see it in the lasers (first the hand moves, then the light follows); this makes developing playable instruments, such as a virtual drumkit, difficult. BUT you avoid the issue by using Ableton, which keeps the audio in sync, with Kinect more as an effects controller (where the latency is not so problematic). nice work, i'll be interested to see how it progresses …

    i've written these pieces on the subject:

       – pdx

  • pdx

    hmm, that comment was supposed to start off with a "thumbs up!" sign, because i really like what you're doing, but it must've been taken as bad html (d'oh! 🙂

  • As stated… excellent as proof of concept hooking up new kit and multi control…more practice needed and choice of music would be critical ….. In a Live performance you would need to be carefull with the lasers to avoid risking staionary beams ( crowd scanning would NOT be recommended) so setting up zones may make the link up less effective/ entertaining…but hey, great idea!!

  • Rob

    Peter, I would love to talk to you about how you set this up. I dont have a kinect yet but the ability to create music with dance looks amazing. I have quite a few dancer/dj's that would love to help out in any way we can

  • dtr

    Somewhat similar to the system I'm using in my upcoming performance project, just using projections in hazer fog instead of lasers:

    Setup = performer motion > Kinect > OSCeleton > Max/MSP/Jitter > 3 beamers & speakers

  • Coolgchat

    hi~~ Could you teach us how you did it?

  • Coolgchat

    hi~~ Could you teach us how you did it?

  • Coolgchat

    hi~~ Could you teach us how you did it?