Dancing, digitally, anyone? Capturing full-body motion has tended to be imprecise and primitive, expensive, complex, non-real-time, or sometimes a combination. Rapidly-paced open source development around Microsoft’s Kinect 3D depth camera is proving the future doesn’t have to be that way. The results, piping control data in real-time to any visual, music, or other software, demonstrate your full body as input. We can look through our own eyes at another human being and be aware of how their skeleton is moving through space. Here, our computers no longer have to be in the dark about the same awareness.
The video above transforms your skeleton into an OSC (OpenSoundControl) input for any software you want. Here’s what its creators say, to get you started — and some thoughts about how more open initiatives are transforming the landscape:
Want to do 2D or 3D animation but you find it hard and time-consuming?
Want to build games but the artwork is your bottleneck?
Fortunately we now have low-cost 3D cameras, thanks to Microsoft and PrimeSense.
OSCeleton is basically a DIY motion capture system.
It sends 3D tracked body skeletons trough the OSC protocol so you can build anything easily.So…
Grab the drivers:
github.com/avin2/SensorKinectFollow the install instructions:
ros.org/wiki/niDownload OSCeleton and run the binary:
github.com/Sensebloom/OSCeleton
(use “OSCeleton -h” in the command line for more options)and the Processing examples in the video:
github.com/Sensebloom/OSCeleton-examplesTry Animata too:
animata.kibu.hu/… and enjoy 😉
SoundTrack:
KC & The Sunshine Band, Shake Your Booty
Reader Steve Elbows shares additional insights into the video above, some of his own experiments, what this means, and where to begin. It’s a valuable enough email that I’m just going to reproduce it here verbatim:
Things are starting to progress in terms of using Kinect sensor with OpenNI & NITE middleware skeleton tracking for VJing or music purposes.
https://github.com/Sensebloom/OSCeleton#readme
OSCeleton is my favourite so far, very low CPU use. Probably needs more choice of OSC message formats to work with the full spectrum of environments such as Quartz Composer, but for now I have it working with [game engine] Unity.
It’s mostly joint positions only at this stage, think joint rotation is giving people some headaches in the maths department, although there is an OpenNI example that does joint rotation stuff on an OGRE model.
NITE, which does the important skeletal tracking bit, isn’t available for Macs yet but as OSCeleton uses so little CPU I’ve been able to run it on a virtualized Parallels desktop Windows install and have the OSC go over the virtual network between the real Mac and the virtualized
Windows without too many problems.Before I heard about these OSC apps I did cobble one together of my own which worked ok on a Linux box, and I sent joint positions to a second machine running Quartz Composer. Only had time to throw together some very rushed demos to start with, but gives some glimpse into the potential, at least:
Someone also seems to have released something that will store skeleton as BVH files, though I haven’t tried this myself yet:
Open Interaction
The other recent breakthrough is the emergence of OpenNI, an initiative by which Kinect’s creators hope to standardize open and open source development. The group is ” an industry-led, not-for-profit organization formed to certify and promote the compatibility and interoperability of Natural Interaction (NI) devices, applications and middleware.”
OpenNI is a promising sign on a number of levels:
- Standards for interoperability, beyond just Kinect (or any one tool). Open source development could be rudderless without some intervening oganization that keeps things interoperable. And natural interaction shouldn’t have to be about just one platform.
- Readily-available, open source driver support for tools like Kinect. You can grab downloads from OpenNI that give you immediate access to Kinect. And they’re fully open source – a major departure from how gestural interaction has worked in the past. That helps ensure…
- Applications beyond gaming, and beyond proprietary platforms. You’ll already find tools for integrating with various software, demos, code, and more to come. That means that, far from reinventing the wheel or having to fight intellectual property complaints, artists, educators, and researchers should be free to experiment – and stand on each others’ work.
That last point is important, as this is about far more than any one technology. It may take that shared effort just to work out what the heck this stuff is for and how to use it, and to iterate over time to make things like skeletal detection more accurate, more human, and more economical (including in regards to system resources and latency).
We knew Microsoft’s little camera would be looking out at us. Now we get to look back at it. I can’t wait to see what happens next.