Dancing, digitally, anyone? Capturing full-body motion has tended to be imprecise and primitive, expensive, complex, non-real-time, or sometimes a combination. Rapidly-paced open source development around Microsoft’s Kinect 3D depth camera is proving the future doesn’t have to be that way. The results, piping control data in real-time to any visual, music, or other software, demonstrate your full body as input. We can look through our own eyes at another human being and be aware of how their skeleton is moving through space. Here, our computers no longer have to be in the dark about the same awareness.

The video above transforms your skeleton into an OSC (OpenSoundControl) input for any software you want. Here’s what its creators say, to get you started — and some thoughts about how more open initiatives are transforming the landscape:

Want to do 2D or 3D animation but you find it hard and time-consuming?
Want to build games but the artwork is your bottleneck?

Fortunately we now have low-cost 3D cameras, thanks to Microsoft and PrimeSense.

OSCeleton is basically a DIY motion capture system.
It sends 3D tracked body skeletons trough the OSC protocol so you can build anything easily.


Grab the drivers:

Follow the install instructions:

Download OSCeleton and run the binary:
(use “OSCeleton -h” in the command line for more options)

and the Processing examples in the video:

Try Animata too:

… and enjoy 😉

KC & The Sunshine Band, Shake Your Booty


Reader Steve Elbows shares additional insights into the video above, some of his own experiments, what this means, and where to begin. It’s a valuable enough email that I’m just going to reproduce it here verbatim:

Things are starting to progress in terms of using Kinect sensor with OpenNI & NITE middleware skeleton tracking for VJing or music purposes.


OSCeleton is my favourite so far, very low CPU use. Probably needs more choice of OSC message formats to work with the full spectrum of environments such as Quartz Composer, but for now I have it working with [game engine] Unity.

It’s mostly joint positions only at this stage, think joint rotation is giving people some headaches in the maths department, although there is an OpenNI example that does joint rotation stuff on an OGRE model.

NITE, which does the important skeletal tracking bit, isn’t available for Macs yet but as OSCeleton uses so little CPU I’ve been able to run it on a virtualized Parallels desktop Windows install and have the OSC go over the virtual network between the real Mac and the virtualized
Windows without too many problems.

Before I heard about these OSC apps I did cobble one together of my own which worked ok on a Linux box, and I sent joint positions to a second machine running Quartz Composer. Only had time to throw together some very rushed demos to start with, but gives some glimpse into the potential, at least:

Someone also seems to have released something that will store skeleton as BVH files, though I haven’t tried this myself yet:


Open Interaction

The other recent breakthrough is the emergence of OpenNI, an initiative by which Kinect’s creators hope to standardize open and open source development. The group is ” an industry-led, not-for-profit organization formed to certify and promote the compatibility and interoperability of Natural Interaction (NI) devices, applications and middleware.”


OpenNI is a promising sign on a number of levels:

  • Standards for interoperability, beyond just Kinect (or any one tool). Open source development could be rudderless without some intervening oganization that keeps things interoperable. And natural interaction shouldn’t have to be about just one platform.
  • Readily-available, open source driver support for tools like Kinect. You can grab downloads from OpenNI that give you immediate access to Kinect. And they’re fully open source – a major departure from how gestural interaction has worked in the past. That helps ensure…
  • Applications beyond gaming, and beyond proprietary platforms. You’ll already find tools for integrating with various software, demos, code, and more to come. That means that, far from reinventing the wheel or having to fight intellectual property complaints, artists, educators, and researchers should be free to experiment – and stand on each others’ work.

That last point is important, as this is about far more than any one technology. It may take that shared effort just to work out what the heck this stuff is for and how to use it, and to iterate over time to make things like skeletal detection more accurate, more human, and more economical (including in regards to system resources and latency).

We knew Microsoft’s little camera would be looking out at us. Now we get to look back at it. I can’t wait to see what happens next.

Microsoft’s camera, and the productization of the technologies behind it, could be the start of something big. Photo (CC-BY-SA) Phillip Torrone.
  • Steve Elbows

    Thanks for featuring this stuff!

    Since I emailed you there have been further developments, there is now an official OpenNI wrapper for Unity!


    Like the other OpenNI stuff we still need to wait for a mac version of NITE before its of much use on OS X, and some people are still having trouble getting their heads round roint rotation issues for rigged characters, but I expect these issues to get cleared up pretty quickly.

    As Im on a mac I havent really used the above yet, but have continued to dabble with OSC-based stuff in the meantime. Here are some basic tests with me interacting with the physics stuff that Unity provides:

    http://www.vimeo.com/18307795 http://www.vimeo.com/18305217

    I will be very interested to see what happens with the hardware in 2011. OpenNI & NITE stuff has come from PrimeSense, the company which provided the technology to Microsoft that powers the Kinect (although Microsoft used their own body tracking software solutions rather than PrimeSenses own which we get with NITE). Their business model seems pretty focussed on supplying the hardware to other companies which will bring devices to market, with the Kinect being the first high-profile use of it. So its possible that some other company will manufacture their own sensors which are aimed at the mass PC consumer market. For whilst the Kinect sensor works on computers already, its a bit of a grey area as to whether, if for example I had some computer games ready to sell that use this technology, I would be able to market them without running into issues, especially if I dared to mention the word Kinect. (As technically speaking Kinect refers to the complete XBOX360 based solution including all the software work Microsoft have done to make it a robust experience on that platform).

  • Steve Elbows

    Also its probably worth pointing out that the actual skeletal tracking stuff is not open source. The skeletal tracking and a bunch of other stuff is part of NITE middleware, which PrimeSense have released as a binary (for Windows & Linux at this stage but also OS X soon I think). However because the NITE middleware plugs into the OpenNI stuff which is open source, a world of possibilities is still open to the wider community. But it does mean that we cannot tweak or improve the actual skeletal tracking algorithms themselves, at least not without starting this work from scratch.

  • nalleli

    hey the links are not working¡

  • Joseph

    Unity stuff is very cool!

    also see&nbsp ;http://vimeo.com/18008386

  • Anonymous
  • Steve Elbows

    NITE for OS X is now available.

    As I have not previously installed any of the components that make this stuff work on my mac, I am still battling with getting it all installed so I can test.

  • Steve Elbows

    Vague guide to installing this stuff on OS X,the following worked for me:
    Install macportsUse macports to install libusb-devel(for me this failed on first try so I had to use macports to install libtools first and then it worked)Install OpenNI using the latest unstable branch binaries that PrimeSense provideInstall NITE using latest binaries from PrimeSense and the license key they give on their site.Install the driver – the one PrimeSense provide is for their reference hardware not the Kinect, but there are suitable OpenNI kinect drivers available, either from the ROS project or from here:
    I can then run the various OpenNI samples including the User Tracking one that features skeletal tracking, whcih shows that the NITE stuff is working.
    I dont think the Unity wrapper works on the mac yet but someone is looking at refactoring it so that it does.

  • Steve Elbows

    I still havent seen the Unity wrapper working on OS X, but OSCeleton is now available for OS X.


    There is also a google group for OSCeleton now.


    If anybody has any thoughts on OSCeleton supporting different OSC message format, please check out the google group as I have started a discussion about this, and what is needed for the OSC to work well in Quartz Composer. 

    PrimeSense have also reiterated that NITE middleware is staying fee, including for commercial use, although they do seem to have some rules such as no porn or drug promotion.

  • Steve Elbows

    OSCeleton now has a mode which sends OSC messages that are better suited to Quartz Composer, yay.

    As far as the alternative Unity wrapper that works on macs, its steadily improving but stil has some quite bad issues at the moment, so I recommend going down the OSCeleton->OSC->Unity route at the moment if you are on OS X, although this could change any day.

  • Why have all the sites been pulled? O.o

  • Nevermind, some temporary bug… ?

  • Your blog is pretty interesting to me and your topics are very relevant. I was browsing around and came across something you might find interesting. I was guilty of 3 of them with my sites. “99% of site managers are guilty of these 5 mistakes”. http://bit.ly/tJJczA You will be suprised how fast they are to fix.