As input device, Microsoft’s Kinect has its shortcomings – largely summed up by saying that you don’t always want to be waving your arms around through space just to accomplish a task.
But one place Kinect really shows the potential of mainstream computer vision is in its ability to track the body. And that means some serious possibilities for making human beings into dynamic projection surfaces, augmenting their bodies with projected imagery. I remember some frustrating experiments with this on my own, working in a residency a few years ago at Dance Theater Workshop.
Francois Wunschel writes in with an early test session of the technology, and it’s already looking compelling. Latency remains a bit of a challenge – fast movements mean your body gets ahead of the projection. But there are moments of the minimal, sketch-like visuals that are simply sublime.
In order to prove the concept, they’re hacking together a lot of tools – Kinect with OSCeleton with Apple’s Quartz Composer with Cycling ’74’s Max for Live running in Ableton Live with projection-mapping tool MadMapper. But that can be a good way to get a prototype going quickly, and it should certainly be possible to do something with just one tool.
More info on their blog:
They’ve also posted lovely long-exposures of the effect:
This isn’t becoming Create Kinect Motion, don’t worry, but it is fascinating to watch this evolve. And if you’re in New York this weekend, there’s a hacklab going on. I can’t make it – visiting my father in Florida – so if you can, take some photos and video and notes and report back!
And if you’re not in NYC, which of course the vast majority of you aren’t, you’ll be pleased to know this:
We will also be working on two new resources for the community:
http://openvoxel.org and http://wiki.openvoxel.org which we hope will
come to address some of the needs of people and organizations working
with this technology.