Percussa micro super signal processor

The work by Theo Watson and team has been one of those magical technological revelations that makes people say “oh – that’s what that’s for.”

Say computer vision or tracking, or show the typical demo of what these can do with interaction, and eyes glaze over. But make them work as puppetry, and somewhere deep inside the mechanisms by which us human beings interact with our world, something lights up.

With iteration, that first proof of concept just gets better. Theo writes to share that he and collaborator Emily Gobeille made a second version of the project. In “Puppet Parade,” the Interactive Puppet Prototype 2.0, the barrier between digital realm and human gesture gets a bit thinner.

But don’t just watch the edited demo – see what it looks like in action below, and a brief visual look at how the system works. (Bonus: Theo wrote the tools on which the whole system was based – and shared them with a well-connected global community of hackers via his open source library.)

Description and credits:

Puppet Parade is an interactive installation that allows children to use their arms to puppeteer larger than life creatures projected on the wall in front of them. Children can also step in to the environment and interact with the puppets directly, petting them or making food for them to eat. This dual interactive setup allows children to perform alongside the puppets, blurring the line between the ‘audience’ and the puppeteers and creating an endlessly playful dialogue between the children in the space and the children puppeteering the creatures.

Puppet Parade premiered at the 2011 Cinekid festival in Amsterdam. Puppet Parade is made with openFrameworks and the ofxKinect addon.

Project page: design-io.com/?p=15

Credits:
Project by Design I/O – design-io.com
Exhibited at Cinekid Media Lab 2011 – cinekid.nl
Sound Design by: m-ost.nl
Video by Thiago Glomer / Go Glo – thiagobrazil.blogspot.com

Here’s the live version, unedited, for a better feel of what this project is like in person.

Note the interaction on two planes. (Kyle McDonald, another superstar of the Kinect community, points to this element in comments on Vimeo – thanks, Kyle.)

And for a peek behind the curtain, you can also see the tracking that drives the interaction:

Filip at Creative Applications goes into greater technical detail, in terms of libraries used and other specifics. Two particular tips: motor control allows the system to adjust to different heights, and the depth of field benefits from the use of two cameras.