“Augmented dancing” is a phrase we’ve gradually been slipping in, describing projection mapping directly onto dancers. It’s always been easy enough to do, but now, if you want to simultaneously mask your projection so it doesn’t also spill behind the dancer – or even go nuts and generate your visuals from intelligent tracking – it’s easier than ever before. Thank/blame Kinect, and accessible tools for using it. MadMapper even has an experimental, very basic tool for creating the mask with that program, though for more sophisticated tools, you’ll want to look to open source libraries for software like OpenFrameworks and Processing.

Before getting into those technical details, let’s look at another example of what’s possible. Daniel Schwarz, whose work I’d planned to feature separately, has an interesting proof of concept in his piece AXIOM.3. For this work, he uses another great development tool: vvvv. As he describes the work:

AXIOM.3 is an interactive dance performance augmented with projection mapping. The whole performance is controlled with a self-built street organ by the audience.
The rotational speed of the street organ affects song tempo, dancing and visuals. The box on the left side visualizes the relative speed.
The projection mapping on the dancer is generated in realtime with a kinect camera.

Everything in realtime.
All done in vvvv with three computers, three projectors, kinect and ps3 camera.

Music [top]: Orquestra Popular De Paio Pires – Kyoto Melody
myspace.com/​orquestrapopulardepaiopires

Music [second from top]: Timatim Fitfit – No How
soundcloud.com/​timatimfitfit

Seen other dance work? We may shortly need to do a round-up, so send them in.

More on vvvv, which I simply pronounce “vvvvvvivvvvvvvvvvvvvv….. vvvvv” (I like to run it on my Eeeeeeeeeeeeeeeeeee PC):
http://vvvv.org/