Computer vision for tracking movement is cool. But add the ability to track actual objects, and you can extend the possibilities for interfaces. We’ll be playing around with this on our upcoming tangible hackday. A lot of the reason these experiments can proliferate is the availability of free frameworks that make the technology accessible to artists and designers. The tricky tracking work is done, leaving you to focus on where this tracking might actually be useful.
The other good news: while doing projected visual feedback or fancier tracking can get more complex and costly, if you just want to track some objects, all you need is a USB or FireWire camera and some printed stickers. Cost: $40 with a webcam, about $5 without.
Recently, one of the most popular of these libraries got a big update: reacTIVision 1.4. It’s the open-source, multi-platform framework that powers the reacTable, and was developed by Martin Kaltenbrunner and Ross Bencina at the Music Technology Group at the Universitat Pompeu Fabra in Barcelona, Spain.
Multi-platform really means multi-platform. It works with FireWire and USB cameras, Mac, Windows, and Linux, and has clients for C++, Java, C#, Processing, Pure Data, Max/MSP/Jitter, Quartz Composer, and Flash, plus a wide range of applications that support the OpenSoundControl-based TUIO protocol.
You can grab the library at:
I’ll actually be testing both the tracker in reacTIVision and the Trackmate tracker from the LusidOSC project. The Trackmate software is built in OpenFrameworks. It does use a different protocol (LusidOSC), but that’s also based on OSC, and there’s even a tool that translates to TUIO.
For an example of what this all looks like when assembled – and some of the power of having a framework on which to build – here’s a tangible interface for a multiplayer game. It’s Pong with objects.
This games uses reactivision software, along with Flash, to detect symbol fiducial block movements. The game is played by moving these symbols on a table. Players can enter and exit the playing field at any time. The game adapts to the number of players. The lower the score the better; the first player with a score of 12 ends the game.
Music: Waterdrops by Yohan Shin http://www.geocities.com/cerup2
There’s more progress coming in reacTIVision world, too. First up: reacTIVision 1.5. Martin tells us:
After this release I am now implementing reacTIVision 1.5, which will improve the multi-touch tracking performance, and already implement the upcoming TUIO 1.1 blob tracking extensions, for the transmission of basic untagged object descriptors.
Following that, the next plan is for TUIO2, an expanded protocol that will address some of the shortcomings of the first version, to be released with a future update to reacTIVision. You can read the full specification for the new protocol, but Martin has kindly given us a Cliff Notes version:
To summarize, TUIO2 has a flat profile, which now includes symbol, cursor and blob descriptors in much more detail. Symbols now also can carry content info (e.g. datamatrix), cursors have additional properties such as type, pressure and region of influence, and blobs can be described in various incremental messages that describe the bounding, contour and skeleton for example.
The other important thing that Martin is doing – and the reason for the wait – is to synchronize implementation of TUIO in other key libraries and clients. That is helping keep TUIO a standard for this kind of work. It’s not even really a full protocol – part of the beauty of it is that it builds on OSC.
If this isn’t quite making sense yet, stay tuned and we’ll show some of the specific applications and get you started with your own projects.