With projected mapped to more than just the cinema-style plane, and now smarter integrated sensing and vision applications like Microsoft’s Kinect, all sorts of surfaces can come alive. It transforms projection from something cinematic to an interactive, dynamic part of the space.

So, while this project isn’t new (both videos were released last year), it’s worth revisiting the work of Augmented Engineering. Researchers at the University of Illinois, Brett Jones and Rajinder Sodhi have a resume that includes Walt Disney Imagineering. That work is even more promising-looking, though, having seen this week the efforts to build cross-platform mapping and interaction frameworks and work with voxels to make dynamic, augmented 3D spaces.

And it all means this could soon be a full-blown artistic medium, not just a research project – particularly when a VJ can go to a local GameStop and buy a Kinect for less than the cost of a projector bulb.

In the video at top:

ISMAR 2010 – Best Student Paper Award (ismar10.org)

We present a novel way of interacting with everyday objects by representing content as interactive surface particles. Users can build their own physical world, map virtual content onto their physical construction and play directly with the surface using a stylus. A surface particle representation allows programmed content to be created independent of the display object and to be reused on many surfaces. We demonstrated this idea through a projector-camera system that acquires the object geometry and enables direct interaction through an IR tracked stylus. We present three motivating example applications, each displayed on three example surfaces. We discuss a set of interaction techniques that show possible avenues for structuring interaction on complicated everyday objects, such as Surface Adaptive GUIs for menu selection. Through an informal evaluation and interviews with end users, we demonstrate the potential of interacting with surface particles and identify improvements necessary to make this interaction practical on everyday surfaces.

Paper: brettrjones.com/​data/​SIE.pdf

As with the work we saw earlier this week, the team has also experimented with adding Kinect to the mix (relevant not only to Kinect but any similar sensing/vision scenario):

A Kinect-projector visualization where distances from the Kinect sensor are visualized through colors (blue=close and red=far). The Kinect sensor and a standard projector are calibrated using the Projector-Camera Calibration Library (an open-source project to be released soon).

Via comments on the first video, some more interesting work – like mapping projections onto a clay model, in this case modeling soil erosion, though that’s hardly the only application.

Working with Tangible GIS

The blog hasn’t seen an update since January, but maybe we’ll get their attention:
http://augmentedengineering.wordpress.com/

And if all of this makes you want to start playing with toys, you’re in luck — I love this Kinect + LEGO research, as linked from the Augmented Engineering blog:

The work is a project of Intel Labs Seattle and the University of Washington. Combining vision, robotics, and human-computer interaction experts, they’re investigating all kinds of good stuff. They describe this work thusly:

OASIS is a flexible software architecture that enables us to prototype applications that use depth cameras and underlying computer vision algorithms to recognize and track objects and gestures, combined with interactive projection. Here we show OASIS in an interactive Lego playing scenario.