Visionary 3D scanning, computer vision, and digital media guru Kyle McDonald is back again with more tools that break down the boundary between the computer and the world. Kyle tells us he spent a great part of the fall in residence at the Yamaguchi Center for Arts and Media (YCAM) in Japan. He worked with experimental projector/camera rigs — and now we get to enjoy some fruits of those labors.

http://interlab.ycam.jp/en/projects/guestresearch/vol1

The software collection is free and open source (MIT License), built for OpenFrameworks, and already includes some documentation. (The ideal is contributing back to the core of OF, adding to this artist-savvy C++ coding tool.)

Mapamok

A new experimental projection mapping tool called Mapamok is the major highlight for me. As Kyle tells CDM, “the idea is to get people away from clicking on so many points and drawing so many masks. Instead, you load a 3D model of the scene and then click on a small number of points (8 to 12 points), and the whole projection is automatically calibrated.”

Automatic mapping and calibration? Sounds good to me.

The tool is completely free, as part of the ProCamKit. There’s both a download an an (English) tutorial guide:
http://github.com/YCAMInterlab/ProCamToolkit/downloads
http://github.com/YCAMInterlab/ProCamToolkit/wiki/mapamok-%28English%29

Video, at top, is described thusly:

Music: “Mass” by Vaetxh [Vimeo]
mapamok is an experimental projection mapping application that is part of the Projector Camera Toolkit developed during Guest Research Project v.1 at YCAM Interlab.
This video demonstrates a workflow for fast projection mapping involving: measuring and modelling the scene (5-30 minutes) followed by installing and calibrating the projector (5 minutes). Because mapamok is built with openFrameworks, everything is rendered in realtime. This allows for the technique to easily be extended to interactive applications. Near the end, an example of an interactive sound+visual mapping is shown.

But there’s more to this kit, cool:

3D Scanning Meets Projection

“The 3d scanning + projection stuff is something I hinted at being possible a long time ago, back when I got started with structured light,” Kyle explains. “Elliot Woods has also done some great work using the [Microsoft] Kinect for relighting.”

For more on 3D scanning and Kyle’s “structured light” concept – an effect we’ve seen in a number of music videos now – everything is at the Google Code site, from code to discussion:
http://code.google.com/p/structured-light/

Here’s what’s happening now. (Alas, poor Yorick, I knew him, a fellow of infinite jest, of most excellent fancy. Also, glad he wasn’t a dissected frog. That’s just cruel.)

Music: “Hoofbeat” by Dustmotes [SoundCloud]
This video demonstrates an experimental projection mapping application that is part of the Projector Camera Toolkit developed during Guest Research Project v.1 at YCAM Interlab.
First, the scene (the skull) is scanned using gray code structured light. This scan data is decoded into a 3D point cloud. This is possible because the camera and projector are calibrated and the position and orientation of each is known in advance. At :33 there is a brief demo of editing a GLSL shader in realtime, making a minor change that changes the width of the projected lines. Then the skull itself is shown again, with the projection mapped visuals.

shadowplay

“The shadowplay work, i’m pretty sure it hasn’t been done like this before,” says Kyle. “It’s the first steps towards a bigger project I have in mind.” It’s a bit tough to follow, but the idea is aligning projectors, then – using the white light created by the two projectors – manipulating shadow in a way that would normally be physically impossible. It’s clearly just a first step, but you can see some compelling potential here that could evolve along a different path than we’ve seen before.

Actually, that said, for me the best part of this is hearing some of Kyle’s music, which is also lovely. (Is there anything you don’t do in digital media, Kyle?)

Music: “Say It” by kylemcdonald [SoundCloud]
shadowplay shows an experimental interaction paradigm built with the Projector Camera Toolkit, developed during Guest Research Project v.1 at YCAM Interlab.
This video demonstrates some shadowplay experiments, built around the idea of using two projectors that are perfectly aligned. The projectors display the inverse images of each other, so that without interference the screen appears white. When an object enters the space, the structure hidden in the projection is revealed in the shadows. The technical challenge of aligning the projections and calibrating the colors of the projection is solved with the Projector Camera Toolkit, using structured light scanning and some novel feedback-based calibration techniques.

http://github.com/YCAMInterlab/ProCamToolkit