Arguably, democratizing a technique is an excellent way to improve craft. (See: linear perspective in Renaissance painting.) So, why not do that with projection mapping?

Here’s the latest installment. Elliot Woods of Kimchee and Chips is one of the leading practitioners of the projection-mapping arts. He also gives away a lot of his tools (see GitHub, below). And now he’s giving away some of his knowledge, not only for projection mapping, but what he calls “3D projection mapping.” More on that in a moment, but first, let’s review his work:

An Augmented Tree, and Free Tools Power 3D Voxel Projection — on Leaves

Kinect-Augmented Reality, as Projection Mapping Meets Depth Sensing (Hint: It’s Awesome)

What’s this “3D” thing about? The notion is scanning the world in three-dimensions, and using that three-dimensional model for mapping. That’s a big step from the distorted-plane techniques that generally constitute mapping. It is an accurate mathematical model of what’s happening in 3D, a kind of genuinely-digital trompe-l’œil.

Elliot is not only doing a workshop (see top), but has assembled a detailed, step-by-step walkthrough of the technique using free tools and the superb visual environment vvvv. He explains:

Using ReconstructMe we rapidly scan a real world scene and export a 3D mesh file which can be used for projection mapping. This means that within 10-20 minutes we can go from having seen an environment for the first time, to augmenting it using 3D projection mapping.

What I’d describe as ‘3D projection mapping’ is the act of re-projecting a virtual 3D object onto its real world counterpart using a video projector*. Thereby all of the features of the real object which are visible from the point of view of the projector have an image projected onto them, and this image is ‘extracted’ from the corresponding surfaces of the virtual counterpart object. Augmenting the virtual object then results in an implicit augmentation of the real world object.

* thereby defining ‘2D projection mapping’ as lining up 2D shapes in a projector image with real world features in the projector’s line of sight.

You can check out a walkthrough video below, then try the whole tutorial (complete with files) on the vvvv site:

Path files at GitHub

For those of you in the UK, see the Manchester workshop details and to register interest for a future installment.

Thanks, Elliot! And readers, we’re keen to see you do something with this, so don’t be shy if you post an experiment.