When photography meets generative visuals, where real-for-real techniques come together with digital imagery, some special things can happen. Take the surprisingly-effective technique of 3D light painting.
If you don’t closely follow our sister music site, you may have missed NI’s Monark synth. But the teaser for it is worth revisiting for its technique. Combining the iPad as a convenient portable screen with generative visuals and an open source 3D and Arduino control rig, it produces a rich, subtle effect that could be worth adding to your arsenal.
Motion artist Mickael Le Goff shares his work with CDM. I’m just going to quote him in his entirety, as this whole recipe is delicious.
The work is stunning and organic-looking, and the technique looks like a hell of a lot of fun.
For the release of the virtual-analog synth ‘Monark’ from Native Instruments, I realised this teaser, mixing photographic and stop animation techniques. The so-called ”3D light-painting” has been realised in a similar way than the video “Making Future Magic” by mcgarryowen. (below)
The principle of 3D light-painting animation is the following:
1. You create videos of a slice view of 3D objects in any 3D software.
2. In the dark, you take a long exposure picture of a moving screen (e.g. an iPad) on which the video is playing.
3. You repeat again and again the process with different camera’s positions for the animation.
Around 3000 pictures with exposure of 5 to 10 seconds have been taken for this video.
Unlike the video of mcgarryowen, I tried to realised much more precise 3D details in the pictures. You can find more information about how did I create this video below.
Arduino & Lego rail:
Moving the iPad with the hand is problematic in the sense that it is almost impossible to move at a constant speed. As a result, it is really difficult to obtain some fine 3D details or even simple straight lines.
In order to solve this issue, I built a rail (slider) using some Legos and an extra motor. This motor was connected to an Arduino board that was able to control the moving speed of the iPad. I was therefore able to match the travelling speed of the iPad in order to recreate with fidelity the 3D object on the light painting pictures.
More infos about Arduino: arduino.cc/
All the synthesizer’s interface and knobs shots have been first modelised in Blender and the slice view videos have been generated with a nice trick from Efflam Le Bivic in Cycles. It was these videos that, played on the moving iPad, were creating the depth on the long exposure pictures.
More infos about Blender: blender.org/
The waveforms, digital cloud and abstract shapes have been the results of the same technique without using proper videos. I used the programming environment Codea (Lua based app on iPad) for creating algorythms that generate the wanted shapes. You can find another video made entirely with codea and light painting below
More infos about Codea: twolivesleft.com/Codea/
More Blender goodness from Mickael, for good measure:
And more of his work: