Live visualists and electronic musicians still explore how to make the computer a live performance instrument. But going one step further, fused visuals and sound can become the real-time, dynamic substance of a performance.
Artist Idiron (aka Gilbert) writes to share his technique for live audiovisuals. It’s a ground-up, handcrafted process that begins with raw material and ends, following lots of computer manipulation, in baroque spectacle. Idiron works with CGI and footage in 3D Studio max and After Effects, then uses Max for Live to glue together audio and MIDI control structures between VJ app Resolume and Ableton Live. That’s ambitious – timing is tough when you have two independent programs talking – but the results appear to work nicely.
None of this sauce is secret, though – Gilbert makes tutorials, Max for Live and TouchOSC patches, and other information public. That’s a new-found spirit of sharing that I find really encouraging. Aside from promoting community, I think it shows the growing confidence of artists in our sphere – it says, you know, what, I’m me. I can give away the mechanics of what I do, and what you make will be different. But that’s just my own opinion, and naturally I’m biased on the side of getting more communication and information out there as we as artists work to promote the medium.
Our friend Surya Buchwald wrote up his own proposal for how to build a language of A/V communication. That focused on larger musical elements, whereas this is more detailed and idiosyncratic. But as a result, to me the ideas in the two are natural complements. I think we could be well on our way to seeing a growing number of artists develop the A/V route further, which after years of effort could finally help contextualize the work in this field.
Gilbert shares his own reflections on his process with CDM:
I entered into this project with little know-how about VJing other than that when playing live, more often than not the visuals had little relevance to the music. As the musician, I figured I was in a good position to go about creating a bespoke Audio/Visual set-up for integrating visuals with my Idiron Soundtrack livesets, and as the building blocks of any live electronic music performance or composition consist of MIDI and Audio, I explored ways these could be used to control visual material in a improvisational context. Eventually this manifested in a series of Max for Live patches designed to flexibly route FFT data and notation into Resolume via OSC messages.
I also began developing ideas as to how this data should be used to control visual elements, particularly in mind of the limitations of computing power. Realtime visual manipulation is typically resource-intensive (ie. generative), so the creation of video clips that work specifically with the OSC data in mind aimed to facilitate a degree of realtime visual sophistication that would otherwise difficult to obtain on say, integrated GPUs. In practice this meant I created video clips with 3DS Max, HD footage and After Effects with a general deep-sea, space-faring theme in keeping with my last EP, Turbo Kisu.
Finally my research took me through methods of interaction. I felt it was important not to get too bogged down in the control of the visuals at the expense of the improvisation, or to end up stuck behind a laptop (‘playing minesweeper’). Experimenting with both novel and practical methods, the end result was a combination Launchpad and custom touchOSC patch at the core of any set-up, thereby removing the need to interact with the computer (although a patch that will use the trackpad to resize and rotate video clips through multitouch gestures is in the works).
Releasing these patches and tutorials I hope they can go some way to helping create more aesthetic symbiosis between musicians and VJs, that in future there is growing expectation for live visuals to be less nonsensical in relation to the music. Also, book me for your party.
Indeed — book him. Book more live visual artists. Oh, and while you’re at it, buy new projectors. (Hey, hope springs eternal; I have to try!)
Ready to try this yourself, if you’re using tools like Max for Live and Resolume (or other tools to which you might translate the same ideas)? Downloads, videos: