Live brushes up its VJ kung fu: The Karate Kid live remix at the CDM NAMM Party last month, as Ableton Live gets integrated into live visuals. Photo courtesy Robin Hunicke.

Audiovisual performance has a history stretching back through the decades — from the 90s Japan audiovisual scene to 60s Acid Tests and whole heck of a lot of other places. Heck, I’m fairly certain people were shooting up on morphine or getting happy with the opium and chilling out to magic lanterns and colored lights at the end of the 19th Century. But there’s a new excitement brewing globally around live music and visuals. That’s important, because it could push the scene forward — a critical mass of performers could pressure more venues into better projection, from avant-garde to club, and raise the level of chops and artistry in the medium. And you won’t even need opium.

The growing interest in A/V performance was part of what made us so excited about Serato’s VIDEO-SL, as seen in our exclusive hands-on with dj rndm. It’s unquestionably the best (well, even arguably the only) true, integrated DVJ tool in computer software form, certainly as far as digital vinyl control.

But curiously, one of the tools at the center of this movement isn’t really a DJ app in the traditional sense, has no scratching capabilities for audio let alone video, only limited video support, no live video triggering support, and no projection support. It’d be as though, collectively, the world decided in 1965 everyone was going to build flying moon buggies by first buying themselves Chevy Novas.

That’d make no sense whatsoever, except the app in question is Ableton Live.

And suddenly, it’s a natural choice: Live is a favorite tool for slicing and dicing sound live, so why not visuals — even if only by transmitting MIDI to a dedicated visual app? There are a number of approaches.

Feed MIDI Out of Live

On Create Digital Motion, Momo the Monster details his technique for cutting up sound in the plug-in Lucifer, then feeding that MIDI output to a separate visual app for the eye candy. To me, this is the most logical approach: there’s no reason you have to make Live do everything at once. If you’ve got more than one person playing, you could easily have one person on sound and another person on visuals. But even with one person, it’s increasingly feasible to run both a visual and sound app on the same machine (and I’ve played fairly regularly with two laptops).

VJ Kung Fu – AV Cutups using Lucifer and Ableton Live from momo_the_monster on Vimeo.

Full how-to details and lots of reader discussion at Create DIgital Motion:

AV Cutup Secrets: Using Lucifer & Live

Make Live an AV App


Users of other programs might see this as a stop sign. Live users see it as a challenge.

That hasn’t stopped people from trying to get Live to do it all, though — something a lot of users wondered about when they first saw Live’s ability to use video live, full screen. When I first saw Live’s video support, the Ableton crew had started forwarding around a set featuring clips of an ostrich poking its head forward. Uh — okay, you had to be there, but it was hilariously addictive. The trick was to use Location Markers in the Arrangement View to jump around in the video.

That’s all well and good, but the real appeal of Live is its clip-based Session View.

Just Add Music for Live 6 and 7 offers a solution: sync most of the major stuff Live does with sound to video. You get four simultaneous channels of media, video, pictures, and Flash files. Or at least, so it promises — for now, I couldn’t even get a download link to work; word is, JAM is very much in “alpha.” Here’s the site, though, and hopefully we’ll see an update soon:

Just Add Music for Ableton Live

I have another problem with JAM, I have to admit: it feels too much to me like a dumbed-down copy of VJing of the past. Read Create Digital Motion and you’ll see lots of people excited about gorgeous, abstract 3D visuals dancing in time to the music. Sure, chopping up video and syncing it to music is fun — but you’ll forever be in the shadow of the masters, like Emergency Broadcast Network. Now, maybe I’m overanalyzing; it’s still a huge leap forward from playing a DVD in the background for a lot of bands. But I think the ever-evolving state of live visual tools means that ultimately dedicated tools will eclipse the hacked versions of Live.

Of course, the hackier the hack gets, the more interesting things get, like the unique, organic frame-by-frame animation in Live we saw last week from Cousin Throckmorton:

Make Live Scratch

Maybe — as Momo did with Lucifer — the best option is to focus on hacking Live for better sonic capabilities, then sync that to visuals. Which brings us back to Live’s inability to scratch. DVJ crew Cobalt Scratch Bomb from Japan have hacked scratching into a Mac Audio Unit plug-in:

You can’t have this yet, but our friend Terry Church at Beatportal  gets full credit for the scoop on the first preview:

Japanese DVJ crew invent Ableton plug-in

Industrious Max/MSP/Jitter users ought to be able to do things like this, too, exporting to a plug-in.

For full-blown DJ capabilities in Live, there’s also the possibility of running Image Line’s awesome new DJ tool Deckadance as a plug-in.

Watching A/V Performance Evolve

Of course, a lot of this is speculative. I remain, like many, interested in visuals for their own sake, in live visuals with music that aren’t synced, and in whatever crazy stuff evolves next. We’ll check in occasionally here on Create Digital Music, but if you want to keep up daily with live visuals, be sure to subscribe to RSS / bookmark Create Digital Motion.

And keep on hacking. Can’t wait to see what you do next.

  • the inevitable convergence of the audio visual world should indeed be creating a buzz. at the rate that technology is coming forth to us, we should pay attention it's great potential.

    the ability to have control over audio and visuals from the same machine, the facilitation and a more intuitive relationship between the two seems much closer than ever before..

    i'm still waiting to see what happens with the collaboration of max/msp and ableton…

  • Thanks for this!

  • nobbystylus

    its all exciting stuff and its no doubt that there is a gap for someone to make Live into the visual control app of choice.. for now its about hacking stuff up in QC, MAX/MSP, custom plugs or syncing with VJ apps like VDMX, modul8 etc..

    published inputs for QC inside live would be my dream!! (imagine controlling particles with beat repeat – woohoo)..

  • Art Gillespie

    I've been toying with a QuartzComposer 'player' Audio Unit that maps a composition's inputs to plug-in parameters (and when used as an effect will route incoming audio to a QC's audio inputs), but it's fairly useless in the hosts that can actually load it (Logic, GB, etc.)… when/if Ableton adds support for Cocoas UIs in Live, there should be a lot of cool visualization AUs coming out on the OS X side of things.

  • don't forget the upcoming 1.5 update of m-audio's torq, which will send and receive midi clock – so I guess you will be able to control live sets and vj programs with your timecode vinyls.

  • Pingback: » Island Newscast: Workload Update The Isle Of Will()

  • DonSonic aka Donnie

    There should be a visualizer plug-in standard just like VST, AU, RTAS that allows your audio app to connect to visual [VJ] apps with midi, smpte and all the control signals needed for proper sync. It's slowly getting there. But, not fast enough for me. I would like to see a great Quartz Composer/Core Animation app that connects to apps like Logic, Live, Reason etc. Someone could even start simply with a visualizer app like the one in iTunes! Now, just get your sync signal directly from your audio production apps. Also have multi-channel accessibility features and have visuals connected to each audio output bus. One visual channel for drums, one for vocals, strings, backgrounds. You get the idea. Great times are ahead. Peace, Love and Blessings for each and everyone!

  • improvised visuals and music from the same computer has been my goal for a few years, and the best thing i could come up with was a combination in Max/MSP/Jitter of my "cptrl" patch (download from with 8 seqs and FM synths plus effects, with a video patch based on movie playback mixed with some effects objects and random parameter generators. the result is pretty "artificial" (hence the movie playback) both in sound and visuals, but pretty connected between the two media. the main drawback is that it is difficult to add more effects to make the results richer because of CPU limitations. the best examples of this kind of software that i have seen are all programmed directly in code (i do not code, and don't intend to). probably this is because a programmer can optimize the applicattions performance at its best.

  • Pingback: Create Digital Music » Renoise Tracker Made Into Animation()

  • Do you know Ms. Pinky?