Lucifer is a plug-in that does real-time audio slicing and repeats — as in for music. So what is this plug-in, running in Ableton Live as a host (hmm, music again), showing up on Create Digital Motion? Because our friend Momo used its MIDI output capabilities to trigger video — and got an unusual interaction between sound and visual as a result. Now, I’m in the camp that says Ableton Live should stay a music app; there are too many well-developed visual tools that Live would never equal. But this is the exception that proves that point: by thinking in a musical way when triggering visuals, you get a relationship between the two you wouldn’t otherwise. Momo shows us how in the latest VJ Kung Fu tutorial. -PK

If you’re not familiar with Lucifer, it’s a VST/AU plugin for realtime beat-based cutup/repeats of audio. What you’re going to do is route the MIDI from Lucifer out to another program that will do Video cutups. This is useful for more than just video – with the MIDI signals coming out of Lucifer, you can control and trigger and MIDI-capable software and hardware.

We figured out a way to control video using the awesome Lucifer plugin while working on our Karate Kid AV Remix. In response to a few inquiries about just how we made this work, I put together a video tutorial showing how to set up Lucifer to output MIDI.

Karate Kid AV Remix from momo_the_monster on Vimeo.

While this particular implementation is specific to the Lucifer plug-in, it’s a thought-provoking approach to doing AV Cutups. You could build a similar method by creating MIDI clips that output common/useful triggering patterns, and trigger those instead of mashing buttons to directly start your videos.

Also, this method involves looking at the MIDI Sync information coming from Live and using that to figure out a proper loop-length for your video. This way, you can use a longish video by simply adjusting the ‘play start’ point rather than cutting your videos down to 8 or 16 proper-length versions.

Hit VJ Kung Fu for the full article.

  • nobbystylus

    nice article momo!

    why do we need lucifer specifically? can't Live's midi clips with modulation do the same envelopes? is it a resolution issue ? I've done this kind of thing with Live and Quartz composer using pitch bend information and it works pretty well..

  • Absolutely – I think Peter hit the nail on the head with his explanation of taking musical approaches to video mixing.

    True, this article is specific to Lucifer because it's a step-by-step of how to set it up with that plugin – but you've gleaned the deeper impetus behind the tutorial, which is exploring meaningful automation of Audiovisual interactions. Perhaps you could explain with a little depth how you've set this up with Live and QC? A VJ Kung Fu tutorial in the making, no?

  • if only we could persuade ableton to write a plug-in that would allow proper a/v integration, so we can have clips controlled in ableton as if they were native audio samples but the clip is actually slaving the vj app to do the video part. it wouldn't be so hard, its really a matter of the vp camp coming together with some kind of communication specification out of of say osc or rewire. from ableton's perspective, the plug-in would be dead simple to write, and anybody doing a/v work would buy it: easy money! from the vj app developers, if they implement it, they gain a market too.

    i've got a stalled a/v cut-up piece from 2005 that hit too many buffers to be fun to work on any more (especially on a mac not having vegas). there was no nice way of making rhytmic a/v loops, and tying an interesting musical production process based off the video's audio back into an a/v form proved somewhere between impossible and a nightmare. we ended up with something similar to lucifer – splicing an audio loop into a midi scale and then manipulating the midi pattern musically – but even then the results weren't worthy of the goal… i'm tired of seeing juttery rearranged frames, its not the same as scratching records or stabbing audio samples.

    anyway, rant over. i've just realised the mp4 i had online of mangled max is offline, so i'll link back once i've got that back up there.

  • @Toby: well, I think what you're really describing is, more generally:
    * greater access to Live sets as far as what's happening internally — so input and output data beyond what's presently MIDI-assignable, and data that tells you the state of the set (did a clip finish? etc.)

    * customization within Live for how the host behaves — read: scripting

    I'd actually rather they NOT provide A/V-specific functionality. I'd rather they provide more control of this stuff generically, which would allow all kinds of things — A/V being just one of them. This would also provide a resolution to a lot of the music needs, as far as better hardware integration and more live customization.

    It isn't easy to build this sort of stuff into a host, though, and it hasn't really been done before. But I think Ableton are aware of these two needs.

    …and in the meantime, the hacked Live API is still going:

  • well, yep, its what is achievable and what fits with the ableton team's goals. the continued absence of osc support in live is kinda starting to look intentional rather than an oversight, so i wonder what they're thinking about it all. hence my proposal of something small and targeted, where there's even money in it for them: it fits their existing model of 'instruments'. i'd take either, and given the choice, the one more open ended and platform-like.

    mangled max is now back online –… – this version after two days work, it got worse the more we worked on it after that. so hopefully this summer will see a redux, i can't wait to play it out and fulfill my nieve "i'm a vj but if i hit this button the club reverberates to serious bass" fantasies…

    (and i got excited about the live api project last year, then realised it was only working on the pc, and the project was kinda dead. live 7?)

  • wes

    hey i really liked this tutorial. I've been searching for a place with an archive of videos like old 80's, 70's clips of junk for mangling, but i've had no luck finding anything. Would anybody be able to provide some good sites with videos & clips of random stuff like this.

    Thanks in advance :).

  • ian

    this is nifty but it could be misleading, this method could work in ANY vst host right? although live is the app of choice nowadays it seems…
    why aren't there more VJ-oriented VST's? can VST not handle video directly?
    anyone who's really interested in this, i say take the time to learn some max/msp (or the free alternative, puredata). using MIDI to rearrange video is cool, but when you have generative video effects at your disposal (GEM, for example) you can scale the midi data to do ALL sorts of crazy shit.
    and wes, maybe check out – not the best footage for VJ stuff but there seems to be a whole slew of oldschool (50s, 60s) training videos and informative videos. let us know if you come across other good (royalty-free) sites

  • < this method could work in ANY vst host,right?

    Yep – Lucifer's a VST plugin and should work in any host. Good clarification 🙂

    < why no VJ-VSTs?

    I suppose we really ought to ask a VST developer directly (or maybe someone wants to chime in?). I know that there's this divide – some people already feel that Ableton has wasted resources developing anything video-related for live, while other (myself included) are excited about such prospects.

    Oh, and I second the recommendation for That's where I got all my Betty Boop for this tutorial. However, you're not likely to find more recent stuff since it hasn't passed into public domain. For that, I recommend your local thrift/second-hand store.

  • wes

    hey thanks for the reccomendation, checked it out and i'm finding some really odd but possibly useful stuff. i gues maybe i just don't really know what to type in for random 70s/80's junk videos lol. anyway found these sites too for anyone else on the hunt.

  • wes

    hey don't know if anyone has seen this little program add on to Live, but apparently its for creating automatic synced visual in relation to Lives audio. can't really describe it but give it a look, very interesting stuff.

  • wes

    k i might seem like an addict here but i forgot to ask something in my last post lol.
    @ Momo about the Lucifer and video chopping. Would i be able to use Quartz Composer instead of Isadora? Does it have the same functions like: MIDI CC Watcher & Real-Time Watcher or anything to observe the MIDI clock info from Live and sync it?
    I just don't have Isadora…

    Thanks 🙂

  • nobbystylus

    It does work in Live and QC
    I'll get a little Ableton Live and QC pack together explaining how it works. Its very very simple and my QC skills are low but i'm sure that somebody will be able to expand upon my (lame) QC patch..

  • nobbystylus

    JAM is quite interesting and i've been testing it out, but its still very much in alpha stages of development..

  • @toby: I'm not sure what you mean as far as "intentional" versus "oversight" and OSC. I haven't seen a robust, current OSC implementation in any mainstream music app. (NI's apps tackled it at one point but that doesn't appear actively supported — note the lack of OSC in KORE, which would have been a logical place to put it.) The problem is, it's going to take more effort to get better OSC support in general. Visual apps are the exception; we're lucky enough to have better support.

    If you think OSC belongs in Live, you need to tell Ableton — and get five of your friends to tell Ableton, and five of their friends. I know Ableton are aware of it as I've talked to them about it. But developers usually work on only a fraction of the things they want to do, so for it to become a top priority, they need more support from users on this, and I think the implementation needs to be made easier and better documented.

    Live API I believe is still active, on Google Code.

    @wes: thanks for the reminder, been meaning to cover JAM!

  • nobbystylus

    I've put a quick explanation as to how i do the Live to QC movie time control up here :

    its very basic but it might help some people.. its using Leopard and Live 7.02

  • peter – as in, by now ableton must have done a cost/benefit analysis for osc and the answer wasn't the one we've been hoping for. i agree that can change with more people asking for it. i've certainly harped on about it whenever i've had anybody's ear… but yeah, there doesn't seem to be too much demand last time i looked at the tumbleweed osc threads in their forum. this time around i was hoping that if the liveapi was a reverse engineer of a python api, ableton themselves would expose it live7. maybe 8, here's hoping =]

  • The only downside I see to doing this in QC as opposed to Isadora is that you can't (currently) use the audio from clips in QC while controlling the movies externally. I know that in this example I didn't use the audio, but it's nice to have that option.

  • Koshi

    @nobbystylus just to let you know it works in Tiger too.

    I've been trying to make it work with jitter rewired to live and live's drum slices. Not much luck so far. I can't figure out the real time watcher or beat calculator in max.

    Thanks for the tutorial momo keep them comming!

  • ian

    here's hoping the ableton/cycling74 collab features OSC

  • Pingback: Create Digital Music » Ableton for the DVJ: Users Hack in Scratching, Live Video, and Visual Remixing()

  • wes

    @nobbystylus, thanks for that tutorial. looks interesting, tried for about 10 mins and got kinda confused or couldn't find what i wanted to do but I'll have to spend some more time looking around. I really wanted to do what Momo was doing in his vid with a real time watcher attached so it jumped around in the movie and synced it. Can this only be done with pitch bend data? because when i was trying it seemed like all the pitch bend data was doing was stopping and restarting the movie. anyway thanks again for the help.

  • nobbystylus

    Pitch bend data is higher resolution that CC data from Live.. its 14bit, so tends to give smoother playback. I had quite good results doing the same type of thing like this:
    i've dropped in the movie clip to live, and sliced to a drum rack. i've then sliced it to 32 bars of individual hits, so i've got the audio mapped across 32 notes. Then i 'play' the movie (in QC) by converting the midi note infomation into pitch bend data in Plogue Bidule and sending this note to pitch bend data into QC and this controlling the Patch Time of the movie (via a Math patch to multiply the data from 0>1 up to something that makes sense to the length). I can then granulate the movie and control the position in loads of funky ways by using an arpeggiator to play the notes in a more chaotic/interesting order, and twisting the gate/grid/steps and distance all with a controller. it makes for lovely glitchy stuff that can actually be done live with a midi controller..

  • mememamo

    just a heads up, it seems the lucifer plugin has been discontinued… all my attempts at getting a copy have failed… divine machines no longer even mention it on their site

  • @mememamo – it's available direct from the man who wrote it. Contact steve_at_duda_dot_org. He'll sell it to you for $99.

  • wes

    Hey i'd just like to know what the "Real-Time Watcher" patch is called in Quartz Composer, cuz i don't have Isadora and am just using QC. thanks

  • Pingback: Create Digital Motion » Ableton Live + Isadora: Slicing, Syncing Audiovisual Tutorials()

  • Pingback:

  • Pingback: Chris::Bamborough » Improvising with visuals.()