If you could control your music with all of your digits, and get interactive feedback on a display, what would your setup look like? Expert Lemur user and software engineer Bryant Place has one such answer. It shows off just how much the Lemur’s software has evolved over a series of revisions, and reveals a bit of what can go into performing with Ableton Live.

Photos/screens: Bryant Place. Used by permission. (Click for larger versions.)

Side note: for a look at live touch interfaces with Native Instruments’ Reaktor, see our story for our NI minisite. To really understand how touch is impacting live playing, I think it’s helpful to see what’s going on with different software platforms.

Multi-touch, Lemur, and Going Live

Part of the appeal of Ableton Live is that it behaves as a hybrid between arrangement software and musical instrument. Early versions even carried the tagline “Sequencing Instrument,” but that sums up the problem: instruments generally aren’t sequencers, and visa versa. To “play” your sequencer live is challenging enough, but added to that is the fundamental mouse-pointer interface that’s been in the marketplace for over twenty years. To really control live, you need more direct access.

The Lemur multi-touch hardware promised just such control when unveiled. In an early review, I saw this as promising but cautioned that the custom software the Lemur runs was overly rigid. Since then, firmware updates have gradually added more custom features.

On a recent trip to Los Angeles, I got to watch as Bryant showed off a set of templates he’s been developing that exploit these features for deeper, more interactive control of Ableton Live. Bryant’s session was brief enough that you could blink and miss it, but an awed crowd of assembled Live gurus revealed that he’d showed something really special. It’s a dream multi-touch setup. He’s using the new v2 firmware for Lemur, which we see in a screenshot from Jazz Mutant has also been used in their own template for Live. Not all the features come from v2 firmware, but those tabs make a big difference, and I can imagine continuing to go hog-wild with envelopes and such.

The basic idea: set up effects for live performance and make them readily accessible from the futuristic-looking, multi-touch, colored Lemur control surface. With a few compact screens, and interface elements that respond dynamically to what’s happening in software, it’s possible to use touch gestures to control elaborate effects arrangements in ways that would be very different than the results you could get from conventional knobs and faders.

Have a look at the pictures to really get a feel for what this means. I asked Bryant to describe to us a little more about how it all works. He cautions he’s “more of an engineer than a writer.” (Add “Damnit, Jim” to the beginning of that line, Star Trek fans.) But he actually has quite a lot to say, and you can feel free to ask some follow-up questions in comments.

Behind the Scenes with Bryant

My Live set is designed to take complete songs (preferably electronic dance music), and remix and affect the sound in such a way what I can take an original mix and completely transform its sound and rhythm.

I’m using only Live’s [internal] effects for the following reasons: stability, [efficient use of] CPU resources, tempo changes. I am thinking of adding some Sugar Bytes and possibly Audio Damage – we’ll see. [Ed.: Yes, I have to at least observe that third-party plug-ins are often as stable and sometimes more CPU-efficient – depending on the specific application.]

Some notes and tips, as I have learned building this project:

  • Using the Lemur to control Live, which I can then do very quickly and naturally, has allowed me to discover the nature and quirks behind some of Live’s effects.
  • Live and is amazing at changing tempo – especially evident when there are quantized auto-filters.
  • Changing tempo while holding [instances of] Beat Repeat can cause some problems with the groove as Beat Repeat uses a good amount of audio buffer.
  • Playing fast songs (for example, 135 bpm) at a slow tempo (e.g., 75 bpm) usually sounds weird. This can be somewhat enhanced by the following procedure: use two of the exact same audio clip, one using the "beats" algorithm and one using "complex." Together, they have a much better texture than you’d get using just one.

The signal flow and layout:

  • Four Audio Tracks: I have four audio tracks for clips. Two A and Two B – I use the crossfader to fade between set A and B.  A1,A2,   B1,B2: These four tracks are "sends only"
  • Seven Sends, with Pre-Configured Routing: I have seven sends.  A1,A2, are sent to sends: A Hi, B Mid, C, Low (the seventh send is simply a DRY track "G"). B1, B2, are sent to the hi mid lo sends D E F
  • Effects Inserts: Sends Set A (ABC) and Sends Set B (DEF) contain independent auto-filters, multi-band compression tuned to their specific frequencies and auto pan.
  • Effects in performance: The effects are controlled by the Lemur in a very magical way. 🙂 (I spent a lot of time tuning the MIDI mapping)  This allows me to create a separate groove from the original song [using the resulting effects] – AND one that is frequency-independent.  (I had to compensate some things due to buffer limitations and CPU [utilization] for my MacBook Pro.)
  • Returns, and More Effects: Next I take the sends and route them back to specific audio tracks.  I route A Hi to X Hi also D Hi to X Hi, and so on. This is where I add band-independent instances of [Ableton’s] Beat Repeat and Simple Delay. (By the way, these delays are far deeper than they seem on the surface.) I have full control of them using the Lemur – you can see the delay units in the images.  Lastly, I use a multi-ball object to control Hi, Mid, Lo. Chorusing tuned to their respective frequencies.  (When used correctly and with taste – the effect is mind-blowing)
  • Recording: Lastly, I have my FIRE track which I use as a pre-Master (Xhi Xmid Xlow are sent to FIRE) – so I can record my performances.  I also use some mastering plug-ins to finalize the sound. [Ed.: Interesting, though I’d be inclined to do that after recording!]

The result is called LiveFIRE. I am using v2 Lemur Firmware but I haven’t used many new feat
ures — only the tabbed container object, color options, and other little tidbits. [Ed. That may be, but having worked in the Lemur editor, sometimes having just that one object you need can make a huge difference. If you saw an early revision, like the one I first tested, many of these objects are also the result of a series of new features.]

Technical notes: I can’t use my Live set to its fullest capacity due to my MacBook Pro’s limitations with audio buffer.  I have already scrapped my audio interface in turn for my integrated sound card – as it allows a larger audio buffer size. (This problem occurs only when I have audio on all four tracks playing at the same time.)  [Ed.: I’m actually not sure about this detail; we’ll have to discuss it more. Switching to internal audio is usually the opposite of what’s necessary, so we’ll have to have a separate conversation about exactly what’s going on, what the symptoms are, and what the cause may be. An inability to get a sufficient audio buffer, or problems running out of CPU horsepower to complete the tasks, would be symptomatic of either trying to push the envelope a bit too far with the set or encountering some driver-OS-software issue. Then again, it sounds as though Bryant is intentionally modifying the buffer to get certain results – an interesting and unorthodox technique. We’ve kicked off the discussion, so we can look at this some more.]

My future plans are to naturally incorporate the LiveAPI, which will take some time and a lot of remapping. [Ed.: The Live API is a user-supported way of customizing functionality in Ableton Live – it’s a hack, and requires a bit of Python coding knowledge in order to make it your own, but it’s a very powerful outlet and well worth revisiting here later.]

I really look forward to continuing this discussion. What would your ultimate touch controller look like for Ableton Live or other software? Or would you rather dump the touch and stick with tangible hardware control?