Can you hack it? Yes. Yes, you can. Screenshot (CC-BY) Hens Zimmerman / 37Hz.

Even before Max for Live was available, hackers had found a way of interacting with “secret” APIs inside Live for custom control, allowing them to customize Live’s behavior and make it work more seamlessly with hardware. That included providing something Ableton themselves had not: real, native control of Live via OSC, for more control than MIDI alone can provide. I was assured such hacks would continue to work, and sure enough, they have. Here’s how to get started.

You may wonder, of course, why even bother now that Max for Live is available? Max for Live is a powerful environment for creating instruments, effects, sequencers, and other devices within Ableton Live, and via its access to the Live API, it can even be a tool for customizing how Live works. But it adds an additional layer of abstraction, it is somewhat limited in how much it can manipulate interaction with hardware, and anyone wanting to use your creations will need to own Max for Live and not just Ableton Live. And not only that, but some people will simply prefer scripting in a language like Python to working with visual patching. (There’s still reason to consider M4L, too; see the full link to its “API” for Live, below. But we do have multiple options)

So, with that out of the way, here are the current solutions:

Make your own MIDI remote scripts.

Hanz Petrov has written an intensive introduction to creating your own MIDI remote scripts in Python, using the new Framework classes:

Introduction to the Framework Classes

Use OSC, via the Live OSC API Hack (or MIDI)

Ableton doesn’t have native support for OSC — unfortunate, given that’s now a feature of major visual applications (Resolume, VDMX, GrandVJ, Modul8, and others). But while we keep bugging Ableton for OSC to be on equal footing with MIDI, you can make use of a special Python hack that provides an OSC API to Live.

If the above scripting seems intimidating – and I can certainly see why it might be – the LiveOSC API is refreshingly simple. Because you can simply send OSC messages directly, controlling Live with tools like iPhone apps or Processing sketches or even hardware could become comparatively simple – and yes, simpler than working in Max for Live. If you only have MIDI, there’s even a MIDI API, too. Here’s where to start:

Complete documentation of the LiveAPI project [assembla]

Why it’s nice: you can send something as simple as /live/play/clip (track, clip) and trigger a clip. That’s even more direct than the usual MIDI interface.

Most importantly, this now works with Live 8.1. See the video below for an example of this in action:

mlrV4live tutorial (&casio madness) from StevieRaySean on Vimeo.

Check out his Arduinome build documentation, too. (Arduinome is an authorized clone of the monome using readily-available parts.)

The Max for Live way: Live Object Model

Complete LOM documentation at Cycling ’74

And yes, it makes my head spin a little, too. (Or perhaps the word is “oscillate.”) (Michael Chenetz) has done a great job of making this a bit more manageable. In the video below, he explains how to use the interaction between Max for Live and Live; there’s also a tutorial on sending messages to a control surface like the Launchpad. But note that some of this can actually more complex, and more hardware-specific (APC/Launchpad-only) than the hacks above. It’s a case in which the hacked version actually works a little better than (cough) the official version.

Max For Live Paths, Objects, and Observers from Michael Chenetz on Vimeo.

My own challenge for myself: just make the Launchpad intelligently control device parameters, something it currently doesn’t do. I’ll let you know how it goes.

Thoughts on the merits of these different approaches? Projects you’ve made using one or another? We’d love to see them.

  • Mudo

    I love this game.

  • Great post Peter, I was just explaining to some of my peers that with Live OSC, end users get the fabled red box.. I remember there was a discussion on CDM a while back, about why Ableton has not enabled that feature for all controllers.. yes.. even if a person wants a 2×2 red box or a 8×8 they can get it now, including multiple session boxs. Now am I the only person wondering why there is no way to delete a clip remotely (python, midi, max4live) outside of the keyboard shortcut?

  • Well, I suspect some of these things are simply by coincidence and not by design, but then, that doesn't necessarily make for great design.

    I'm assuming the deletion, for instance, would be related to what happens inside the engine when a clip is deleted. But out of curiosity, why would you need to automate clip deletion? Just to be able to edit your sessions without getting on the keyboard/mouse?

  • Here is an example of why remote deletion can help in a work flow:

    Say you have a clip, or external source playing… a user could record the the output into slots with button presses, so a user can achieve an MLR style effect (with or without quantization) while keeping forward time, this is crucial when working with say an MC. To make it more clear lets say you have a track your playing and you would like re-sample and re-sequence elements for a short period.. then after that action is completed jump back into the track where it would be normally (like the roll function on a DJM vs using a loop as in Serato). Remote deletion would help in this area so you don't wind up with a mess or run out of buttons (assuming you don't have a launchpad).

  • Ah, yes, that makes sense. Well, I don't know what the specific reason is it's not there….

  • deb

    i think the first comment on this thread is the best comment i've ever read. EVAR!

  • sb

    Yup, Mudo pretty much summed it up.

  • stevieraysean

    man, I was checking this to see whats new didn't expect to see to something I did 6 months ago. I sound like i've got a cold or a hangover hah. video is not the greatest either. anyway…

    I think that the mlrV4live mod of mlrV with liveosc still probably works better than the mlr ports I made in maxforlive. for me maxforlive just isn't cutting it. it's actually put me off using live since i tried playing a gig with it.

    liveosc is great though and offers heaps of extra control and it's free.

  • @Stevie: Sorry about the delay. πŸ™‚ The video is old, but some of the information is new — like the 8.1 update for LiveOSC. (Okay, it's a month old, but better late than never… I was traveling last month and got behind)

    I hear you, for sure. Specifically, though, what was it about maxforlive that didn't work for you?

    LiveOSC is a lot closer to what I really want for control, which makes me wonder what it could be like if actually integrated with the UI as MIDI is.

  • Oh, and the video is still pretty good. πŸ™‚ If you make more, let us know!

  • stevieraysean

    hey it's cool. happy be be on CDM πŸ™‚

    not sure i've even tried mlrV4live with 8.1 yet. might need an update to the latest liveosc

    the main things i've found with maxforlive haven't worked for me is that it just doesn't seem very stable. I've had a live set crash about 5 times while setting up for a gig and it only had 1 m4l patch (mlr) and about 2 audio effects in it. could be something wrong with my patch i guess

    editing can be a real pain also as you have to freeze, save and also sometimes reload patches especially when using bpatchers etc. all very time consuming compared to editing a max patch and running it next to live.

    I do suppose that what i've really been trying to do is squash mlr a max 4 patch into maxforlive. when really a rewrite would probably be more suitable.

    If I had more time i'd probably be able to make mlr work a lot better in maxforlive. though i'd like to see better multi channel support like being able to send audio between patches without the laggy plugsend~ object. also a way to get proper file paths from clips in the session would be handy so you can automatically load them into a maxforlive object would be nice.

    liveOSC integrated with the UI would be fantastic. having a window to define commands and bind them to objects like you would assign a midi controller would give great flexibility to setups.

  • Mudo

    Thanks guys!

    This is exciting because we could use pd patches "bridgering" the gap…


  • It is curious that Ableton decided to use Python for internal scripting, yet decided on (probably for ease of marketing and selling more stuff) Max/MSP as an add on, instead of freely integrating PD into Live.. at least we can still use PD as a vst with live…

  • Also… I am surprised this post has not got more attention.. RED BOX for everyone!!!!!

  • griotspeak


    i don't know why, but i seem to be an evangelist for javascript. most of the stuff we want to do with the monome benefits from js.

    i HAVE noticed a lack of stability that i thought would be there, but i am going to give the benefit of the doubt and say it will go away.

    I am still fighting my urge to get the red box through LiveOSC with the hope that if we make enough noise, Ableton will supply the goods. debugging, for me, is already enough of a pain, i don't really want to incorporate non supported methods in.

  • stevieraysean

    @griotspeak you are right, i'm sure the stability issues in maxforlive will go away. i'm trying not to be a hater but it's just a time thing for me…i'd rather put more extra time into making music than developing tools to do it.

    as my frustration with maxforlive grows I end up basically making liveformax patches – building/recreating the things I use in live into max patches. like a looper-like effect from live built into mlrV is what i'm using at the moment and it seems to be doing the trick for what I wanna do.. still tweaking it and haven't released it yet but it's working and is kinda my direction at the moment.

    i'd love to learn js too as it seems to offer a whole other angle and can definitely see the benefits. again, i don't have the time yet. i guess collaborative projects could be an idea tho. everyone has there own idea of the 'perfect setup' though i guess.

  • griotspeak

    @stevie – well, shoot me an email or get at me on the boards sometime if there is something i can help with. i seem to be stuck on JS scripts anyway.

  • pukpuk

    This is huge.
    I can imagine some future.

  • fox232323

    Ableton have to officially support OSC NOW!
    not make a commercial locked link with max,
    or do both.

    im working on my V2 osc usine /ableton interface:

  • fox232323

  • Man, this post is way over my head. Makin' my head 'oscillate' for real.

  • Useful post! I really needed a tutorial to get into Python control surfaces, thanks Peter!

  • ST8

    Thanks for the coverage peter, its quite hard to get everyone up to speed on where the LiveAPI is. Most people think that it doesn't work with Live7/8 when in fact it does.

    I've been using the LiveAPI for some time to integrate an arduinome with pots/encoders/lcd displays with ableton live. If i ever get the logic boards for the 128 i'm building i'll post some photos/videos of the whole setup! is the project with some photos of the development setup. It'll be much better when its on an arduinome 128 with two displays though πŸ™‚

  • MusicLovr

    Nice work, Hanz. Thanks Peter.

  • This is such a great post. Thanks.

    I'm working on a custom script for the MPD32 with red box. So far I've managed to modify Hanz Petrov's sample code to get the red box and mixer, but am wondering how to make the knobs map automatically to devices. Right now I'm doing it through a 2nd User Remote Script, but I'd like to know how to do it in Python.

    Also, is there a way to make 2 automap scripts interact when the controllers are coming in from different ports? For example, make a faders controller map itself to the position of a Launchpad's red box. Is it possible through python scripting? If not, what are other options?

  • @Monosylabik: Grab the decomplyed sources, and have a look at the VCM600 script (relatively simple)or the AxiomPro script (more complicated) – either of these should point you in the right direction (i.e. show you how to use the DeviceComponent Framework methods).

    For connecting two scripts, there's a connect_script_instances method (in the ControlSurface base class), but it might be easier to get the interation you want via Max for Live (unless you're up for modding the Launchpad scripts ;).

  • fox232323

    yup thanks peter to cover that.
    thanks a lot for your great work.
    I picked up your osc files and
    reimplemented colorblock and other infosblocks,
    but without you this would have been impossible.

    I hope you ll keep on' rocking that way…

  • Stu

    Great article! Will this enable us to make a 'lock to surface' button for the APC? something that is already available for many controllers and the custom user scripts.

  • ST8

    could you upload your updated code please? I'll merge the changes into the current release.


  • fox232323

    yup ok, will take a bit more time to recheck that all, and make a list of new functions.
    I don't know a word of python, have just copied/pasted/modified tested for long hehe..
    will let you know on your page.
    cheers mate.

  • @Hanz Petrov: Thanks a lot for the help.

    How feasible is it to combine the midi output of both controllers using a MIDI utility like MIDI OX or MidiPipe and adding the mixer code to the Launchpad script? I wonder if it'd somehow break something in the Live to Launchpad connection.

  • @Stu: There are both set_lock_to_device and set_lock_button methods in the Framework DeviceComponent class, but for reasons unknown, the APC40 does not make use of them. You'd probably need to mod the APC40 script in order to get this going. Might also be possible using Max for Live, since you should be able to make calls to these functions and turn the lock on when you need it (the methods are inherited, so they're there – they're just not used in the script).

    @Monosylabik: Sounds like it could work. As long as the "secret handshake" is there, Live won't care where it came from. Adding the mixer code to the Launchpad script might be a challenge however πŸ˜‰

  • Andrej M.

    Well, well, well. I was just looking around for some Python libraries to play with, and after nothing sparked my interest I payed a visit to good ol' cdm. And now I find this wonderful tutorial for interfacing Live and Python. I'm gonna have some fun this week!

    Thanks Peter!

  • Pingback: Ableton Support » Create Digital Music Β» Hacking Ableton Live: Unofficial OSC …()

  • merci merci merci … exact what I need!

  • Pingback: realtimespacelab : LiveOSC das OSC Interface fΓΌr Ableton Live()

  • Pingback: KaossGuitarTouchControl – Progressive Factory()

  • Pingback: KaossGuitarTouchControl – Progressive Factory()

  • Pingback: KaossGuitarTouchControl | Progressive Factory Blog()

  • Glenn Reuther

    Nearly 3 years on, how has this challenge gone ? I’ve been making a custom Launchpad app, and to do it, I’ve been arriving at various incarnations using a combination of M4L, Python Remote Scripts, and Max6, all with varying degrees of failure; so far. πŸ™‚

  • Glenn Reuther

    Nearly 3 years on, how has this challenge gone ? I’ve been making a custom Launchpad app, and to do it, I’ve been arriving at various incarnations using a combination of M4L, Python Remote Scripts, and Max6, all with varying degrees of failure; so far. πŸ™‚

  • Glenn Reuther

    Nearly 3 years on, how has this challenge gone ? I’ve been making a custom Launchpad app, and to do it, I’ve been arriving at various incarnations using a combination of M4L, Python Remote Scripts, and Max6, all with varying degrees of failure; so far. πŸ™‚

  • In the years since this article went up, seems to have become… something else.

  • Aaron Levitz

    In the years since this article went up, seems to have become… something else.

  • Aaron Levitz

    In the years since this article went up, seems to have become… something else.