TouchOSC makes an appearance as musicians hack control at our Handmade Music Open Lab in New York Saturday. Photo by Matos; used with permission. See his (not entirely safe for work) art portfolio.

TouchOSC has become something of a standard on iOS for touch control, thanks to desktop editor apps for custom layouts and high-contrast, Lemur-style controls. Last Thursday was all about wired MIDI on iPad, so it seems only fair to show what people are doing with wireless and OSC. I’ve got a few good selections from my recent inbox.

DJing with Traktor

Above, the latest version of Traktor Pro templates, for iPhone or iPad, from Milos:

It’s got some extensive functionality, and since Milos used Pure Data (Pd) to translate to MIDI, you can use it with both the Mac and Windows versions of Traktor. Milos doesn’t yet have an iPad, so he’s collecting money to invest in one.

Arovia has their own Traktor layout, aptly titled “nano” as it’s fit into a small area.

From over the summer, here’s a different approach to using Traktor with touch, turning instead to one big wheel.

Ableton Live

Malaventura has assembled a “kitchen sink” approach to working with Ableton Live, with a do-everything Live template.

A touchOSC layout for iPad that contains a step sequencer monosynth & drum machine, a ambient generator, a psychedelic fx unit & operator synth controller. All designed for works in iPad with touchOSC, OSCulator and Ableton Live in your computer. The layout and all the files necessary are zipped in this link:

It’s a really involved set of layouts; it’s not quite as sophisticated as something dedicated like Touchable, but then again, since you can run both, you may just give it a try and use it for certain editing workflows.

The one caveat – and this is a catch on a lot of these patches – is that you need Osculator in order to use it. More on that gripe in a moment.

As seen on Synthtopia

Working with Hardware

Last week, I showed my preferred means of editing MIDI devices – using, you know, MIDI cables. But I can see the appeal of wireless control, too, in certain situations. Using The Missing Link wireless hardware adapter (see our detailed look at two wireless solutions last month), you can work with conventional hardware.

Via Matrixsynth, there’s a nice template for the Waldorf Pulse.

Palm Sounds points to a Yamaha DX7 editor, complete with SysEx. (Isn’t there supposed to be an actual link there somewhere, though?)

Some Friendly Criticism of the State of OSC Touch

I do see opportunity for progress in all of this, however. Constructive criticism, for all of us:

  • The lack of native OSC means way, way too many kludges. Osculator is a cool little app, but you shouldn’t need it to do OSC; the whole point of OSC is that it’s a simple, universal networking protocol. We either need native support in apps like Ableton Live, or we need to use something else – period. Having to use go-between apps makes it a step backward in these applications from MIDI.
  • Why not edit on the device, or even generate layouts automatically? Part of the beauty of touch layouts is on-the-fly controls. There’s plenty to explore here, from layouts that generate automatically after an exchange of information over OSC to on-device editing. One of my criticisms of the original Lemur was having to use a dedicated editing app, and that was more than five years ago.
  • Why not use the browser? Wouldn’t it be great for editing and control to move seamlessly between desktop browser and mobile, or between mobile platforms?

I’m burying this in this article just because I’d rather spend time working on those things than complaining them, but it’s worth saying, partly because I’m sure others are thinking the same way. (And developers thinking that way have the chops to do something about it.)

Also, in answer to everyone griping about a good Android solution, I’m personally waiting for a usable Android tablet and not just handhelds. That means I’m seriously bummed that the Motorola Xoom may cost US$800. Sorry, at that point, I spend money on synths instead.

Knobs rock.

All of that said, I do think there are some great solutions here, and they work right now. Looking forward, we can build the next generation even better.

In the meantime, go grab TouchOSC. It’s fantastic software, and supporting it means an increased likelihood of developer hexler getting to continue to iterate on his own great work. (He’s a really nice guy, to boot, as well as a talented developer; I know he isn’t exactly getting rich on this thing, but sales really do support developers working on apps they care about.)

What layouts are you using? Got any you want to share? And what do you want to see in touch controllers?

  • There are tons of exciting things bubbling forth from the osc massive, but having that intermediate layer of conversion overly complicates the creation process and stifles the work flow. In a world where plug and play has become a mantra, plinking away at osculator wondering why only your faders fade while your buttons sadly send out lost signals is a fun puzzle to ponder but not how I want to spend my limited time. When many daw's are reaching a level of maturity, the addition of osc makes a delightful bullet point. 

  • I actually think the latest Arovia layout for Traktor on the iPad is pretty slick. Check it out here:

  • imccnl

    I have to pitch in here to say that all is not lost in the world of native OSC support. The really wonderful open-source DAW named Ardour ( for GNU/Linux and OSX comes right out of it's virtual box with OSC support. I have to be honest and say that I haven't used the OSC support yet (there go my geek points!) as I have been using an Alphatrack for transport and track control, however it does look like you can do pretty much everything you'd want to via OSC ( Ardour is a great DAW tool and I urge everyone to support it!

  • Peter Kirn

    Oh, yeah, absolutely – native OSC support is good. Viva Ardour, Renoise, and (planned) Reaper, among others (Pd, Csound, Max, etc. all have long done it). I mean that in these particular examples, people are trying to talk to hosts that don't support it. 😉

  • onar

    I was out of coincidence yesterday researching the automatic generation of OSC layouts as you call it, and came up with two existing initiatives towards this goal: Wright & Schmeder's "A query system for Open Sound Control" article from 2004, and the Minuit protocol:

    Using either of these however requires that all involved nodes implement the protocol, or we're back on square one!

    And there isn't really any other approach to solving the problem either, it isn't that the above solutions go about it the wrong way…

    I see this being delayed many more years still, simply because there needs to be an agreement to a standard, and an adoption of it widespread enough to make developers effort worthwhile implementing it. In other words, the same catch 22 situation that has been keeping OSC from taking off to date, but all over again.

    As you say yourself, most software doesn't even support OSC out of the box, let alone fancy discovery & query systems…

  • gio

    Check out these templates for controlling Reason and Record at

  • RichardL

    If you are listening Hexler, please fix the Android version of TouchOSC to scale to tablet resolutions.

  • Peter Kirn

    @onar: Ah, interesting you should bring that up — I was looking back on OSC papers and seeing a reference to query systems in *2002*. So … what's the deal? Why don't we have an agreed-upon way of doing those queries?
    I rather wonder if some sort of existing standard for querying / remote procedure calls makes sense. (JSON-RPC, for instance?) Time to look into it.

    Now, I think honestly part of the reason there isn't broader OSC support is because querying – while it's more work – is also an essential incentive to doing OSC in the first place. So it's really, really a chicken and egg problem — just on the side of software implementation, not, as many have argued, hardware. (Frankly, MIDI is just fine for a lot of hardware; OSC is useful for software talking to other software in situations in which MIDI doesn't make sense, especially because MIDI absolutely lacks that sort of query mechanism.)

  • Peter Kirn

    @RichardL: I wasn't going to bring up TouchOSC on Android, as I feel it's an entirely inadequate solution. Like I said, I'm ready to embrace something new. Pity the first tablet you'd want costs as much as a new laptop, cough, Moto.

  • RichardL

    There will be other Honeycomb tablets, but it's disappointing that Moto apparently isn't aiming for a broader market with a consumer-friendly price.

    Refurb MacBook Air for $850 and the win. 

  • Peter Kirn

    Yeah, you begin to wonder if you really need touch all that badly. 😉 I'm sure we'll see better prices on Honeycomb tablets in general, and probably even the Xoom specifically, with various discounts and subsidies and who knows what else. And I imagine first-gen iPads will wind up showing up cheap when the new model of that shows up. In fairness, we're going by one – apparently unsubsidized – Best Buy ad. Heck, my first-gen Droid I think is something like $500-600 unlocked and unsubsidized.

    I'm now off topic on my own post, but I'm sure other people are thinking about this, as well. 😉

  • anechoic

    I'd love a hardware controller that was OSC – w/ knobs, faders and buttons

  • Thanks for mentioning my layout. Ive notice much more visits at my blog and now I see why. 🙂

  • @peter: the query system is all implemented in OSC – there's not much point leaving it for some other protocol, i think.

    @onar: the query system by itself still doesn't address the absolute lack of any semantic standards. its great that your OSC controller can discover that Ardour supports a command called /frob/baz/nicate and what argument it needs. But what does that command do? And how do you make an arbitrary OSC receiving system start its transport, or play a middle C, or ….

    the moment you get into note data, you end in one of the endless battles that have gone on for decades about how to describe notes.

    it doesn't look so great …

  • Peter Kirn

    Sorry, I mean to say that the query is there, but not the ability to query actual semantics – the problem you're describing. OSC clients and servers are a bit like a lot of our foreign language knowledge.

    "Hello! How are you?"
    "I'm fine!"

    You need more of a vocabulary.

  • actually, a better representation of their conversation is:

    – Hi, who are you?
    + Fred
    – What words do you know?
    + Hello, Goodbye, Please, Thankyou
    – (… thinks …)
    – Hello, Please
    + what?
    – Thankyou
    + I didn't do anything
    – Please Goodbye
    + Goodbye
    – AARGH!

  • Insilico

    The app midi touch is a strong contender to touch osc and it solves 2 of your main gripes – editing done on the iPad, no third party translator needed. The developer mentioned to me that the next release will include support for sysex, so I will finally be able to make an editor for my blofeld! The blofeld is class compliant too so I can just hook it up straight to the iPad, no computer needed!

    I'm also considering making presets for the dsi tetra, virus Ti and maybe some older Roland rack gear (mks series etc). Any other requests?

  • Two things: First, and most constructively, here's a template for my alpha juno 2:
    Secondly, it seems to me that  Peter is implying that in a perfect world, one would be able to connect any OSC input device to any OSC host app without any intermediary.  I totally disagree with that.  In fact, this decoupling is for me one of the greatest advantages that OSC has over MIDI.  OSCulator and similar apps aren't a kludge, they are a necessary part of a new, free world.  It would be nice if there were cross platform free software that performed the same function, though.

    A bit more pedantically:

    The "language" analogy that you and Paul Davis were using I think is totally apt.  In most electronic instruments, I generally break it down into two distinct parts – the interface and the mechanism that actually makes the sound (as an aside, this is different from all traditional instruments, with the exception of the keyboards).  The power of MIDI is really the power of separation of the interface and the mechanism into separate logical components.  However, it's important to realize that the language that the interface "speaks" (until recently, switches, knobs, sliders, etc.) is different than the language of the mechanism (pitch, cutoff, volume, etc. for an analog synth).  I can think of four ways to successfully connect the instrument to the interface:     
    – Have the interface learn the language of the mechanism    
    – Have the mechanism learn the language of the interface    
    – Have each piece speak their own language, and create an intermediate stage that converts between them.    
    – Make up a new language and teach it to both  
    MIDI is an implementation of this fourth method.  The problem with making a language that is native to neither the interface nor the mechanism is the difficulty of predicting how the mechanism will respond to a given command.  Even if midi were a human readable format (another big problem),  it wouldn't be saved from this basic flaw.  What does "Note G#7 on with velocity 127" mean when you send it to a DAW?  G#7 doesn't mean anything to a DAW, so who knows what it will do.  Perhaps it will trigger some clip that is midi mapped, or perhaps it will send the note to a VI, or perhaps it will start recording automation on channel 1 (as in the Mackie control spec).
    There's a basic, fundamental difference between the language of interfaces and the language of mechanisms, and making a small set of words that makes up some arbitrary standard language, like MIDI, doesn't help that.  Interfaces and mechanisms that don't fit into this narrow model aren't going to abandon all hope of connection, they're going to make up their own nonsensical language-within-a-language.  If this wasn't obvious before the MIDI spec was released, there's plenty of proof for it now (see Mackie control, firmata, and plenty of other crazy things).  It's better to allow the protocol to have more freedom, so you can be completely free of these kludgy languages-within-languages.
    In OSC, you're allowed to use any of the four methods I talked about above to connect mechanism with interface, using the best one for the given situation.  For touchOSC, where you are making a custom interface for a known mechanism, it makes sense to use the first one.  If you have a simple interface with a set language, say a monome, you're free to write a custom mechanism in pd or max that speaks the language of the interface.  If you want to connect an existing interface to an existing mechanism in a new and exciting way (i.e., monome to live), you can use the third method.  There are even exciting examples where the teaching is automated by a meta-language, like mrmr.
    Obviously, OSCulator is a method for doing the third method above.  I think in most cases this will actually end up being the most useful of the four.  Just from an engineering point of view, it makes more sense to have one interface for converting between languages rather than making every interface and every mechanism multi-lingual.

  • Peter Kirn

    @Russell, good arguments… MIDI is open to criticism, but I do have to take issue with a few things here.

    Firstly, I can't possibly argue that OSC is an improvement over MIDI in cases in which you're using OSCulator as an app * to translate to MIDI *. If you can explain what you mean by that, I'm all ears. But there's no additional decoupling in OSC beyond what's possible in MIDI, as far as I can tell, in this particular case. We've always been able to use MIDI slicer-dicer utilities in between a controller and software or between two pieces of hardware.

    Secondly, it's not entirely fair to say MIDI is not human-readable. OSC is more descriptive, but neither is human-readable once encoded for transmission. MIDI is human-readable once you're labeling the messages, just as OSC is.

    Nor is MIDI an arbitrary standard. It just happens to be incomplete for every scenario we might dream up for it, which makes sense for a 30+-year-old protocol that is being used far beyond its original application. No one argues that.

  • Well, I'd argue that even when the final instrument receives midi, it's more understandable to send OSC and have a translator, except in the special case where you're only sending note messages and pitch bends to a traditional synth (where the semantics of midi are clear).

    That's because, if everyone's speaking midi, it's an opaque system.  You can't know what's going on unless someone tells you or you read the spec, or maybe you have the time and skill to reverse-engineer the protocol.  If you build a "translator" patch (in OSCulator or otherwise), you only have to do it once, and then you can easily connect any interface to it, because the semantics are right there written in plain english (if it's not, it's your fault for choosing bad OSC message names). If you stick with midi, you have to remember the usually undefined, undocumented, and weird semantics every time you build a new interface.  I'm sure many readers of this site know how painful it is to build a sysex editor for a synth, even if you've worked with it before.  There's a real difference between /synth/oscillators/1/waveshape and f0 31 68 79 f7.

    Of course, it's better still for the manufacturers themselves to do this for you and support OSC natively, but writing an OSC wrapper is hardly so distasteful that we should give up the empowerment of the intelligibility of OSC entirely.  

    I should say that I have a ton of respect for MIDI – even just the "MIDI in" and "MIDI out" ports are a very insightful way to think about instruments.  And of course no one can retroactively fault the designers of MIDI for not foreseeing that "the masses" would one day want to be in control of their own instruments and their own interfaces.

    But I guess, even with all that said, the intended (although perhaps poorly emphasized) point of my last post wasn't really that MIDI is terrible, but rather that even in the best case world where everyone had OSC support, we'd still need translation apps like OSCulator.  It's not a bad thing, it's just the price you have to pay for having the electronic protocols more closely match the physical and logical protocols of the devices.

  • Peter Kirn

    I've never once used an OSC translation program. Every single time, I've dealt with the messages directly. So I'm obviously missing something. Of course, they can be useful, but the only time I've seen that they're a requirement is if the receiving party can't read OSC messages.

    And it's not just OSC. Music programs that aren't Csound, Pd, Max, SuperCollider, etc. are too dumb to deal with networking, period. No UDP, No TCP. No nothing.

    The IP suite and Ethernet predate MIDI, and even in their current form are about the same age. So maybe it's not nearly as impressive that MIDI has survived this long as it is that music software has managed to get through the entire Internet revolution so far as if networking never happened.

  • Well, I think you might be exaggerating a little bit.  As mentioned in previous comments, "mainstream" DAWs like Ardour, DP, Renoise, and even the latest version of Logic can all receive UDP OSC messages.  

    What happens when you want to connect a fixed interface like the monome to one of these programs?  The monome only sends simple OSC messages about what buttons were pressed.  You can't program it to send certain messages when certain buttons are pressed.  Also, why should Ardour, for example, have to know what messages the monome is sending?  To me, the only reasonable way to get something like the monome and Ardour to talk to each other is through a translation layer.  If you write a translation layer implementation that's seamless enough for anyone to use (I'm not saying that OSCulator qualifies here), there's no reason to add some "OSC learn" complexity to either the monome or Ardour.  I agree that user-programmable apps like touchOSC are a little bit different than the monome, in that the user should be able to program them to send messages that will be understood without translation by a certain host.  But in the general case, where you have an interface and an implementation that speak different languages, you really do need a translation layer.

  • Just to ask you if someone know how to connect TouchOSC over USB not WiFi to PureData?
    I know its possible with Osculator but I need it for Pure Data.

  • bilderbuchi

    @PK: What exactly do you mean with "embrace something new" w.r.t. TouchOSC+android? There's no tablet you find adequate yet (btw, why not the galaxy tab?), so it doesn't matter that the android version is…lacking…? or instead of wanting to use TouchOSC with your android phone (like so many did with the IPhone/IPod Touch) you'll just go back to physical knobs again? I'm confused…

  • Michael Mennell

    Hey there, first I need to say that this is such a great site.  This is all still very new to me and the wealth of info here is incredible.  So cool that there are so many well-versed people out there exploring and developing new tools for musicians.  Just thought i'd mention that i've been using LiveControl on my Iphone (i know, don't say it) but for the small amount of midi based music that i've begun to incorporate into my set, it really works ok.  Don't know if I may have missed you bringing it up.  

  • The reason I really think OSC is a fantastic step forward is not because of all the things it does right, as much as how many things it has avoided doing wrong!

    It manages being useful, without making assumptions about the intended context of use, that could potentially break its usefulness in other contexts.

    Most importantly, unlike MIDI, it doesn't impose a schema that you have to stick to to use it. That is because the developers rightly realized that there is no ONE schema everyone can agree on, as has later been exemplified by the numerous research articles and other publications that propose schemas, each making specific assumptions about its targeted context of use. Perhaps in the future there will be a convergence, where users agree on a set of schemas, each for the relevant context, e.g. everyone using a transport supports the transport schema, etc.

    Similarly, that is perhaps also why it has no query system yet: maybe there is no one ideal query system for all applications?

  • KNS

    I will be all over this once i get the iPad.

  • KNS

    By the way it would be nice if you could interview the folks at retouchcontrol. What they have done is amazing.

  • cillianjohn

    I've just tried out Livecontrol, a pretty deep touchOSC template for Ableton Live (surprisingly!).
    Its free to download and uses a midi script so no need for OSCulator.

    there's also a monome version!

  • people in here get a life, i mean seriously

    android for trktr???

  • or i pad for scratcing?? keep makin useless crap 

  • @bilderbuchi TouchOSC doesn't scale to fill the screen on the Galaxy Tab. Also it's unfortunately way behind the features of the iPhone/iPad version — essentially the 1.0 version.

    Obviously the developers of TouchOSC have a lot on their plate. I'm sure they will address these things as fast as they can.

    I'm just voting.

  • Here's a link to that DX7 template:…

  • BTW: When we were developing the Missing Link, we were very surprised that there wasn't any kind of a standard already in place for wrapping a MIDI message within OSC. While the future indeed looks like native support of OSC, it's important that backward compatibility be built in so that we can continue using the last 30 years' worth of electronic music technology as we move forward.

  • griotspeak

    @onar – I think devices like the the monome are a completely different case than user definable applications. App development for the monome is necessary because the grid is usually an abstraction. It can be faders, toggles, buttons etc. Elements in touchOSC do not typically 'need translation.' That slider in the template usually corresponds to its counter in the host.  TouchOSC doesn't really 'think' about the information it is sending or receiving. Neither does OSCulator (when using touchOSC.) It can send many messages from one and scale, but it doesn't have the same sense of state as a monome app. "press 0 0" can result in many different responses from one monome app depending on any number of factors.  All that said, I don't think we can really manage native OSC without an arrangement akin to Max for Live in all of our hosts BECAUSE devices like the Monome need to be translated in ways that OSCulator is not meant to translate. Native OSC that didn't require anything other than the host and the controller would put complete trust in the developer of the host with regards to how I want to interact. I am frustrated by situations like TouchOSC/Logic where developer created documentation of the spec is lacking (Missing!) As much as I and many other Max for Live users complain about limitations, at least Ableton and Cycling intend for us to plainly see what we CAN do as patchers. Using that, I work to make the interface do what I want.

  • griotspeak

    my line breaks were CLOBBERED

    how do i format?

  • griotspeak

    and there is my answer. 

    two line breaks.

  • griotspeak

    It has been February 7th for an awful long time in this thread, no?

  • "I rather wonder if some sort of existing standard for querying / remote procedure calls makes sense. (JSON-RPC, for instance?) Time to look into it."
    @peter – Definitely.  I'm a fan of OSC for transmitting instrument events (TUIO, slider movement, etc.) which need low latency and occur in high volume.  However, while I've implemented and actively used OSC for querying and general RPC, I have found it too cumbersome and awkward for those uses.  I've started using JSON-RPC for those kinds of things, and am much happier with it.

  • @tim: OSC for RPC is definitely cumbersome and awkward – I'd so far as to say almost unworkable (UDP as a transport protocol doesn't help here). But I'm not sure that requiring introspection/discovery of an OSC namespace to depend on JSON (or some other flavor of the day) makes much sense.

  • @paul: Yes, I agree, tying the two protocols together that tightly is questionable.  It all depends on how much you want to depend or design around the undependable.  🙂   I'm very happy and even relieved now to limit my OSC usage to things that are standardized or static and don't need discovery or reliability.

  • @paul: Yes, I agree, tying the two protocols together that tightly is questionable.  It all depends on how much you want to depend on or design around the undependable.  🙂   I’m very happy and even relieved now to limit my OSC usage to things that are standardized or static and don’t need discovery or reliability.

  • Ivan Smirnov

    Speaking of monome, imo TouchOSC needs to learn to receive (and maybe send too) blocks of packed data, then it would work way better with monome apps. As it is now, take Molar, press a bunch of bottom row buttons quickly and your UDP goes boom and TouchOSC goes laggy.

  • Check out my website for traktor templates for windows and mac. Simple and effective with total feedback to ipad.

    also facebook page.

    ableton dj template coming soon!!!!

  • I despise minikeys, and I despise touchscreen keys. There's a reason for weighted keyboards. I also hate controllers where I have to remember what button, fader, knob goes where in what config. That's where the instant visual feedback is critical, and why I've yet to see a controller that really beat my laptop and a rack of knobs and a real keyboard. That's what i want in a controller, a customizable touchscreen WITH ports, and maybe a passable keyboard, and since it already exists in its original form, and may not be any better in a new form, it makes me wonder if all this discussion about the ultimate controller is just a severe waste of time.

  • How about building interfaces with Max and simply mirroring it to the iPad? That's what this app does:&nbsp ;

    It also supports 'pressure sensitivity' (using contact area) these days. 


    (not sure if the link came out right)


    check out my latest layout. i promise it's one of the most intuitive 🙂

  • the best tutorials I’ve come across so far are on
    Gets you up and running in no time. The Ableton DJ Mixer is excellent too!!

  • i just got an iphone for cheap (3g) and i literally just got touch osc 5 minutes ago and i’ve already got a setup up and running. wahu,.

  • Anders Hofsten

    i just got an iphone for cheap (3g) and i literally just got touch osc 5 minutes ago and i’ve already got a setup up and running. wahu,.

  • Anders Hofsten

    i just got an iphone for cheap (3g) and i literally just got touch osc 5 minutes ago and i’ve already got a setup up and running. wahu,.

  • check my layout for virtual DJ at its so ready for DJ`s , and it can be yours.

  • check my layout for virtual DJ at its so ready for DJ`s , and it can be yours.

  • check my layout for virtual DJ at its so ready for DJ`s , and it can be yours.

  • Michael

    what about cubase? how i make it works