In case you missed it on Create Digital Motion, we’re now beta testing a version of open source OSC controller software for the iPhone and iPod touch 2.x firmware. (For something along the same lines, see also OSCemote.) It’s well worth reading that story, as mrmr’s creator Eric Redlinger talks eloquently about what this is about: controllers that can connect to a performance, even a stranger’s performance, as easily as an iPhone could dial up a webpage.

Mrmr iPhone 2.x Firmware Beta, and the Self-Configuring Touch Controller

But in case even that argument doesn’t make it clear, this isn’t an iPhone story. It’s an OpenSoundControl story. It’s about chickens, eggs, and the future. It’s just a glimpse of that future, but it could be a meaningful one. This is technology coming to far more than just Apple’s devices. Let me explain.

OpenSoundControl is an open network protocol for music, visual, and other control. It’s really not fair to call it a rival to MIDI, because the whole point is that it’s not MIDI — it’s at home in networks, you can easily route it through wifi and Ethernet and Ethernet hubs, it handles high-resolution data, and it’s descriptive. It’s therefore good at a lot of things for which MIDI was never really intended — things that are more important into today’s computer- and software-centric landscape.

Yet I hear people say OSC isn’t ready for primetime, because they’re using the old technology as a judge. MIDI only caught on after adopted by companies like Roland, so therefore, OSC is waiting for some sort of hardware support before it’d be worthwhile to use it, and — direct quote I hear a lot — “nothing supports OSC.”

About that “nothing” category. Nothing supports OSC, except:

  • Millions of units of iPhones worldwide and millions more iPod touch, via free or cheap apps available officially through Apple’s store (compare to, just a year or two ago, the very select number of units of the Lemur/Dexter multi-touch controller specialized in music)
  • Via free or cheap software, Wii remotes for the tens of millions of Wii consoles sold
  • The open source Monome controller — fewer units sold, but easily one of the most sought-after controllers available today
  • Native Instruments Reaktor and Traktor, FAW’s soft synth Circle, Apple’s Quartz Composer (in the version in every copy of Leopard and later), Processing (the code environment featured several times recently at the Museum of Modern Art in New York and elsewhere), built-in support in Max/MSP/Jitter and Pd and SuperColldier, and support available now or in the works for nearly every major VJ and live visual app being used

In other words, in key hardware and software categories, we’re talking either a dominant product category or popular hardware selling millions of units or both. “Millions” isn’t a number the music tech world even thinks about a lot of the time.

Before I come unhinged from reality, I should be clear about some very significant caveats. OSC still isn’t widely known. MIDI is still the logical way to deal with notes and simple controllers, and that’s fine. Some of the software OSC implementations are a really big stumbling block, because not everything is actively supported or documented or implements key features like “bundles.”

But that doesn’t change those many users wanting to do live visuals or play around with their Wii remote or iPhone. It’s not a standard, and there’s a long way to go. But it’s safe to say that those could be, well … the egg.

  • Bjorn

    Not a single word of the Lemur…

    I don't blame you though, the last significant thing that happened to jazzmutant is that daft punk used a few Lemurs as playback visual devices for their pyramid. They've been promising an update at the end of the year for the second time in a row now. (almost 2 years without a single improvement)

    OSC is more readily available than people think. Its not much of a stretch to adapt HID to be OSC. That old joystick you got lying around? Its an OSC XYZ controller. What about those cheap HID ribbons?

    Powermate? Space Navigator?

    I just recently spent a few weeks writing an application that works around some limitations of the BCR. Even with all the controllers out there and all the DIY solutions, there still isn't anything that meets my needs. I dabbled with stuff like the lemur and just about every type of controller out there, nothing really gives me what I'm after. Which is essentially a hardware setup that allows me to control software like its a well designed groovebox.

    If Ableton Live had a good implementation of OSC, you'd be able to call up all the parameter/clip names and values. (There's a proof of concept API hack out there)

    If you thought a novation SL looked cool, imagine you could stream info from Live to any of the hundreds of different types of LCD screens out there.

    At least the development on OSX based touch software is on a roll.

    This software isn't the egg and the iphone isn't neither.

    The egg (IMO) is a piece of hardware somewhere between the iphone and the lemur. Within financial reach, useable as a wireless screen, ebook, tablet, sketchpad, input device, capture device, etc.

    You know… The computer of the 21st century…

  • poorsod

    The moment OSC really takes off is the moment Ableton starts supporting it with the same function as its MIDI control.

  • Damon

    Question – Chicken or Egg?

    Answer – Chicken Egg!!!!

    Sorry (not really), had to sync it…



  • Actually, I mention the Lemur and Dexter above. My assumption here is that people paid attention to those because they were music-specific, but haven't fully realized the importance of these other devices, which are shipping in far greater quantity. Although, I should hasten to add, the Dexter didn't really focus on OSC because of some of the limitations of the software, so it's also a good example of control hardware regularly having to bend to the control limitations of current software.

  • Does this mean we could have a Monome-like controller with the iphone/ipod?

  • cobalt

    I think there are two related reasons why OSC hasn't taken off.

    1. It's an academic, open-source project, and the full specification has to the best of my knowledge never been worked out (although it certainly works).

    2. Mainstream hardware manufacturers have demonstrated no interest at all in adopting a new interface or data communication protocol.

    Ever since I was shown a prototype OSC audio interface at CNMAT a couple years ago, I've wondered why people haven't demanded OSC compatible hardware. The particular prototype itself I saw was not picked up by any manufacturer, and the person who showed it to me was still in disbelief about it. Like lots of other people, I've read countless posts about the limits of USB for transferring low latency multi-channel audio and looked at all the different solutions (like FW, PCMCIA, Express Card, modular systems, etc.). And then there are lots of people trying to set up multi-room recording studios, sometimes wiring tons of expensive cabling between rooms to carry multi-channel audio, getting wireless remote console control systems, setting up chains of MIDI cables, etc. But if someone would just build an OSC multi-channel audio interface, you just have to stick an ethernet cable in the port — something we've had for years — and all the audio channels and assignable control data channels are accessible to any and in fact all computers on the same network. Likewise, any message or audio signal can be sent back the other way, in any direction. The cost of expanding the system is the cost of an 8-port wired router and some ethernet cabling.

    Peter, I wrote with you a little bit shortly after my experience at CNMAT, and at the time you mentioned the lack of software support as a reason for the relative lack of interest in OSC. But in that case, we were just talking about the control data. Adding software support for OSC audio data would be trivial, and adding backwards compatibility with MIDI et al. as well. I think the fault really lies with the big hardware companies (like Roland, Yamaha, etc.) being completely non-forward looking, trying at times to develop proprietary systems that of course no one will invest in (like that Yamaha system that was based on FW, except not really true FW). And part of it is user complacency and an unfortunate ignorance of the potential of OSC. The fact that there is still a huge market for USB audio interfaces is a mystery to me. At the very most, you would need USB for power; every computer has an ethernet port.

    The problem of course to going to a new networking standard is that one device manufacturer can't make it happen alone. No synth maker was going to add an ethernet port on a keyboard, for instance. What's interesting about the iPhone and Touch for OSC is that they're wireless and have sensor and control capabilities built in. It makes sense, given the audio hardware impasse, that OSC would really hit its potential when someone mass-produced a device with all those characteristics for some completely different purpose.

    I'm pleased OSC is finally coming up. PC games and MMORPGs have already done tons of work creating stable, massively parallel systems involving synchronous audio-visual display and interaction. This is going to be a good time. Yeah!

  • cobalt

    I also think details about the pin-out of the Apple iPod connector port will soon become interesting to a different group of people than iPod accessory makers.

  • Bjorn

    My bad, it was so briefly, I literally missed it.

  • The primary problem with OSC is that it defines a rather low level protocol, not a "language" with real semantics the way that MIDI does. Its one thing for apps X, Y and Z to all "support" OSC. But that doesn't mean anything when there is no message you could send them all that has any predefined meaning. Trivial example: you have full control over Ardour's transport with OSC, but the messages you send Ardour to achieve that would almost certainly mean absolutely nothing to Traktor.

    Until enough people agree on a vocabulary and a grammar for OSC interactions, its use is going to be continue to be "ghettoized" – capable of absolutely awesome stuff on an app-by-app or hw-by-hw basis, but not exciting as a more general purpose way of controlling and interacting.

    @cobalt: OSC's use of UDP makes it somewhat unsuitable as the basis for a realtime audio transfer protocol. There are several (read: way too many) attempts to define audio-over-ethernet, some using parts of the TCP/IP stack, some using raw ethernet, but the ones that are really reliable and adaptable tend not to use UDP. Granted, OSC doesn't *have* to use UDP, but it does tend to do so. There quite a few commercially available audio-over-ethernet solutions available, and guess what: none of them are interchangeable. I don't see how OSC can help with this.

  • @cobalt: Good points, but —

    1. I'm not sure being an academic project is any better or worse than being an industry project; people use what they use.

    2. I don't think you need mainstream music hardware support; that's my whole point. There are in fact things Roland and Yamaha need to adopt a protocol, things OSC may not provide. OSC may not fit into their business model. But millions of other hardware units say you might adopt it elsewhere first.

    @Paul, yes, I do tend to agree, that's the hurdle that remains. It's not a chicken and egg problem, it's actually a syntax and implementation problem. Then again, this may not need to be a standard so much as certain implementations that are useful to people. Not everything needs a big industry standards board (hello MMA); I think it may be better if people just go out and do stuff with it. That said, this issue of whether or not software has *some* kind of OSC facility, and whether or not that facility supports bundles (as Reaktor, Quartz Composer, etc. do *not*) — that's a really big deal, and that to me is the remaining obstacle. There's certainly no lack of other hardware and software that could take advantage of it, and I don't think you need to constantly make the MIDI comparison. If you're personally using it, you don't care about what other people are doing. And as Eric points out, you could use software to push out an entire interface to other people on their devices. Maybe my friend has a different protocol for controlling his live visual setup. But I can give him control of mine instantly by having him connect to a mrmr setup. If we start seeing touchscreen computers and not just iPhones, that means in the near future laptops could do the same thing. (Read Eric's article as he explains it really well.)

    But yes, I was talking control, not audio; that's a whole different can of fish. Kettle of worms. Um… you know, interchangeability. 😉

  • cobalt

    One nice thing about not having technical understanding is that one can make incredible claims about future technology relatively unhindered. The prototype I saw was in fact for a specialized purpose. What makes me upset about it not having one is that if I wanted to create a functionally similar set up, I would need about 8 audio cables, an 8 channel audio interface, and an Arduino box extra.

    The last two comments make me think that one thing that's unusual about OSC on the iPhone and Touch in particular is that suddenly you have millions of OSC devices that all have virtually identical capabilities. In a traditional set up, you have sound generators (synth modules and instruments), controllers, and a centralized "sequencer" computer which is connected to all the interfaces and signal transduction mechanisms.

    Part of the reason I've been mystified by the lack of interest in iPhone apps (and the Touch as a way less expensive alternative to the iPhone) is that Android is coming next, and it has similar potential. And, it's open source with fewer SDK restrictions. So if people want to get their open source on, there's always Android.

    Anyway, great discussion. I'm excited to see how this all develops.

  • john dalton

    Assuming someone did want to build OSC support into their products what would be the best cross platform code SDK to use for this?

  • @John: It's a pretty simple protocol; any development tools will work. For end user-facing tools that do the utility stuff, I'd like to see Java used, personally … but mainly for OSC utilities; Java's great for networking and fairly lousy for audio.

    Mostly what we need is to get people working on more fleshed-out syntax and implementation; I agree entirely with Paul Davis on that.

    But a basic implementation is pretty simple … existing SDKs are just fine, and the basic protocol is well documented at


  • @John: for POSIX-ish systems liblo (google it) is an excellent place to start.

    There are also implementations of OSC for languages like Python, Java and Perl, but several seem to exist for each language and its not trivial to determine which to choose (I have certainly been unable to do so).

    But understand that OSC is a very, very simple protocol at heart. The reference implementation is just a UDP connection between the two endpoints, and simple strings sent back and forth. All that the libraries/SDK's offer are more wrappers to make higher level things a bit simpler.

    The one to do is to never go near the original reference implementation. Read the specs that come with it, but stay away from it. Its probably the worst C code I've ever read.

  • I have OSCemote on my iPhone already. I commend an open source effort, but unfortunately it will not be distributable on the App Store and thus relegated to iPhone pwning folk.

    Someone complained that the Lemur was mentioned in the OSC article but alas it was; look again.

    The most glaring omission in Peter's article was the silence about Max/MSP/Jitter. I would wager that 80-90% of all OSC use occurs in Max/MSP. And another 5-10% in Supercollider and PD. Max/MSP has tremendous market penetration now so I was surprised that it was missing.

  • Darren Landrum

    Wow, for once I'm in complete agreement with Paul Davis! 😉

    This is a proposal for a SYN OSC namespace:

    Theoretically, this could be the OSC syntax/implementation that replaces MIDI.

  • @Christopher: Ah, yes, good point; updated my list there. (I start getting a little, uh, flaky round about Friday evening!) And, of course, Max is now no longer even an external, so that's really a glaring omission on my part (not to mention, I use it)!

    As for OSCemote, the plan as I understand it is that mrmr will be *both* open source and distributed via the App Store. I don't think there's anything about the App Store that fundamentally precludes doing open source software, you just have to be careful about which libraries you'd link to / what you open source, and of course the Apple bits aren't open source. But what's important that Eric has done that OSCemote does not do is that mrmr is also a protocol, and also has this client/server model for actually pushing out the controller templates.

    Anyway, I plan to run both and use them for different things.

  • Formal

    I don't mean to spam or anything, but I see this article is receiving more attention then the one on createdigitalmotion. I won't repost my entire question, but if you feel like helping me out with OSC to midi please read my original post… . Thanks!

  • Patrick Delges
  • Patrick Delges

    Ooops, sorry for the <a> typo:

    makingthings' Make Controller

  • Hi Patrick,

    Oh yeah, good point….

    And I expect we ought to have a standard Arduino-style implementation that's Ethernet + OSC and is open source. Here's one:

  • Anyone looking for an OSC app for the iphone that has configurable screens should definitely take a look at the 1.1 update of OSCemote. You can now do screens in HTML/Javascript that both send *and* receive OSC. Probably not as easy to create new screens as Mrmr, but OSCemote now seems WAY more open-ended.