Sync or swim, indeed. Synchronized swimming performance in Brighton, which itself had to sync with live music and cinema – check out the details, as they’re perfect metaphorically for this story. Photo: Greg Neate.

Laptop musicians are feeling out of sync — literally. But we can work together to help the situation.

Computer music making can be an isolating experience. But when users try to use their eminently-mobile tools to play together in the same room, they often find that the technology resists. MIDI, as a serial protocol, isn’t designed for networked environments. Software interfaces are designed to be visible to only one user. Sharing between users rarely figures into designs. Input points are made to be single-user only.

And most importantly, just getting a couple of computers to sync can be a Herculean task — one that seems to have gotten worse with advanced computer software rather than better. In short, for all the technology we have today, we’ve actually regressed from the state of interoperability 20 years ago.

I’ve been hearing more and more frustration over sync, as people begin to collaborate with multiple computers as they would with a small ensemble of instruments. Ableton Live is the most frequent example, but it’s only one case – and I suspect part of the fault is that people are more likely to try to sync multiple copies of Live. When I spoke to Depeche Mode’s Martin Gore in the spring for Keyboard, Martin complained that they had trouble syncing his Apple Logic sessions with other band members using Pro Tools and Ableton. This weekend in Los Angeles at the DubSpot sessions, Glitch Mob’s Justin Boreta talked about the issues that group has had with multiple copies of Live.

Synchronization is, by definition, a tough thing to do. But musical engineering is replete with challenges; it’s no longer acceptable to simply say “live with it” and walk away. It seems we need both better shared knowledge about what sync is how to make it work, and better engineering solutions on the software and protocols side to support the way users want to work. And yes, we need a new sync standard that goes beyond what’s presently available in MIDI alone.

Focusing this discussion, I just got an essay in my inbox that I think focuses the issue. I will try to speak to Ableton’s engineers about the matter, but this isn’t really about Ableton alone, so I’m posting it here first. We could use more data about how you’re working with various software and hardware, what techniques you’ve developed, and what frustrations you’ve had. We have a wide community here of users and developers (and a whole lot of you are both).

Mark Kunoff writes:

I’m writing to you today about an issue which I believe has been a sore spot for many Ableton Live users – *reliable* syncing of two or more computers – particularly for those of us who are attempting to sync for the purposes of *live performance*.

My musical partner Patrick Petro and I (together we perform as “Othership”) have been struggling with this issue for several years now. At present time, we have a decent solution using midi time code. Initially we attempted to use Midi “clock” but our friend Steve Duda (partner of Deadmau5 in BSOD) informed us, “using Midi clock is as reliable as syncing to a boat motor.” He informed us that in BSOD, he and Deamau5 have reliable sync between their 2 laptops using MTC, although the main drawback is the inability to fluctuate tempo – you must run at a consistent tempo the whole time. (You may be aware of this already, but Steve is the person responsible for ‘Molar’ the incredible step and loop sequencer for the Monome, was a programmer for Devine Machine and has worked for many renown artists in the music industry such as Trent Reznor. We are very fortunate to benefit from his consultation!)

Currently we are both using Macbooks and syncing via Ethernet with Audio File Engineering’s “Backline” app to generate MTC. This method has been about 95% reliable, but after reading an article on Ableton Tweets ( and our response – we are going to acquire a dedicated external device to generate MTC such as a Motu Timepiece.

I feel strongly that Ableton has not addressed these issues sufficiently and could do a better job of educating their user base as to the challenges that face performers in achieving reliable sync. I’m not expecting a walk in the park, but as of yet Ableton has not provided comprehensive documentation regarding these issues and places most of the responsibility on users to figure it out for themselves. We are (and have been) perfectly willing to educate ourselves but for the most part this issue remains elusive to the majority of Ableton Live users.

The Ableton Live forum posts regarding sync are fraught with dissension and are excruciating to read to say the least. I feel I’m empathetic to the complexities of programming audio applications, but in my estimation Ableton tech support’s explanations toward this issue have been mostly open ended. Many users report these issues only to report back that Ableton’s tech support doesn’t respond. I have experienced this as well. Certainly there are enough customers who want a better solution.

I feel it’s time to launch a concerted effort to organize users and demand that Ableton addresses this issue once and for all. Perhaps this solution wouldn’t even involve midi at all. Ideally this would be an open protocol such as OSC, but I wouldn’t be opposed to a proprietary solution – just as long as there is a reliable one.

The main purpose of this correspondence is seek your and CDM’s assistance in sponsoring an effort to encourage Ableton to address this issue once and for all. I feel CDM could be quite helpful in garnering leverage toward this effort (a simple blog post, or ideally a dedicated section) to organize users and to demand better sync ability between two (or even multiple) laptops running Live – even from unlike computer manufacturers. I’m sure you know artists with valuable expertise in this area.

Even if the issues regarding sync via midi are insurmountable, there have to be CDM readers who have developed reliable methods of two or more persons performing with Ableton Live and it would be great to have one centralized portal where discussions of working methods can be discovered.

Thank you for your time.

Laptop music making can feel a bit… isolating. Body-Hardware Interface photo (CC) its creator, Becky Stern.

Again, my personal intention is not to single out Ableton — I’ve heard similar complaints about other scenarios, and moreover, I think the “open-ended” tech support response occurs when there isn’t an easy solution. Tech support alone often can’t deal with something as multi-faceted as sync, so it’s time to engage other users in this, as well.

I’ve also spoken to Owen Vallis and other folks about how sync could be executed more effectively over network protocols, and specifically how the time stamp feature in OpenSoundControl might be used in conjunction with MIDI clock messages.

To kick things off, let’s comment here, but I’m also setting up a special Noisepages group for users to share experiences and tips:

Sync or Swim Group [noisepages]

(Incidentally, CDM contributor Matt Ganucheau is joining me Saturday at a WordPress developer intensive here in New York, so we’ll be picking up development techniques to work on the Noisepages community, too.)

Jump in, say hello, and let’s talk about how we can make sync work in real-life musical situations.

I’ll also be talking to more artists and developers about their experiences and suggestions, and will pass along your feedback, so expect a report back. In the meantime:

1. Are you routinely trying to sync multiple musicians?
2. What software (and hardware) tools do you use?
3. What have been some frustrations?
4. What techniques have worked, or what have you learned you might want to pass along to other users at various skill levels?

  • Armando

    I was actually discussing this with a buddy of mine of how I could capture his tempo out of traktor to stay in sync with Ableton so I can play live on top of his DJ set. Unfortunately I have not found a solid solution, seems like everyone is in the same boat.

    But since you mentioned using network protocol, and considering a lot of controllers are starting show up with network protocols for hookups, would it even be possible to run a router hub and pass all that information through one port?

    Thanks for tackling this Peter, I really feel like a fish in a lonely pond with no company really listening…

  • I think it should absolutely be possible to do this over a network. You'd probably want to use UDP instead of TCP/IP, and perhaps make use of OSC's timestamps. The main thing would likely be to build good tolerances for what happens if the sync signal is temporarily lost / late / etc. That might not allow for quick tempo changes, but could make the sync perform more reliably at a fixed tempo.

  • salamanderanagram

    syncing live to live or to reason has never been a problem for me, i think it took a half hour to figure out one time, but i've never had it fail on me… i think i just sent midi clock thru my midi hub into my friends computer.

    either way i used to jam with this kid and we would play for quite a while. and i checked our respective clocks a few times and they were always identical, which i think is slightly better than syncing to a boat motor 😉 maybe i'm missing something but midi clock has always been enough for me.

  • enomis

    A few years back I was working with another musician and we needed to sync our laptops. At the time we were only running max/msp patches. They were not always the same patches on both laptops but regardless we did need to have them synced. We tried MIDI and it was a lost cause.

    The most reliable sync we found was establishing a master/slave relationship between the two machines and running a phasor~ object through lightpipe from the master to the slave. Our audio interfaces were also synced via word clock.

    This setup was ideal because the master computer could change tempo by speeding up or slowing down the phasor and the slave machine would follow.

    Nothing I have tried has come anywhere close to being this rock solid.

  • I've got really big problem syncing Live with Korg Electribe and Virus TI – so finally i've sold all the stuff, excepting the laptop and ableton, cause it's more versatile. But if i could use correctly both hardware and software together at the stage i'd never sell all my synths(((

    Well, waiting for international and all-the-brands support for OSC protocol. But it looks like it's too hard for Korg, Yamaha etc. to forget about "revolutionary" MIDI interface, which is already more than 20 years old (sic!)

  • Ooh. Big can of worms…

    Hundreds of tests, thousands of posts and in Ableton's case they even wrote a 6 page paper on the subject. Its in the Live 8 PDF page 500 to 507.

    Sign me up for the team working on atomic OSC sync in M4L.

  • Great post! There is too much voodoo around sync, everytime I go play with others friends I feel scared about the moment I'll hear the beat going to hell.. The bad thing is it seems completely RANDOM. No matter the laptops are mac or pc, the sync can be solid for a day, or jump every two 2 minuts without any warranty.

  • Actually, I should make this very clear — OSC is not in any way a "cure-all" for sync issues; quite the opposite. Think of switching from one spoken language to another; you still have to work out what you're going to say. 😉

    I do think that a networked environment more closely resembles the context in which musicians are working today. That is, you have a group of people with a group of machines trying to interconnect them.

    But the idea that Ethernet can't work isn't correct; Ethernet is just the pipe. TCP/IP is unlikely to be a good solution because it's packet-based; UDP might be possible.

    This is a huge can of worms I'm opening, I know, but it's an appropriate can of worms for CDM.

  • i've tried syncing two laptops for our live shows as big hair since Live V2.0 and it has always been unreliable. With every new version we give it another try, one as slave, one as master, or using a third machine as master to two slaves, and then abandon the idea very shortly after several hours of frustration..

    We stay in sync by old school methods, checking tempos by eye, and hitting the space bar to launch by hand… if we slip for whatever reason we can always nudge tempo.

  • One of the tricks I learned at a monome Princeton meetup in the past. It is described on this forum thread. For two Macs, which is what I have done, use network, built-in to AudioMIDI Setup on OSX for tighter sync than 2 MIDI interfaces will provide

  • Hi Peter

    Thats really good post just in time. I will be working in the nearest future on the live set that will include MD and Ableton, thats why I read your post with great interest.

    As for previous experiences I tried syncing 3 macs each running ableton over network and it was unstable. Live was crushing one or the other. That was version 7. I dont even atmpt to try that in 8.

    I generally prefer midi cable.

    Most reliable setup so far was using two Elektron machines going to motu ultralite and using Live as a mixer so no midi to the computer=no problems 🙂

    Another setup I use uses OSC wireless between two macbooks with no problems at all, used UDP. One laptop is running a visual score and sends it to the other. I use OSC just for the score and altering it live, so no MIDI sync there. I get midi from drumpads (live percussionist) convert and scale MIDI in max and sends it to Machinedrum where it changes synthesis parameters. Percussionist plays to the machinedrum to the loops and has headphones connected straight to the MD to avoid latency.

    Had few bad experiences with syncing MD to Ableton before but I was using arrangement view and pressing stop on MD was making Live go to the very beginning of the piece. They were going out of sync sometimes so I needed to make ableton a master and was pressing stop and start on MD to make it go in sync agian.

  • @Kempton: interesting, but — the network function Apple built is one of the specific cases I'd heard people were *unhappy* with. It was working for you? Not to mention, whatever Apple is doing, it should be possible across platforms anyway. But first, need to figure out why people weren't finding that a satisfactory solution.

    @Radek: yeah, agreed! Well, you can pipe MIDI over OSC, so it seems like the best common-denominator solution may indeed be OSC over UDP. With a MIDI proxy on either side, you could use whatever tools you want on whatever platform you want.

    I should add — folks, we need specifics. What hardware interface are you using? (So far, hearing good things about MOTU, RME, and — sorry Avid — bad things about M-Audio, which should be telegraphed to the M-Audio support people if we can get some particulars.)

    Then again, going to network interfaces could be a good way to remove these additional variables.

  • Should be a fun thing to test when I get my new laptop in a few weeks. Try to measure what goes faster and has the least jitter.

    MIDI traveling from one usb interface to another in combination with a hosts built-in sync feature.

    Or host to host communication over UDP with sync being performed with API control (maybe at audio rates).

    I don't think I can agree with how the OSC timestamps (1.0) are defined. If I read it correctly, you'd deal with 64 bits that count back from the 1900's. I'm no mathematician, but that seems rather silly for syncing 2 computers that are definitely in the same time zone.

    And even if they weren't, you'd only need the atomic time of minutes and lower. Because you wouldn't sync a computer with an hour delay.

    I'm glad you raised the subject and sorry for going slightly off topic with this OSC stuff. But I must have been missing some documentation and conversations where people discuss their solutions.

    The solution seems so obvious that I dare not post it out of fear of getting pwnd. 🙂

    I mean, I'm not the brightest nut on the block. Surely there must be some obvious problem I have missed that prevented watertight syncing using modern networking techniques.

  • William

    Not that this will help anyone trying to achieve this type of syncing…

    … but have you thought about why you need to sync in the first place?

    My current project, while using a good amount of technology, from Max/MSP to Quartz Composer, is strictly anti-clicktrack/anti-machine time. We all keep time together, just like musicians have done forever.

    As all of you have found, and anyone who has attempted any sort of beat tracking algorithms, computers are not very good at playing in time with each other or with humans.

    Humans, on the other hand, are insanely good at keeping time with one another, allowing for a collaborative and dynamic expression of tempo.

  • @William – I totally agree, although for that, see my other complaints at the top of the article. 😉 There are many, many things people are doing between multiple machines, of which sync is only one.

    I think the reason sync is important is that people *aren't* always playing together in the same place — sometimes what they're really doing is *sequencing* together. That's a new possibility that's rather exciting. It's like having collaborative composition.

  • @Bjorn: OSC is absolutely on-topic for this conversation. I need to look again at time stamps. There's some discussion of how to evolve time stamps in the recent paper Adrian did, too.

    I think it is possible you'd want to exchange timecode across timezones, but… well, that illustrates these are different scenarios, and your point is exactly right. I can't imagine when you'd ever actually use that. 🙂

  • timecode vinyl –> midi –> master tempo

    I have no idea how to convert timecode to midi sync though. I'm sure i've seen a post about it on here before. Also, this would only work if you can beat match records!

  • tonnu

    im not even trying to sync two laptops

    syncing ableton with my monomachine (as a slave) is, like someone had said before, temperamental to say the least.

    i use live 7 on macbook with motu ultralite

    it's a real pain (when it SHOULD NOT) .. god

  • The NetClock protocol which uses OSC is meant to address this.

    Right now it's implemented for Supercollider with Haskel, Perl, Impromptu and ChucK in progress, not sure what the PureData status is; there was some interest. This is primarily meant for livecoding but I don't see why Ableton's Live couldn't use it as well.

    The trick is that you need a time-server on your lan, from there on you can use OSC timestamps and network jitter needn't matter (within reason). One of the nice things is that you don't need to have a single master with the rest being slaves; everyone can send time-stamped bpm and time-signature change messages. This also means a single laptop crash won't stop the music.

    Most of the time I play by ear anyway; just set the bpm that your partner uses and hit "start" when you hear the "1". That does of course demand a system that doesn't drift, not sure how Live is drift-wise; I haven't used it for years.

    Frankly I don't see why we should need sync cables between musicians at all; you don't need sync cables between turntables to beat-match and turntables are incredibly primitive compared to laptops. What would you do when you'd have to fall in with a bunch of acoustical musicians that already started playing? Always start with a click-track from the laptop and always have the laptop lead tempo-wise? While dependable sync would be very nice to have I think the question is much larger than that.

  • When we first started using computers (and computer tempos) Mike and I would declare "I am Clock" and state the BPM. We had pretty much given up osynching electronically. Luckily the software has become more flexible with me on live and him on reason.

    Kassin is right in that tempo control comes with mastering your instrument (hard or soft) and with mastery comes the flexibility to play at or within tempo.

  • Sorry, that shold be KassEn.

  • Zulu Company

    @ Peter: So i'm running basically the identical BSOD setup, using backline on computer A to send MTC to mac's IAC bus, then selecting that bus as the incoming sync in that computer and also sending it over the ether to computer B. The sync appears to fluctuate maybe a few hundredths of a BPM but hardly noticeable and has been running smoothly for hours upon end. I'm curious as to the argument in favor of using hardware to generate the MTC….I'm using a MOTU Ultralite as the soundcard for comp A and the main outs of it are running into channel 4 of a Xone 4D mixer which is also acting as the soundcard for comp B.

    So the suggestion is to use something like the Timepiece, but I have found that I can use the Ultralite to generate SMPTE which then, using a 3rd party app called SMPTE Reador (found it somewhere online, freeware) it translates to MTC which then goes to the Abletons. SO my question basically is, is this a kind of comparable setup to what both Mark and the ableton guys are suggesting? Is this any better than using Backline to generate MTC (it seems just generating MTC with backline is more simple than generating SMPTE and translating, or am i missing something?) am i still running into the same problem sending the MTC from the IAC bus over IP to the second comp? either way, both of these methods seem to supply a stable option, given no bpm changes. LETS JUMP ON THE BANDWAGON NOW, SAMPLE ACCURATE SYNC ABLETON, PUH PUH PLEEEEEEASE!!

  • Rock wrote; "tempo control comes with mastering your instrument (hard or soft) and with mastery comes the flexibility to play at or within tempo."

    Yes, I think so, and that should also mean that the instruments provide for that. The last version of Live I upgraded to was 4.1.4 but back then I think all it had was a simple tap-tempo. I think MIDI clock worked quite well in Live at that time, BTW, I can't remember any complaints. It was no Atari but at least good enough for serviceable sync.

  • AdamDriveEngine

    On separate occasions have experienced similar frustrations attempting to sync Ableton with a Monomachine, Triton, and even other copies of live on separate machines.

    Aside from the occasional voodoo basic midi sync, the other problem lies in the fact that particular midi settings should really be per track in clip mode rather than a global parameter. (If we are really to consider clip mode as an "instrument", then settings should really be per clip.)

  • Martin

    i have several analog synths synced over an AMT-8 with a Korg KMS-30, there´s also a machinedrum and a monomachine connected, a korg prophecy and some other instruments.

    sync with ablton as master works pretty good, only if i have a loop running inside ableton i experience latency which is normal and can be solved with live´s external instruments plug in and a litle midi-delay overall.

  • I have been performing live with my group – dirtRAID for that past year. In our sets, we sync a tr-606/tr-707/mc-09/sh-09/plug ins, ableton/sp-404 and serato to abelton live. We are using an um-880, kenton pro solo, and a usb to din sync box. The only reason we even use abelton is because it allowed us to sync up all these box's and adjust offsets. We also get a sync signal back to 2 sets of mpd32's.. I think problems can be solved with a lil research. Midi suits our needs just fine. Although you would think in 2010 the process could be a little more stream lined.. but then again if a person is using this type of technology it never hurts to rtfm and to know your kit inside and out.

  • I have already began to create a M4L patch that you drop into 2 different Ableton sessions and it will allow you to sync between the two over OSC. Additionally, it will allow you to triger clips from either laptop and soon it will be able to perform other functions.

    Since I am a enterprise network consultant as well as a musician I was thinking of some other options that could prioritize the packets using QOS tagging. This way you can utilize a small managed switch to connect your devices and prioritize the control packets that are used for syncing and Live performance over the regular network congestion. Right now i was just thinking about computer-to-computer packets.

    Honestly, I have not seen an issue yet with performance. But, I would like to see performances in the future that have multiple computers in sync together without an issue.

  • Mark Kunoff

    First off – Thank you so much for honoring my query and a super big thank you for establishing the dedicated noisepage! I'll be visiting and contributing as much as possible.

    I totally respect the comments regarding keeping time with good ol' musicianship. In fact, Pat and I come from a traditional rock/jazz background. But for some artists, there is an abundance of complex things going on in our setups (certainly evident in the unique and increasingly complex approaches featured here) and we'd rather not be burdened with nudging tempo periodically. Furthermore, from a purely production oriented viewpoint, this sort of functionality is a very practical consideration.

    I also understand that these issues aren't simple by any means, but I am elated that CDM has agreed to offer a centralized location to consolidate discussions regarding this subject. CDM rules!!!

  • Jaime Munarriz

    Kassen is right, you can just play following your partner, and it works. Drifts can be interesting, and you can always stop, listen and get into it again.

  • Jaime Munarriz

    Has anybody tried MIDIoverLAN or ipMIDI?

  • excited to read more. @Mark Kunoff thanks for starting this mess! @Peter, thanks for being game.

  • The beauty of OSC is that this becomes a non-issue. In PLOrk, we routinely sync 20+ laptops using OSC and a single Apple Airport. The following is re-posted from a recent discussion on the monome forums, (the original discussion is here:

    I'd like to use this as an opportunity to explain how we sync ~30 laptops together in PLOrk using OSC. This will probably not be immediately applicable to most people here, but keep in mind that wouldn't be difficult to turn these OSC events into MIDI.

    First off, the advantage to using OSC sync is that there are very few dependencies or fundamental limitations. The only current limitation I can think of is "dropped packets" on a wireless network, but someone here has finished a packet-redundant version of the Chuck OSC objects that has already proven to eliminate this problem. Granted, we only started dropping packets in PLOrk when we hit 20+ computers talking to 1 airport.

    Unlike MIDI time-code, you can change tempo as fast/slow as you like…because every machine is receiving a "tick" over the network from the server. The client computers simply react to the "tick", they have no internal sense of what the tempo "should be".

    The disadvantage to using OSC is that very few programs support it. And I only know of code in Chuck that handles this kind of thing responsibly.

    How it works: The server computer is usually hard-wired into the Apple Airport with an ethernet cable. The server code is started first. Usually, the client code is packed into another set of (music making) code, or modified in some way for the specific piece. When the players launch the program, their machines are automatically registered with the server. Each client broadcasts its presence every second, so that the server knows if they drop off the network. The server sends out a "tick" to every machine at some periodic interval (tempo). Each client can then use this event information to sync with the server.

    Somehow I've had a difficult time finding the server/client demos on our public PLOrk sites, so I've reposted them to my personal html.

    OSC Client -
    OSC Server –
    PLOrk Repository –
    OSC in Chuck –
    Video –

  • flunky

    (as an aside the Sequentix P3 sequencer has a cool function, where if you press Run, whilst it is running, it resends midi clock 'start' at the next Global Bar.. great for getting all the other hardware running again if you are jamming as a group with other sequencers etc.. without having to stop / start)

  • flunky
  • @altitude sickness

    Cool stuff. Yeah in terms of 802.11b/g wireless… A single radio can really only handle about 25 connections. Optimally, you really only want 12 connections per AP. You should probably start to think about multiple APs at that point. If you are using standalone APs you generally want to place them so the signal has 15% overlap at 65dbm. With a wireless lan controller, the controller can take care of it.

    I would think that a wired network is definitely a better bet. You don't have to deal with wireless interference and things like that.

    I am definitely going to check out what yo did. It sounds like you just provided a reference onto which everyone syncs to.

  • Just wanted to pop my head in and leave my 2cents 🙂

    1) Although it is possible to play two live laptops out of sync (pressing the space bar), the clock's minute differences will drive the two sets out of sync over time. Having DJd on turntables for years, in another lifetime :), I can attest that syncing two or three beats is no big deal….but, syncing ambient sets? poly-rhythms, more than two Ableton sets? very hard to say the least.

    The other issue is that I don't want to focus on that part (the syncing of beats) anymore, I want to focus on the musical dialogue with the other computer musicians. I think that this aspect is a whole different discussion in and of itself.

    2) due to many different factors, the MIDI clock coming in to a machine might have a little Jitter introduced. If you then calculate BPM based off of MIDI Clock you will have small fluctuations ( or large depending on the jitter ). These small fluctuations cause a mess of the plug-ins that use a dynamic buffer size. Things like delay lines and reverbs sound like crap. Try syncing Ableton and then check out the reverb on the slave machine. All sorts of un-interpolated nastiness. Delay's that can handle the fluctuating tempo though interpolation will still have a pitched component. Basically its a disaster for certain types of effects.

    3) Lastly, I think it sounds like ppl are already looking into some great ideas. As far as I can tell, our problem is similar to NTP issues for servers and clients. This is a very difficult problem to solve, much harder than I had thought it would be. Look forward to seeing what everyone comes up with. 🙂

    lots of other stuff to, but this is just a comment so I'll stop here.

  • Actually I should have said 25 devices and 12 devices are optimal, in the above comment… I should have used the word devices, not connections.

  • Wulliamz

    I'm interested in these issues and glad to scoop up any ideas, however I also concur with other commenters about the "just press play" solution – a few years ago I made some tracks with a friend, he on his MPC, me on my laptop, and it surprised me how easy it was to manually sync, and that they did not drift out of time (to any noticeable degree.) Also, having made music in the glory days of MIDI and hardware of various kinds, in my experience the kind of timing you get in a computer (rigidly rock solid) really does take a bit of life out of music… luckily I have ways and means of reintroducing tiny amounts of timing jitter.

  • aws

    Open Sound Control timestamps enable deadline scheduling, atomic events (multiple parameter updates to occur simultaneously), accurate calculation of inter-event time (for example, estimating the rate of change of a parameter), and recovery from transport jitter (by means of adding a small bit of extra delay).

    These are things that you do WITH a system that has been synchronized using some other method. Some of the details are described in a paper by myself titled "Implementation and Applications of Open Sound Control Timestamps" (ICMC 2008).

    Moreover, the sync problem extends to not only multiple computers in a performance, but to EVERY device you are using, including the gesture controller, which is typically a computer in its own right (usually an embedded processor of some flavor). In the above paper I show an analysis that concludes that typical noise induced by transport jitter (~2msec dispersion) is probably ruining the quality of most continuous gesture controllers to the point where the effective headroom of the channel is only about 24dB (that is down to 4 bits, ouch)

    Why does OSC use a 64-bit absolute time format? Well, why not? There are some arguments concerned with efficiency–but this is explicitly avoided since the original design criteria of the format assumes 10Mbit or faster network is available so the large format has negligible impact. i.e., OSC anticipates Moore's law and decides pre-emptively that putting lots of effort into saving bits is less important than making something that works and is easy to explain.

    OSC does not deal with the actual synchronization, this is left to NTP or some other method. NTP gives accuracy of about 1msec in the best case, but you will want a Stratum-1 or Stratum-2 server nearby (for example on most campus networks you will find one).

    You can roll your own client-server network sync protocol and get pretty easily to about 0.1msec accuracy, I have done this between computers and embedded devices, for example.

    The state of the art in sync protocols is IEEE1588 or PTP (Precise Time Protocol) which achives sync down to less than a nanosecond. It can be propagated over ethernet with hardware support. This is how the industrial embedded developers are dealing with the problem, and eventually that tech should trickle down to the rest of us… until then, hope for the best. 🙂

    Hope this helps—Andy.

  • well…

    i'm going to be the one arguing for out of sync artefacts between two machines…

    got two laptops…

    both gateways one p4 xp, and the other centrino due vista…

    both got live 7 on them…both got 1gig of RAM…xp machine's got evolution machine's got Kore…

    xp machine is master (usb to midi out)

    honestly…both the machines hardly ever sync but it creates aliases and offbeat artifacts that add a unique accent to the music.

    anybody else appreciate this?

  • Pingback: Create Digital Music » Live Music Makers Ask: How Can We Get in Sync? | By IT()

  • Well, I said it at the beginning of the story, and in comments, but I'll say it again — sync is just one of a range of issues that can make instruments communicate between each other, allow musicians to play together. I absolutely agree that playing together does not always need to mean syncing up *at all*, let alone with some of these degrees of precision.

    That said, I think people want the *option* of having it work properly. I mean, if you're a fan of complex polyrhythms, you don't necessarily seek out a terrible drummer to help you achieve them. 😉 You usually want to do it intentionally.

  • Hello, looks like most of the activity is here in the Comments section of the post, but I also posted this at the noisepages group. Looks like PTP could be the answer for getting the clock events to and from computers, but Jordan and I were attempting to get local internal Midi Clock controlling Ableton from both C++ and Chuck, and still have about 3 BPM variance using either language. Got a link to this work with system clocks that might help us out a little bit.

    Also, what should we work in? C++, Chuck, Java. Any ideas? I would imagine C++ (openframeworks?) but I’m not the most knowledgeable so…?

    Stoked to figure this out 🙂

  • @ Andy: Great paper btw 🙂

  • aws

    Here is a direct link to the paper I mentioned (sorry I forgot this):

    Also I am quite fond of this paper by Fons Adriaensen, "Filtering Time with a DLL" (delay locked loop). This shows how to align physical time with sample time by removing the jitter in the audio callback with a 2nd order IIR filter:

    You can use this technique to make oscillators that are in phase globally up to the precision of the system clock.

    I gather that some form of this functionality has already been incorporated into the Jack Audio Server.

  • Gustavo

    I've been syncing multiple laptops since the late 90's for live performances, from Reason 1.0 to Ableton Live version 1 to the current version.

    Software clocks just don't have the stability that you need. There's workarounds and rare exceptions, but the moment you switch to slaving off something like an MPC or Electribe, it's a noticeable difference. I've been using a tiny MAM arpeggiator box for years, split using an MX8 midi patchbay or a small Midiman or Roland 1-to-4 midi through.

    Your midi interface of course has some bearing. I've had very good luck using "dumb" interfaces like the M-audio 2×2 and the smaller Edirol / Roland USB interfaces. I've had terrible luck with anything that supports midi time-stamping, like the old Emagic interfaces.

    From what I've seen – there's some hardware like the Elektron boxes that just doesn't have good sync. Also, a lot of devices like the KP3 or the Pioneer DJ mixers don't generate start/stop messages – just tempo.

    So the short of it – use a piece of hardware to generate your clock, use a "dumb" midi interface, and make all your software slave to the midi hardware clock.

    The funny thing is that tight midi clock sync was pretty common with a lot of hardware in the 80s/90s – but that technology just didn't get ported over to the virtual world. 🙂

  • mo-seph

    I've been playing a lot with people where we sync Live (or Radial) using the built in OSX midi-over-wireless protocol. It's always been fine!

    Nowadays, I'm much less worried about the sync, and I think there's a lot of more interesting things to do with timing than make everyone's beats the same. I guess it depends what kind of music you want to make…

  • GMM

    Great initiative Peter!

    I have torn out my hair over this issue many a time. I usually have two or three laptops in sync during liveshows, running Live. My current sync solution is:

    -One laptop is master

    -This laptop runs a closed Airport network

    -Other laptops connects to this network

    -Master laptop runs a network MIDI session

    -Other laptops connects their own MIDI sessions to this MIDI session via Airport

    -Master laptop runs MIDI clock sync out on the MIDI session (Live can't run as MTC master)

    -Master laptop runs in ONE TEMPO ONLY (60 bpm)

    -Slaves are listening and following master

    Any tempo-based effects or timings on either machine are happening in manually customized fractions of master tempo, which is a pain, but flexible tempo is more pain.

    This method is usable for linear sets, where events are scripted to happen at specific points in time.

    My method is not foolproof and every time I set it up for a show, something somewhere is flaky and quite often just needs to be turned off/on. But usually I get everything stable just before the show starts.

    I absolutely wish syncing up would be more stable and hassle-free.

  • Jordaan

    I've had limited experience syncing multiple computers but I do work with a number of hardware units. Specifically, I do a lot of my main composition on an Akai MPC and tried numerous times to sync it to Ableton with Midi Clock. When I am playing a set and have to change sample banks, I need to stop the MPC playback, load the samples, and then stop Ableton and re-sync the two again.

    Another thing I noticed was that when I record material into Ableton as the slave, the recorded audio is riddled with timing artifacts and the response timing for playback is delayed significantly. It is hard to describe but it only happens when Ableton is the slave to Midi Clock. So.. let's just say I've learned to work with the various idiosyncrasies of my equipment. After reading the comments of this post I've gotten some other ideas for way to connect the two. I had just resigned to the fact that I needed to record my parts in a wave editor and then import them separately into Ableton.

  • I've been working on sync'd live set with a friend and MIDI clock seems to work alright for us. He is running Logic which sends MIDI clock into MAX. We have a little max patch which sends the midi clock stream over TCP/IP (this is MUCH more reliable than the other network protocol which is implemented in MAX). Our laptops our connected with a single ethernet cable. I have a similar MAX patch which recieves the MIDI clock and then sends it into Live via MIDI Yoke. The timing is fine and because we are using MIDI clock not timecode, it can follow tempo changes. Our only problem is that Logic keeps crashing on my friend's computer!

    the aspects of our set up that have made it work reliably are connecting via an Ethernet cable instead of a wireless network, and using TCP/IP to send the midi clock instead of the other protocol we had been using.

  • Bodhi

    I wonder if it will be possible with max-for-live to rig up some kind of system using phasor~ and lightpipe like enomis described in an early comment. Or encode timing messages in the ADAT audio that are decoded at the other end, you could probably get pretty close to sample-accurate sync. I vaguely remember Cubase being able to do this several years ago?

    I spent an awful amount of time tracking down timing problems in mrlV (monome app built in Max) and trying to get it to sync to external midi clock. It seems that the 'recommended' way in Max to sync to midi ends up running late after a while. (But then my Max trial ran out 😉 )

  • 1. Are you routinely trying to sync multiple musicians?

    Yes for 3 years now trying to synchronise two same mac book pro

    2. What software (and hardware) tools do you use?

    2 mac book pro / ableton (6, 7, 8…) via midi using Motu Ultralites (both the sames) or via airport or other trials

    3. What have been some frustrations?

    works quite ok with small live sets, got out of sync with huge sets, tempo is jittering on the slave computer

    4. What techniques have worked, or what have you learned you might want to pass along to other users at various skill levels?

    No idea, just would like someting that work, osc maybe?

  • Mark Kunoff

    There are some incredible insights in these comments. This is exactly what I was hoping for! Some insanely brilliant minds frequent this blog and I have no doubt that these discussions will bring more light to this elusive subject. Now, head on over to the noisepage!

  • there seems to be quite a bit of confusion here between positional and other kinds of "sync". the PT manual does a good job of differentiating between the kind of sync that allows participants to determine where they are on a timeline from the kind that allows determining how fast you are going. so, for example, MTC provides positional ("where") sync, but MIDI Clock provides speed ("how fast") sync. Word clock provides "speed", but at the sample/hardware level. Although its possible to use one to generate the other when coupled with some assumptions, they are really not equivalent, and if you don't handle them separately, then all kinds of design kludges follow.

    one of the issues i have not seen addressed in the comments so far is the issue of shared tempo maps. if you only work in a single tempo per piece, this is not such an issue. as soon as there are tempo changes to contend with, its not simple. what you really want is a way for everyone to access the tempo map (preferably read and write, but read only is OK for some things). that goes far beyond providing some kind of steady clock signal of some kind. however, it allows all the sync participants to know about changes before they happen and thus provide proper support for multi-meter/multi-tempo compositions. its not trivial to do, however, because it means agreeing on the representation of tempo and meter across the whole system. given some discussions on the coreaudio API mailing list recently about beat counting, and older discussions i've had online, this kind of consensus could be very difficult to achieve.

  • ben

    As Paul noted, there are differences in trying to figure out the 'when' (or 'where' as Paul puts it) from the 'how fast'.

    To deal with the 'when should everything happen', I think at the heart of it, you'd like to know how the clock on computer #1 (which reads T1) relates to the clock on computer #2 (which reads T2). Remember, these clocks are probably not just offset (i.e. T1 and T2 are different), but they're probably running at slightly different speeds, so the offset between them will drift over, er, time.

    There's a hardware based solution that's completely non-standard in our typical musical systems, but that I've used at work to synchronize different data collection systems over a CAN bus (CAN is a network bus standard that's designed for inter-microcontroller comms, often found in cars and other vehicles).

    I'm not proposing that we all ditch our ethernet and switch to CAN, but if there is a local sync issue and you're willing to chuck a couple of (say) arduinos at the problem, you could replicate this kind of thing in the comfort of your own home.

    Without the aid of diagrams, I might just confuse everyone, but I'll try my best to describe how we made things work.

    On our CAN bus network we had a bunch of PCs, each equipped with CAN cards. Each card had two inputs, and a real-time clock (in essence, a counter that starts at zero when the device is powered on, and increments every microsecond, perhaps even more finely-grained than that). At any point in time, this clock can be queried by the host computer. One of the inputs is used for message data. The other input is dedicated to a hardware sync pulse. This is just a very simple electrical pulse signal that is sent from a central source (for example) every 2 seconds. The important thing about this sync signal is that in itself it encodes nothing about time, it's just a pulse, and over short lengths of wire, it will arrive practically simultaneously at every device connected to it. Even when this pulse happens isn't important, so if the timing on your pulse generator jitters that's ok too.

    How the synchronization happens is as follows:

    Every time a pulse is received on the hardware sync input of a device, the current value of the hardware time at that instant is latched to local memory. The device then sends out a message to every other device on the network that includes the value of the clock when the sync pulse was received.

    Every time a message is sent on the data bus, the sending device adds the current value of its clock in the header of that message.

    This message will be received at every other device on the network. As an example, if you have two computers, in response to every pulse on the sync circuit the following happens:

    Computer #1: Latches its local counter (T1) and sends a message to everyone saying 'hey. my clock read T1 when I got that last sync pulse'.

    Computer #2: Latches its local counter (T2) and sends a message to everyone saying 'hey. my clock read T2 when I got that last sync pulse.

    Computer #1: Receives the message from computer #2, and says, OK, when my clock was at T1, computer #2's clock was at T2. So the current offset between our clocks is T2-T1. Whenever I receive a message from computer #2, I need to look at its timestamp and add a correction of T2-T1 in order put that message in proper context against my own clock.

    Computer #2: Receives the message from computer #1, and says, OK, when my clock was at T2, computer #1's clock was at T1. So the current offset between our clocks is T1-T2. Whenever I receive a message from computer #1, I need to look at its timestamp and add a correction of T1-T2 in order put that message in proper context against my own clock.

    (if you have more than two computers, it's a similar deal – every computer on the network just has to maintain a list of offsets between its own hardware counter and the counters on the other networked computers)

    The host computer also has access to the hardware counter on its local CAN device, so you can figure out the relationship between the hardware counter and the computer's clock. After doing that, each time you receive a message, you know it is timestamped correctly with respect to the hardware device clock, and you know how to relate that clock to the computer clock, and everything hangs together.

    As long as the hardware pulses on the sync bus arrive at a rate that is 'often enough' when compared to the rate at which the two hardware counters drift you should be fine. In our setup we ran the sync every two seconds, which was way overkill for the hardware we were using; the pulses could have been hours apart before we started seeing drift that was significant for our system timing.

    This solution was effective for allowing us to synchronize the 'when/where'-ness of the clocks on the various machines.

  • ben

    hmm. that was a long post. sorry about that.

  • Owen wrote; "1) Although it is possible to play two live laptops out of sync (pressing the space bar), the clock’s minute differences will drive the two sets out of sync over time."

    Really? We all run multi-GHz laptops, I don't think there is any excuse for audio programs on that kind of system to have perceivable drift for realistic performance lengths; it's not rocket science. Of course there will always be rounding errors, such is the nature of digital systems, but those don't need to carry. If my DAW drifted like that in 2009 I'd want my money back.

  • Jaime Munarriz

    I think we've hit a really interesting spot. Someone (we ?¿ ) should start an open project for implementing this, osc, prfect timeing on the network, and some clients helping in Pd Max etc.

  • I have a 3 man Ableton ensemble @StaticGrooves where we do a similar setup to Glitch Mob. MTC is unworkable in full. It will hang on you without question. The best thing we've used is:

    ipMIDI from

    It is indeed a MIDI over ethernet solution, but we have not had the crashing issues, or clock gone haywire issues some have reported. We can change tempos and stay locked, but not without some caveats.

    nobody should be connected via WiFi, (even though their website says it can work I've had nothing but issues) or have an internet connection at all, or any other network devices enabled other than the one being used by ipMIDI. A Dedicated short cabled 4 port ethernet hub is essential gear. If you absolutely HAVE to have more than one network interface enabled, you may have to do some command line route add statements to get it tight:

    route -n add -net 225.0.0 -interface

    Once the software is installed you have a new virtual midi device in Ableton you enable in preferences just like anything else. Then you set your clock to internal or external depending on your role. One person is the master, and everyone else slaves, so the master gets to map a midi controller knob to tempo but nobody else. Slaves need to make sure no clips are set to start or have their audio levels turned down, since the start event from the master starts the clock on all systems.

    If you do get a glitch, the slaves need to unsync, resync and then jump back in. Of course from a musical perspective, you need to juggle clip jobs around in the group before doing anything radical to avoid nasty sounds emanating from your gear.

    other general caveat: ipMIDI is free for MacOSX but costs $$$ for the Windows version.

    Its not perfect, but in our experience, it beats MTC over MIDI cables by a long shot.

    When our sets are less choreographed across setups, we forgo the ipMIDI, set the same BPM, and use the spacebar start to beat match each other low-tech style, which sadly is better than most syncing solutions.

  • The problem that i feel is that "syncing" is a state of mind. How do you do it? How would i do it? The main issue is that a lot of people are trying to copy techniques that don't mix well with their personal work flow. I have synced laptops with hardware… Live with Live …Live with Traktor no problem…rock solid . The thing is you have to learn to use your devices with their limitations and proactively think about what your trying to accomplish first. We have come to a age when we first to blame software or hardware if we cant get something to work.

    1. Eliminate useless or redundant gear in your setup. no need for that extra synth for one sound…sample it

    2. Dedicate devices. ie use older beat machines for Midi Master. MPCs older korg rhythm boxes

    3. Work in steps…get 1 thing working first then add others.

    4. READ MANUALS……a lot of "features " can be used for other things.

    5. Use your equipment wisely…..think outside the box…don't look on the net for solutions first… and keep making changes until it works.!!!!

    Experimenting is the building block of creativity!

  • Hey check out Innerclock Systems for Sydney Australia. Their Sync Lock solves a lot of sync problems. It bases it's clock off audio files that play in time to your DAW. The clock is perfect because it id derived from the exact same source as the audio files playing because it is audio. Their website explains it well. Single handily will solve all my sync problems with new and old tech.

  • This is hilarious only because Ableton assured me there were no MIDI sync issues over 5 years ago when I was doing software QA at M-Audio.

    And while MTC really should be the solution, it is completely useless in Ableton. The problem really is Ableton: it is the worst sync'ing software every created.

    Pretty much all other software can be made to lock with MTC — for example: running Vegas on a PC and editing music for said video on a macbook; Running ProTools in Vista and Acid on a netbook. There is drift and lag, but after a few dozen frames, the timing works out.

    * Gustavo's post about a solid external clock rings true. The only way I ever got 2 copies of Ableton (or Ableton + Reason) to sync on 2 notebooks was with a independent clock source. Technically, the clock source should always be independent for very profound reasons. If you don't like that, find another universe. That's how time works >here<.

    Anybody tried using Jack for this?

  • fred johnsen

    maybe people should learn how to play instruments 1st by themselves & then w other people…

  • I would have to agree with DJ URBSTAR. I am not sure exactly why there seems to be such a problem for so many people. For example the reason we use the UM-880 is because it is solid kit. We use the Kenton Pro Solo for the same reason. I think some people might be using less than ideal gear and maybe not full informed. Yes sometimes you run into things like the Machinedrum.. it just does not like to run as slave.. but for the most part people had mpc 3000's and cakewalk in sync.. tb-303 and tr-909's in sync.. software clocks are used today can be just as stable.. but just like good A/D you need good midi hardware. Ableton helps to get the job done with its timing offset for each midi port. It took many hours of work making our setup solid, but in the end many problems were resolved by making sure external gear was running with external power when sending midi clocks (the usb on my d820 powers midi controllers just fine but when it has to send a midi clock to real gear it is not solid from usb ). The UM-880 has an ICE cable. I hope this helps out.

  • Mark Kunoff

    I can only speak for ourselves, but we have spent countless hours researching and educating ourselves as much as possible regarding this issue, so be careful to assume that we haven't. This also goes for comments like "learn how to play instruments 1st by themselves & then w other people." Um yeah – that's exactly where we come from and NO, we'd rather have tight sync between our gear by CHOICE.

    The main motivation for my letter to Peter was to gather all these incredible viewpoints so I'm very happy with the result. Generally speaking, this subject IS elusive. There's some great info in these comments which no one else, including Ableton has steered us to. A big "Thank You!!" to everyone who has contributed to this discussion and I'm hopeful you'll join us over at the dedicated noisepage to continue it further.

    @ Nath Null Object – Thanks for the info on Innerclock Systems. I'm gonna explore that one in depth. Despite my greatest efforts to scour for innovative solutions, this one never came up in my search.

    Finally, Ableton might decide to continue to ignore this, but given the Max/MSP integration, the users themselves could potentially solve this problem themselves. My jaded expectation is that this is exactly how it will go down. See you all at the 'sync-or-swin' noisepage!

  • Alex Falk

    1. Establish an agreed-upon bpm.

    2. Press play on laptops.

    3. Make phase corrections using "tempo bend" (Live 8) or tempo nudging (older versions)

    4. Share and enjoy!

  • i think i'm/we (choking sun) are in the same "boat" as @William. having been at this now for over 15 years, i've/we've found that the only time that sync is required is when recording across multiple devices at the same time. the motu midi express xt usb did a fantasic job between cubase and an akai dr16. other than that, tap tempo and doing adjustments on the fly has always worked. now granted our music style and approach is much different than most digital musicians.

    @m-clis: what problems are you seeing when syncing hardware and software? the only problem we've ever had was with midi clock to multiple devices. the main issues was the amount of data going to the devices was causing all sorts of drops (lots of midi data at the same time). so we ditched it.

    i think what the topic is really bring up is our growing reliance on technology for performance. with the migration from hardware to software (though laptops are hardware just not in the traditional sense us gear heads use), more is being left to the machines to handle. while i completely disagree with the sentiment that djs and other digital "technicians" aren't musicians, i think there is something to be said to a loss of some of the traditional aspects of musician. if i can't keep time with another person regardless of what i'm using, then there is something wrong. we can blame technology short comings as much as we want but keeping time with others is core to performing with others. its become such a big issue because we're letting the machines control our sync instead of ourselves.

    just my two cents of course.

  • Mosquito wrote; "more is being left to the machines to handle."

    Yes. One of the big questions I have been struggling with is where that leaves intonation. How do we link automation to intonation?

  • Hi there to you all…

    I'm amazed that no one has even mentioned AudioMulch (AM)… I think it was one of the first apps that allowed simple connection over LAN.. Really geared to live performance!!

    I actually tried it with a friend of mine and it waorked flawlessly… unfortunately AM has its own idiossicracies and the lack of MIDI features at the time of V1 really bummed us out..

    Actually, now with Launchpad and matrix routing in AM… hmmm getting some ideas 😉

    Nevertheless, I think that Ross Bencina could well have a word on the subject has he programmed the monster!!

    @Peter Kirn, have you talked with him onm this?

    Just my 5 cents… Hope it helps!!

    PS: Congrats for CDM and its role in keeping E-Musicians educated and up-to-date!! I really appreciate what you all have acomplished.

  • Amos

    AudioMulch 2.0 is out, and it runs on both Mac and PC. Offers much better MIDI implementation than AudioMulch 1.0, and sends and receives Midi Sync *AND* Ethernet Sync. Also, it's an incredibly awesome live sound-manipulating tool. Definitely worth checking out!

  • Amos

    footnote to Zevinhill: I think a lot of the folks that are griping about sync issues have stricter requirements for sync than your live style might demand… as in, utterly precise synchronization with under 5ms of jitter between events that are supposed to happen at the same instant. This is beyond overkill for live jams, but a critical component of the "feel" of a lot of mechanistic electronic styles… for me, it's to the point that I can't even write techno for mixed hardware and software instruments until I sort out the sync & timing issues in my studio… if the timing is flabby, the groove just dies.

  • Amos

    sorry for triple-comment: last footnote was to dj mosquito, not zevinhill. end of transmission….

  • I've been talking about this for a long time. This is something that needs to be built into the operating systems and gear. The harsh reality is that we need to work with MIDI sync, and we need something else, that's beyond MIDI to really solve the MIDI sync issues between two computers. I think the problem is pretty easy to solve if we can assume a LAN connection.

    Check my post from around a year ago:

  • And Peter, you made a mistake about TCP vs. UDP. UDP is packet based, which is why we all use UDP for time-critical applications. If the slave misses a UDP packet, one could just assume that the tempo hasn't changed, the next time you receive a UDP packet containing the absolute timestamp, then the DAW can lock to that. A simple method for determining latency is just sending a UDP packet, waiting for a response UDP packet and dividing the round trip time by two. Then we know that we need to pre-delay the outbound timecode stream by that latency, and the machine which receives the timecode just acts upon it as if said timecode were the truth. This is assuming of course that UDP packet handling is simple, real-time (as in lives in a 'real-time' thread) and predictable (goes along with the real-time thread part).

    I really think that sync is the main problem yet to be solved in electronic music. We have the technology, we just need to build a workable model that is easy for both hardware and software manufacturers to adopt… which is why OSC (which can be sent over TCP/IP, UDP, USB, Firewire or serial) is possibly our great hope. Some of those interfaces actually have pretty stable latency characteristics… some (like every one of the networking interfaces) do not.

    Once people adopt a good scheme for OSC sync and we have hardware that converts from OSC sync to MIDI sync and DIN sync, then I can jam in peace.

  • Just a note. After some experiments it looks like we can't even get stable sync internally. This is not a fault of Ableton btw. It seems to be very difficult to get a continues, uninterrupted clock source to gauge the Midi Clock messages against. There seems to be ~ + or – 1.5 BPM variance when sending a locally created Midi Clock source (from say Chuck or openframeworks) over IAC into Live.

    In addition to worrying about network timing issues we should also solve for the local timing problems. Very hard indeed. Interesting side note, an engineer was telling me the other day that a lot of astronomers have abandoned Windows as a viable clock source. Sort of makes sense as a chip can only do one thing at a time so I imagine the clock must be a low priority thread, constantly being interrupted. Maybe a clock "buffer" similar to how audio deals with this type of thing? Or even querying an audio buffer as a clock source? fun times 🙂

  • Oh, and to address the myriad of comments saying that sync is not an issue 😛 This issue also effects your plug-ins. Any slave computer that has its BPM wandering slightly back and forth, while using a delay effect, will have bizarre pitch artifacts at the very least, or un-interpolated nastiness at the worst.

  • Owen; I mostly set delays using a gesture. I detect the speed of the -manual- gesture, then base the delay time on that, not on the bpm.

    I recognise the importance of shared clocks but I'm at least as interested in working towards a instrument/interface that wouldn't benefit from one.

  • Ok here we go.

    Ableton Live v. 7.0.2 has a stable slave midi clock.

    It has been the only version I have found so far that has a stable clock on the receiving end.

    I've been working on a band project for a couple of years now, in which the 6 members all use Ableton Live. We have 4 or 5 laptops running Live, one of which acts as Midi Clock host.

    The 'master' laptop runs a ± 60 minute arrangement, with quite a few BPM changes. The other laptops use the Midi Clock sync from the master for BPM as well as song position. There is no way we could do this with only BPM (osc) or only song position (MTC).

    We have tried about all soundcards and midi interfaces on the market. In about all hardware configurations, including clean os installs, network solutions, etc. etc.

    Nothing works. The fluctuating BPM results in VST/AU effects glitching, totall Sync loss, warped clips running bezerk, live 8's looper recording out of sync, etc. etc.

    The only solution we've recently discovered that works is reverting back to Live 7.0.2 for the receiving laptops. It seems that version does very nice rounding off of the incoming BPM.

    It seems strange that Ableton has 'forgotten' this version exists. When I recently called support, they acknowledged the fluctuating BPM but said it was expected behaviour, due to flaws in the MIDI protocol. Not because of any issue with Live (sic!). Any trouble with third-party plugins should be referred to those 3rd party developers, not Ableton. Any problems arrising from the sync issues with Ableton software should be posted as a bug on the Abe forum.

    Of course this response is completely rediculous, but it is the situation at the moment…

  • @Amos: you are right that we don't need as tight a sync with our style. that i know, however when reading about the number of machines, etc that many people are trying to sync, its been an issue on at least some level for years and not isolated to just newer software. i will say that i find cubase to be a far better program for avoiding many of the sync issues, but the point here is directed at the problems in live. MTC is about the only way you stand a chance and you're locked in with one bpm.

    i still think a lot of people are relying more and more on their equipment to handle things we as musicians are suppose to be capable of doing.

  • Has anything come off a result of this topic yet?

    It has mostly been nodding and shaking fists, with a bit of OSC geekery. But has anybody gotten scientific about the actual problem yet?

    I'll join the testing as soon as I get my second rig. Which is a lame excuse, I know.

    But there are so many factors, its a bit unfair to generalize it all. There's PDC, Audio interface delays, MIDI interface delays, MIDI Drivers, Virtual MIDI Ports, MIDI over TCP.

    Do MIDI Sync signals get priority over audio signals?

    How do you account for all these details and come up with reliable tests and results?

    I often hear phrases like "My old Atari could sync like a…". Which makes perfect sense. It didn't have anything else to process. And correct me if I'm wrong, but even when it did sound it was still using chips to do so.

    My point is, it makes sense that the only task the software needed to perform worked flawlessly.

    And I guess there weren't too many background processes.

    How many processes are running on a computer even when you do nothing?

    Sorry to come off a little ranty, but I've watched this topic being discussed for years. Only to see the discussion die down just like its doing now.

    My last syncing venture was trying to sync up Live and a mc-909 many years ago. Ever since then I've always opted for an external clock.

    I'm getting my second rig soon and will probably try to figure it out, out loud.

    So hopefully this topic is still going.

    But in the interest of understanding the problem, we could use some facts. Generalizations just wont do. You can say software X syncs better than software Y, but can you prove it?

    – How do you measure the latency of a USB/Firewire MIDI interface? Which may or may not also be sending audio over the same cable at the same time.

    – How do you measure the latency between 2 computers, their collective PDC and jitter?

    The only reliable method I can see is to record in a latency free hardware multitrack recorder. Because recording on a computer may skew the results. But I'm not sure how exactly you would perform the test.

    I could be way out my league with this stuff, although I do enjoy geeking out.

    Sorry for the lengthy post. But its one of those topics I feel we should not have to discuss in 2009. Where's my damn jetpack?

  • insilico

    im in an electronic band and we've been trying to sort out sync issues ever since we started jamming.

    we bought a midi splitter/thru box (1 in 8 out)

    we run a drum machine as master (boss dr rhythm)sending midi clock and then 3 laptops as slave.

    this setup is ok, but if a computer crashes then we have to restart the drum machine so that it sends another midi clock start signal.

    we've also tried midi syncing using software midi clocks as master, but same problem prevails.

    at the moment we just sync manually. (1,2,3 GO!) and use the nudge tempo buttons in ableton to sync.

    another problem when using ableton as a slave is the audio buffer. because ableton is tracking the tempo to the moment, there is little time for the buffer, so effects play out of time or stutter and our synths wiggout when trying to play arps.

    we're also running 2 pc's and a mac so we dont have the option to use the specy midi + LAN capabilities built into osx.

  • MysteryFlavor

    Three years later… the trail of tears stretching to the horizon… still  jittering my way through an groove-less wasteland, out of sync. Please God (or Live 9)… save me.

    • al .

      3 years even later and getting midi sync to work is still a pain in Live 9 with it basically not working often :p

  • MysteryFlavor

    Three years later… the trail of tears stretching to the horizon… still  jittering my way through an groove-less wasteland, out of sync. Please God (or Live 9)… save me.

    • al .

      3 years even later and getting midi sync to work is still a pain in Live 9 with it basically not working often :p

  • MysteryFlavor

    Three years later… the trail of tears stretching to the horizon… still  jittering my way through an groove-less wasteland, out of sync. Please God (or Live 9)… save me.

    • al .

      3 years even later and getting midi sync to work is still a pain in Live 9 with it basically not working often :p