So, you’ve got a plug-in or a hardware synth – and you want to control part of the sound with a physical knob or some iPad modulation. One clever iPad app and an open source scheme could make what happens next happen faster.
Early 1980s MIDI still gets the job done in a lot of ways. But then you hit this problem of mapping. Let’s say you’re an app developer, and you want to support a whole lot of different synths. (You
Users, of course, have the same issue – from controllers to desktop software to apps, we often find ourselves having to manually create templates.
Developer Eokuwwy Development (aka Steven Connelly) faced this challenge with the app MIDI Mod. MIDI Mod is clever stuff, and worth a separate article – it gives you a ton of modulation options you can use to control gear, and then the ability to modulate the modulators internally (routing an LFO to the modulation that’s then routed to your synth). So you can get a bunch of elaborate changing, morphing sounds on whatever you choose.
The breakthrough from Mr. Connelly was to establish a standard schema for defining all those parameters to control. Got a Roland System-8? A Behringer Neutron? Yamaha Reface? BigSky reverb pedal? Moog Minitaur KORG
Other developers have done things like this before. (Native Instruments Maschine, for one, had similar mappings – though unfortunately, the engineers working on this support were to my knowledge included inthe layoffs last month.)
This developer is going one step further, by releasing the entire schema on GitHub for manufacturers and developers. And it could be relevant to anyone – someone making a hardware synth, a Web-based tool, an iOS app, desktop software, whatever.
As a user, you may not necessarily need to know how this works – only that it allows makers of software and hardware to make more stuff compatible, and work more consistently, faster. But the basic idea is, this not only defines a consistent way of defining
Got a synth you want supported? Make the document once, and then – once they provide support for this schema – other tools will be able to work with your tool, check for errors, and even generate code and documentation. It’s a JSON schema, plus a whole bunch of useful examples. iOS developers should be able to get going really fast – even using Swift – but it’s pretty clear to everyone else, too.
I remember this conversation going on for at least a decade, even specifically talking about “wouldn’t it be nice if there were a JSON schema” for this. The reason is, Web developers do this sort of work all the time. It’s just that these were in the form of APIs for Web applications that … uh, stole all your data from a weird online survey that then sold that data to foreign spies or whatever the heck has been going on for the intervening time. I’m kidding, mostly – okay, most of this sort of JavaScript work is more like boring day job stuff.
Isn’t it about time that we applied that intelligence to music?
I don’t know that this particular implementation is perfect, but it is open source, it has everything I and others I had talked to wanted for such a thing, and so it seems time to put it out there.
(Yeah, maybe like minijack MIDI, we can all talk about this now, rather than wind up with two competing formats. Just a thought.)
I know there have been similar discussions to add this sort of functionality to a future version of MIDI. But this particular kind of schema doesn’t require anything in the MIDI spec itself – it’s only built around it. So this is something that works with MIDI 1.0.
Developers, have a look and let us know what you think. Maybe you can add to that list of apps supporting this.
Users, well, you don’t have to wait – you can check out MIDI Mod now, if you have an iPad. (And I better take the opportunity to make some docs for all our MeeBlip synths.)
https://github.com/eokuwwy/open-midi-rtc-schema
https://github.com/eokuwwy/open-midi-rtc-specs
Developer site and a lot more info: https://eokuwwy.blogspot.com/
It seems to me, I’ve heard that song before
Okay, addendum – let’s talk about what this isn’t.
This isn’t MIDI-CI (MIDI Capability Inquiry). I wrote about that when it was first announced, along with other upcoming additions to the MIDI spec in MIDI 2.0:
MIDI evolves, adding more expressiveness and easier configuration
You can also read a more technical explanation on the MIDI Manufacturers Association site:
https://www.midi.org/articles-old/details-about-midi-2-0-midi-ci-profiles-and-property-exchange
But MIDI-CI both provides different capabilities and requires different devices. Basically, MIDI-CI and Open MIDI RTC Schema could well complement one another, as they accomplish a related task for the user but don’t overlap much beyond that. MIDI-CI handles protocol negotiation and
Another point – Open MIDI RTC Schema you can use right now, and you can roll your own schema docs even if the manufacturer doesn’t bother. MIDI-CI isn’t here yet at all, and it won’t help your existing MIDI 1.0 devices when it arrives anyway. (It won’t break backwards compatibility with them; they just won’t be able to do the MIDI-CI tricks.)
Now, this MIDI schema is actually very much like OSCQueryProtocol from Vidvox. The difference, of course, is in the name. One works with OpenSoundControl (used more often in visual software), and one in MIDI (still the standard for most music hardware since the 80s, more or less).
But as the working method is related, these two will also complement one another nicely. I wrote about this early on, but it continues to develop:
Other feedback is welcome, of course.