From the Lab: Airborne Beats from Oblong Industries on Vimeo.

With hand gestures recalling those that first reached the mainstream in Minority Report, “Airborne Beats” lets you make music just by gesturing with your hands and fingers in mid-air. You can drag around audio samples, and make gestures for control, controlling both production and performance. Coming from the labs at Oblong, it’s the latest etude in a long series of these kind of interfaces (see below). They in turn point out this could work with any time-based interface. And because of the nature of the interface, it also makes those tasks collaborative.

Technical details:

If you’re wondering how the app was built, Airborne Beats was programmed in C++ using g-speak and the creative coding library, Cinder. As for hardware, we pulled together one screen equipped with an IR sensor, a “commodity” computer (with g-speak installed), and speakers. For the most-part, it was designed and developed by a single person. Although Airborne Beats is currently a demo, the users of this application could be composers, DJs, or perhaps even children or educators. And the ability to recognize multiple hands opens up some unique collaborative possibilities (guest DJ, anyone?).

Now, it’s a clear a lot of work and talent went into the app. But I can’t help but notice that the results are, frankly, a bit awkward. (Of course, that’s why testing and experimentation is so valuable: there’s no substitute for trying things out.) There’s some really clever stuff in there, including the overlay of envelopes atop waveforms and the way the interactions work, particularly grabbing audio from a pool. But while it shows potential, it’s also hard to see a lot of advantages over the conventional input for the same interface.

In fact, it seems what this otherwise-impressive demo needs is to somehow pair up not with a GarageBand-style timeline, but the likes of AudioGL, the 3D interface we saw earlier today. That interface on-screen must in turn deal with the fact that the mouse was never intended as a 3D interface. (To be fair, that hasn’t stopped gamers from making lightning-quick work of using it, but it still seems worth reviewing.)

More:
http://oblong.com/what-we-do/labs

And here’s some of the other work they’re doing. As you can see, some of these experiments, built around the gestural interface, suggest more effective possibilities.

Oblong Labs from Oblong Industries on Vimeo.

In fact, this all segues nicely to an insightful post by Chris Randall today at Analog Industries.

Chris makes an impassioned argument for taking a fresh approach to designing interfaces for music software, rather than just copying existing music gear. Viewed in this context, that becomes necessity: you can’t devise a novel musical operation or physical interaction and expect it to match up well with a copied-and-pasted UI. As Chris puts it, in regards to AudioGL:

What I really want to talk about is how this shoehorns in to my latest flight of fancy. What I like about this app is that Jonathan has, for the most part, ignored the standard conventions that the music tech industry relies on. (COMMANDMENT ONE: THALL SHALT MAKE ALL COMPRESSORS LOOK LIKE A BEAT-UP 1176! ETC.) Instead, he’s just made it look cool and logical.

Now, perhaps Hollywood Tom Cruise should stick to the hand gestures. But approaching the problem of designing an interface afresh ought to be just that.