Today, we note the availability on Android of Control, a WebKit-based touch interface also on iOS.

For visualists and interactive designers, it’s worth paying attention to one feature in particular: dynamic interface creation. Perhaps biased by the musicians who have tended to embrace them, touch interfaces have tended to rely on the static layouts favored by physical knobs and faders. That’s arguably the worst of both worlds: you lose the tactile feedback of physical controls, but you don’t add any of the flexibility of a display.

Control is an open-source application rendered in HTML5, powered by JavaScript and JSON, so it’s capable of anything you can imagine. But Charlie Roberts has already demonstrated how a dynamic interface could work. Using OSC, you can make control layouts on the fly. That could lead to more sophisticated software integration for visual and musical performance, new chances for collaboration and live rigs, and the ability to make an interface on someone’s device in an interactive situation.

We saw the last of these scenarios in the case of the iOS app mrmr, developed by Eric Redlinger. As proof of concept, I and others put together a gallery show using mrmr, at which interactive pieces were able to build interfaces on-the-fly on user’s iPhones and iPads. With Control, those horizons expand, no longer constrained to individual proprietary UI widgets on one platform (like iOS), but cross-platform, Web-based, and dynamic.

The video above I think does a good job of scratching the surface of what’s possible. More on that here:
Control 1.3: Dynamic Interfaces, jQuery integration & more

But dynamic layouts could go in many, many directions. Since this is especially relevant to visual performance, perhaps in modes of interaction not really possible in music, I’d love to hear what readers imagine. And do try Charlie’s app, whether on iOS, Android, or both:

— and if you’re really ambitious, have a look at the source!