Having looked at two examples of what the Lemur multi-touch hardware can do, the videos above illustrate directly what I’m talking about when I describe two different approaches. Metrognome is an insanely-talented guru in the modular instrument/effects-building environment Reaktor. He’s working to build new live performance tools that meld live arrangement / remixing / DJing with a kind of computer meta-instrument. It’s really a great illustration of how software can become a live instrument. It also represents one of two paths in thinking about what touch can do for live music performance.

1. Multi-touch as virtual controller: The Lemur’s design assumes that what you want to do is create virtual hardware, using a stock set of knobs, faders, gestural controllers, envelope editors, and the like. The advantage is, these interfaces are modular and consistent. The disadvantage: you’re limited to pre-built screens and pre-built widgets, so you can’t do anything outside what’s given.

2. Screen as direct controller: The difference with the Reaktor examples is that there’s no intermediary. Whatever is on your computer screen is the interface. The downside: that includes all the usual UI clutter, and the open-ended possibilities could be overwhelming. The upside: as Metrognome artfully demonstrates, you can imagine any interface, build it, and immediately control it – including things the Lemur may not do. The other, not insignificant advantage: you don’t have to buy another piece of hardware, making this route much cheaper. Your screen or projection simply becomes the touch controller surface. Multi-touch isn’t quite ready for prime time on computers yet, but it could be soon.

I’m not saying one is better than the other. In fact, I suspect some people will prefer the Lemur approach even if it means spending additional money, because they want something that has some of the flexibility of a screen, but still behaves more or less like a dedicated controller. But I think it’d be a mistake to miss that we have two very different angles on touch here.

Of course, none of this stops you from building or buying a $50 or $100 knob box and being perfectly happy with that.

For more details on what Metrognome is doing (including an up-close shot of that beautiful ensemble), see our Kore minisite – and expect some more details on this soon over on that site, thanks to our Reaktor contributor Peter Dines:

Reaktor + Touchscreen = Touch Grains, Touch Performances, Wild UIs [Kore@CDM]

  • yay for all this touch screen coverage.
    I've recently made the leap to a touch screen laptop for my live rig

    i've tried out the monotouch live
    but its ridiculously inflexible

    makes me wonder if i couldnt build something in a modular environment that would do the job

    touch screen + live is ridiculous
    the buttons are too damned small
    i've had to revert to using the pen when playing live
    which is not nearly as cool
    and touching or dragging

  • poorsod

    Multi touch won't ever work on an interface desinged for mouse, for the simple reason that mouse is single-touch.
    This means either OSs are going to have to start directly supporting multi-touch (multi-mouse, whatever), or developers of the screens and the software are going to have to come up with ways of going behind the OSs back (which would of course limit its possible applications severely).

  • gbsr

    that second video is just plain beautiful.
    ive been eyeing touchscreens (big/small/ridiculously expensive ones like the leumur/homebrews and the likes) for quite some time now. i think its about time to atleast get myself an ipod touch or something similar.

  • @poorsod: You're right, but I don't think this is as big an issue as you might think. Thing is, you need only to make the *application* aware of additional input. For instance, if you use a tablet-aware application, it's already receiving data that the mouse doesn't provide; that isn't all done through mouse emulation. You just need software to be aware. And this doesn't have to be at the OS level — just the app you happen to be using. Of course, OS-level support is nice, and makes dev's jobs easier, but that's already been promised for Windows7 which isn't far off, and I imagine Linux and Mac will also have support soon.

  • […]Via KORE@CDM and Create Digital Music

    Building instruments and effects in Reaktor is a lot like building your own hardware. But ever wished those fabulous UIs you’ve seen could be used via something other than … ugh … your mouse?[…]

  • Reaktor "…all the clutter" –wtf? designing a UI is an issue in Reaktor? even a novice can choose what is/isn't visible and drag the UI elements around to someplace convenient.

    Native Instruments frequently shows their products with touchscreen displays. I remember 'finger painting' enveleopes at their booth at NAMM 2003.

  • @Nonplus: I'm not saying anything is wrong with the Reaktor UI. The point is, if you're on your *computer display*, you have all the associated UI widgets, which are designed for the mouse + keyboard interaction. I can absolutely see the advantage of making a controller that doesn't have those. If Reaktor had a full-screen mode, of course, that'd help. And any additional widget/UI flexibility they gave us would mean that much more adaptability to other inputs. Obviously, it's working in these Reaktor examples, so I'm not questioning that — but there is room for improvement, and it's reasonable to expect some attention in that area as other input devices become more plentiful.

  • lilith

    #1 it's a Toughbook! <3

  • Pingback: Metro the G[e]nome « rightClique()