The quickest route to expressing an idea remains the gesture of a hand. That gesture may be crudely interpreted through today’s touch displays, but the immediacy remains. Presumably because of some of the device’s limitations, a lot of the experiments with the iPad have involved controllers that operate independently from sound software, like a remote control. Those interfaces, while useful, largely simulate existing hardware controls in a more flexible form, rather than introduce new ideas. But it seems the long-term potential for touch devices is in designs that unite touch, graphic, and sound in a single piece of software, exploring new paradigms for interaction along the way.

Usine is one of music creation’s most surprising secrets: it’s powerful sound software that incorporates creative touch interfaces as a core design principle. And in the video above, it’s running on a relatively cheap PC two-touch display from Packard Bell. Nay-Seven is one of the founders of the Usine community, all while lecturing internationally, and has been pushing the Usine software to its limits.

Here, he tells us about some of his latest experiments, and the potential they hold.

Always looking for a way to use the computer as a real musical instrument, my latest works try to combine graphics and music using a touchscreen interface. The software Usine from sensomusic gives me the freedom to build my own interfaces. Some examples:

Drawing pitch and pan

Here [at top], the purpose is to draw directly some pitch information on the waveform display of a sample. I’ve also added an LFO [low frequency oscillator for modulation]; this way, the drawing can move slowly according to different speed presets.

[At bottom], I play with pan and volume: the x position of the black ball on the lines gives pan information and y the volume. As I’m on working with a dualtouch screen, I can quickly draw some speed changes. Note that this panel is not only for pan and volume; I can also send this drawing to others parameters like delay and filters, here with the << button. Geometry …or not

This workspace is also dedicated to drawing. I’ve built four layers, each one with its own color and its own sound. The XY position gives the pitch value of the notes and other parameters, like velocity or pan. The geometry provides sequences; lines give a kind of glissando.

Vertical sequencers and Pads

Using the new Matrix module (thanks to Martin Fleurent), I’ve built this vertical sequencer [seen at top]. I like the idea that notes fly under my hands this way. [At bottom], I‘ve built pads for tablet surfing on the “iPad” mode, adding also a drone option.

Multitouch gestures

On the same idea of movement, here are two screenshots of a video illustrating a new patch made by Olivier Sens (the Usine developer). This patch provides multitouch gesture recognition, opening new doors to ways in which we use our computers and touchscreen. We can easily imagine some new symbols or alphabets, and new forms of interactions in our musical practice. You draw a ‘V,’ you play with volume, you draw a ‘P,’ you play with pitch…

For more on the display, check out the Packard Bell Viseo 200T. It was previewed by Engadget last year and carried a street price – impressively – of only about US$300, all for a 20-inch screen and low latency. I’m gathering either something happened or it was re-branded for distribution outside the UK; anyone with more information, let us know in comments and I’ll update the story.

More on nay-seven’s Flickr:

All screen images courtesy nay-seven. Used by permission.

  • mat

    wow – amazing!
    I specially like the drawning sequencer and the waveform handeling…very intuitiv.

    I guess if it comes to "how can I create touchscreen controls for music" the discussion between Lemur and TouchOSC (which I also raised in some comments) clearly missed that other french nische product: sensomusic.
    They got the most freedom in objects. Damn, I wish I had that waveform display on the Lemur…or even that simple drawing method…

    OK, it is the hardware that matter? Only dualtouch? Well, as I work with the Lemur for a long time, I have to state that dualtouch is enough for 90% of interactions. You can not concentrate on 4 different aspects and handle them parallel anyway 😉

    great work! I like to see more usine stuff!

  • fx23

    my sweet usine i looove it. nay makes really
    cool patches.

    check the powerfull new usine matrix module also
    with ableton live session to touchscreen.

    yup mat we all wait a true multitouch good hardware,but two is enough in most cases and
    really cheap for a freedom lemur…

  • i love double middle finger touch screen controlling.

  • @mapmap: Yeah, not all gestures are entirely universal. UK users would need four-fingered touch to fully support that one. 😉

  • did anyone else spot a naughty image? :]

  • TechLo

    @bedroom blog

    You mean the demo dude giving the finger to Steve Jobs throughout? 🙂

  • it seems that it might have been re-released in laptop form as the Butterfly? i'm not sure but i really want to find out! i've been searching all morning…

  • Nice! I always wondered what a face sounds like!

    This is a very cool idea. I love touch expression. I really want to incorporate touch into my live sets just haven't found the right hardware solution yet… the new droid stuff is really promising though.

  • @TechLo
    hehe. 😀

  • Thanks guys for comments , and yep , didn't realize i use these fingers when I've done this video ha ha , hope no one is offended..;-)

  • Androidmonk

    Very impressive!

    Musician/developer Olivier Sens has done an absolutely amazing job with Usine, and it's great to see power users like Nay-Seven showing us great demonstrations and making cool music with it!

  • your post is very usefull to me ! keep update plz!

  • Nice work! I like the range this sort of input/control can allow the user, as well as the amazing color schemes used. Does anyone know how well/if Usine works with simple single touch input from a mouse or Wacom tablet? I will have to look into this app environment! Thank you.

    I have a friend who is physically disabled and has slight mobility problems with his hands as well as much more severe lack of use of his legs, and he decided when I was showing him the NDS homebrew app AX piped through VST filters (I am trying to get him into DS composition) to start writing in words to thus produce the musical variation. I cannot repeat the foul language here, but I had never thought to use word and phrase input into my Wacom or the DS to control the settings for apps with an XY control array or whatever. I will have to keep my eyes on this whole front as prices drop to give him something else to play with when we jam in some amateur fashion. He has an iPhone but it is boring him to death.

  • fx23

    @ dragon

    usine can receive about anything in input:
    mouse_tablet_keyboard_midi_osc_wiimote_gamepad_usb device bytes and even video capture from camera you can transform in whatever control you want.
    the recorgnition system is a script that record
    anymoves from those to memory. when mouse (or other) is relased it check if one previously
    reced matchs with tolerance, if so trig the action, mean you can build pretty any sign langage you want with any input, to any output.

  • Thank you for that feedback, fx23. I enjoy triggering sounds using my misc gamepads/Wacom/DS/MIDI/pc keyboards now, so this is right up my alley. 😀 The camera support sounds pretty snazzy. Thanks for mentioning Usine again, Peter.

  • strunkdts

    INSANE!!!! this is really really cool.
    And it makes my beloved AudioMulch look like a typewriter.

  • This is definitely one of my favorite hidden gems. I just started playing around with this Usine, but it is already my goto piece of software for designing interfaces and multitouch.