Here’s how interfaces normally break down. You’ve got your conventional, tactile interfaces, like a knob. You’ve got your touch interfaces, which lack tactile feedback (you touch them, but they don’t push back). You’ve got your gestural interfaces, which have you waving your hands in the air without touching anything and without any tactile feedback. (They’re generally the most challenging, because your brain has no feedback for what it’s doing.)

Syntact creates an entirely new category. It’s a gestural interface, of the “waving your hands around in the air” sort. But while your hand is in mid-air and isn’t touching anything, it does provide tactile feedback. It pushes back as you move your hand around, giving you interactive feedback. The way it pulls it off: sound. 121 ultrasonic transducers beam sound at a particular point, so that you feel something as you move.

You can see a bit of what this means in the new video, above. I’m hoping to get a hands-on (erm, hands-off) demo soon from the designer. The basic specs:

  • Optical analysis of gestures, using a USB camera built into the interface
  • MIDI control, for use with any live performance or music making rig (or other media)
  • A control panel for selecting different sonic images and adjusting scaling.
  • A built-in music solution visualizes sound and makes it easier to map to your own MIDI files.

More information:
http://www.ultrasonic-audio.com/products/syntact.html

Also well worth checking out the directional speaker tech from these Slovenia-based developers – directional sound is another huge area of innovation.
http://www.ultrasonic-audio.com/products/acouspade.html

If you want to try this in person, it’ll be at the Beam Festival in London in late June.

Side note: Yes, I’m looking into that LEAP thing, for more gestures, albeit without tactile feedback. Stay tuned.