From the time we’re kids, we use gestures to make music – shaking, tapping, moving our bodies around, and connecting physical movement to sound. The idea of using these kinds of gestures to control digital music has been something researchers have worked on for many years. But with increasingly smart phones, equipped with mics, tilt and acceleration sensors, cameras, and other inputs, it’s possible to actually deliver these tools to average users.

The latest entry in the field is ZooZBeat. Its life as a mobile app is just a matter of months, but the research behind it involves years of work at Georgia Tech (which recently opened the Georgia Tech Center for Music Technology). The work comes from Gil Weinberg and and co-designers/programmers Andrew Beck and Mark Godfrey. We’ve followed Gil’s work with smart music apps for some time. I got the chance to talk to him about ZooZBeat.

ZooZBeat Website

Georgia Tech Center for Music Technology

Shake it Like a Polaroid

The idea behind ZooZBeat is to use gestures to build up music ideas. Shake and tilt, touchscreen taps, and (Nokia) keypad presses add rhythmic and melodic lines, as seen in the video. Now, if this seems to lack some of the precision of a musical instrument, it’s not just you: the early apps are primarily built to be friendly to novices.

“You can go and you can practice and be much better,” says Weinberg. “But … it helps you get started, even if you’re a novice.”

The free ZooZBeat Lite version already lets you play individually with up to 2 beats running in the background and 10 instrument sounds, and a full-blown version adds voice recording (minus the iPod touch), song saving, more customization, and more sounds. A “Pro” version is coming, too, for more serious use.

If you have an iPhone, an iPod touch, or a Nokia N95, you can try this out for yourself. (Interestingly, the Symbian-based N95 actually trumps the iPhone when it comes to wireless sharing.) The Apple-platform app is available now, with the Nokia app coming within the next few days.

Lowering the Floor, Raising the Ceiling

I talked to Gil about the development process and the ideas behind the project.

“The main issue is how to create low floor and high ceiling — how to allow everyone, kids to [older people] to make music they like and have a meaningful beginning,” says Gil. “People try a cello and it sounds terrible and they drop it. I’m trying to make it easier [to] connect to sound.”

That idea is a familiar one, of course, and something that comes up regularly in new digital instrument design. (In fact, one might wonder if it causes people to neglect the potential of design with instruments intended for more depth.) But the interesting thing is always just how you go about it. Gil says this is the culmination of about ten years of research. For ZooZBeat, it involved doing a lot of testing and development, including interviews, surveys, and user testing.

“Sometimes I did it with musicians, but with the cellphones we focused on novices,” says Gil. “We have kids — friends of my kids from school, a group of them played with [the instrument], and also students at Georgia Tech. observations were very useful, just watching as people used it.”

And the idea wasn’t just to focus on making the design novice-friendly. “The low floor is easy if you just care about the low floor,” Gil observes. “The trick is how to make a high ceiling — once you start, you can also grow up in the house, become better musically.”

As it happens, working with testing and allowing novices and kids to try the instrument yielded some surprises. “The way I played it was tapping. I took it with one hand and tapped on the other hand, the way I thought it would be expressive. Kids came and preferred to shake it.”

With shaking the primary interface, the question of how to accurately measure shakes becomes important. I note some of the challenges of using this as a input, as witnessed by early game development on the Nintendo Wii; recently Nintendo even announced it was adding additional hardware to allow the Wii remote to be more accurate. Gil answers that Georgia Tech is working with providers that may be able to add additional data.

Buzz around the iPhone aside, Gil had a lot of success working cross-platform. Both apps share a common engine for gesture recognition. Building specifics for the platforms wasn’t such a major challenge, thanks to the work both Apple and Nokia have done. “We did it pretty quickly,” says Gil. “We started with the Nokia, believe it or not.” After Apple released the 2.1 SDK for its iPhone and iPod touch, Gil says the team got the work done in under a couple of months. They’re examining other platforms, as well. (By the way, another reason to be interested in Nokia as a development platform: Nokia Labs has already completed a Symbian mobile library for computer vision applications — read, easy camera analysis. Hear that, Gil and programmers?)

Gil promises more developments soon, including that Pro app. We’ll be watching – and it’ll be interesting to hear your feedback.

Previous Research

Mobile software is one delivery platform, but it’s worth looking at some of Gil’s previous research to see where this came from. I suspect some people may actually prefer the tangible objects to mobile phones.

For an overview of what Gil has done:

Music Shapers: These squeezable balls created soft, squishable musical inputs

Beatbugs: Networked physical objects for kids, the Beatbugs are intelligent “rhythm computers” – handheld percussion for the digital age

iltur: Inventing is one thing – and some point, composition and performance matter, actually using those inventions. iltur is a series of compositions realizing musical applications of the Beatbugs.

Obviously, this is not a comprehensive guide to gestural music research, just Gil’s own contributions. Doing that kind of round-up wouldn’t be a bad idea, so if you have suggestions, I’m all ears (or squeeze-ready fingers).

Stay tuned; more soon.