Musical Applications for Multi-Touch Interfaces from BricK Table on Vimeo.

Across series of colored bars, sounds warp and mutate. Vines entangle as organic threads of music. Fingers and objects traverse sonic landscapes in surprising, mysterious ways. Welcome to the worlds of BricK, the musical table interface by Jordan Hochenbaum and Owen Vallis, which, charged with software by Dimitri Diakopoulos, Jim Murphy, and Memo Akten, explores new musical frontiers. The tool uses a combination of open source tools for tracking fingers and objects on a table, then feeds those into sound and music environments.

Just following the landmark, long-awaited release of Processing 1.0, BricK demonstrates the expressive potential of the open-source platform. Processing allows quick and elegant development of stunning visual interfaces, while other tools (ChucK and Reaktor, for instance) serve as sonic engines. Sometimes the sounds themselves are not revolutionary, but by simply replacing the visuals and interaction – just as with changing the look of a score – the music is transformed, too. (At top: experiments with different interfaces for music using the platform they’ve built.)

CDM got to talk to Owen and Jordan about the projects. And now’s a perfect time – the gorgeous Roots is looking for a home, in case we have any curators / galleries / other interested parties in our audience. First, a review of what these platforms are:

Spaces, Multi-Touch Music


Spaces Multi-Touch Music Environment from BricK Table on Vimeo.

Jordan tells CDM about Spaces, their latest creation, which premiered alongside a performance by Daedelus in LA:

Spaces is the latest interactive multi-touch musical application for the Brick Table. Designed as a minimalist interface to free musicians from traditional compositional markers such as frets and keys, the environment enables musicians to compose intuitively through immediate visual and sonic feedback.

In this video, Spaces mediates a spontaneous composition and performance of a slow-moving ambient soundscape.

Spaces was developed by Jordan Hochenbaum, Owen Vallis, Dimitri Diakopoulos, and Jim Murphy.  It was recently used in a performance at the REDCAT lounge at the Walt Disney Theatre, Los Angeles, and further developments are currently underway.

Roots, an Organic Installation


Roots Multi Touch Tangible Installation Teaser from BricK Table on Vimeo.

Roots has been impressive in Web videos, but it’s looking to make the transition to the real world, after a shipping mishap prevented what was supposed to be its premiere showing at New York’s Minitek Festival earlier this fall:

“Roots” is an interactive installation for the Brick Table tangible and multi-touch interface, where multiple people can collaborate in making music in a dynamic & visually responsive environment.  Users use their fingers and tangible objects to create and interact with virtual branch-like vines that move around the screen, allowing users to create either entirely generative, semi-generative, or sudo-composed arrangements and compositions. 

Roots is truly a unique and expressive interactive installation which came together through an internet collaboration between Brick Table’s creators (Jordan Hochenbaum and Owen Vallis), and the super-talented London-based designer/developer Memo Akten.  It was recently selected as a featured Processing Exhibition on Processing.org and we feel it is time to release Roots into the wild…

So! We are calling out to all of you lovely CDM readers out there to get Roots out and into the public.

For more information on how Roots works, please see What is Roots?

Please use the contact on the BricK Table website if you are interested.

Behind the Scenes

CDM: How do the visuals relate to the sound?

BricK: The nature of the vine-like branches in Roots lends itself to creating music with what is — in our opinion — an organic and open feeling. We felt that the music should both sound and feel as if it is coming out of the visuals, and vice-versa, and so we did our best to stay true to this relationship in the overall musical aesthetic of the sounds produced.

The Spaces environment expands on the theme of unconventional visual representations of sound manipulation. Each column is an open space connecting an idea with a musical parameter. Combined with the visual feedback, we decided Spaces would work best with slow-moving ambient soundscapes, although it is certainly possible to experiment with other musical styles..

What sorts of relationships did you experiment with before settling on something you liked?

BricK: With Roots, we first worked with Memo to develop the visual elements before even attempting the musical side of things.  We discussed various approaches to its visual and musical relationships. Did we want it to be completely generative? Did we want a more direct and repeatable relationship between your finger and the resulting sound?  We really liked both ideas, and so we made it all inclusive– making it able to create completely generative, semi-generative, or directly manipulated/composed musical outcomes by the use of finger pressing, sliding, and tangible object interaction.  This really makes Roots unique in comparis
on to other environments which enable generative musical arrangements.  Each performer can exert as much or as little control over the relationship between physical, visual, and musical interaction as they want at any given moment.

In Spaces, we discussed a few different ideas about the layout and design of the interface. Ultimately, we decided on Spaces being able to control four different instruments, each with four parameters (volume, and three others). We toyed with different methods for visually representing the value of each column without turning them into a traditional slider. We felt the cool-to-hot color morph in each column was fitting: the user has to rely more specifically on the sonic result rather than exact value, veering from more traditional musical interface paradigms.

How did you deal with timing relative to the visuals?

BricK: In Roots, it was necessary to have the generative data play in a relatively synchronized manner to maintain a degree of musicality. As the vines move around the environment, the musical outcomes are quantized to various beats. [Ed.: The quantization all happens in ChucK.] That being said, continuous finger movement scrubs audio in a direct 1:1 relationship that gives the user the feeling of direct manipulation when that is wanted.

Spaces has no generative movement (at the moment) which means timing is always completely synchronous with finger movement, both sonically and visually. We tried to make sure that the way in which the colors morph feel as free and smooth as the slowly evolving musical outcome.

Can you talk a bit about how the sound is generated?

BricK: Roots uses audio buffers as its underlying sound source (although the musical outcome is VERY different than the original material). Each vine gets assigned an audio buffer which is then ‘scrubbed’ through as the vine generatively maneuvers around the screen.  The audio and buffer manipulation is done using the ChucK audio programming language. By simply changing its source material, Roots will produce vastly differing musical results.

Spaces generates sounds in a number of different ways, all using Reaktor. Each of the four instruments employs a selection of synthesis methods. Some columns control pitch, other columns control combinations of filters and effects. The clicky percussive sounds are generated from an audio loop which is granulized and re-synthesized with altered delay rate, etc.

What are your future plans for these pieces?

BricK: Roots is ready to go, but in our free-time, Owen and I play with using it as a sequencing device in other ways — using movement and vine-location to pluck notes, control effects and filters, etc.

Exploring Roots along these other avenues will probably create the need for a new GUI interface, which means perhaps Roots will have a new little cousin sometime in the future.  That being said, we are really happy with Roots as is (we reached our specific goals), and we are more interested in giving it the proper debut it deserves, rather than changing the way it works.  We had a great time working with Memo, whose work I actually first came across here on CDM, and would love to work with him again in the future.

Spaces was developed in a very short timeframe for a performance at the REDCAT Lounge at the Walt Disney Concert Hall in downtown Los Angeles, and so we are absolutely looking to expand the possibility of what the Spaces interface is capable of. First, we would like to expand the number of instruments capable of being performed. Secondly, we would also like the interface to be “physics”-enabled, for example, using a flick motion to send a bouncing ball down a column to automate a parameter as the user concentrates on other instruments.

The Software

Just to review, here’s the software powering BricK:

tbeta (“The Beta”): finger tracking. tbeta is an open-source, cross-platform computer vision and multi-touch sensing platform. It’s the successor to the former touchlib, which wasn’t as cross-platform or quite as awesome. More on tbeta on Create Digital Motion.

reacTIVision: fiducial marker tracking for objects. (Fiducial markers are these funny, cellular-looking patterns pictured at right that allow you to track specific objects manipulated on the table. reacTIVision is the open-source library developed by the folks who did reactable. Sounds as though we might get fiducial tracking in the other library, though.)

ChucK: a strongly-timed, quick-to-code sound and synthesis language. It’s elegant enough that it’s used for real-time programming – as in, onstage, in laptop ensembles like PLOrk and (its West Coast descendent we just saw here on CDM) SLOrk.

Native Instruments Reaktor: The modular sequencer, instrument, and effect builder, which we cover regularly on our Kore minisite. It’s the only commercial / non-open-source choice here, though it may actually replace ChucK on Roots in the future.

More Info

Brick website