The step sequencer. The sixteen-pad drum machine. The piano roll. The step sequencing piano roll. The waveform editor. The multi-track recording. Live music is a dynamic and changing phenomenon, but much of our technology assumes fairly predictable interfaces with time. Elysium, which we saw early this week, breaks out of that mold by defining generative systems that live on a hexagonal grid or “honeycomb.” There’s lots of great reader feedback on that story, and Elysium’s creator wrote in to talk a bit about what influenced him.

I want to highlight two sequencers that you play as if they’re games. (Just don’t play a Vulcan – they always win.)

Robots on a Grid

Al-Jazari is named for a 13th-Century scholar and musician who apparently invented an entire band of water-powered hydraulic robotic musicians with more than fifty facial and body movements per song. (Okay, that clearly deserves a separate post later. So, our Western education is so eager to avoid the achievements of Arabs that we skipped over the fact that he basically invented Disneyland in the Middle Ages.)

Al-Jazari in the 21st Century iteration takes the idea of robotic agents and builds a sequencer around them. Creator Dave built a grid on which you can give the robots symbolic instructions (like up, right, down, left), selected from a gamepad. Each grid square represents a note, with pitch modulated by moving bricks up and down. Like Elysium, the music is generated as events are triggered on the grid. And like Microsoft Research’s (non-musical) game Kodu, the gamepad and a set of symbols make what is essentially scripting easy and transparent. (Few would likely call this “programming” because it doesn’t look scary, but that’s what it actually is.)

Al-Jazari is open source, built in the elegant coding language Scheme (a Lisp dialect) atop a game engine called Fluxus. Dave has extensive documentation on its development, and not only the code but even the textures and models. You can use this yourself on Mac and Linux, but it’ll require some messy compiling. (Thanks for this link, MattH – this is layered with things that blow my mind!)

Al Jazari [pawful.org]

reacTogon

Mark Burton’s reacTogon was the influence for Elysium. It’s a “chain reactive performance arpeggiator” – that is, it takes the usual, static, repeating patterns of an arpeggiator and turns them into something altogether different, by allowing events to transform dynamically in two dimensions across a hexagonal grid. The interface is a multi-touch controller with physical objects, so there’s a tangible element, as well.

Looking at reacTogon alongside Al-Jazari really demonstrates some of the advantages of a hexagonal grid versus the more traditional square grid. (And if you think about most musical applications, most of what we have is relatively non-dynamic right-angle grids. There’s movement, but only left to right, with start/stop or loop points. One exception: Follow Actions in Ableton Live.)

Al-Jazari requires movement only to tiles with adjacent edges. reacTogon, since it tiles hexagons, has six adjacent tiles instead of four. It can also map a harmonic table, as other musical hexagonal grids do. Now, that’s not to say reacTogon is better than the other – on the contrary, it demonstrates that just one choice – a grid of squares or a grid of hexagons – can create very different musical possibilities. So even if you’re not musically impressed by these examples just yet, think about the possibilities here. We’re still early in software design and musical interface, so early that something as simple as a simple geometric pattern can become an entire composition.

That’s something to ponder on the eve of the music manufacturers’ trade show.

(If anyone has more documentation on Mark or his creation, let me know.)