Gestural Music Sequencer from Unearthed Music on Vimeo.
Something as simple as remapping a single knob can give you new musical ideas. So expand that to entire gestures and live video input, and you can help push your performance in new directions and out of old habits. That’s why it’s always great to see projects like the Gestural Music Sequencer.
Built entirely in free tools – tools fairly friendly even to non-coders – the GMS lets composer and musician John Keston explore new ideas through gestures captured in a video stream. It’s easier to see than to talk about, so check out the just-completed documentary short by Josh Klos, with the aid of Julie Kistler and Brian Smith. (And yes, documentation makes a huge difference; we’d love to see more of this stuff!)
The ingredients:
- Processing, the free, multiplatform coding environment [site | cdmu tag | cdmo tag]
- controlP5, a lovely, light, quick-and-dirty library for UI controls
- Ableton Live – though you could substitute other software via MIDI, Live makes a nice, familiar interactive music engine
Lots more information on John Keston’s wonderful Audio Cookbook blog, which is fast becoming one of my favorite reads:
http://audiocookbook.org/category/gms/
And here’s a really lovely video that demonstrates what you can do with video. It uses a string of lights in a jar as the source. Yes, in a way, it’s almost like having a very focused random generator, but I think there’s nothing wrong with that. There’s an almost analog approach to seeing the source, and using that to organically create music.
GMS: Chromatic Currents Part II from Unearthed Music on Vimeo.
I have to observe, while this works reasonably well with MIDI, it reveals why standardizing on networked communication, as OSC does, makes more sense. In a world of software, “controller” can really mean anything you like. Control is increasingly about software talking to software – including when devices are involved, since they generally have a software layer of their own. Also, because sometimes it’s easier to code this with Processing than with Max, I can see some powerful uses of the Python-based Live API, which we expect to mature later this year. (Yes, the project called Live API seems to be in a holding pattern, but we may be able to work up a more complete, Live 8-ready alternative.)
By the way, our goal is to make noisepages a platform and collection of tools for people doing this sort of work (or anything creative with music and motion), even if you host your blog elsewhere. Stay tuned for the details on that.