The Monome, a minimalist, elegant open source hardware controller conceived as an array of light-up buttons, has already made a big splash in the music world. But because it’s fundamentally a controller / LED display, it could be used for anything. And the Monome is now starting to realize that potential in increasingly-cool visual device.

First, here’s the Monome becoming virtually visual, overlaid with generative drawings. (If nothing else, this shows you the kind of love people feel for this open source, community-supported gadget.)


Monome Step : Generative Drawing from formalplay on Vimeo.

This is my first test of having the Monome control some generative drawing along with the audio through this step sequencer I made in Flash similar to the original 64Steps.
I especially like where the step sequencing begins to break down because of how deep the drawing gets.
Its the same drawing engine I used to generate this seasons Thank You cards (although for the Monome version I removed the Type Flakes)
This season’s cards in collab with [M]:
ilikegravity.com/real/archives/2007/01/generative_gratitude_collaboration.php

This lovely work is the product of Detroit area-based formalplay.

The Monome can also be a powerful tool for controlling visuals. Here it is manipulating a set of photos on a computer, also by formalplay. You can imagine the potential for live VJing – and, Microsoft Surface, eat your heart out!


Monome_NL_PhotoGallery.swf from formalplay on Vimeo.

But, wait, there’s more…

The Monome is, by its very design, essentially a display – an array of controllable lights, easily manipulated via software over USB. That’s prompted some to start using it as a display.


shado on the monome 128 from cassiel on Vimeo.

Nick Rothwell has written a complete “compositing and sprite” library called Shado. The basic idea is to facilitate easier scripting of visual feedback to the device. That could make the Monome part of your visuals, or might simply help you build more interesting interfaces. (It also means doing Tenori-On-style animated LED visuals would be easy.) It’s written in Java, but designed to be scripted using something like Groovy or Python; examples are in Python. (They use Jython inside Max/MSP, which I suppose goes well with the Max-centric patches for Monome, though there are ways of doing that without using quite so many tools if you’d rather do something simpler!)

Shado Project Page @ loadbang.net, via Synthtopia, via AudioPornCentral

Here’s the Monome using a Max/MSP patch (not Shado) to display Atari ASCII 8-bit characters. I could imagine using this to turn your performance rig into an interactive display when you’re not actively visualizing. As created by Beanbag Amerika.


ATASCII on the 40h. from Beanbag Amerika on Vimeo.

In case you were wondering, it is possible to control LED intensity, as in this hack by Jul of Brussels:


Per led intensity: with sound from Jul on Vimeo.

That kind of visual feedback would,
of course, also be very powerful for running live visuals on the monome.

Does this sort of thing make people happy? Yes. Yes, it does. Just watch Julian squeal with delight:


Julian Loves his Monome from simon on Vimeo.

We know quite a lot of you are doing wonderful things with monome and Processing, for one. I hope to look at more of those in the near future. If you’re out there doing something cool, do get documenting! (Obviously Vimeo is a popular place to stick those videos!)

Stay tuned for more.

Huge credit to Corporation, through whose FriendFeed I’ve been watching all this visual goodness!

http://friendfeed.com/corporation

More Examples – Live Visual Performance with Monome


video mlr clone demo from themoves on Vimeo.
Monome as Visual Controller [here on Create Digital Motion]
In the above example, Joshua Nugent tries translating the audio manipulation metaphors of the Max patch included with the Monome to visuals.

CDMotion contributor and visualist-at-large Momo the Monster has been perfecting his own rig, using the Monome to sequence audio and visuals, as he notes in comments.

From April 2007, his tutorial on Monome audiovisualism:

More details on this technique at VJ Kung Fu:

AV Sequencing with Live + VDMX + Monome

Here’s what some of the results are looking like currently. (It bears a slight, though more rhythmically coordinated, resemblance to what’s happening on some digital TV channels I was trying to pull down on UHF here in my apartment, with building obstructions. Erm, in other words, TV becomes somewhat less linear.)

Momo just needs a picture-in-picture feature so you can see the controller working in his documentation!


Ho-Lee-Cow from momo_the_monster on Vimeo.