If you’ve ever ordered sushi from one of those rotating belts, you’ll love this musical hack that takes it to an entirely new place. For Red Bull Music Academy (RBMA) Tokyo, Native Instruments engineers teamed up with Just Blaze and Tokimonsta to turn a sushi restaurant into a live electronic remix instrument.

And these aren’t tricks – slick as the music video at top my appear. They really did use a combination of cameras and software to make colored plates into a working interface for music.

RBMA produced a video that shows some of what’s going on behind the scenes, below. But we weren’t satisfied until we knew the specifics – after all, we’d love to see more unique musical interfaces around the world. So, CDM talked to developers Bram de Jong and Michael Hlatky of Native Instruments to find out more. And you might learn something you can apply – or get a bit hungry for fish, depending. Michael answers.

The answers get tasty, indeed: we learn everything from how the camera “sees” the plates using their original software to how Maschine acts as a controller for sushi-triggered Ableton Live.

IMG_20141017_134128678

CDM: Why sushi?

Michael: As the RBMA was hosted in Tokyo this year, and besides the lectures and concerts, they wanted to bring together Japanese culture and music technology in a surprising and unconventional way. So, obviously, they wanted to build a sushi drum machine. The end result was not exactly that, but a lot of fun anyways.

And why was it important to do this in a real-world version?

It was probably 50/50 “because we can” and “it’s no fun if we fake it”. RBMA wanted to have an actual sushi drum machine, and having seen our LEGO sequencer we built during this year’s MIDI/Music Hackdays, they asked us whether we’d think it’d be possible to turn a sushi conveyor belt into a drum machine. They sent us a shaky video from the place, we rolled up our sleeves and wrote up an optical sushi plate and color detection in OpenCV with some Python glue code. Due to the slow speed of the belt, we suggested triggering loops quantized, rather than single shots as a drum machine would. They liked the idea, we got on a plane, and you can see the result in the videos.

What’s the camera?

It’s an AlliedVisionTec Mako that we brought along from Germany. The nice thing about this camera is that you can mount standard lenses, and it connects and is powered over Ethernet, so we could just roll up, run a long cable from the laptop to the camera, pick the lens with the right focus angle, and were good to go.

OpenCV is the library, correct – what’s it running in, OpenFrameworks?

As the Mako camera comes with a really straightforward .NET SDK, we used the OpenCvSharp OpenCV wrapper for .NET and rewrote the Python code of our prototype in C# while we were in Japan.

The “tracking” screenshot is the actual software, or no?

No, the actual software probably didn’t have a CSI-enough interface for RBMA and the video producers 😉 but the actual interface, which is pretty much just a display of the live video stream of the camera and some debug color output can be seen in the videos, as well. It’s the laptop to Tokimonsta’s left. The whole software was written during two very jetlagged days, so we didn’t spend much time on making it pretty.

So, I see it uses color tracking – how does the color tracking not get confused by other colors apart from what’s on the plates themselves?

We use OpenCV’s Hough circle transformation to first detect sushi plate-sized circles in the video stream and then use a “donut-shape” mask to cut out only the colored outer bits of the plates for the color detection. We “calibrate” the color detection by running a couple of plates of each color underneath the camera and hand-tag them, then compute the average hue and saturation histograms for each tagged color. When the system runs, for each plate that it detects it computes the hue and saturation histograms and looks up the nearest neighbor to the average histograms generated during calibration – that should be the color. We could also have used a neural network or some other AI for the color detection, but for our purposes, this was more than enough.

IMG_20141017_112531291_HDR

Did you learn anything from the hack day and working with LEGOs?

Yes, a lot, actually. First we both hadn’t ever used the OpenCV library prior to building the LEGO Sequencer, and we improved the color detection a lot compared to earlier versions. We were actually able to detect all 12 differently colored plates the sushi place had with uncanny accuracy, given that the Japanese lighting department didn’t move all their lights around too much after we had calibrated the system.

Looking at the music end of things – you’re using Device Racks? (Nice color coding!)

Both Just Blaze and Tokimonsta got a quick rundown of how the system’s going to work a couple of days before the shoot, and then prepared a Live set. We then color-coded each scene with the sushi plate’s color that would trigger it.

How is the sequencing actually occurring? That is, how is the Maschine project itself set up, first? And then loops are being triggered in Live? (MIDI loops to Maschine? Just guessing here.)

Live does all the sequencing. We could’ve used Maschine as a drum sampler/synthesizer, or instrument/effect host in Live, but both Just Blaze and Tokimonsta had prepared their sets using only pre-produced loops, so we only used Maschine as a hardware controller for Live and to color-code the pad LEDs in the color of the last detected sushi plate.

— and how does the control work, the interface between the visual software and music? You mention the ability to run with one cable – what’s the connection?

Plain old MIDI. All detected sushi plates trigger – depending on their color – different note on events which we mapped to then trigger scenes in Live. Between the laptops runs a single MIDI cable. The Maschine controller is also connected to the laptop running the plate recognition software, to be able to set the LED’s colors via NI’s own HID protocol. The encoder’s MIDI events were simply forwarded to the laptop running Live via the same MIDI connection.

IMG_20141016_123950198_HDR

So if not Maschine, what is Tokimonsta actually controlling? (She’s really virtuosic – excellent.)

A Live Effect rack. But never before was one controlled better 😉

Can you think of some applications for this technology outside this video? What’s the benefit of thinking of these sorts of applications – and is it something Maschine users might also try, for instance?

At NI, we tend to think a lot about tangible and tactile interfaces for music. So the combination of sushi plates and a sequencer was not too far off 😉 This very technology could very well be reused in different contexts, be it an artistic installation, or actual music performance (this is what Just Blaze and Tokimonsta did. It was amazing to see how quickly they learned to interact intuitively with the system).

We’ll leave it to our readers to think of more there. Let us know!