Image courtesy Machine Orchestra.

Ed.: From modern electronica to South Asian Classical music, machines to humans, the Machine Orchestra is doing fascinating things with electrically-powered, digitally-manipulated, physically-robotic music. Here’s more about what makes the ensemble tick.

It’s been nearly three months since I had the opportunity to guest blog here on CDM about a project I am involved in called the Machine Orchestra. In Pt. 1 you were introduced to the directors behind the ensemble, Dr. Ajay Kapur and Michael Darling. Today however, we look at the Machine Orchestra from the inside out, and explore a few of the interfaces, artists, and technologies that make the show a reality.

From the very beginning, a primary goal of the Machine Orchestra has been to explore novel human-machine interaction; how could we both exploit the strengths of our computers and robotic-musicians (i.e., taking advantage of extremely accurate metronomic precision), and at the same time, perform with a high level of musical expression? As we attempted to answer these questions, we made several discoveries that helped us fulfill our desire to musically interact with both our robotic counterparts and our computers.

KarmetiK Machine Orchestra Live at REDCAT from KarmetiK on Vimeo.

The video above gives you a glimpse of the evening which, to throw names around loosely, combined musical elements ranging from Glitch to IDM, traditional North Indian Classical to Balinese Gamelan, and post-rock to new music. Oh yah, let’s not forget the human-interacting machines!

The Speakers.

In addition to exploring new ways to interact with our machines, and taking inspiration from the laptop ensembles that had preceded us, we spent a great deal of time researching ways to reproduce our electronic sounds on stage, as well as experimenting with mains sound reinforcement. At every point in the show we tried to communicate a strong connection between the individual musicians themselves, and the sounds they were creating. To achieve this, each musician had a hemispherical speaker system, and/or a big-ass JBL sub-reinforcement for those musicians requiring extended low-frequency response. Additionally, a 5.1 mains mix was used to reinforce each musician’s location on stage and provide a cohesive house mix for the audience.

The Interfaces.

The diversity of the Machine Orchestra allowed for many types of novel physical interaction. The Machine Orchestra included the following custom interfaces and instruments: Arduinome, the SqueezeVox, ESitar (sitar hyper-instrument), MLGI (laser controller), Helio (touch-strip controller), EDilruba (Dilruba hyper-instrument), and the DigitalDoo. These interfaces were used to control software instruments on each musician’s computer, and also to remotely control the three robots via an OSC/MIDI network designed specifically for the Orchestra.

Interaction and Sync

During our work with the musical robots, interesting challenges emerged that called for creative use of our controllers and technology. One of the most difficult challenges we faced was maintaining stable “sync” between musicians, computers, and the robots. As we’ve briefly discussed in other articles/threads here on CDM, and recently at the CDM mediated NAMM After-Hours Party panel discussion, stable sync between machines is an extremely complex issue, both in terms of technological implementation and its actual uses. When controlling multiple mechanical instruments on stage, and communicating between ten electronic musicians, clock is much more than a way to make up for inaccurate timing—it serves as the essential foundation for fast and accurate communication between robots and performers. We needed to develop a system that allowed complex midi routing over a network, clock sync to be sent to all performers so that tempo changes could be dynamic and on the fly, and the ability for performers to exit or enter the sync stream at any time. We came up with the following solution.

In the Machine Orchestra, all electronic musicians (clients) receive sync from a hub/switch connected to a dedicated server machine via ethernet. The server runs a custom application we developed in ChucK, building off the framework developed for PlorK. Our additions implement a few extra features for interfacing with the robots, as well as addressing some of our stability concerns e.g., in case a musician losses sync the middle of the performance.

We discovered that ChucK implements midi using the RTMidi library, which by default disables midi clock. To enable midi sync in ChucK, the server and client applications are bundled with a custom ChucK binary that is compiled with MidiClock enabled. Additionally, a midi sync client application should configure itself automatically (assigning IP address…etc) and connect to the midi server; in order to facilitate this, we wrote a custom script to dynamically resolve a local IP for the client ChucK applications. Finally, one musician is set as the Master clock, sending clock to the server, and all other clients are then slave to this clock.

Typically, if a computer loses sync, the master clock will need to stop and restart in order to transmit the initial MidiClock start byte and allow that machine back into the sync stream. In practice, this would mean that each time a musician or instrument dropped (or exited) sync all musicians would have to be stopped and restarted by the master clock to get the one machine back in sync. Because of the number of musicians and robots receiving clock during the show, this simply was not an acceptable solution. Instead, we implemented a keyboard command (‘G’ for “Go!”) that each client could manually press if they lost sync. Although not a very complicated solution (simply forcing a stop and start message from the client), it was very effective in allowing a performer to jump back into the sync stream.

With stable sync, and clock communication between all musicians and machines, we were finally ready to explore the different ways to use our custom controllers.

In the piece Voices, various controllers were used to explore vocal synthesis techniques and granular control of vocal sounds. Meason Wiley used his Multi-Laser Gestural Controller (MLGI) to drive a custom Reaktor ensemble with in-air gestures, while Jim Murphy used his new touch-sensor based (akin to a vertical controlled Stribe) controller, the Helio, to control a custom Reaktor granular synthesis instrument he developed with Charlie Burgin. Similarly, Ajay Kapur controlled a granular ChucK patch using his ESitar’s extensive array of sensors (triple-axis accelerometers, thumb-pressure sensors, and fret sensors). Interestingly, each interface’s design imposed a very different use of the granular patch that Charlie, Jim and Ajay were all using—resulting in dramatically different effects.

Other (personal) highlights included being able to work with the visionary electronic and interface pioneers Perry Cook and Curtis Bahn. The vast assortment of interfaces (SqueezeVox, DigitalDoo, EDilruba…etc) and experience they brought to the show was invaluable. In Voices, Perry used the SqueezeVox to control synthesis models (written in ChucK) via an assortment of controls including: tilt/acceleration sensors, replacing the reeds of an accordion with air pressure sensors, force sensors, and linear/rotary potentiometers, creating Forty-One Buttons of pure vocal synthesizing chaos. Throughout the performance, Curtis’ use of the EDilruba beautifully translated human gesture into musical control via accelerometers and pressure sensors on the instrument and bow.

Due to its strength as a reconfigurable device, the Arduinome proved to be a particularly well-suited interface for the Machine Orchestra. One of the ways we used our Arduinomes, for a robot-centric piece called Mechanique, was by setting up 64 midi clips in Ableton, and then midi-learning them to individual buttons on our Arduinomes (we midi-mapped our Arduinomes using a Reaktor patch we made called nomeState). Each midi clip was scored with various sequences/patterns, complete with velocities. Additionally, each clip was paired with midi-clips sending back to ArduinomeSerial for visual light animations on the Arduinomes. Columns on the Arduinomes represented patterns designated for individual arms and beaters of the three robots. By combining different patterns, it was possible to play the robotic instruments in real time, from simple one-shot triggers to complex synced patterns. Completely human controlled, the robots could accurately respond with extremely difficult and complex rhythms, while the clock provided them with fine synchronized precision. The robots not only provided traditional drum sounds, but also effects which would be extremely hard for even the best human musicians to achieve e.g., extremely tight (and fast!) rolls, polyrhythm, and syncopation.

The Arduinomes were also used in many other ways. For example, mapping out the buttons to Ableton’s Midi Note Scale effect, and using the Arduinome as a pitch-based controller for playing soft-synths live; the matrix layout allowed for interesting cross relationships between the intervallic layouts of the different scales.

Each piece in the show called for extremely different methods of interaction between musician and machine. It would be impossible for me to detail every way the instruments were used to control the musical robotics live, as well as all the various software e.g., Ableton, ChucK, Reaktor, and MaxMSP. We would however, like to use this opportunity to open up discussion on the future of laptop ensembles, and promote the sharing of ideas that have been gained when performing with other laptop musicians, interfaces, and/or musical robotics. We graciously thank everyone who came out to support the Machine Orchestra, making it a sell-out debut, as well as those who shared links and spread-the word via twitter, facebook, email, and word of mouth. For those of you who were unable to make it out, no fear, the Machine will come to you soon!

Group Shot

The KarmetiK Machine Orchestra is:
Music Director, Co-Creator: Ajay Kapur
Production Director, Co-Creator: Michael Darling
Guest Electronic Artists: Curtis Bahn & Perry Cook
World Music Performers: Ustad Aashish Khan, Pak Djoko Walujo, & I Nyoman Wenten
Multimedia Performer-Composers: Charlie Burgin, Dimitri Diakopoulos, Jordan Hochenbaum, Jim Murphy, Owen Vallis, Meason Wiley, and Tyler Yamin
Visual Design: Jeremiah Thies
Lighting Design: Tiffany Williams
Dancers: Raakhi Sinha, Kieran Heralall
Sound Design: John Baffa
Production: Lauren Pratt