Cigarette lighters in the air may have given way to smartphones – but it’s hardly fitting at a concert to watch everyone checking their SMS inbox. In a new twist, Dan Deacon concerts use all that computational power in people’s pockets to make these devices part of the show, refocusing fans on the music.
The work of Wham City Apps and developer Keith Lea, the Dan Deacon app synchronizes sound and light to make a sea of phones into objects of wonderment rather than business machines or Facebook hubs. Away from the show, the app doubles as a musical instrument. Under the hood, it was prototyped with Peter Brinkmann’s open source libpd library, a means of distributing Pure Data as a DSP library, in which I was honored to have a small part.
This isn’t the first time apps have extended the live experience. In 2010, the iOS app SYNK “augmented” Richie Hawtin’s Plastikman tour with layers of video and information. But that app, developed by RJ “hexler” Fischer (of TouchOSC fame) and Bryan McDade (Minus), required a WiFi network. This is a more low-fidelity approach – and thus one that promises fewer technical hurdles. I asked Keith to tell us more about what’s going on here.
PK: So, it doesn’t use data or wifi — how does it respond to the event?
Keith: The phone acts on data that has been encoded into sound waves played from the stage. Like dial-up modems or even morse code.
The current implementation is a bit complex, but the idea is simple. In our early prototypes, we just used Pd to generate a randomized sequence of tones (sine waves) in a certain order. Then using libpd and some Objective C, we made the app continuously perform an FFT and keep track of which tones it heard. When it had heard at least 60 of the 80 tones, it would activate the light show.
Since those prototypes, however, the protocol has changed a lot. Now the “control pad” software – the program Dan uses on stage – generates about 80 sine waves on top of each other. On the phone, some complicated C++ code performs an FFT and decodes the sine wave patterns into data. The data contains short, simple commands like “make the screen blue” or “play the light show for True Thrush.”
What happens if you launch in concert mode when you’re … not at a concert?
Well, right now, if you’re not at a Dan Deacon show, nothing happens at all. We are looking into ways to take advantage of this technology in other contexts though.
Any description of the instrument?
The instrument, developed by Patrick McMinn, loops a randomly generated sequence of notes, and you, the user, control the timbre, key/mode, and tempo. You affect the sound in real time by tapping different notes, and by tilting / rotating your phone in various directions.
How was it to use libpd?
I loved libpd because it let us experiment and be creative in designing the app and the communication protocol. Someone would have an idea at a meeting, and within minutes we had a working demo, which would have taken me hours to code in C++.
I did eventually port the audio encoding and decoding functions to C++, for reasons of performance and complexity, but as I said, we couldn’t have made such rapid progress early on without Pd.
Patrick, a computer music student at Peabody, seemed to really enjoy programming the instrument in Pd.
Oh, and any other concert footage?
Not yet, seems like everyone with a smartphone has their app open and isn’t taking pictures or video! There is some fan footage on YouTube:
Thanks, Keith! If you make it to a Dan Deacon concert, readers, let us know how it works – or if you hack into the “concert mode” at home.
INTRODUCING THE DAN DEACON SMARTPHONE APP [Domino Records USA]
http://keithlea.com/project/dan-deacon-app/