Game Audio: Selected Student Works from Matt Ganucheau on Vimeo.

In the early days of game sound, musical soundtracks were all largely adaptive and interactive, fused with the sound effects of the game and the logic of gameplay. Scores were less Alfred Newman or John Williams, more Spike Jones. Today, game music has the potential to reinvent composition itself, to help us reimagine what makes a musical score as on-screen user action drives musical ideas. But with a few, notable exceptions, most modern titles have opted for big, Hollywood-style soundtracks – and the linear composition that goes with them, as though someone just took a film score CD and hit play.

It’s one thing to talk about that in theory. Better yet: give it a shot yourself. So why not teach game music as its own discipline?

Matt Ganucheau, a composer, sound designer, and interactive developer/artist, is teaching just that, working with students at Expression College in Emeryville, California. The accelerated course works with the elegant Unity game engine and a clone of the legendary Space Invaders arcade game, adding music built in Max/MSP. If Max seems an unlikely choice, its open source cousin Pure Data (Pd) is actually integrated with the game engine for Electronic Arts’ Spore, with music by Brian Eno working with EA’s Kent Jolly and contributor Aaron McLeran. So, this could be the wave of the future. The first problem: figuring out how to actually compose.

The results are astonishing, given that the students were just learning Max and had extremely limited amounts of time. I asked Matt to write up for CDM how the coursework evolved; he shares his process and what he learned as a teacher. We’re also working on open sourcing the coursework content and the patches, which we’ll soon provide both for Pd and Max/MSP. I’m doing some work on the game side so that you can play with game mechanics in Processing. Stay tuned for more on that.

We spoke a bit about this process – and interactive music in general – with Xeni Jardin and Boing Boing in their Game Developer Conference livecast a week ago Friday. Edited video of that coming soon.

Here’s Matt on the coursework itself:

When faced with the challenge of updating our Game Audio course at Expression College, we wanted to create a course that reflected the increase of interest in adaptive and interactive audio in the current game industry. To do this successfully, we had to make sure our students had an understanding of how audio engines have evolved in the past eight years. Since our terms are only five weeks and our student body is comprised of non-programmers, this seemed like quite a daunting task. But having carefully fine-tuned the details, we feel we have a good recipe.

First, we begin by having the students build simple environments and place audio emitters inside the Unreal 2k environment. This shows them the restrictions of audio functionality in a proprietary engine. After a few labs with Unreal, the students are then introduced to the concepts of a middleware platform, using Audiokinetic’s WWise connected to the game Cube. Here, they are able to explore more interactive audio such as real-time control parameters and dynamic music changes. Finally, the students are introduced to Max/MSP. Lead through labs comprised of synthesis, sampling, basic programming concepts and sound design, we are able to arm the students will all of the information needed to create their own generative audio engine inside Max/MSP. By hacking away at a recreation of Space Invaders posted to the Unity3d forums (thank you, Eric Haines), we are able to pipe all of the real-time game data to Max/MSP via the UDP transport (with help from Bjerre).

Click for larger version (source patches coming soon)

Inside Max/MSP, the game data is received in our Unity2Max patch. With this initial infrastructure in place, the students are able to use the real-time events to remix the classic arcade game with their own audio engine. Piece by piece, we recreate the original audio engine through tasks such as creating the alternating pitched footsteps for the invaders, and a UFO spaceship noise with a flanger and a sine-wave, as well as mapping invader’s proximity to the music’s speed. For their final project, the students are allowed to use these tools to go in any stylistic direction they wish, as long as the music is adaptive.

We did not give students access to all of the game events because we didn’t want them to become overwhelmed with options. To our surprise, these restrictions created the opposite reaction. Students were frustrated by not having a message saying that the “UFO was destroyed”, so they hacked their own ways to find this out by deducing the change in points. In another example a student wanted the missile explosion to sound when the bunker was hit, so he placed a threshold on the missile flight time to be able to see if a bunker was hit. Hacks like these began to appear all over the students projects. This may seem like basic programming techniques to some, but to see this development come from a class of audio engineers is quite amazing.

Although this new course design has only been active for 4 months, we have seen a dramatic increase of interest from our students. Once a cultural standard like Space Invaders is deconstructed, the students become extremely excited to explore a new direction for the classic game. It still amazes me just how far students can go with only 3 weeks of Max/MSP instruction.

Unity Game Engine (recently updated to 2.5, and now both on Mac and Windows)

Cycling ’74, Makers of Max/MSP

Expression College for Digital Arts

And the bits for this game, specifically:

Unity Invaders on the Unity Community Forum (the Space Invaders game used in the class)
Unity Invaders Site with downloadable, playable versions of the game
Discussion of UDP communication between Max and Unity, with the patch solution by Bjerre

Also, don’t miss the fantastic Pd-based book Designing Sound (well worth a read for Max users, as well). It’s an entire textbook built on the idea of doing interactive sound design in Pd, useful for games but other live and interactive sound,
too – and while the emphasis is sound design rather than music per se, it remains a great reference on learning to patch and learning about audio synthesis.

Andy Farnell