“It’s almost like there’s an echo of the original music in the space.”
After years of music being centered on stereo space and fixed timelines, sound seems ripe for reimagination as open and relative. Tim Murray-Browne sends us a fascinating idea for how to do that, in a composition in sound that transforms as you change your point of view.
Anamorphic Composition (No. 1) is a work that uses head and eye tracking so that you explore the piece by shifting your gaze and craning your neck. That makes for a different sort of composition – one in which time is erased, and fragments of sound are placed in space.
Here’s a simple intro video:
I was also unfamiliar with the word “anamorphosis”:
Anamorphosis is a form which appears distorted or jumbled until viewed from a precise angle. Sometimes in the chaos of information arriving at our senses, there can be a similar moment of clarity, a brief glimpse suggestive of a perspective where the pieces align.
The head tracking and most of the 3D is done in Cinder using the Kinect One. This pipes OSC into SuperCollider which does the sounds synthesis. It’s pretty much entirely additive synthesis based around the harmonics of a bell.
I’d love to see experiments with this via acoustically spatialized sound, too (not just virtual tracking). Indeed, this question came up in a discussion we hosted in Berlin in April, as one audience member talked about how his perception of a composition changed as he tilted his head. I had a similar experience taking in the work of Tristan Perich at Sónar Festival this weekend (more on that later).
On the other hand, virtual spaces will present still other possibilities – as well as approaches that would bend the “real.” With the rise of VR experiences in technology, the question of point of view in sound will become as important as point of view in image. So this is the right time to ask this question, surely.
Something is lost on the Internet, so if you’re in London, check out the exhibition in person. It opens on the 27th: