What if you could mash, mangle, mush, and morph sounds with your fingers on a screen, watching the waveforms dance in response in three dimensions? That “what if” is expressed beautifully in a project by musician-developer Christian Bannister of Portland, Oregon, who works as Subcycle Labs.
The result is like being able to touch sound directly.
Three-dimensional forms morph and vibrate using visuals programmed in Processing, making architectural-organic shapes and spaces that really begin to “look” like sound. These forms can represent synthesis and effects parameters (Christian has done some work with the Massive synth from Native Instruments), or can allow navigation through loops using touch. Gestures remap offsets and duration for audio, scrub and slice, and apply granular resynthesis.
Controls use multiple touch points on a screen (apparently via Community Core Vision and reacTIVision), with sound from Logic, Reaktor, and Max/MSP, and auxiliary control with a joystick array and KORG KAOSS Pad.
Here’s what happens with a Massive bass line:
It’s spectacular, gorgeous work, and I can’t wait to see more. It’s well worth reading through the whole description on the blog for more details, technical, musical, and artistic: