Moog Music is putting the animation into Animoog for a new edition for Apple’s mixed reality headset, Vision Pro. It’s a stunning immersive spatial synth they’re calling Animoog Galaxy.
Our friend Geert Bevin again is leading Moog’s software team on this, and he describes really well what Animoog Galaxy is about. This is very much not just making a virtual version of the synth for your VR headset and calling it a day. They’ve added procedurally generated galaxies to the experience, as well as complete gestural control of the synth with your hand gestures and gaze. This is more like Spore meets Lawnmower Man – like the trippy fantasies you imagine when you play synths. And as it work in shared space, you can mix playing it with playing physical synths:
We’ll talk to Geert more about this next week, including the experience of the platform and developing for it, but here are his first thoughts:
Animoog was always a three-dimensional idea: comets flying in orbits along a path with captivating colors, motions and sounds. We’ve now brought this to its fullest spatial realization and let you be transported into ever-changing procedurally generated galaxies while sculpting and playing synthesized sounds.
Animoog Galaxy has a new expressive and probabilistic step sequencer that can randomize melodies using the musical scale of a preset. All of Animoog’s synthesis features are available and can be tweaked by simply looking at a control, pinching and dragging. While in edit mode, a dynamically appearing HUD will float above your hand to give a detailed read-out of parameters without taking up permanent UI space.
Here it is in motion, though everyone is telling me that’s tough to capture:
Animoog Galaxy can be played over MIDI, and together with Vision Pro’s progressive immersion, you can see both your real-world keyboard and Animoog Galaxy floating above with all its virtual controls accessible. It’s also possible to control other instruments over MIDI from Animoog Galaxy, allowing it to play sequences while using its innovative touch-less playing keyboard to control iPad synths on the Vision Pro or external synthesizers.
As a final callout, Animoog Galaxy is also a visual sound machine in Vision Pro’s shared space, living alongside your reality and virtual reality, and infusing it with ever changing sounds and colors. Choose from any of the 120 built-in presets or create your own to select atmospheres that perfectly complement your mood while being engaged in other activities.
Features:
- Full ASE (Anisotropic Synth Engine) from Animoog (that’s the x/y/z timbral space that made this synth unique in the first place)
- Immersive UI, full support for visuals / graphics / gestures / interactions / sounds of Vision Pro
- Modulation and pitch shifting with configurable scales, pitch correction, and glide
- Look, pinch, and drag to control multiple per-voice parameters. And yeah, wow, polyphonic/multi-voice gestural control!
- Step sequencer
- EG/LFO (three independent six-stage DAHDSR envelope generators, three independent LFOs, 1-8 repeats)
- 10-lane modulation
- Delay, Unison, Bitcrush, Drive, Filter, Arpeggiator, and Recorder
- MIDI integration, including MPE – and you can use the Vision Pro as a MIDI/MPE expressive controller for your outboard gear
More on the Moog site. But this is probably the opposite of what you expected (at least what I expected). Instead of a stripped-down, “VR” version of Animoog Z, we’re getting an even deeper Animoog that transforms the Vision Pro into a deep, gesturally-controlled synth with immersive interface, as well as making it an expressive controller for other hardware.
And I do hope we see some of this functionality on other mobile and desktop platforms, too – that really does seem likely the more I see this
More on this soon, so feel free to ask questions.
And, uh, now in addition to thinking of Opcode Studio Vision Pro every time I hear the name of Apple’s platform, I now want to play Super Mario Galaxy.
Of course, if this is all over budget – you still have the deep modular environment Patchworld on Meta Quest, which as an aside, already demonstrated that VR really can be a deep environment for synthesis and performance and not just a toy/game.
Updated: Here’s more from Geert so you can see how this works in action in terms of responsiveness:
Seriously, though, I don’t know why everyone is complaining about how hard it is to capture video of Vision Pro? Looks easy enough. I need to update my MIDI certification.