For all the great sounds they can make, software synths eventually fit a repetitive mold: lots of knobs onscreen, simplistic keyboard controls when you actually play. ROLI’s Cypher2 could change that. Lead developer Angus chats with us about why.
Angus Hewlett has been in the plug-in synth game a while, having founded his own FXpansion, maker of various wonderful software instruments and drums. That London company is now part of another London company, fast-paced ROLI, and thus has a unique charge to make instruments that can exploit the additional control potential of ROLI’s controllers. The old MIDI model – note on, note off, and wheels and aftertouch that impact all notes at once – gives way to something that maps more of the synth’s sounds to the gestures you make with your hands.
So let’s nerd out with Angus a bit about what they’ve done with Cypher2, the new instrument. Background:
A soft synth that’s made to be played with futuristic, expressive control
Peter: Okay, Cypher2 is sounding terrific! Who made the demos and so on?
Angus: Demos – Rafael Szaban, Heen-Wah Wai, Rory Dow. Sound Design – Rory Dow, Mayur Maha, Lawrence King & Rafael Szaban
Can you tell us a little bit about what architecture lies under the hood here?
Sure – think of it as a multi-oscillator subtractive synth. Three oscillators with audio-rate intermodulation (FM, S&H, waveshape modulation and ring mod), each switchable between Saw and Sin cores. Then you’ve got two waveshapers (each with a selection of analogue circuit models and tone controls, and a couple of digital wavefolders), and two filters, each with a choice of five different analogue filter circuit models – two variations on the diode ladder type, OTA ladder, state variable, Sallen-Key – and a digital comb filter. Finally, you’ve got a polyphonic, twin stereo output amp stage which gives you a lot of control over how the signal hits the effects chain – for example, you can send just the attack of every note to the “A” chain and the sustain/release phase to the “B” chain, all manner of possibilities there.
Controlling all of that, you’ve got our most powerful TransMod yet. 16 assignable modulation slots, each with over a hundred possible sources to choose from, everything from basics like Velocity and LFO through to function processors, step sequencers, paraphonic mod sources and other exotics. Then there’s eight fixed-function mod slots to support the five dimensions of MPE control and the three performance macros. So 24 TransMods in total, three times as many as v1.
Okay, so Cypher2 is built around MPE, or MIDI Polyphonic Expression. For those readers just joining us, this is a development of the existing MIDI specification that standardizes additional control around polyphonic inputs – that is, instead of adding expression to the whole sound all at once, you can get control under each finger, which makes way more sense and is more fun to play. What does it mean to build a synth around MPE control? How did you think about that in designing it?
It’s all about giving the sound designers maximum possibility to create expressive sound, and to manage how their sound behaves across the instrument’s range. When you’re patching for a conventional synth, you really only need to think about pitch and velocity: does the sound play nicely across the keyboard. With 5D MPE sounds, sound designers start having to think more like a software engineer or a game world designer – there’s so many possibilities for how the player might interact with the sound, and they’ve got to have the tools to make it sound musical and believable across the whole range.
What this translates to in the specific case of Cypher2 is adapting our TransMod system (which is, at its heart, a sophisticated modulation matrix) to make it easy for sound designers to map the various MPE control inputs, via dynamically controllable transfer function curves, on to any and every parameter on the synth.
How does this relate to your past line of instruments?
Clearly, Cypher2 is a successor to the original Cypher which was one of the DCAM Synth Squad synths; it inherits many of the same functional upgrades that Strobe 2 gained over its predecessor a couple of years ago – the extended TransMod system, the effects engine, the Retina-friendly, scalable, skinnable GUI – but goes further, and builds on a lot of user and sound-designer feedback we had from Strobe2. So the modulation system is friendlier, the effects engine is more powerful, and it’s got a brand new and much more powerful step-sequencer and arpeggiator. In terms of its relationship to the original Cypher – the overall layout is similar, but the oscillator section has been upgraded with the sine cores and additional FM paths; the shaper section gains wavefolders and tone controls; the filters have six circuits to chose from, up from two in the original, so there’s a much wider range of tones available there; the envelopes give you more choice of curve responses; the LFOs each have a sub oscillator and quadrature outputs; and obviously there’s MPE as described above.
Of course, ROLI hope that folks will use this with their hardware, naturally. But since part of the beauty is that this is open on MPE, any interesting applications working with some other MPE hardware; have you tried it out on non-ROLI stuff (or with testers, etc.)?
Yes, we’ve tried it (with Linnstrument, mainly), and yes, it very much works – although with one caveat. Namely, MPE, as with MIDI, is a protocol which specifies how devices should talk to one another – but it doesn’t specify, at a higher level, what the interaction between the musician and their sound should feel like.
That’s a problem that I actually first encountered during the development of BFD2 in the mid-2000s: “MIDI Velocity 0-127” is adequate to specify the interaction between a basic keyboard and a sound module, and some of the more sophisticated stage controller boards (Kurzweil, etc.) have had velocity curves at least since the 90s. But as you increase the realism and resolution of the sounds – and BFD2 was the first time we really did so in software to the extent that it became a problem – it becomes apparent that MIDI doesn’t specify how velocity should map on to dB, or foot-pounds-per-second force equivalent, or any real-world units.
That’s tolerable for a keyboard, where a discerning user can set one range for the whole instrument, but when you’re dealing with a V-Drums kit with, potentially, ten or twelve pads, of different types, to set up, and little in the way of a standard curve to aim for, the process becomes cumbersome and off-putting for the end-user. What does “Velocity 72” actually mean from Manufacturer A’s snare drum controller, at a sensitivity setting B, via drum brain C triggering sample D?
Essentially, you run into something of an Uncanny Valley effect (a term from the world of movies / games where, as computer generated graphics moved from obviously artificial 8-bit pixel art to today’s motion-captured, super-sampled cinematic epics, paradoxically audiences would in some cases be less satisfied with the result). So it’s certainly a necessary step to get expressive hardware and software talking to one another – and MPE accomplishes that very nicely indeed – but it’s not sufficient to guarantee that a patch will result in a satisfactory, believable playing experience OOTB.
Some sound-synth-controller-player combinations will be fine, others may not quite live up to expectations, but right now I think it’s natural to expect that it may be a bit hit-and-miss. Feedback on this is something I’d like to actively encourage, we have a great dialogue with the other hardware vendors and are keen for to achieve a high standard of interoperation, but it’s a learning process for all involved.
Thanks, Angus! I’ll be playing with Cypher2 and seeing what I can do with it – but fascinating to hear this take on synths and control mapping. More food for thought.
https://fxpansion.com/products/cypher2/