Pair a veteran software synth maker with a traditional artist, throwback graphics with expressive interactive control of live sound, and what do you get? SynthX is an iPad synth that gives some hope for the genre, tailored to the medium by Way Out Ware and Jim Heintz, who created the critically-acclaimed TimeWARP ARP simulation (as endorsed by 2600 creator Alan Pearlman) and versatile synth KikAXXE. It’s a synth that actually feels like it fits the iPad, but one that also seems ready to coexist with other tools in your studio or stage rig.
I think there’s a common notion that iPad synths are gimmicks. The sequence goes something like this: you see a popular, growing platform with some unique features and run off to make something that fits it, whether it’s musically useful or not. Toys and mass-market noisemakers win out over real tools. That’s true to an extent, of course – and maybe there’s not even anything wrong with it – but people who have been doing computer music for some time may see it differently. The iPad, in that view, is an outgrowth of the possibilities of computing. It’s a chance for ideas that have been gestating for some time to become practical, provided the design is matched to the platform.
SynthX, following in the footsteps of apps like Jordan Rudess’ MorphWiz, centers around a horizontal touch area for playing the instrument. In addition to touch surfaces for playing expressively, there’s a grid layout and lots of visual feedback.
We got to speak to creator Jim Heintz about his creation and how he put it together – an ideal companion to an extended interview we did last week with another iOS synth designer:
Imagining a Tablet Synth: Developer Christopher Penrose Shows Us SynthTronica for iPad
The synth engine itself is retro, virtual analog territory with rich sounds. Jim says he wanted to pair that sound with a visual aesthetic – in his mind, it was his beloved Amiga. But don’t think that means this is primitive; he tells us that, while it required a lot of optimization work, he didn’t have to make many compromises in terms of sound quality. (That said, he also says he’s eager to get his own iPad 2.)
For a review, I can’t improve upon what Len Sasso has written for GearWire:
WayOutWare SynthX Review by Len Sasso: Surprising Playability From An iPad Synth
But here’s Jim with a behind-the-scenes look at how the design came to be.
CDM: Can you tell us a bit about how SynthX came about?
Jim: SynthX actually has a pretty interesting background. I started on it initially because I was working on an enhanced iPad version of iSample, and I decided that it needed a killer synth to go along with it. This made me consider my options for about 30 seconds, at which time I realized that I had a great synth that might just work quite well. The synth I was considering was the AXXE emulation in our plug-in product KikAXXE. This had a couple of nice advantages: first, it sounds great and has a big library of professionally designed patches, and secondly, it is a pretty easy synth to understand and program, so beginners, and novices wont be out of luck it they try their hand a sound design, not to mention the fact that I designed and wrote it.
What went into programming the synth engine? Was this something you built from scratch — in fact, is that generally how you work — or are you able to build upon things you’ve done in the past?
Initially I was very concerned about the amount of processing required for virtual analog on the iPad, so I focused on optimization from the very start and re-coded the whole synth engine in fixed-point Integer math because the ARM processor handles this quite well. I was surprised to find that it work very very well, and on an iPad 1 SynthX is capable of 6 voices. (probably more, but I felt 6 is a good starting point). As far as the genesis of the code base, I drew on KikAXXE for most of it. A lot of the UI basics came from KikAXXE as well, however, I have re-coded all of the actual rendering to take advantage of OpenGL ES so as to unload the main processor as much as possible.
It feels like a lot of thought went into the user interface, both visually and in terms of interaction. How did you develop that interactive paradigm, and the visualization for the sound?
The UI design was a collaboration. I have been working with an artist named Nigel Robertson (http://www.nigelrobertson.com | contains fine art nudity) who is currently studying at the Florence Academy of Art. He and I wanted to produce a UI that looked like the fond memories that we both shared of 80’s era computer/gaming devices in order to give it a retro feel. Sort of to make you feel like a kid again, or to bring back the vibe of the era that these sounds would have been heard in, and I was bored of the rendered 3D hardware look that everything else seems to have these days.
Touch is a challenging medium in some ways for playability, because of the lack of tactile feedback. That said, have you seen other examples that inspire you?
This is why I felt it necessary to display the waveforms on the tip of your finger… I think it helps a lot as far as feedback goes. I find it really stimulating to see the exact waveform my ears are hearing when I play SynthX. As for inspiration, Bebot and Jordan Rudess’ MorphWiz were an inspiration for the XY screen, and Roger Linn Design’s Linnstrument and Mugician were inspirations for the grid screen. I play the violin, so I have been interested in being able to play in an expressive manner on a touch screen for a long time. I think we are just at the beginning in this area. I feel there is a long way to go before the possibilities for this paradigm are realized, and I hope to be part of making that reality.
A big concern users raise – tablet lovers and naysayers alike – has to do with workflow, especially for people accustomed to the versatility of plug-ins and such. How do you imagine people will use SynthX? Do you think tablets could benefit from the kinds of technologies for interoperability on the desktop? Or should everyone stop worrying, plug in that minijack (or USB audio interface, even), and play?
Well, this is a very good question. First, I believe there are a lot of different ways to incorporate tablets into music workflows. I believe that MIDI, in-app recording, file sharing, and AudioCopy/Paste are all ways that can be used to incorporate tablets. Of course the mini-jack works as well. I think as time goes on, the options will continue to expand. It would not surprise me to see a plug-in format appear in iOS, but I don’t expect it right away. I think as tablets get more and more powerful, that this will become necessary at some point. Customers will demand it.
I believe that if we try to turn a tablet into a desktop computer, you will loose many of it’s benefits. From an app designer standpoint, being able to rely on the fact that my app will occupy the whole screen when it is active, and not be partially exposed, or otherwise reduced in functionality gives me more to work with and allows me more freedom to create. I think that at some point, it will be possible to create apps that work together better when backgrounded as well. It will be interesting to watch how this all works out.