flesh1

Working with samples is great fun, but there’s a certain sameness to approach. Load a sample. Play back a sample. Slice a sample. FLESH takes a unique angle: it analyzes sound samples and mangles them into new animals.

And it’s the latest from Tim Exile, a one-man live performer of madness himself (Warp, Planet Mu), and one of Reaktor’s greatest patching virtuosos on Earth. His first two instruments, THE FINGER and THE MOUTH, were already weird and wonderful tools for performance, but FLESH could be the deepest one yet. (Yes, that’s just Flesh, not The Flesh. So it could be, basically, any flesh. Yours, mine, some random guy’s flesh… you get it.)

At the heart of FLESH is a screen on which you can drag and drop loops. It wisely borrows a page from the terrific Loopy on iOS and displays that audio in a circle – a visual that makes sense for loops.

You can stick up to 12 samples in there, though once you hear all you can do with them, you may be fine with a much smaller number.

The ingredients that follow are tools we’ve heard and seen before in some form. What’s novel is packaging them together into a single, live and improvisatory workflow. FLESH analyzes the sounds for both transient and spectral profile, then uses that information to transform the results.

fleshmono

There are four engines:

1. Sample Engine is more conventional sample stuff – length, envelopes, modulation.

2. Monosynth turns those samples into wavetables.

3. Polysynth turns them into chords (or can be a multi-voice monosynth with stacked voices).

4. Subsynth produces bass frequencies from an incoming pitch signal.

flesh2

Then, as if that weren’t enough, there’s a whole effects section and additional modulation. There’s a dub delay, itself controlled by modulation from the sound, and a Mod Page. Because you can drive all of this from the sample modulation, you can get really wild effects.

fleshmod

Because of these different engines, you could really get different results than the Tim Exile-y ones you hear in the demo videos. You might focus on just one engine, or just one set of parameters.

But the other thing that’s important is centralizing sound control. Tim himself works with an enormous table full of gear (including a lot of Behringer controls). This isn’t just about working on sound design as a slow, studio-driven process. It’s about improvisation, which is always what impressed me about Tim’s performances. Now, I’m not Tim, you’re not Tim, but the live control element means a chance to really play with sound.

These are integrated into big macros for Spectrum, Character, Length, and Mod, and it’s really fast to play and tweak quickly.

FLESH isn’t a sampler – that part may disappoint people. You do need to prepare samples in advance.

But there is a major improvement in Reaktor 6 under the hood. New tables for samples make the drag-and-drop ease here possible for the first time. And so while FLESH is great on its own, it’s also the first showcase that really shows what’s possible in Reaktor 6 (apart from those Blocks).

fleshkeys

This being NI, they’ve also come up with some slick integration with the KOMPLETE KONTROL keyboard line – and I hope, soon, too, the more compact Maschine controller.

Watch:

The combination of all these elements really does make FLESH into an integrated performance instrument – like, you might not need Ableton Live any more for playing out. Load your samples and parameters, and do it all live. Tim gives a quick demo of what that looks like:

Here’s a promo video from NI, proving that Tim is the virtual reality presence we all knew him to be. People are already down-voting it on YouTube, because… well, either there are some angry jealous Cylons subscribed to NI’s channel, or the downward-facing thumb has some meaning in other cultures. (I never understood why people do that on YouTube. Come on, kids. Lighten up.)

You can join Tim and me for a live Periscope. Tim asked me to come over while he’s in Berlin, so we’ll be online on the main page in a little while. (7 PM Berlin time, that’s 1 PM in New York and 10 AM in California.)

If you miss the live video, it’ll be up for 24 hours.

Find that, plus everything on FLESH, on NI’s site:

Native Instruments FLESH

Runs inside Reaktor 6, or if you don’t have Reaktor, inside the free Reaktor 6 Player.

Tim also talks to CDM about what it was like building this instrument.

CDM: You’ve for some years used a project called Flow Machine. How does The Flesh relate to that?

Tim: Flesh is made to fit into the Flow Machine workflow. The Flow Machine is kind of my R&D playground performance instrument. All the technology in the Flow Machine is optimized for spontaneous electronic music performance.

How is it different making something like this for other musicians than when you’ve done something just for yourself?

You have to bear a lot more variables in mind. Personally I avoid DAWs like the plague as I prefer to create spontaneously, but most people still use them, of course. Making Flesh work both as a standalone performance instrument with its own approach to music making as well as a plugin which plays nicely with a DAW was a real challenge. The challenges of each mode of operation kind of multiply each other. Then you have all the platforms – different DAWs behave slightly differently, the new version of Reaktor has lots of small differences, Komplete Kontrol, Maschine etc., etc. It gets complicated very quickly!

Is The Flesh something you’re using for performance already? If so, how?

I have a prototype version of The Flow Machine already running with Flesh, but to be honest, there’s been so much effort in the last few months preparing for this release that I haven’t fully integrated it yet. But even with the prototype, I can’t quite get my head round the possibilities. The next thing I want to do is spend time really learning how to play Flesh both on its own and in the context of The Flow Machine.

How do you hope others will use this? Do you feel like you’re giving away some secret Tim Exile sauce – or do you imagine they’ll make something different?

I have no idea how people are going to use it! I feel it has quite a new way of approaching music making and performance and that in itself could go wherever anyone’s imagination might take it. It has a distinctive sound and a distinctive interaction and I can’t see any particular limits on how that could be interpreted. I suppose my ultimate dream is that it gives rise to its own genre of music – better still music performance. That’s a big goal, but why not?

You’re something of a Reaktor master. But how do you actually start a project – especially one this big? Where do you begin when patching?

That’s very kind of you. My projects usually start out with a big dream, then some small first steps, a little bit of humiliation of my programming abilities, the resolve to get over it then just lots and lots of work.

When did you first get started learning Reaktor?
I got my first copy of Reaktor in 2001 I think. I originally I had Syncmodular the software made by Vadim Zavalishin before NI bought out his project which then became Reaktor Core. NI gave all of his registered users a free copy of Reaktor. And it just went from there…

Do you use the User Library (or the factory library, for that matter), or are you deep enough in your own creations?
For my sins, I don’t really. In that sense I’m pretty myopic and deep in my own Reaktor projects.

Is there a way for people to look inside what you’ve done, since this is Reaktor, and learn from it? In particular, is this drag-and-dropping of samples augmented by the new table stuff in Reaktor 6?

If you have the full version of Reaktor, you can have a look at the whole of Flesh right the way down to the inner working of the live oscillators. The sample drag and drop and the offline analysis is all implemented using the new Table Framework in R6.

If you dive into Flesh, you’ll see the chaos of my mind writ large!

So, what’s next – putting mouth and flesh and fingers together?

First up, I want to spend some time actually playing the instruments I’ve been making, improving my performances and writing some music. Alongside that the Flow Machine project continues. As of earlier this year, I now have a talented developer working with me (hi Ash!) so I can spend more time performing, making music and figuring out what the hell to do with the Flow Machine. He’s working on some really exciting new features, and it would be great if as many of them as possible can be things that more people than just me get to use.