Native Instruments Spark plus Blackbox from Create Digital Media on Vimeo.

Hands-on control is a wonderful thing, as NI founder and Reaktor “mastermind” Stephan Schmitt noted in our story yesterday on his creation Spark. And LFOs are often not terribly interesting. But even using your feet for modulation, you may eventually run out of limbs. So if you want to record automation but keep the human element, a motion recorder is not a bad way to go. Spark is just out, but our NI minisite writer Peter was so into it that he created a motion recorder for himself just to use it:

BlackBox Recorder: Free Reaktor tool to Enhance Spark and Kore

Now, as it happens, you don’t necessarily have to use this with either Spark or Kore, so it’s worth mentioning here. You will need Reaktor to use the patch, though maybe this will give folks ideas for creating something similar with Max or Pd.

It’s a simple tool, but motion recording can lead to all sorts of other ideas. Got a favorite tool for recording human automation quickly? Let us know.

  • meimeimei

    What's so special on there?

    For me it's a waste of money…

  • Well, Blackbox is free, it's just a utility. Spark is $60, but that's not what's being shown here. Not sure what you're asking.

  • Nice work! This will be a great tool for performing on the lappy w/o looking at the screen! Thanks!

  • Great little tool.
    Im sure its not hard to add in but it could do with some looping syncing so the control data repeatedly loops in-time.

  • Darren Landrum

    One of my ideas was to build a synth without envelopes or LFOs by allowing automation of any parameter by way of marking multiple settings as "key frames." The way that a parameter moves between two (or more) key frames can then be a function f(t) or f(c), a function whose value is determined by time or by a controller. Loops between two values could then be established for LFO-like functionality.

    I wrote a wiki article for an OSS project I'm involved in that helps to explain it. It's not a great article, and needs some further explanations. If you look at the history, the first revision is from some number of months ago.

    This motion recorder is kinda heading in the direction that I was going with this "key frame parameterization" idea, so I thought I'd mention it here.

  • Damon

    "These should put you even more in the mood, then."

    After watching the cone heads thing, not sure that is my first choice in moods. Yes, I remember the product, but were Pavlov to have anything to say about it, the product is creepy and slightly disturbing.


  • Ben

    Have you though about using bezier curves between the key-frames you mentioned. That would be a good way to get a variable shape between the key-frames, and it isn't very processor intensive to calculate. It's an idea that I've wanted to try for a while, but my studio now takes up so much of my time that I don't get the chance to do any programming. Email me if interested in discussing it (

    Keep any eye out for the "Complete Ableton Live 7" tutorial I am authoring, that will be released in mid January by Streamworks Audio.

    Sorry about the plug Peter, couldn't help myself 🙂

  • Darren Landrum

    Bezier, linear, and sine were all going to be options, along with the ability to define a custom function for really funky effects. I'll shoot you an email tomorrow (it's the middle of the night here and I'm up with a cough and need to get back to bed) so we don't take the thread off-topic.

  • poopoo

    Great! He did something kind of similar in the synth inside the roux sequencer.

  • Peter Dines

    Mattbronka: Yes, quantized looping isn't difficult to implement for the knob macros – it's just a matter of synchronized resetting of a counter. My 2 cents is that this makes things too static, though. I much prefer ragged loops to bar-length ones.

    Darren Landrum: as it happens, your keyframes are a perfect analogy for the sound variations in Kore. Each sound variation holds different synth knob settings and depending on what cells you save the variations into and how you morph through them, you get slight or radical movements of the different synth controls. You can also snap instantly to the different variations with the controller. The interesting thing is, this works for any third-party instrument or effect that allows host automation and can be hosted in Kore, not only for NI plugins.

  • Darren Landrum

    @Peter Dines: As it turns out, that basic idea was also implemented in the Yamaha AN1x virtual analog synth, way back in the 90s. They called it "scenes" and each patch had two "scene memories" that would store knob settings to morph back and forth between. In the end, it's really not a new idea, but I'm aiming to take it a step further and use the idea to replace the concept of envelopes and LFOs with something that I think can be a lot more flexible.

  • Peter Dines

    Ooh, neat! I had no idea something like that was already implemented in hardware. Yeah, that's a great idea to apply automation to morphing.

    I had a look at your wiki article – I'm particularly interested by your idea of audio rate keyframe morphing. Are you talking about modulating that at an audio rate, producing FM and AM effects (depending on the synth controls affected), or just having the modulation curves work at extra high sampling rate and bit depth for super smoothness and accuracy?

  • Darren Landrum

    @Peter Dines: I'll try to give a rundown in as few sentences as possible here. 🙂

    The audio-rate key frames would only apply to key frames that are set up as a function of time f(t). Yes, the idea is that they'll evaluate for every sample (in blocks, like all the other sample-rate stuff). What this would possibly allow is to run a function so fast that it outputs a signal that can be heard.

    So, to answer your question, yes. 🙂

  • poopoo

    Have you checked the meta-surface in audio mulch? It is a neat way of morphing between many parameters sets simultaneously. It handles morphing between many scenes instead of just two.

  • Darren Landrum

    Unfortunately, this is where my communications skills break down. There are no "scenes" in my keyframing scheme. Each and every DSP parameter can have its own controlling function applied to it, and they all behave and operate independently of each other by default.

    Of course, a controlling function, f(t) or f(c) (c for controller input), can also be controlled by another controller function, but that's its own can of worms. I really need to update that article and add some diagrams, to see if I can clarify things. I really hate not being able to communicate what I can see clearly in my head.

  • ehdyn

    Similar to your concept.

    Beta and Presets just wrapped up on this one.

  • Darren Landrum

    That is indeed extremely close to the general idea, though I had a very different interface in mind. My idea also has a lot more flexibility in that changes in parameters can happen over a variety of curves (not just linear) and can be a function of more than just time. The trade-off is that my idea would require preset creators to be capable of more abstract thought processes.

    It's close enough that if they patent it, I'll be extremely pissed off.

  • O…M….G…..!!!!!!!!

    amazing ensemble,Peter.
    this takes my performances to the next level.
    i modified it so it can automate 8 cc's at any given time.
    awesome to use with Absynth 5,and with the "midi program change" midi fx of Kore 2 i can switch between different automations.

    many many thanks!



    p.s :is it ok if i upload it to the reaktor user library?(i'll be sure to mention that the original was created by you)