It makes things look prettier, with something called “physically based rendering.” It has crazy compositing and capture powers. It’s networked with Web support, talks to DMX gear, and intelligently handles all your MIDI gadgets and capture cards and everything else. It handles VR with HTC Vive. And that’s just a few examples of what is new or improved in version 099.
TouchDesigner is a name you’ll see coming up regularly mentioned in projects. It’s an all-encompassing visual development environment, using a patching (or “dataflow”) metaphor, like Pd, Max/MSP/Jitter, vvvv, and others. That interface is one of the best looking zoomable patch environments, too — enough so that it’s sometimes been featured as part of artists’ performances. And it covers a range of OpenGL-based visual operations for generative graphics, video, lighting control, and other media, making it a favorite tool for live visual performers and installation designers. It just happens to be really good at these things.
TouchDesigner is, basically, one of those rare tools that pushes forward the state of visual expression and electronic performance. And now, after a long history of being Windows-only, it’s in experimental (but very usable) form on macOS, too – though it might get you hooked enough on GPUs that it eventually sells those same Mac users on new PC hardware.
099 is available now in beta – though seems already stable enough to start projects and so on. (You might want to keep it away from that six figure contract.)
That now includes even the non-commercial version – meaning you can try this free if you’re not using it on paid projects. (099 was first released to the wild in October, but now includes the free version, and is nearing release with various updates.)
And wow, is it packed. So, there is a Mac version – details on that soon, including how it differs from the Windows version. Even leaving out some features that require specific Windows integration, it’s an unprecedented level of functionality and expression coming to macOS. But maybe just as important is everything 099 adds to the core tool, far beyond that cross-platform effort. That’ll be significant to new users (Mac and Windows) and vets alike.
Derivative (the developers, and talented artists themselves) go into detail on their site about what’s new – links / resources below. But here’s a quick overview:
Physically-based rendering. Imagine being able to play with substances and materials in a way that looks beautifully realistic – and that integrates with workflows with other tools supporting this method. New environmental lighting support takes full advantage of this, as well.
Compositing improvements. There’s a bunch of stuff here that adds to the powerful compositing toolbox (including noise-based tools), plus new high-performance capture on Windows.
Virtual reality. HTC Vive Development Environment support is the first step into high-performance VR – and involves some complex work behind the scenes involving multiple views (because of the perspective and interaction needed).
By the way, here’s an example of projection mapping using the Vive:
Web connections. Now you can render Web pages — which means you could even create controls with HTML5, too, as well as integrated Web sources in projects.
Capture, DMX, and more hardware integration. Expanded DMX support lets you use more LED lighting environments. There’s added native SDK support for devices like Bluefish and AJA (think 4K capture and output). There’s new motion tracking support, and MIDI improvements for the rest of us. Plus – Blackmagic, OpenVR, Oculus updates, etc. TouchDesigner somehow manages to remain at the forefront of all this.
Scales to high-resolution / high density monitors. Mac users have a pretty easy time on their Retina displays, but on Windows you need to manage scaling to avoid tiny, tiny, tiny text – and Derivative have added complete integration for that. (Just in time for a new laptop to arrive here – great!)
More scripting. Mmmmm… Python. Now Python has various improvements including OpenCV support for computer vision analysis. There’s also documentation to make this friendly to openFrameworks veterans / switchers.
Jessica Palmer talks about using Python with TD:
Ableton Sync support. Not to be confused with Ableton Link, this is a proprietary, expansive system for coordinating visual sets with the sound cues in Ableton Live. So that includes not only timing information and synchronization, but scenes, cues, MIDI notes (and thus musical patterns), loops, and controller information. All the material in your set, in other words, can be the basis of live visuals. It’s even been tested on Amon Tobin and Plastikman tours. More details are in the wiki. (Correction: an early draft of this story very incorrectly identified this as Ableton Link support. I’d still like to see Ableton Link support, in that this means support for other apps and tools, not just Live. But that is really a solution to a different problem. I regret the error.)
And a lot more… Panels have been organized and cleaned up, tons of shader improvements, sample-based workflows, expanded geometry export, math and Syphon improvements, and lots of little details like – “multi-projection fisheye+cube+equirectangular, all with multi-picking.” Awesome, actually. Didn’t know I wanted that, but now I kinda do. Performance is improved, as well.
About that Mac version…
System requirements – you’ll need a decent GPU and on Mac, macOS 10.11 or later (that’ll knock out some Mac users, of course)
Licenses begin free, and even commercial licenses top out at US$599. (The version that costs two grand features stuff that … well, if you need it, you can afford the two grand version, I bet.)