This is some Kodachrome level color voodoo – color grading and shot matching powered by machine-learning. And it comes from a collaboration with some friends of ours from the artist and live visual side, so it’s doubly worth mentioning.

What if the current techniques called AI turned out to be really important to creative artists – just not for the reason the general public expected? That’s sure what Colourlab Ai looks like. It harnesses the powers of massive data crunching of pixels, the thing “AI” in the current generation was designed to do, and then applies it to making your video look amazing. For beginners, it makes color matching easier. For experts – right up to the Hollywood sorts who demand a ton of control and are not going to be satisfied with some presets – it makes the tool work faster, up against their deadlines, in the way they expect.

That seems a lot better than using AI for nefarious plots to take over with dystopian surveillance or automate out human choices from music listening or other horrid ideas.

Now, I took a look at Colourlab Ai before – and the collaboration that brought in renowned live visual artist and developer vade, and the software house Vidvox. (The latter are the fine makers of tools like VDMX.) I talked about why that collaboration was important, and how it’s a new model for how open source AI research can also lead to commercial products. (That’s important, too, because bills and buying food and stuff.)

But let’s skip ahead to today.

Not only is 1.0 here, but the developers have revealed their roadmap going forward.

Colourlab is a serious tool. So you get a free trial, but either US$99/month pricing subscription (good if you just want to try it on a job) or $499/year (so that subscription doesn’t eat too far into the money you’re earning).

What’s great is you get a lot of functionality right away – integrating with a ton of macOS tools – and a lot more coming, including a Windows beta. (Sadly, the Mac version arrives just as the M1 / Big Sur is new and not quite ready for primetime, so it’s too bad those aren’t flipped – but then for Windows pros, April is at least the time when maybe you’ll get out of quarantine in the northern hemisphere.)

I thought 1.0 would be fairly basic, though. It’s not. You already get 9 matching models and the ability to create custom ones, automatic matching by camera, and a ton of SDR and HDR settings here on day one.

The look that says, “I’m getting on my motorcycle and picking up my new Mac mini so I can do better color grades in Resolve.”

And this integrates with – well, most everything:

  • Davinci Resolve 16 and 17 (with custom node treessssdsdsss I just drooled on my keyboard and I’m a music and synth person) – also on Linux and Windows
  • Final Cut Pro rountrip (using proxies)
  • Premiere roundtrip (also with proxies)
  • Avid and Filmlight integration (via CDLs)
  • Frame.io support
  • Media relink
  • Export look decks as PDF

…and a lot more.

Just as crucially, this is ready for a ton of cameras and codecs on day one.

Cameras: Arri, Blackmagic, Canon, DJI, Fui, GoPro, Nikon, Pansonic, RED, Sony.

Codecs include RAW (REDCODE and Blackmagic), AVC and whatnot, ProRes, H.264, HDV, XDCAM… and the list goes on.

There’s some exciting stuff on the roadmap. AAF import in December, Arri RAW, import of timelines from FCP and Premiere in January, plus a Look Designer – a la the OFX plugin in Davinci and Canon RAW integration.

By February, you get Sony RAW support and full integration with Final Cut Pro.

By April, there’s Dolby Vision integration. And yeah, this means studio-grade stuff but also workflows that might involve your iPhone 12.

And finally in April, again, there’s the Windows beta.

This is either an illustration of a really beautiful color grade, or a metaphorical picture of how fast the whole thing is, or an instructional shot of what safe social distancing looks like in most of the USA in November 2020. Or all of those.

Look, far be it from me to interrupt slapping Oberheim synths with something from the video world, only to then make outrageous claims about the future of color. But seriously – people will look back and say, oh yeah, that’s when basically we changed how color works in video, because AI.

And not that AI started writing scripts or replacing actors – it was more that we got our color grading done and then went out and chilled together. This is good, because hopefully on the roadmap sometime later in 2021 we get to hang out with humans again.

When we are hanging out with humans, we’ll want to shut off these damn computers and just be with people, so getting color grades done more quickly has some appeal.

Seriously. Talk to me in December 2021 and let me know if I was right.

Meanwhile, I have a Mac mini with Apple silicon, so I’ll let you know how it works with this – and if that neural processing stuff in the Apple chipset benefits performance. Stay tuned, colorful people.