So, speaking of NVIDIA goodies and why they’re powerful for artists – Notch just added support for all those NVIDIA Broadcaster features covered here in the fall. It’s like a motion capture studio and green screen rig, without the studio or screen.

Notch just today dropped a video showing off why this is cool, which you should definitely watch:

And the software hit, too. It’s unceremoniously dubbed 0.9.23.195. (I hear the kids now love lots of decimal places, though.)

But wow – if you’ve got an RTX GPU, you can take advantage of all of this:

  • NVIDIA Virtual Background support (AI-driven background removal, without the green screen)
  • NVIDIA AR Body Tracker (AI-powered 3D skeleton tracking using a 2D video feed – no depth camera required)
  • NVIDIA AR Body Tracker Skeleton (and you get the skeleton tracking)
  • An interactive waterfall sample, a sample showing you how to work this background subtraction stuff into larger contexts, and templates for all the new features
Eat your heart out, Bruce Boxleitner.

It’s not just NVIDIA. You also get the latest Kinect Azure SDK support for even more tracking joy, and Kinect 1, 2, and 4 all get expanded Nodes for motion capture data processing and skeleton manipulation. (Azure SDK 1.4.1 and Body Tracking SDK 1.1.0 were included.)

That means if you can’t get your hands on a new RTX card, but do happen to have an old Kinect lying around and a PC capable of Notch, this is still very useful.

This is on top of depth camera and Kinect features added late last year, plus a ton of other refinements for processing color, light, and other features and fixes. Check the changelog:

https://www.notch.one/releases/0-9-23/

But let me say explicitly – it’s not just that you add some kind of support for this stuff, but that it’s all natively part of the modular, node-patching interface of Notch itself. That lets you quickly experiment with building larger creations out of this tool.

And what that can do artistically is just astounding. In case you missed it, definitely do check out what Defasten shared with us in his video for me back in February. Also, it’s worth rewatching that video and imagining what you could do just inserting yourself into a scene with these features.

With all due respect, I hope this eliminates a number of trends:

Firstly, I won’t complain if this kills the use of truly horrifically-bad tracking as some kind of de facto art-y thing. It’d be nice to see that used intentionally, and not because it’s the only thing anyone has working.

Secondly, more importantly, the ability to transform our identity in virtual space to me suggests a chance to escape the panopticon of selfies and imposed “real” identity online.

Plus… I mean, we have these computers. Haven’t you wanted to be expressive and free of your own body basically since the first time you saw Tron as a kid? Sometimes there’s more on our insides than people see on our outsides.

Those reflections aside, Notch is awesome. Go be with Notch.

https://www.notch.one/

I don’t know about “the real-time graphics tool,” but it is definitely “a real-time graphics tool” and one to keep on your radar:

And speaking of the other real-time graphics tools, I’m eager to see where NVIDIA Broadcaster support (and the updated Microsoft SDKs for Kinect) pop up next.