Hey, that other engine starting with the letter ‘U’ is looking pretty amazing, too. Unity 2021.2 Tech Stream is here, with loads of artist-friendly features. And you were worried what to do with your winter (southern hemisphere summer).
In fact, it’s all enough to make you do a double-take – yes, that’s Unity, not the certain other one. Since choice is great, I’m glad to see it.
This whole article is a great read, CDM Create Digital Motioneers:
What’s new for artists and designers in Unity 2021.2
There’s plenty here that could power expressive artistic applications. There are Terrain Tools with some beautiful new sculpting brushes and erosion and material painting – plus updated vegetation powers – all of which could make you feel like your very own Slartibartfast. (He created Norway.) There’s gorgeous realtime global lighting that can shine sun across your scene – perfect for us here in northern Europe staring down winter – via Enlighten, plus other updated lighting pipelines and features and lens flares.
Clouds – wonderful clouds and fog features are available, too, for the sky above all that new earth.
Using your apps for virtual cinematography and face capture is simply amazing. I found myself weirdly recently watching a documentary on the trainwreck of – sorry “making of” Star Wars: Episode I The Phantom Menace. A lot of it features a very nervous-looking John Knoll, the effects legend behind ILM and Adobe Photoshop, trying to work out how to do all this virtual effects and mixed reality stuff. Now you can get a lot of it on your phone with free software, and go totally punk with it – and you don’t have to answer to George Lucas, either.
It’s also now in Unity, much as we watch NVIDIA and Epic/Unreal chasing the same stuff:
Hey, let’s watch it in Indonesian, actually! (For my Indonesian-speaking friends, I couldn’t resist.)
There’s FBX export from the Face Capture app and it all uses Apple ARKit, meaning you might use this even if you’re not a Unity user.
But maybe the most exciting and emotion-triggering feature here is the Visual Effects Graph, which creates powerful visual effects across higher-end platforms (PC, console, and better mobile graphics). (You still have the more mobile-friendly, lower-common-denominator approach with Particle System.)
It’s beautiful in 2D rendering, too, which I hope not only appeals to game creators but artists, too. You build your own Shader Graph shaders and can target VFX Graph, too. Upshot of that: “These shaders can also use new lighting models like HDRP hair or fabric, or they can modify particles at the vertex level to enable effects like birds with flapping wings, wobbling particles like soap bubbles, and much more.”
As it happens, my friend Lewis Hackman actually has his quick demo of this featured in the Unity blog – and it looks just gorgeous, even through some social media compression. It’s a perfect example, too, as it takes the famous 1986 Reynolds 3D Boids example and remakes it in Unity VFX with eye-popping results (check Lewis’ other stuff, too, in Unity and TouchDesigner):
Check out the Visual Effects Graph documentation:
Dig deep and you get into other lighting and pipeline stuff – and hair. As I’d written before, this also means the growing benefit of some of the upscalers we’ve seen lately – from NVIDIA Deep Learning, DLSS, AMD’s FidelityFX Super Resolution (FSR), and Unity’s homegrown Temporal Upscaler. These do what AI turns out to be really good at, which is not in fact selling you out on your space mission and throwing you out the airlock or taking over your job as a writer and composer or building a metaverse or something. No, AI is really, really good at upscaling textures in ways that just weren’t possible on home computers before but now are. (Expect more NVIDIA news in these parts soon.)
The fact Unity has its own Temporal Upscaler also means you could see some of this stuff work on Apple’s platforms, too – since those don’t have NVIDIA or AMD chips in them looking forward. I haven’t had time to dig into too much of that yet, but suffice to say using a platform like Unity also means you let the platform deal with things like Metal optimization and you don’t have to think about it. (I’m simplifying, but that’s not an unfair description of why people look this way.)
But the part that has me intrigued – especially for quick sketches – is the ability to work with Graphics Buffers. So normally how this works is you have to deal with texture baking for complex simulations. Now, “simulations like boids, large data rendering, fluids, hair simulation, or crowds” can work with your data coded in C# or compute shaders directly in Graphics Buffers.
Or to make that simplification simpler – you can do fancy computation for some pretty eye candy without the previously required, counterintuitive code gymnastics.
And yes. VFX bubbles. Lots of bubbles.
Sorry, my brain goes… places. (oddly the YouTube uploader here seems to think this is some CIA mind-ops, as opposed to a kids’ program getting climate change warnings in 2021 right, but several decades earlier… though to keep us on-topic here, you can also do mixed reality by projecting some stock footage on a wall and running a bubble machine in front of it, which looks surprisingly great – clever PBS!)