It was the dawn of the third age of audiovisual creation.* Yes, Unreal keeps adding eye-popping, ear-bursting media powers, both in its stable 4.2x branch and early-access UE5. That makes this free-to-use game engine a must for futuristic artists.
UE5’s MetaSounds getting tons of love
The team at Epic have been gradually building a full-on procedural, modular audio system into Unreal. It’s good enough that one VCV Rack developer very publicly moved to Unreal instead for their development. (I’m really eager to see what Aria makes.)
Can this make more interesting sound effects? Yes. Can it make music? Yes. Could it make music that uses sound effect techniques to, for instance, create drums, and do it all in a 3D UI that’s unlike what you’ve seen before, and make it into an integrated, synesthetic audiovisual show? Well, that’s up to your imagination.
Here’s the official overview:
But for more, check out lead Aaron McLeran talking about their baby:
And there’s already an intensive community growing around this tool, sharing at an obsessive level. It’s clear some of these folks are totally music people. Here’s Dan Reynolds’ excellent series, for instance:
And, since we’re music people, can it MIDI? Yes, it can.
Back in stable land, there’s Unreal Engine 4.27. It’s got some stunning visual effects capabilities – enough so that even people who aren’t VFX artists suddenly look at the new camera system and think, hey, maybe I do want to get into that.
New in this version:
- Overhauled in-camera VFX
- Enhanced Virtual Camera system, building on 4.26 improvements.
- Live Link Vcam for iOS so you can drive the whole thing from your iPad, etc. (See it in the video. Yep, looks good.)
- 3D Config Editor for multiple-display applications. Now, this is being pitched with lots of xR examples, but suffice to say it also means new installation and live performance scenarios, too. And multi-display was one thing frankly Unreal didn’t do so well at the start (since it was, you know, a game engine). Related – nDisplay Root Actor and multiple camera improvements.
- Multi GPU support. (Again, huge for live performances – and hey, I hear those are slowly coming back.)
- Color calibration, also essential for LED volume and xR.
- Remote control Web UI builder. Now, this is really cool – graphical creation of widgets you can use to control the engine from a tablet or laptop. So you can steer your Unreal rig from an iPad or phone, which is useful for xR, installation, performance again.
- Level Snapshots
All in all, this means you can grab an iPad, a machine with multiple GPUs, and go into an arena and shoot a virtual xR show (now, while lockdowns continue) or play for a big audience (next… year? in more places? safely even? we hope?).
Advanced users, you can check out that in-camera VFX toolkit and even an example project. But notice also things like traveling shot motion blur, for extra evidence Hollywood productions have been messing about with Unreal lately.
Plus more goodness:
- Faster light baking / GPU Lightmass
- Path Tracer for accelerated progressive rendering
- New built-in video codecs and compression, for free
- Enhanced USD and Alembic support
- More VR and AR templates
Oh yeah, and MetaHumans, too
The funny thing about MetaSounds is that it has briefly stolen some thunder from the other community darling of late, MetaHuman Creator – a free cloud-based app for making your own photorealistic human figures that you can rig in Unreal. We’ve somehow climbed up the other side of the uncanny valley, basically. Well, at least we can get to the level of what was until recently a high-end game cinematic, for free.
So, since I don’t particularly like how I look on camera, who is with me for making your own virtual DJ self to play the virtual gigs?
See you there. Or see your new avatar, that is.
There’s Houdini news, too, but let me not cram everything into one story. Keep on as you Create Digital Motion, y’all.
*Fine, yes, this is a reference to Babylon 5’s opening. It’s all part of my plan to bring new, younger audiences to CDM by consistently referencing 1990s stuff. I mean, maybe they heard it in the womb?