Unreal Engine’s sonic powers are extraordinary and free of cost – both in games and audiovisual environments but even powerful enough to consider Unreal as a music tool in its own right. The challenge is navigating the deep interface. The Sound FX Guy is one of the must-have learning resources for Unreal sound and music.
Here are some favorites – MIDI, creating soundscapes, exploiting the dynamic and spatial possibilities of the engine, and even using your mic as a game controller.
Side note: someday I may have to do a “nonviolent” game template that gets rid of the default of people with guns!
MIDI
If you’re ready to make the leap to Unreal Engine 5.4, the new release is packed with tons of new audio features. MIDI has long been a part of Unreal – you may have seen videos of folks plugging a MIDI controller keyboard into their Unreal setup and jamming. But what’s new is the Harmonix plug-in I wrote up recently. So let’s start with a video from March on working with 5.4’s Harmonix plug-in – a few months late, yes, but I think it’s more stable now than it was then.
Use a microphone
Audio input is always fun in Unreal – and (topic for another time) you’ll find plenty of sound-reactive ideas floating around, perfect for VJing or music videos. But if you want to get really advanced, try wiring up the mic to interaction – love this:
Spatial composition and design
There are a lot of things you can do with space in Unreal that are lacking in conventional DAW environments. Unreal is a fully native 3D interface rather than a virtual copy of a mixing board. That’s something a lot of us have dreamt of for a long time.
What’s interesting about this is, you could use it for sound design for games and virtual environments. But you could also use this as a way of mixing and composing spatially, even if the end result is audio-only.
And while experimental software has attempted that for years, because Unreal is a game engine, it also adds a greater degree of interactivity. So you could use the engine’s attenuation and occlusion features to make really realistic things happen in space – or you could make something totally artificial, using that same space as an interactive score. [mind blown]
Let’s understand the basics first. There’s this video on making variations linked to distance:
The Soundscapes feature introduced in 5.1, which has a ton of compositional possibilities:
Real-time volume control:
Actually, while we’re at it, his latest video on sound concurrency is worth investigating. (So, one thing about Unreal is that fancy, new, sometimes unstable features coexist with much older features. They can even compete for attention, and part of why it’s useful to have a guide is you might miss the old stuff while Epic hypes up the new shiny.)
Assembling MetaSounds projects
Cool as Unreal Engine is, it’ll probably be utterly baffling at first to music users, even to a lot of experienced users of node-based environments. You just have to wrap your head around the layers of abstraction that make this tool work inside a game engine, which means some different structures than you’d get in tools like Max, Reaktor, or TouchDesigner.
So here are some good breakdowns of how to think about Patches vs. Sources and more:
And on sources/routing/buses:
Putting it together
Here’s a fantastic look at assembling a whole system around all of this:
More:
https://www.thesoundfxguy.com/