Every once in a while, the kind of cool thing you’d imagine does happen. So Harmonix, the music game dev, is still thriving at Epic, and they’ve got a surprise. They’ve built a full-blown modular synth that runs inside Fortnite, thanks to Unreal Engine and MetaSounds. And it’s capable of everything from letting creators add their own interactive music engine to multiplayer music, concerts, and more.

Ryan Challinor, Principle Designer at Harmonix, writes to tell us this talk just went online from Unreal Fest. He’s onstage with fellow Harmonix vet Zoe Schneider, Epic Marketing Manager. Unreal Fest had the unfortunate timing of arriving alongside big cuts at Epic, following major cutbacks across tech and media sectors this year. But Unreal Engine and Fortnite (and particularly the creator ecosystem for Fortnite) represent surviving strong points for the company. And it’s encouraging to see creative ideas like this enduring.

(and yes, that’s the same Ryan Challinor who’s behind the desktop modular, Bespoke!)

The basic notion here is, you get a collection of instruments that you can use in a modular performance rig. It’s Meta-MetaSounds – they’re all built with the guts of Epic’s interactive music and sound engine MetaSounds inside, but as friendlier, high-level instruments, effects, and basic audio routing and triggers.

Included in the launch selection:

  • Note sequencer, progressor, trigger
  • Drum sequencer, player
  • Omega synth
  • Instrument player
  • Echo and distortion effects
  • LFO and step modulation
  • Value setter
  • Speaker 
  • Music manager

At first I thought this was just some fun musical instruments to play around with inside Fortnite, but they show a number of use cases:

  • Yes, a modular rig / studio rig you can play in-game, but also —
  • Dynamic music and sound for creators
  • Virtual concerts with fully interactive, musically-synced animations
  • Real-time multiplayer music
  • — and this being Harmonix, music-synced gameplay

Yeah, sure, Harmonix are the folks who brought the world Rock Band (and Guitar Hero, before they sold it), but a lot of us remember them for Frequency and Amplitude, the fast-paced action games for PS2. And so this is wild: now you can basically make your own game like that just by putting together some blocks. Given the wild stuff people made with Super Mario Maker, it’s sort of mind-numbing picturing what Fortnite creators could do with this.

It holds up!

Okay, well, in a retro way! (Should be about time for everything in that video to come back into vogue.)

The modular interaction demos in Patchwork are really reminiscent of the work of PatchXR, the independent, standalone environment built for Meta’s Quest and other VR platforms. PatchWorld has a bigger library, too, and the approach is uniquely to use VR as the dev environment. (See below.) I think that Patchwork could give even give PatchWorld a lift, if the concept is catching on.

The unique draw in Patchworld/Unreal is the ability to run what you make live in Fortnite – taking advantage of Fortnite’s existing audience, interactions, multiplayer facilities, and the powerful functionality of Unreal Engine and MetaSounds. You can even play your musical modules directly inside Unreal Editor for Fortnite (UEFN). (UEFN acts as a special version of Unreal Editor made for Fortnite creators, giving them access to Epic’s platform-wide developer features.) That’s a great quick prototyping idea because it means you get drop-in musical engines with high-level features, even if you’re not adept with MetaSounds – and the full modular capabilities of MetaSounds are available should you want it.

And crucially, all MetaSound’s usual interactions between engine and game work here. You can drive gameplay elements from sound and musical triggers – and sound and musical triggers can be driven by gameplay.

There’s even a potential ecosystem here for instrument and effect builders and whatnot, as you can wrap any MetaSounds creation into a Patchworks interface.

A lot of the expanding power of MetaSounds, thanks to the audio team there, helps make this work: UE 5.3’s ability to dynamically construct MetaSounds at runtime comes into play here, for instance. Existing multiplayer and sound capabilities make it all work, and as Ryan notes, that means you get multiplayer “for free” – with control signal sent over the network and all audio generated on the fly locally. I’d always eyed Fortnite’s multiplayer features and SDK and wondered if they’d work well for music – now, here’s a great illustration.

Let us know if you’ve got questions; a blog post with more is coming next week. You’ve got one of the very first glimpses outside Unreal Fest.

Previously – on the capabilities of PatchWorld and multiplayer music in VR:

https://cdm.link/2023/07/patchworld-multiplayer-vr-music-integrates-with-ableton-live/

And more on MetaSounds features as I dig through the Unreal Fest sessions. Until then:

https://cdm.link/2023/02/learn-modular-metasounds-in-unreal-free/