Will the Unreal Engine 5 Realize the Metaverse’s Potential?

Epic Games’ new, real-time, photorealistic renderer reaches well beyond just games

3 min read
A female avatar is repeated endlessly in rows. In the front of one area is the same avatar positioned slightly differently, next to a male avatar.

Unreal Engine 5’s MetaHuman can be used to create photorealistic avatars, such as this feature demonstration from The Matrix Awakens.

Epic Games

The race to build the metaverse is on—but first, developers need the tools to do it.

Unreal Engine 5, released on 5 April, is among the leaders. Its creator, Epic Games, is best known for its megahit game Fortnite, but the company’s engine expertise has deep roots. The engine spun off from the 1998 shooter Unreal and seized an early lead when Epic licensed it for use in over a dozen games from 1999 to 2001.

“I think Unreal Engine 5 is a great step forward for the metaverse and augmented reality/virtual reality, especially when you consider that a lot of what the company has been building is thinking about the metaverse and AR/VR,” said Anshel Sag, principal analyst at Moor Insights & Strategy.

After all, immersion and real-time, photorealistic rendering that's as all-encompassing an experience as possible is the name of the game.

To that end, Unreal Engine 5 (UE5) packs in dozens of features and refinements, but two take the spotlight: Lumen and Nanite.

Lumen can handle global illuminations that include diffuse reflections with potentially infinite bounces. If you think this sounds like ray tracing, you’re right. Though handled in software, which reduces accuracy, early demos show results that hold up well at a glance. Nanite, meanwhile, is a virtualized geometry system that can alter polygon counts in real time to meet a performance target. It helps developers achieve a level of optimization not previously possible.

What is virtualized micropolygon geometry? An explainer on Nanite | Unreal Engine 5 www.youtube.com

“In my opinion, Nanite could be a vital component to helping VR experiences expand in scope for high-end VR, such as [PC-tethered VR a.k.a. PCVR] or [Sony’s] PSVR 2, if it can help us to add more detail and scale to our worlds,” Alistair Hume, cofounder of metaverse developer Axon Park, said in an e-mail.

These features are supported by MetaHuman, a tool for generating photorealistic avatars; World Partition, used for dividing world assets into groups for improved optimization; and Quixel Bridge, used to import objects from Megascans, a database of high-fidelity 3D assets.

Epic has leaned into metaverse hype by releasing The Matrix Awakens: An Unreal Engine 5 Experience alongside the film The Matrix: Resurrections. The demo delivers a virtual world as realistic as that in the first Matrix movie yet runs in real time on modern home consoles and PCs. UE5 includes the demo’s city as a free sample any developer can download—which has already led to a viral hit inspired by Superman.

You could be forgiven for thinking UE5 is ready to dominate AR/VR and metaverse development. This notion is reinforced by the company’s other metaverse efforts, which includes a partnership with the LEGO Group focused on a kid-friendly metaverse experience.

Despite this, Lumen and Nanite do not support VR development at launch, and Epic has no timeline for their release. Developers hoping to use them must take a wait-and-see approach.

“[Extended reality or XR] developers need as much performance as they can get, and not being able to take advantage of Lumen or Nanite is challenging,” said Sag. “Unreal Engine is already a great engine for XR, but the lack of its flagship new features may turn off some developers.”

Epic thinks developers will look past the omission. “VR developers can leverage most of UE5’s production-ready tools and features now, such as the new [user experience], the new suite of modeling tools, creator tools such as Control Rig and MetaSounds, and World Partition for large open environments,” an Epic spokesperson said in an e-mail.

Developers have several alternatives, but one stands out: Unity. Though less frequently used by big-budget releases, Unity is the world’s most popular game engine by market share.

Game avatars stand in a group in a city square with a large device including floating cyan gemstones. Zenith: The Last City is a massively multiplayer VR game built with the popular game engine Unity.Ramen VR

Unity powered 2016’s AR hit Pokémon Go and Beat Saber, a 2018 VR game that’s still among the most popular for Oculus headsets. More recently, Unity has touted the success of Ramen VR’s Zenith: The Last City, a VR game with an open multiplayer world. Zenith highlights the strength of Unity’s Asset Store, which offers over 60,000 assets for download. Ramen VR used store assets to build Zenith with a team of less than 10 developers.

It’s unlikely either engine will gain a permanent upper hand. The metaverse is one of several frontiers for game engines, which range from product prototyping to CGI and 3D animation for television and film. In short, both UE5 and Unity have room to grow—and that means metaverse development will be just one of many tasks each engine supports.

“I think both Epic Games and Unity 3D realize that the broader their engines can be applied, the more scale they will be able to drive,” said Sag. “Thinking of them as just game engines is a 5-year-old mentality.”

The Conversation (0)

The Future of Deep Learning Is Photonic

Computing with light could slash the energy needs of neural networks

10 min read

This computer rendering depicts the pattern on a photonic chip that the author and his colleagues have devised for performing neural-network calculations using light.

Alexander Sludds
DarkBlue1

Think of the many tasks to which computers are being applied that in the not-so-distant past required human intuition. Computers routinely identify objects in images, transcribe speech, translate between languages, diagnose medical conditions, play complex games, and drive cars.

The technique that has empowered these stunning developments is called deep learning, a term that refers to mathematical models known as artificial neural networks. Deep learning is a subfield of machine learning, a branch of computer science based on fitting complex models to data.

Keep Reading ↓ Show less