Inside The Matrix Awakens: a vision for the future of real-time graphics
For Digital Foundry, the highlight of The Game Awards wasn’t actually an award as such or even a massive triple-A reveal but rather the debut of The Matrix Awakens: An Unreal Engine 5 Experience. Sandwiched between CG trailers, its impact may not have been appreciated during the event, but once downloaded onto your console, it’s clear that this is a genuinely important moment in real-time rendering. What Epic Games and partnering studios including The Coalition have achieved is the closest we’ve seen to an interactive motion picture, delivering new levels of fidelity in character realisation, environmental rendering, lighting quality and post-processing. If you own a PlayStation 5 or Xbox Series console, you owe it to yourself to check this out – particularly if you’re feeling the effects of cross-gen fatigue.
However, Epic’s objectives with this demo are many and varied, beyond the obvious visual spectacle. We spoke to key members of the firm’s special projects team to gain a better understanding of the full significance of this landmark release, discovering that the team has some serious pedigree, having made the original Reflections demo for the Nvidia Turing launch, using Star Wars: The Force Awakens assets to demonstrate hardware accelerated ray tracing. It’s the same group that worked on Lumen in the Land of Nanite – the stunning real-time demo that showed the world that the new consoles were capable of so much more. Valley of the Ancient followed, emphasising the quality delivered by the Nanite micro-polygon system and enhancements in Lumen-powered global illumination, and made available on PC. This team also worked with Lucasfilm, creating the remarkable LED wall used in the production of The Mandalorian’s virtual sets. For this talented group, The Matrix Awakens is a chance to validate Epic’s technology on the kind of mass scale we’ve not seen so far. It’s about wrapping it up and delivering it to the new generation of mainstream gaming consoles.
“Let’s just release it, let’s release it to the public, right?” says Jeff Farris, the Technical Director of Special Projects. “[Let’s allow] developers and customers to put their hands on it and hold ourselves accountable for making sure that this tech is 100 percent ship-ready. And that’s what we pushed through to make sure that yes, you can put this on real world hardware – this level of graphics and this level of interactivity is achievable.”
“We didn’t want to see a YouTube comment that says, oh, it’s running on a massive PC, it’s not a real PS5,” adds Jerome Platteux, Art Director Supervisor. “Yes, it’s on every next-gen console, including Xbox Series S.”
The story of how this collaboration happened is fascinating, indicative of how rendering in both the gaming and motion picture spaces is converging. Epic’s CTO, Kim Libreri, came to the company after a highly accomplished career working on major Hollywood pictures including the Matrix trilogy, and kept in touch with Lana Wachowski. This opened the door to Epic gaining access to the IP and assets. And maybe that’s how the Matrix Awakens manages to deliver a shot-for-shot real-time remake of the classic scene where Neo is awoken by Trinity reaching out to him via his computer. It’s certainly how the iconic bullet-time scene is recreated in real-time – Epic had access to the original assets and actually increased the resolution on Neo himself, with UE5’s Chaos physics engine used to accurately map the slow motion movement of his coat. It’s also how the team were able to recreate the original Neo and Trinity even though they custom-scanned today’s Keanu Reeves and Carrie-Anne Moss for integration into their MetaHuman character rendering system.