Building the Open Metaverse

Moviemaking in the Metaverse

Kim Libreri, formerly of Lucasfilm and current Chief Technology Officer at Epic Games, joins hosts Patrick Cozzi (Cesium) and Marc Petit (Epic Games) to discuss moviemaking, world-building, and VFX technology in The Matrix films and beyond.

Guests

Kim Libreri
Chief Technology Officer, Epic Games
Kim Libreri
Chief Technology Officer, Epic Games

Listen

Subscribe

Watch

Read

Announcer:

Today on Building the Open Metaverse.

Kim Libreri:

A totally fully simulated world, say the whole of America. Yeah. Or the whole planet, there are still challenges. That demo runs on a single computer. Yeah. Gameplay. Even though the AI... We're doing something that most games don't do in that demo, in that all of the AI characters or vehicles are asynchronously simulating continuously.

So, that's why you see on YouTube all these traffic jams that happen in the demo, because a traffic jam will propagate and just get worse and worse and worse until some lucky event happens with an AI car and it just clears itself. But if you wanted to do that with millions of vehicles and millions of people, then we need to start thinking about, "Well, how do you program AI and simulation across multiple computers?" Because you can't possibly hold it in one computer.

Announcer:

Welcome to Building the Open Metaverse where technology experts discuss how the community is building the open Metaverse together. Hosted by Patrick Cozzi from Cesium and Marc Petit from Epic Games.

Marc Petit:

Hello everybody. And welcome to our show, Building the Open Metaverse podcast where technologists share their insight on how the community is building the Metaverse together. Patrick, hello. How are you?

Patrick Cozzi:

Hi, Marc. Doing great. Just not my usual voice today.

Marc Petit:

That's unfortunate. Patrick Cozzi, our co-host. And today, we have a very special guest, and full disclosure, a colleague of mine and a longtime friend of mine. We're super happy to welcome Kim Libreri, the CTO of Epic Games, to the podcast. Welcome Kim.

Kim Libreri:

Hey Marc, hey Patrick. How's it going?

Marc Petit:

Good. So, I'll do a quick intro and then I'll ask you to describe a little bit your journey to the Metaverse. You've been CTO of Epic Games for more than seven years now. Prior to that, you were driving technology in Lucasfilm. And then, you also were a number of things, but including all the technology in the VFX supervision on the three original Matrix movies.

Marc Petit:

So, a pretty impressive and busy career. Thanks for being there with us to share your vision on the Metaverse. So, Kim, let's start by having your own version of your journey through CG and into the Metaverse.

Kim Libreri:

Okay. All right. So, I went to University of Manchester, Manchester University. Graduated in 89' and was super into computer graphics. That's what I specialized in the last year of university. And at the time, in 89', when you look at video games... I used to play lots of video games. In fact, I had an Atari 800. Actually, it would've been an Atari ST by that period, that I learned to program on taught myself as a kid, like many of us old timers did.

Kim Libreri:

I had to make a choice about my career. I knew I wanted to do computer graphics. And the challenge at the time would be video games, which I loved, were just nothing. There was nothing really happening in real time CG at the time. So, I'm like, "Well, maybe I'll go and work on movies." And I finished university, did the usual take a year out of school, go to Australia and then find a job. And when it came to finding a job, the first movie job that I was able to get was a small company where I first met Marc many years ago, called The Computer Film Company where they were making films using computers, which was a novel thing at the time.

Kim Libreri:

I think very few places. There was a couple of places in California, maybe on the east coast and, obviously, famous ILM was doing amazing computer graphics at that period. So, I started that and just got lucky overtime, worked on more and more movies. Initially as a software engineer, and gradually ended up getting into the creative side of things as a visual effects supervisor, mostly for the reason that... In the early days, if you want to pioneer, it was pretty obvious that the visual effects supervisors on movies were the ones that called the shots in terms of what technology you could use.

Kim Libreri:

So, eventually, I got a job offer to come to California to work on a movie called What Dreams May Come. And then, straight after that, we all, myself, John Gaeta, a bunch of my close friends that I'm still very much in touch with today, worked on a movie called The Matrix. Everybody knows what the movie's about. In fact, there's a connection to Marc there because Softimage, where Marc was working at the time was... Softimage and Mental Ray were at the core of the original Matrix movie. And we used it extensively for the bullet time shots in Matrix one.

Kim Libreri:

But what happened with that movies, we made all three of them, not totally back to back, there was a break of about a year between Matrix one and then Matrix Reloaded. The core philosophy was, "Hey, let's build visual effects in the way that programmers in the future would build a virtual universe." And the Matrix itself is kind of a Metaverse in terms of it being a simulated world, using computer graphics to make you believe that you're actually looking at something that's real instead of it being fake. And part of that project was digital humans, computer generated fire, computer generated clouds, digital cities. Loads and loads and loads of breakthrough ray tracing.

Kim Libreri:

I think we were one of the first shows to use a ton of ray tracing in a movie. And it just stuck with me. How do you use a computer to build a photoreal rendition of the world? And that stuck with me all through my career. I did movies for another decade or so, and then eventually, thanks to my friend, Paul Meegan, I managed to get a transfer from ILM where I had been working into LucasArts and LucasArts was the video game division of George Lucas's Lucasfilm. And we just tried to go at, trying to do all the stuff we'd done on movies in video games.

Kim Libreri:

And that's what brought me to Unreal Engine and understanding the capabilities of... I think we were Unreal Engine three, is what we were using at the time. And eventually, you start to see that there's going to be a revolution in the entertainment industry where the stuff that you used to make video games is going to be used for movies, for designing cars, for all sorts of industrial capabilities and beyond just making cool video games that we love to play. And I'm like, "I really want to be part of that. And I think that being a part of Epic..." Especially as they were about to think about going free with the engine, "Would be a really cool journey." So, that's what brought us into our Metaverse.

Kim Libreri:

On the interview with Epic, my wife, Sandra... We came back to North Carolina to check the place out before I took the job offer and played a game that they were working on called Fortnite. And Sandra was like, "Wow, this is going to be awesome. It's going to be the best thing ever." And I'm like, "You know what? It is pretty cool." Who would know that years later, it's starting to evolve into this massive, socially connected gaming space were all sorts of experiences are possible.

Kim Libreri:

So, that's my story of how I got into this world of the Metaverse. This year we did this Matrix Awakens demo, as you're all familiar with. And that was a, how do we go full circle? How do we take the things that we made in the first, second and third movie and show to the world that can happen on a video game platform and actually look better than it did 20 years ago when we made it in movies? So, that was the whole full circle.

Marc Petit:

Before we dig deeper on The Matrix and The Matrix Awakens, can you talk to us about Star Wars 1313? I think that was a big moment as well. At least for me, it was a big eye opening moment that you are responsible for.

Kim Libreri:

So, what happened is that, Paul Meegan was running LucasArts at the time. LucasArts had gone through a few iterations. Companies are constantly reinventing themselves and they really wanted to make a splash to re-announce they were back in triple A game development, they were going to work on two games. One of them was going to be a Star Wars action, adventure game. And Paul said, "Hey, do you want to come for a sabbatical at LucasArts and see what you can do?" And I was introduced to a fantastic team.

Kim Libreri:

My friend, Roger Cordes and Lutz Latta and the rest of the team that were basically making that game were originally planning to do gen three. I guess it was gen three, PlayStation 3 game. And we knew that the new PlayStation 4 was going to come out and we were like, "Shall we try and see if we can take all the lessons that we've learned making computer graphics both when I was on The Matrix and also what ILM had done over the years and bring together a hybrid team?"

Kim Libreri:

So, we did this prototype demo that went to E3 and won all those awards that you're talking about for 1313. And it really did deploy the way... Hayden [Landis], who invented the... Hayden and Hilmar [Koch], who invented the ambient occlusion techniques at ILM, they were part of that project. So, we really ported a lot of the ILM shader technology and the philosophies of how do you make things look real. Turntable renders, calibrated lighting environments, all the things that you'd normally would use to make a movie look great, we tried to apply to making a video game and the result was that demo.

Marc Petit:

Yeah. And Patrick, that was for me, a big moment because after spending all my career in the offline world, looking at the demo that came down in Star Wars 1313... When I got the phone call from Epic from Kim, this is why I said, "Yeah. I believe we can get all of this stuff working in real time and if someone can do it, that's him." And so, that's why also, I joined Epic, just as an aside.

Kim Libreri:

We had an awesome team. In fact, that team, a lot of them are part of ILMxLab. So, when you see them cool VR experience, and I think they worked on the Millennium Falcon thing that's at the park. So, they're a pretty killer team.

Patrick Cozzi:

First, Kim, is such an honor to have you here. I really love your passion and I love all the work that you're doing to take almost once movie rendering and make it real time. I'm very much a believer that game engines aren't just for games, it's for all things 3D and love that you're pushing the limits on real time 3D. So, the Matrix Awakening, I have to say, that's the best piece of real time 3D experience or demo that I've ever seen. Just the scale, the rendering quality, the interactivity. We'd love to hear a bit about the backstory in making it, and also some of the biggest technical challenges.

Kim Libreri:

Okay. All right. So, the backstory. We talked a little bit about trying to do what we'd done in the original Matrix movies in real time has been a driving feature in my career and the team's career. And in fact, many people that worked on Matrix Awakens actually worked on the Matrix movies. Jerome, who's our art director that basically put the piece together, he worked the architect scene in Matrix Reloaded and Revolutions.

Kim Libreri:

Anyway. So, how did it start? So, obviously, I'm good friends with Lana Wachowski and John Gaeta. And we ended up going to dinner one night and she tells us that she's going to make another Matrix movie. And part of it was trying to get the band back together in classic Lana fashion. And I'm like, "Look, I can't make the movie for you. We can help because the engine could do all this cool stuff now for visual effects. And thankfully, Lana believed in that and we were able to have Dan Glass and Double Negative run with making part of the movie with Unreal Engine.

Kim Libreri:

But I'm like, "How about this? We'll help on the movie a little bit, but how about you let us make a demo? Every year we do these cool tech demos to show off Unreal Engine. We have the latest version, Unreal Engine five. And Tim's been asking us to... Can we try and show the way forward for Unreal Engine users how to make a big open world, a city level simulation. And the Matrix, what cool a way to talk about the emerging Metaverse, the photo realism of virtual environments, a living, breathing city than the Matrix?" And Lana was "Sure, that sounds fantastic."

Kim Libreri:

Obviously, we had to work with our friends at Warner Brothers. They're one of our biggest customers in the gaming segment, almost all their studios run on Unreal Engine. And they were super excited to help out and bless the project. So, we started and the technical challenges... It's a new version of the engine. So, new version of the engine isn't like the version of the engine that our customers use. A new version of the engine is one that, quite often on a daily basis, it's probably not working in exactly the way you would want it to. So, the real challenge is working with our awesome engine team to make sure enough functionality was coming online, that we could actually build this massive large scale environment.

Kim Libreri:

And I've got to hand it to the engine team and the gameplay team and the art team, there's quite a lot of complex sequencing there. So, being able to deal with, how are we going to build this procedural city? How do we integrate our Houdini workflows into the engine while the engine is still a baby? It's not grown yet. At that time was quite a challenge. But in terms of the really hard stuff, I think, trying to work out, how do you build a city with a relatively small team? How do we use proceduralism? How many building blocks?

Kim Libreri:

It was a really big journey of discovery and, although, the team that we have, the special projects team that built the demo, not just special projects, people from all over Epic were involved in the project, had made a game before. We made Robo Recall. Nothing... We'd never built a large scale world of this size, and it's not a huge team. So, the core of it, we really wanted to show, "Hey, how do we get these two tools to work together, Houdini and Unreal Engine? How do we put the right amount of proceduralism in the engine? How do we deal with using Nanite and Lumen to its best capabilities?"

Kim Libreri:

So, there was a lot of challenges there. The other elephant in the room is that we have two very famous actors in the demo and making sure that we do not disrespect their performances, or put the demo in a weird, uncanny valley was quite challenging. And thankfully, we've got an amazing team. 3Lateral and Cubic Motion that were able to work with our local character team here in the Lockesburg office. And then, the other bit of it was we really wanted to ship. So, making it run on an Xbox Series X, Series S, PlayStation 5 was quite the challenge.

Kim Libreri:

But it's super authentic to our audience. It's like, "Look, it's actually running on our consoles. It's not running on a supercomputer." So, we felt it was important. I think that summarizes the majority of it. The physics system was pretty heavily used. There was a lot of upgrades there. You name it, there's a feature in Unreal Engine 5 that's used on this demo.

Patrick Cozzi:

And how many developers and how many artists over what time period?

Kim Libreri:

The artist count is... It's actually difficult to come up with a number because we did a bunch of outsource thing. We worked with our friends at Weta VFX to do some of the building modeling and a little bit of the character simulation stuff. We had our friends at Halon on work on it. So, I think total, the amount of artists that touched this demo, would've been around the 50, 60 artists in total, but not continuous.

Kim Libreri:

The special project art team is actually pretty small. And then developers, well, it's the brand new engine. So, you could argue that the entire Engine team was involved because they're working hard on UE5. And we don't just ship an engine that, "Okay, there's the code. It's complete. Out it goes." They're able to look at... The artists and gameplay people and gameplay engineers making this demo longer and going, "Is it working? Does it work right? Is it too easy? Do we need to move the buttons around?"

Kim Libreri:

So, really, a large proportion of the Engine team. So, it probably ends up being... The amount of engineers that would've touched this demo would've probably been more than 100 people. But they're not excluded, they're working on the engine, but "Hey, this building doesn't load quick enough, what's going on?" "Oh, let me optimize my code for streaming geometry." Or whatever they would be doing.

Patrick Cozzi:

Got it. And then, you mentioned the upcoming micro polygon renderer in Unreal Engine 5. Nanite, as well as the dynamic global illumination engine, Luma. I mean, I'm curious, was there any lessons learned? Or best practices for using some of these new UE5 features?

Kim Libreri:

Well, just getting art... Making a game that looks photoreal is quite different from making a movie that has to be photoreal. In the movie business, we have these things called compositors and they do an amazing job of taking lots and lots of lighting elements, lots and lots of passes, adding some practical elements and making stuff look real. Now, in our world where it's total real time computer graphics, you don't totally control where the camera's going. So, the assets have to look photoreal just naturally without a composite process.

Kim Libreri:

So, Nanite enabled us to do unbelievable detail, but actually knowing how to make the art using that detail. If you use photogrammetry, it works. But a lot of the buildings aren't based on photogrammetry, there's a lot of custom made stuff. The cars, for example. The cars had to look great and we didn't have scans and we couldn't base them on existing cars, otherwise, we have a copyright challenge. So, they're all made from scratch.

Kim Libreri:

So actually, there was a lot of lessons learned in terms of fabrication, how much procedural texturing, when to use photogrammetry stuff, when to use reference. And I think our art team leveled up really well by the end of it. Even our friends at Weta Visual Effects, they learn lessons as they were doing this because it really... It's like, "No, it just has to work. It just has to look real."

Marc Petit:

Yeah. No, there is no more room for cheating. I mean, usually in compositing you can actually fix things, but there is no “fix it in post” anymore.

Kim Libreri:

Yeah, exactly. Fix it in dev. Yeah. The humans work pretty hard. Yeah. One of the nice things is that 3Lateral and Cubic, they're the same company, we kept their names, it's just the way it is. They were evolving new capture technologies. So, Vlad and the team have an amazing new generation photogrammetry scanner that we were able to get Keanu and Carrie-Anne on to do their performance. In fact, we flew them over to Novi Sad in Serbia to actually get the shoot done.

Kim Libreri:

Yeah. It was a cool experience for the actors because they've been in Berlin shooting the movie. They'd never seen anything at this level of technology. The stuff that Vlad has over in Novi Sad is unbelievable. But matching it exactly... We were fortunate enough that Lana Wachowski shot us live action reference. And actually, we used a tiny bit of the live action in the demo and we cut between the computer generated version and the real version of Keanu to enhance the effect of the narrative that we're giving.

Marc Petit:

So, I mean, it was an amazing demo and already at a scale, some numbers have been shared about... It's basically the size of Los Angeles. Thousands of thousands of cars, tens of thousands of people and everything. But how do you scale to a fully simulated world? Do you think that the technology would allow us to scale? To represent the fully simulated world?

Kim Libreri:

A totally fully simulated world, say the whole of America. Or the whole planet. There are still challenges. That that demo runs on a single computer. Yeah. Gameplay. Even though the AI... We're doing something that most games don't do in that demo, in that all of the AI characters or vehicles are asynchronously simulating continuously. So, that's why you see on YouTube all these traffic jams that happen in the demo because a traffic jam will propagate and just get worse and worse and worse until some lucky event happens with an AI car and it just clears itself.

Kim Libreri:

But if you wanted to do that with millions of vehicles and millions of people, then we need to start thinking about, "Well, how do you program AI and simulation across multiple computers?" Because you can't possibly hold it in one computer. So, that's a subject very dear to Tim's heart right now. And I think you'll probably hear more from him in the next couple of years in terms of the way that we're going to solve those problems. But yeah, it's a big challenge.

Kim Libreri:

The engine definitely helps, the current version of the engine, UE5, the one we're on the edge of shipping, definitely helps in terms of managing the complexity of a big world. But a fully simulated world with clouds and weather. And the butterfly effect actually being able to propagate is a challenge that I think the computer science and video game world has to work on. There are a few little things out there that divide the world into a uniform grid of simulations and stuff, but they don't quite deal with the fact that... In a video game you can teleport from anywhere to anywhere, you're looking for nearly far distances, there's a lot of challenges ahead of us, but you know what? That's the cool thing about this industry, is that there's always something new to work on that's cool and exciting.

Marc Petit:

So, Kim, the Matrix Awakens, there was both a game, but it was also a very cinematic and very choreographed experiences. How did you achieve that?

Kim Libreri:

That's such a good question, Marc. It happened in this order. So, we built the engine, we built the city with the engine and some help from Houdini as well. And then, we populate it with AI, we populate it with traffic and then we drive the vehicles in the virtual world. And because this is running in the world of a simulation, just like the Matrix itself, everything is recordable. So, Colin, who's our lead cinematics artist on the project, is able to drive that Camaro, power slide it around corners, weave it in and out of traffic, reset the simulation, record it all and then go, "Wow, that was a neat maneuver. Let me put cameras that follow the action around."

Kim Libreri:

So, we literally filmed inside the world of a simulation. Every filmmaker I've talked to, I'm like, "No, it really... It's much more akin to the Star Trek and the holodeck." We ran a sim and we filmed inside it. We staged the action. We loaded the actors into it. And even down to the lighting simulation, Lumen is so good. If we want to get a little bit more fill light into a character, we just put a white card in the car, switches so it's not visible to the camera, but it adds to the lighting and we get nice fill lights. So, yeah. It's very analogous to filmmaking in the Matrix, is what it is.

Marc Petit:

Yeah. So, it's essentially simulation of cinematography. And so, that allows you to create that. I mean, those shots were actually beautiful. And so, who directed the piece? Was there an equivalent of a director?

Kim Libreri:

Actually, the way we tend to work in video games is quite different from the film business. It's not really one person, but it started... The original story started, Lana Wachowski wrote it. So, it's custom script. And that was mostly the intro at the beginning where they question the nature of what is real and what is not real. And then, we transitioned it in the car, because Lana was like, "Whoa, you're going to just have Trinity driving a car and it's the person that nobody even knows who it is driving..." "Video games, you can suspend your disbelief." She's like, "No, I'm going to write you something."

Kim Libreri:

So, that whole scene of them taking Mickey out of the marketing people, Lana wrote that. And then, the rest of it, that was Epic. And it's a combination of... John Gaeta was involved, Gavin Moran, Colin, and just trying ideas out and trying to stage and work out... Because John was designing the city a little bit. So, we would change the freeway a little bit. It's like, "Well, where are we going to..." We'd have the freeway, we'd scout a little bit. And we're like, "No, we need someone to do a hand break turn here." "Oh, okay. Let's adjust this freeway a little bit."

Kim Libreri:

So, we were terraformed as the sequence evolved, but a lot of that stuff was internal and the team just trying to work out what they could pull off. And then, you've got this extra challenge in a virtual simulated world. If you move the camera from one side of the city to the other, then the assets... Because it's a PlayStation or Xbox, it's not got infinite memory. That stuff may not be in memory. So, the streaming systems in the engine have to prime and load the content as you teleport.

Kim Libreri:

So, it's actually quite important that you think about, as you go from cut to cut, if you teleport yourself too far down the street, then you're going to cause a problem that causes a hitch in the playback. And the one thing with that sequence is we didn't want to be cheating with video because people are cooler than that. We really truly wanted to render on the console that's in front of you. So, a lot of thought went into, where do we place the camera? Can we have repeating geometry on the stretch of the street? Actually, it's pretty clever the way that Colin set it up.

Marc Petit:

So, I mean, it's going to be interesting for filmmakers because they're going to have to learn new ways and new techniques. So, they're going to have to relearn, but then the creative freedom they get out of it seems to be well worth the price. You think we'll see... How are we with the adoption of game mechanics to support movie making? Where do you think we are? I know it's something that you care a lot about and you've been advocating for a long time.

Kim Libreri:

I'm hoping that, as more people make movies using our engine, that they start to lean into this cinematics through simulation, as you were saying earlier. I think that the old school way of doing visual effects where you set up a particle system, you're trying to make it look like rain and you tweak in on a per shot basis. I would love to see a more systems level approach where, if you have a vehicle that you need to drive around in a car chase, whether you're shooting on an LED wall or it's a fully synthetic shot, rig up a car, make it drivable.

Kim Libreri:

Yeah. It's actually fun. It's more akin of physical production than traditional visual effects production. If you want to have weather, build a weather system, maybe we ship one at some point with the engine, use our time of day system to change the angle of the sun in the sky. Take a very procedural and systemic approach to how you build the world, as opposed to trying to do every little component as bespoke. And as studios do more and more in the engine, they'll build more and more of a digital backlog, not just of models and textures, but of these clever systems. Yeah.

Kim Libreri:

"Oh, you want something that looks like a lightning strike or a thunderstorm? Here's the thunderstorm piece of content that we've made." "Oh, what type of thunder clouds?" It's really start to... And that's another reason why I like the use of the Matrix for this, start to think, if you had the power to control the Matrix and you were filming the Matrix, what cool things would you want to be able to do? "Oh, if you crash a car, reset it instantly? Yeah. Let's just make it so there's instantly resettable just like they would be in a video game." So, that's the bit that I want people to lean in. It's beginning to happen. Finally, I'm seeing the film team start to go, "Well, hold on. It's just a big video game running." Yes.

Patrick Cozzi:

So, Kim, do you think interactive movies, or maybe you're watching a movie and Marc is watching a movie, but you're both using different camera angles. Do you think that will catch on?

Kim Libreri:

Obviously, there's a lot to be said for the craftsmanship that goes into telling a good story, but I do feel that the ability... If you're building your story with real time technology, the ability to go back into that story, explore it in different ways, share it together. I even think that, as we see a new generation of storytellers and filmmakers, I do think they'll start to evolve into these hybrid experiences that are part interactive and part cinematic.

Kim Libreri:

And who knows? The human imagination is endless. So, I think now this new tool appearing in front of creators, I think we're going to actually see a new generation of creators that, are they filmmakers? Are they game makers? They make experiences. And some of it may be a traditional story and some of it may be a very, very deep trippy interactive experience.

Patrick Cozzi:

Well said.

Marc Petit:

So, as you know, one of our favorite topics here on the podcast is to talk about the open Metaverse and opening those data sets. So, how you envision the pass forward so that we can create those amazing experiences like The Matrix Awakens? But how do we, eventually, share them and share those assets and make that city a truly open city?

Kim Libreri:

Okay. So, that is a long path for everybody involved in the whole Metaverse of things. I see good progress, thanks to Pixar kicking off open standards. And ILM as well with the XR, that are beginning to be open static containers for assets, but primarily they revolve around static or pre-animated assets and not smart assets. So, for example, a car, a Ferrari that drives like a Ferrari, but is fully digital is not just a model. It has a lot of logic. And depending on how deep you want to go down the rabbit hole of simulation, it could have cylinders and internal combustion, all running in its virtual simulation.

Kim Libreri:

So, I think we're a fair way off of being able to encapsulate transportability of smart assets. But I do feel that machine learning... If you show a computer enough examples of input changing output, then eventually, it can do a pretty good job of emulating that stuff. Yeah. And we've seen some great research happening over the last few years with deep learning. And I think that we may actually find that instead of trying to standardize physics system, mechanicals, all the complexities of what makes a real world object be complex.

Kim Libreri:

I'm sat on a swivel chair right now. I'm probably irritating the camera people that I'm sliding around here, but this is a reasonably complicated thing. A car or a vehicle, they're so much more complicated. And I think that machine learning may end up being the container that we use to transport complex rules. Unless we come up with a standardized programming language and a completely standardized physics system, there's a lot of work. It'll happen over the next decade. I'm pretty sure these things will happen, but right now, I think it's a nice open dialogue like this to exchange ideas between different companies, different creators, the games industry versus the movie industry.

Kim Libreri:

I think having this open forum for talking about the big challenges. I'm a big believer in incremental progress. It's good to... What was it? That Casey Kasem used to say, you can have your head in the stars and keep your feet on the ground or something like that. I think it's good to think about where we have to go, but not go into analysis paralysis where we're like, "Well, we can't make it until we try." I think there's so much experimentation that needs to happen.

Kim Libreri:

And I think as long as people experiment in the open way... One of the nice things about the way that VR evolved in the early days, is people were really transparent about what they were doing, what they were trying to achieve, the hardware they were making. And I think the Metaverse needs that level of transparency and exploration and discussions to help us really solve these problems.

Patrick Cozzi:

Yeah. I love that and I love to see themes across podcast episodes. Marc, if you remember, when we had Vlad from Unity on the podcast. Kim, Vlad is a creator of WebGL and he encouraged everyone to experiment in the open, which I think is exactly what you're saying as well.

Kim Libreri:

Exactly. I'd love to do some crazy experiments where you have two engines working together. We've had discussions with people in the past about, if you think about driving a vehicle, sure, all the logic for making the steering work and the traction and all that stuff can be contained within the logic of one particular game engine. But its interface to the virtual world, it's actually quite simple. It just needs to know about the terrain and the inputs from the user.

Kim Libreri:

And it is going to collide with any object. So, there's some simple physics transports that you can do between two different systems. And I'd love to see some open experiments where we really try to mix it up and do some crazy stuff.

Marc Petit:

Actually, Kim, we just recorded one of our previous episodes, with Juan from Godot and Royal from O3DE, the open source 3D engine. And I think about where the conversation went about making experiments to have data, trying to get a drivable car to go from Unreal Engine to O3DEand vice versa. And I think there's a lot of appetite amongst industry, and we're very close to Neil Trevett and The Khronos organization to try to make those things happen.

Marc Petit:

But I think you had a very interesting... And I think it's the first time we hear this on the podcast, this idea of you could transport the behavior without really understanding them. This notion of having machine learning, algorithm learn about input and outputs, I think is probably a great intermediate step so that we don't have to effectively standardize all aspect.

Kim Libreri:

As you probably know, cloth simulation is super complicated. There's not way you're going to run super complicated cloth simulations anyway, on a console. So, can we get a deep learning algorithm to actually do a pretty good job of faking the cloth? And the answer is actually, yeah. We could still do with one more console generations increment for the deep learning hardware, but it's some interesting stuff, though.

Kim Libreri:

Even lighting, you think about... We did this project, four or five years ago now, with the Blackbird. And what they do, is they do a panoramic environment capture with the aspherical camera rig on the top of the Blackbird. If you want to light a car that's actually driving through a world in a different engine from the one that's actually simulating the car, then all you need for lighting... You do a pretty good job of just generate a real time light probe from one engine and feed that video data into the other engine, which is how we did the human race. And it looks pretty good.

Kim Libreri:

And it's no secret that a lot of visual effects companies, when they do their character renders, they don't ray trace and light against the entire environment. They generate lat longs and they use them for lighting. So, I do think an experiment... I'd love to get involved. Next year or two, try something crazy. See if we can mix it up between different platforms.

Marc Petit:

Well, Patrick, when Kim says let's try something crazy, it's actually crazy. Actually, I have one question because... Come back and close the loop on The Matrix Awakened, that was a big endeavor. Did you have any doubt in any moment that you would make it?

Kim Libreri:

There's always fear. When you deliver any big project, even The Matrix movies. The hardest project ever to deliver was Matrix Reloaded for me. And you go through these experiences, you just have to believe you've got the best team. And we do. We have some of the greatest people in the world working on it. And if you believe, and not to cheese-ily riff on Ted Lasso, but “you've got to believe” and you'll get there. And what we do on these big projects, we adapt. There was a Kung Fu fighting moment in the projects at one point, and it just was out of scope. We just couldn't finish it. So, we took it out and fixed the holes, and it's still a good experience.

Kim Libreri:

Yeah. It's scary and didn't quite hit the date we were originally wanting to hit. But fortunately, the game awards came out of and were like, "Wow, it's before the movie. Game awards, it's got massive..." Oh, it's even better. Couldn't have worked out better. So, even though we're probably two or three months later than we would've originally planned, by magic, these things came together. There's always a silver lining. Well, almost always a silver lining.

Marc Petit:

Yeah. I mean, what an incredible achievement and the impact that it has. I mean, I think it was fantastic to watch this happen. So, is there any topic that we think we should be covering to further the conversation about the open Metaverse and anything you would've liked to discuss today or we should cover in another episode?

Kim Libreri:

I do think about... If you want to go really crazy, it's doing a brainstorming between two engine teams. Maybe it's the two big engine teams, or it's a proprietary engine. Frostbite team and Epic or something about, "Come on, let's do some crossover stuff. Let's do something really crazy." And do that in an open... Try and do it... People, they...

Marc Petit:

Our next guest is Natalya Tatarchuk from Unity. Do you have questions for her or proposal?

Kim Libreri:

How would we do something cool that uses both engines at the same time?

Marc Petit:

Okay, we'll ask.

Kim Libreri:

Yeah. That is the stuff of Metaverse. Yeah. And then, the other thing that I'm really looking forward to is simultaneous events in the real world and the virtual world. And bringing a different twist for... If you're gameplay, you're going to get something a bit different, but it's the same event and connecting... The one thing that we can do in these big, massive multi-play games that we have is we can connect all races, all people across the entire planet to come together and have fun and enjoy themselves.

Kim Libreri:

And I think that we can do the same thing between the virtual world and the real world in really interesting ways. It would be exciting to like, what would Live Aid be in 2024, 25? Yeah. There's so much potential for creators to come together. So, yeah. I think that what are the creative... Five years from now? What should we have done? What have we failed at if we haven't done it in five years? That would be a cool thing to talk about.

Marc Petit:

And our last question, any organization or person that you'd want to give a shout out to?

Kim Libreri:

Well, I'm going to probably miss a lot of people. Obviously, our confidence and family on The Matrix. Lana and Dan Glass and John Gaeta and James McTeigue and Keanu and Carrie-Anne. The list goes endless. Everybody from The Matrix crew that helped us out. Warner Brothers were amazing. Our friends at side effects, all the third party vendors that helped us on the demo. And honestly, our engine team. The visual effects business, where I worked for 20 years, we love our engineers because they help us make images we've never seen before.

Kim Libreri:

But everything's quite off the shelf nowadays. There's some proprietary stuff left, but it's not what it used to be. And I think that that knowledge of how powerful a great engineer can be and how you can transform the ordinary into the amazing. This demo did not run a frame rate a month before we shipped and not just by a... It wasn't like a 5% thing. It was a lot. And the engineers optimized their code, come up with new techniques, think of new ways of doing it. The artists work with them.

Kim Libreri:

That collaboration between technologists and creatives and game designers is super, super important. And I think that we ain't going to make a great Metaverse unless we respect the technologists and the artists that are actually going to make these things. And I think it's very easy for people to think of big IP and big brands, but it's really... If you want to make something amazing, you need the creativity and the technology totally working together.

Marc Petit:

Absolutely. Thank you. Well, Kim, it was a treat to have you with us today. Thank you very much. Congrats, again, on The Matrix Awakens and everything you've been doing. I think it's very promising for the Metaverse, for the quality of content we're going to have to experience. So, thank you so much for everything.

Kim Libreri:

Thanks. Thanks Marc. Thanks Patrick.

Patrick Cozzi:

Kim, really appreciate you joining us today. And yeah, you've had just an amazing journey in CG. Really great for you to come.

Kim Libreri:

And I'm not that gray yet, getting that way, but me and Marc share this, we've been around for ages. We aren’t showing our age totally yet!

Marc Petit:

So, I want to thank our audience. The feedback on the podcast continues to be great. We're so lucky to have amazing guests and you were one of them, Kim. So, as usual, everybody, if you have feedback, reach out to on social and hit the podcast on your favorite platform and share it. Thank you everybody. Thank you, Patrick. Thank you, Kim.