Building the Open Metaverse

Spearheading the 3D Evolution

Rev Lebaredian, Vice President, Omniverse & Simulation Technology at NVIDIA, joins Patrick Cozzi (Cesium) and Marc Petit (Epic Games) to discuss his journey in tech, current work, and computer graphics and AI for the metaverse.

Guests

Rev Lebaredian
Vice President, Omniverse & Simulation Technology, NVIDIA
Rev Lebaredian
Vice President, Omniverse & Simulation Technology, NVIDIA

Listen

Subscribe

Watch

Read

Announcer:

Today on Building the Open Metaverse...

Rev Lebaredian:

We do more on the internet than just entertain ourselves and just socialize. We use the internet for work. We use the internet to build things. We use the internet to operate our companies and machinery and all kinds of stuff. So, all of those things are going to also be important in the Metaverse.

Announcer:

Welcome to Building the Open Metaverse, where technology experts discuss how the community is building the open Metaverse together, hosted by Patrick Cozzi from Cesium, and Marc Petit from Epic Games.

Marc Petit:

Hello, everyone, and welcome to our show, Building the Open Metaverse, the podcast where technologists share their insight on how the community is building the open Metaverse together. My name is Marc Petit from Epic Games, and my cohost is Patrick Cozzi from Cesium. Patrick, how are you today?

Patrick Cozzi:

Hey, Marc. I'm doing great. I've been looking forward to this episode for quite a while.

Marc Petit:

Yeah, indeed. It's season three and we finally get to have with us Rev Lebaredian, VP of Omniverse and Simulation Technology at NVIDIA. Rev, welcome to the show.

Rev Lebaredian:

Thank you so much for having me. I'm a big fan of this show. I've watched every episode and I'm happy to be on here with you guys.

Marc Petit:

Yeah, we're happy to have you with us.

Patrick Cozzi:

That's cool. I'm so happy that you've watched every episode. So Rev, as you know, then, we love to kick things off by asking folks their journey to the Metaverse. Look, I think you have a very inspiring story, the way that you found computer graphics and found the internet, so please, tell us about it.

Rev Lebaredian:

Yeah, I mean, if we're going to go back into history, you might as well start from the beginning. I was really fortunate. I'm a child of the '80s and when I was very young... I was six years old when my father decided to buy me a computer. This is 1982, I believe. Got a Commodore VIC-20 and I just love this thing. The idea that I could come up with an idea and type something in and the computer does things that I tell it to do was just amazing to me, and so I stuck to it. A few years later, when I was about 10 or 11 in the mid '80s, 1985, the Amiga 1000 computer had been released. This was a giant leap forward in computing at home, especially. It had 4,096 colors, it 16-bit sound, it could do animation, it could do all these things.

Rev Lebaredian:

This was in the era when... Macs didn't get color for five years, PCs were still that amber and green monochrome. I was reading a computer magazine that was talking about the Amiga, and then there was another article right after it which had a picture that I just couldn't make sense of. I stared at this picture and I couldn't understand what it was. There were two spheres floating on a checkerboard floor, and one sphere is transparent, the other one was reflective. And I read the article and what I understood from the article was that that wasn't drawn, nor was it a photo. It was a computer algorithm, a program that generated the image. And at that moment, I was hooked. I was like, "I've been wanting to draw my whole life." Half my family, my mother's side, they're all naturally artistic. They could draw from the moment they could lift a pencil. But I couldn't do that, but I could program a computer. So, I said, "This is what I'm going to do for the rest of my life."

Rev Lebaredian:

And I managed to find this amazing ASCII-based newsletter called Ray Tracing News on bulletin board systems. This is pre-internet, and then through that I learned how to do some basic ray tracing and ray tracing mathematics, vector math and whatnot. I went searching for more. I found the internet because that's where all of this stuff originated. Turns out the guy who edited this, Eric Haines, one of the greats in the computer graphics history, he works here now and I have the pleasure of working with him. That image was Turner Whitted's famous image from the 1980 paper on ray tracing. I got to work with him, too. So, that led me to visual effects. The same year I turned 18 was the same year the web was born, it was the same year and NVIDIA was born, and Jurassic Park came out.

Rev Lebaredian:

It was 1993. So, there's this huge demand for people who knew computer graphics in Los Angeles, and I managed to find my way into Warner Brothers and then into Disney. I got into rendering naturally, so I wrote the hair renderer and hair system for Mighty Joe Young at Disney Dream Quest. Then after that, I started my own company. I created my own renderer and I licensed it back to a lot of the big studios like Disney and Sony and Digital Domain and all those. So, eventually I was called by NVIDIA, and this was at a very special moment in computer graphics history. I had heard rumors that NVIDIA was working on programmable shading, which was a really, really big deal. My whole world was always offline rendering because I wanted to do the highest quality things, match reality as closely as possible. And real-time stuff at that time, the real-time 3D, was still too simplistic, but with programmable shading, that held the promise that what we were doing in the offline world might become real time.

Rev Lebaredian:

And so, the things that we are creating for the movies, we might be able to experience and go be inside. So, I joined NVIDIA and started working on the first hardware shading language, Cg. I was one of the first few people to work on that, and I thought it'd only be a few years before we got to offline quality and totally real time, and be immersed in it. Turns out it took it a little bit longer than that. It's been 20 years now.

Rev Lebaredian:

So, in the time that I've been here, this is what I've been working towards the whole time, is trying to make what we've been doing in the offline visual effects world real time so we can apply it to everything. Once it becomes interactive and immersive, everything will change.

Marc Petit:

Absolutely. It's fantastic. So, let's talk about Omniverse. I mean, this is one project that's near and dear to your heart. When did you start it again, just for the record?

Rev Lebaredian:

Well, it depends on how you measure when it started. In some ways, I've been working on it the whole time I've been here or even before. It's been a progression, an evolution towards it. We started calling it Omniverse in 2017.

Marc Petit:

Okay.

Rev Lebaredian:

And that's when we called it... And even the definition of it started evolved past that, but for at least five years it's been called Omniverse.

Marc Petit:

Fantastic. So, tell us about Omniverse and what NVIDIA is trying to achieve with Omniverse.

Rev Lebaredian:

Yeah, so if you look at NVIDIA from the beginning, you can kind of divide NVIDIA's history into three eras. All along, we've been essentially doing the same thing. We build computing systems, computers and the stacks, to accelerate algorithms that solve really, really hard problems. The first problem we went to attack is rendering, which is a form of physics simulation, if you think about it. It's the simulation of how light interacts with matter. We use it primarily for entertainment purposes, for generating beautiful imagery for video games and visual effects and whatnot, but really, what we're trying to do is simulate how light interacts with matter so that we can create those images.

Rev Lebaredian:

Once we introduced programmable shading about 10 years into NVIDIA's history, that opened up possibilities to accelerate different types of algorithms. That's when we introduced CUDA; that allowed us to build super computers and high-performance computing systems to accelerate simulations of general physics. You could use it for seismic analysis and medical imaging, you could use it for weather prediction and so on and so forth. About 10 years ago, so 10 years after that, a new era for NVIDIA came into existence. On top of CUDA, the whole deep learning AI revolution was born. The first thing that sparked this was at the University of Toronto. It was some grad students, Alex Krizhevsky and Geoff Hinton's group, they took an old idea, neural networks, a bunch of new data that was now available because of the internet, and combined it with, essentially, a supercomputer that was in their gaming system on a gaming GPU, and were able to do things that had previously eluded computer scientists.

Rev Lebaredian:

Up until that point, we had had no way of creating an algorithm that could reliably tell the difference between a cat and dog in images. And so, overnight that changed everything. Now, we could write software that writes software, and when that happened everything kind of changed. We realized that the way software is going to be created, the most advanced software, software that simulates intelligence is fundamentally different than how all of the software we've created before has been created. In order to create this new software, these new algorithms, you need an immense amount of data, and this data has to be very specific and it has to, in most cases, match the real world.

Rev Lebaredian:

So, for example, if we want to create robots like the ones that we're trying to make to drive on our roads out there, these self-driving cars, we need algorithms that give these robots intelligence to understand the world around them. They're going to see that world, they're going to perceive. And in order to do that, to create those algorithms, we have to feed the training mechanism, the way we create it with data, which is another way of saying, "We're going to feed it with life experience." We're going to give it hours and hours and hours of experience of seeing things so it can learn, much like how humans learn when they're born as babies. We learn how to see, how all creatures learn, and it became clear to us pretty early on that the only way we're really going to be able to do this is by synthesizing that life experience for these robots. We're not going to be able to gather all this information from the real world. We just are limited by time and space and cost, and in many cases, it's impossible to get some of the data you need or unethical.

Rev Lebaredian:

If we want to have our robots be intelligent enough to not drive over children and hit them when they're on the road, we need them to experience what it's like to have a child in front of them in every weather condition, every lighting condition. So, how are we going to create this? And the conclusion we came to was, well, we need to simulate it. We need to create simulations of these virtual worlds so that we can have these new intelligences we're creating be born and raised inside these virtual worlds. And it turns out all of the accelerated computing we've been doing all these years have all the ingredients for the things we need to construct the worlds. Rendering, physics simulation, and the new AIs we're creating to populate those virtual worlds to begin with or help us build it.

Rev Lebaredian:

And so, Omniverse basically came from that. We started building the computing stacks for self-driving cars, for robotics, and essentially digital twins of the advanced things we're trying to build internally here. And we always try to use all of the tools, everything that's already available out there before we create something new, but when we see that there's a gap, that there's something that's missing that we need and nobody else is building it, then we go build that thing. But we try to bias towards connecting to all of the things that already exist there so we don't have to duplicate effort.

Rev Lebaredian:

So, you see this with Omniverse. Omniverse is, it's kind of two things. First, it's a system for aggregating or connecting all of the tools and data sources you might have for building virtual worlds. We built it around USD, Pixar's USD open description of virtual worlds, so that we could avoid having to build all the tools we might need to construct these worlds. We want to collect them all together. And then we've built a specialized computing stack for doing these kinds of simulations designed to scale, from relatively powerful computers like our NVIDIA workstations, up to super computers that have many, many GPUs and many nodes, so that you don't have to make a trade off between accuracy and fidelity of your physics in world simulation and speed. That's kind of the two sides of it. But in many cases, we choose... Or we need to run simulations in different simulators, so just having the world all aggregated into this form, open description, allows us to use any simulator or engine out there, potentially, for the particular problem at hand.

Marc Petit:

Actually, there's something I wanted to say there. Rev, thank you for that. I think we have to give credit where credit is due, and we all have high anticipation on USD. We all had intuition that USD could be very powerful, but I think it took you, your team in Omniverse, to actually prove it out to you. And now the fact that USD is a candidate to become, quote unquote, "The HTML of the Metaverse," I mean, yes, it's due to the brilliance of the Pixar engineers and Guido the people who invented that. But I think without the work of your team to prove it out, I think that has massively accelerated the fact that we can consider USD for such a prominent role that we are currently having the conversations around the Metaverse Standards Forum.

Marc Petit:

So, I think we owe this to you and to your team, that a lot of us, including... I would include us, Epic, we dip our toe in the USD water a little bit. We've done some of it with it, but you guys have been all in and really pushed it to a level that makes us really comfortable to think it's going to work for everyone. Just wanted to call this out and thank you for that.

Rev Lebaredian:

Yeah, I think from our perspective, when we started this we said, "Well, no one tool, no simulator, no one engine is going to solve even all of our needs here at NVIDIA, let alone all of the world's needs." But one thing that's always a huge problem for us anytime we want to do anything is just collecting all the data together. When we want to do a simulation of our headquarters, like when we built this building here, NVIDIA Endeavor, our second-to-last building, Voyager, is next door, we ran simulations of how light would interact with this building. We had skylights that allowed a lot of light through, lots of windows on the sides. When we ran the simulations, we found out that we built it with the original design, we would fry our employees, all the humans that were in here. It would've been way, way too hot.

Rev Lebaredian:

So, they had to resize everything down and fix it. That would've been a very expensive problem to solve later. We solved early on, but just getting that data of the building and all of the furniture and all the things that we need to put inside there to run that simulation is a nightmare, and it's because everybody's speaking different languages. All of this data lives in different islands elsewhere. So, it was clear to us early on that that's the first problem that needs to be solved. We all have to talk the same language. If we can't, then we have no hope of simulating whole worlds, because all of the stuff being put into the real world here, the digital versions of it live in different islands. So, looking around, we're like, "Well, we could create something from scratch, but that always sucks."

Rev Lebaredian:

It's never a good idea to start from the beginning. Then you have to convince everybody to use that and convince them that you don't have nefarious, evil purposes behind doing that to lock them in and all that stuff. When we saw that Pixar had done this, that they open-sourced it, that was an aha moment. Like, "Wow, Pixar has been building large virtual worlds for longer than any other company, any other group in the world, and they've been using all these different tools with different people, with different skills, all working simultaneous together for longer than anyone else. What they've built is probably pretty good, and there's probably a lot of wisdom imbued inside that system." We're certain it's far from perfect and far from what we need, but better to start from something that exists and build on top of that wisdom than to build something from scratch.

Patrick Cozzi:

Rev, yeah. Look, I agree with the whole philosophy, especially enabling everyone to work together and the challenges of collecting all the data and making it interop. So, when you look at USD, how do you think it will evolve over the next few years?

Rev Lebaredian:

Well, you guys were at SIGGRAPH with me and me and you were in the Metaverse course, there was a lot of USD talk there. I think this year it was pretty clear that it's tipped over. I think there's a lot of momentum behind USD, and a lot of people in different industries have come to the realization that it's the best option we have to do this. There's a lot of work that still needs to be done, but I feel like everybody is coming together in good faith now, wanting to extend it and build it in an open manner so that we can have this interoperability.

Rev Lebaredian:

It's to everybody's benefit if we can communicate with each other, and I think history has shown that. On the web, with the HTML analogy, there were points in time where some actors were trying to lock HTML and the web away from us, and that just didn't work out, ultimately. Eventually, we got to HTML5, which was open and more advanced than all of the proprietary technologies that people tried to insert into the web, into that in that timeframe, and I think we can skip all of that stuff now. Let's just go straight to what the right answer's going to be anyway.

Marc Petit:

Yeah. And it probably needs... We need to turn it into a real standard more than an open source library.

Rev Lebaredian:

Yes. Well, that's a whole separate discussion, splitting the standard from the library, and I think that's inevitable. We just have to figure out how to do it.

Patrick Cozzi:

Cool. And Rev, speaking of the Open Metaverse course at SIGGRAPH, so for season three, episode one, we had Neil Trevett back on the podcast, and Marc and I were telling Neil that we just tried to invite all the right folks to come to that course, technologists with a vision, and it turned out that they all were interested to talk about USD. So, that's kind of... You know how the industry speaking, and so I thought that was cool.

Marc Petit:

Yeah, it was not rigged. We did not organize a USD conference.

Rev Lebaredian:

Yeah, it's turning out to be the right answer, and there's a lot of smart people on that course who are peering into the future, and so they're seeing the right answer. But a lot of it was about all the things that USD needs to have that it doesn't have yet, what all the gaps are to get there. It's great. That's the discussion we want to have.

Patrick Cozzi:

And Rev, that was a great segue in terms of peering into the future. So, one thing that you talked about that I thought was super inspiring at SIGGRAPH was giving people super human powers. We just talked a little bit about digital twins and simulation, but you also spoke about real-time synchronization between the real and physical world and how that could enable teleportation or traveling to the past or the future, or even a modified future. Do you want to tell folks about this?

Rev Lebaredian:

Yeah, I think a lot of the Metaverse talk right now is largely about fanciful, more entertainment-oriented things. People, when you say Metaverse, they imagine something like Ready Player One or this idea of, essentially, a large social space or video game. Which, definitely, I believe will be a huge part of the Metaverse. Of course. But if you think about the Metaverse as an evolution, as a continuation of the internet, it's a new mode of interacting with the internet. Of course, we do more on the internet than just entertain ourselves, than just socialize. We use the internet for work. We use the internet to build things. We use the internet to operate our companies and machinery and all kinds of stuff. So, all of those things are going to also be important in the Metaverse, and a key thing that we need for the Metaverse to be useful for all these other things is a link back to this reality. The one that we're in.

Rev Lebaredian:

For entertainment purposes, you almost want the opposite. You want to go escape, you want to go into magical worlds, you want to be a superhero, you want to do all that stuff. But for all the other stuff we do in the world in life, it's important that the internet and the things that we have in there reflects the real world. And if you extend this to a 3D spatial, immersive internet, if you can make that link happen between the real world and this form of the internet, then you get these superpowers I was talking about.

Rev Lebaredian:

So, the way I think about it, the first one you get is kind of the no-brainer one, is teleportation. If you have something in the real world, the example I think I used there as a factory. If I have a factory like the one we we've been showing in a lot of our GTC keynotes, the BMW one, and you have this link where the state of your factory, all of the joint angles of every robot that's operating in the factory, the position of the conveyor belts, the poses of the humans that are in the factory, you can gather all of that information and quickly send it to the Metaverse, to the digital twin, to the virtual version of that thing and have it match close enough, then effectively, anybody who has access to that virtual version will be teleporting to that factory.

Rev Lebaredian:

They can go experience that factory assuming that the simulation, including the rendering and the physics and everything that's happening there, matches. It's kind of the same thing. And if you can record that state, the state of the factory over time, then you get the ability to essentially rewind. You can jump back to the past to whatever you have recorded that's still stored in your storage, and so now you get some kind of time travel. If you want to go debug your factory, there was a problem somewhere in the line, anyone who has access to that anywhere in the world can go back in time and go see what happened. But it gets really, really powerful when you have a simulator that's accurate enough to predict the future for the things that you care about. So, for the factory, if you can make a simulator that would predict that you're going to have a failure a minute from now, then now, you have the potential to peer into the future.

Rev Lebaredian:

You can teleport to any part of that factory and look at that future, and if you could do that simulation faster than real time, faster than our time out here, then you can run many possible simulations in that same period of time and you can do experiments. You can say, "Well, what if I tweaked my factory around? I changed the speeds and feeds of the conveyor belts, of my robot configurations, the amount of energy I'm using? How can I optimize for energy, for human safety, for all these other things?" And I can search for the best possible future and go implement that one instead of just waiting for whatever to happen before you actually implement it in real life.

Rev Lebaredian:

So, that pattern, I think, applies to just about everything. If you can reflect the real world, whatever it is, whether it's a factory, whether it's your car, whether it's the whole Earth, whatever it is, if you can reflect it accurately enough, you can make that link between the real world and the virtual one and you can create a great simulator that can be accurate enough in its predictions. Then you gain all of these amazing superpowers.

Marc Petit:

Yeah, absolutely. That's a fascinating perspective, and I think what you guys are showing tells us that is around the corner.

Rev Lebaredian:

Yeah, I think it's going to be... This is one of those awesome endless tasks. From my view, I think this is the grandest of all computer science challenges: simulating the world in all its glory. It's endless because you can't actually build a computer that's big enough to simulate everything down to the quantum level in the universe. You need a computer that's orders-of-magnitude larger than our universe to do that. But in order for it to be useful, we don't necessarily need that. For the specific things that we need to predict the future about, where we need to teleport, we can get close enough already with a lot of the technologies we have today to do some really useful things.

Marc Petit:

Wonderful. So, let's zoom back a little bit and look at NVIDIA as a whole. We're seeing a company that has a lot of vertical integration, from GPUs to servers, to networks, to clouds, to software layers, layer application, software layers. So, at the same time we feel a company that's committed to open. So, how do you maintain openness at every one of those levels, and what's your strategy there?

Rev Lebaredian:

Yeah, that's a really good question because it is something that's somewhat unique about us compared to many other companies. Fundamentally, NVIDIA's a technology company, and there are many technology companies out there, but we see ourselves as a pure technology company. And by that I mean our product, the thing that we actually sell, that we make money from, is technology itself. We don't typically make end-user solutions, end-user applications, the final thing; we create a lot of technology that's very hard to create. We go focus on the things we're particularly good at, and then we rely on others to take that and integrate it into their products, into their applications, to their solutions. And that's how we scale out. That's fundamentally how NVIDIA works.

Rev Lebaredian:

However, the technology that we create is essentially a special computing stack. We don't build general purpose computers. There's other companies that do that. Our computers, from the start, have always been specialized towards solving super hard problems that require much more of the stack in order to solve. Computer graphics, doing rendering in real time, you can't just do that with a CPU. It's not enough to just have an ISA like x86, or ARM to do that, you need to have lots and lots of system software. You need a very hefty driver and you need deep understanding of the applications.

Rev Lebaredian:

We have an army of engineers that go and help application developers and other developers like Epic optimize their software and their applications for our whole stack. And so, we have these two things where we provide technology and we want others to go take that technology and formulate solutions, but the kinds of problems that we're attacking, they can't be solved only at the one layer of the computing stack problem. They're full-stack problems, so the way we do that is first we have to, whenever we're addressing a new kind of problem, we have to have a deep understanding of that problem in order to build any layer of the stack correctly.

Rev Lebaredian:

You can't, for example, create the algorithms and the computer for a self-driving car without actually making a self-driving car first. We can't just go ask a car maker, "What kind of chip do you need? What kind of systems do you need? What types of algorithms you need?" Because they don't know. It hasn't been done yet. So, we have a fleet of our own self-driving cars or the prototypes that we're building over here, not because we plan on building those cars and manufacturing them, but because we need a deep understanding of the problem to even just go implement any layers of the stack.

Rev Lebaredian:

Once we have that, we have these different layers, we're more than happy to license or provide this technology at any layer to anyone who wants it. We're not offended if somebody only wants our chip. If you only want our chip and you don't want the rest of the stuff for your self-driving car, so be it. That's okay. Go ahead and go build on top of that. But if you want that, too, we'll license you the stack above it, but the mere fact that we actually built that stack made the chip better. You benefited from it regardless of whether you license it or not.

Marc Petit:

Yeah. This concept with doc footing is very important in technology. You can actually tell who does and who does not.

Patrick Cozzi:

So, Rev, switching gears, we want to talk a little bit about AI. So, NVIDIA has been such a leader in applying AI to computer graphics, and I know that you're such a proponent for AI for the Metaverse, so would love to hear what's exciting you in AI today.

Rev Lebaredian:

Yeah, I mentioned earlier how we've been building Omniverse so that we can go create AI. We believe that it's a fundamental prerequisite, that there's no way we're going to create advanced AI unless we have world simulators and unless we build high-fidelity virtual worlds that we could go train them in. But the inverse is true as well. We believe that in order to advance computer graphics, to advance virtual worlds and simulations, we need AI. We can't actually create all of the worlds that we need to create without the assistance of these artificial intelligences. If you think about it, there aren't that many people in the world today that can create a high-fidelity virtual world. They're either at AAA game companies or visual effects studios. I don't know what that exact number is, but I would imagine we'd be lucky if there's 100,000, 200,000 people in the world that could really do this.

Rev Lebaredian:

That's obviously not enough if we're going to have a Metaverse where everyone is participating inside these virtual worlds. The thing that made the internet, and the web in specific, so successful was that it was created by everyone. Anyone can go create HTML, anyone can go create a webpage. Anyone can go upload a video and become a YouTube star and create a podcast these days. It's not limited to just a small number of people, but that's unfortunately not true for 3D. Creating 3D virtual worlds is just extremely hard and it takes decades to master just very niche aspects of the craft as a whole, and so we need AI to democratize the creation of virtual worlds.

Rev Lebaredian:

AI is going to help us ingest the real world and turn it into a virtual world so we can have digital twins of the real things, and then we're going to be able to use those things we collect from the real world to remix them and recompose new ones. And AI assistance will help us generate new things and create new designs in there, because every human, every child has a virtual world or numbers of them trapped in their minds. When you talk to a six year old, they'll tell you all about these virtual worlds, and they communicate them to you with words, incepting your mind with their vision.

Rev Lebaredian:

We want every child to be able to actually turn that into a real virtual world in the Metaverse. The key to that is, it has to be AI. There's no other way we're going to be able to do that. You need an AI to understand what that child is saying and convert it into the triangles and textiles and rigs and all the other things that are so hard to create right now.

Marc Petit:

Yeah, I agree, and in the spirit of giving credit where credit is due, AI denoising is how we got real-time virtual worlds. We're wondering, will we ever have enough compute to ray trace worlds? But the thing is, we guess as many rays as we actually compute them with AI denoising, and so we've got this boost in performance and that's accepted now. That's a given, that we do AI denoising and we're going to see so many more of those examples moving forward.

Rev Lebaredian:

Yeah, I mean, AI basically comes down to... All AI is is the ultimate function approximator generalized. It can take any function, whatever it is, and if you have enough data and if you have enough computing power, you can train this network, the system, to approximate that function. So, denoising is just one of the first functions that we're doing that with, but we should be able to eventually extend them to do others. We're seeing all this magic in the 2D world with Dall.E and stable diffusion algorithms, we want to see more and more of that come to the 3D world. That's where it becomes really useful, as far as I'm concerned.

Marc Petit:

Absolutely. All right. So, we've covered a lot of topics. We were super happy to see NVIDIA as part of the founding companies for the Metaverse Standards Forum, to join the initial group of companies. What are your expectations for the Forum?

Rev Lebaredian:

Yeah, I mean, I'm really glad that Neil (Trevett) pushed this, creating the Metaverse Standards forum. I'm actually the one that signed the check for us joining it. Neil came to me with that one. I'm a little bit surprised at how much interest there's been. There's almost 2,000 entities there, which is great. We love the fact that there's so much interest in the Metaverse and people want to discuss the standards, but I think now we have to figure out what that means. How do we deal with thousands of people, all with their ideas of what the Metaverse standards should be? I'm looking forward to seeing how these... I don't know what Neil and you guys are calling it, it's like subcommittees or...?

Marc Petit:

Yeah, domain working groups.

Rev Lebaredian:

Domain working groups work out so that we can get just the right number of voices who actually know each domain well enough to come together and build it properly.

Marc Petit:

Yeah, that's the challenge, is managing an open process and making sure that the right person get a chance to be head.

Rev Lebaredian:

Yeah, we want everybody to have a voice, but not every voice is equal in terms of wisdom and experience. So, you want to bias and wade towards the ones that actually have done it a little more than those that haven't, but-

Marc Petit:

We had Michael Kass sign up from your team, so...

Rev Lebaredian:

I'm sorry?

Marc Petit:

We had Michael Kass sign up from your team.

Rev Lebaredian:

Yes, yes, we have Michael Kass and Guy Martin as well.

Patrick Cozzi:

Yeah, we didn't know that the Metaverse Standards Forum was going to get that big that fast. It was kind of a surprise that we went from 35 to 1,600 and maybe two months or so. But yeah, Marc and I have constantly been saying, "Hey, Neil, okay, this is cool, but how do we organize it?"

Rev Lebaredian:

What is the number you expected? I mean, I'm surprised, too. I didn't think that many people would be willing to go sign up and actually do that. Being in standards forums and stuff, that's not the sexiest thing out there. People usually avoid that the plague.

Patrick Cozzi:

Yeah. I mean, we were originally interested in 3D asset interoperability, just that scope, which is a big scope. So, that, I think we were thinking, I don't know, Marc? 10 people maybe, or 10 organizations. But the swath of Metaverse is big, so yeah, I'm excited to see where it can go.

Rev Lebaredian:

You were just off by two orders of magnitude, maybe three by the time we're done with this.

Patrick Cozzi:

So Rev, as you know, we like to wrap up the episode with two questions, and the first one is, are there any topics that we didn't cover that you want to talk about?

Rev Lebaredian:

We talked about almost everything I love. We talked about computer graphics and AI, the Metaverse, about computing history. I really can't think of anything that I could summarize in a minute that would be an addition to that.

Marc Petit:

And so the other question is, is there a person, institution, or organization that you would like to give a shout out to today?

Rev Lebaredian:

Well, I touched upon it earlier. I think Pixar, I'd like to give out a big shout out to. What we've built with Omniverse and now what the industry is starting to move towards with USD in general, that couldn't have happened without their foresight and the risk they took by opening it up so early. They put it out there in 2015 and they've been engaged with the community, sharing their most valuable resources, their engineers, with the rest of us in the community for this period of time, and now they're doubling down on that. So, I'd like to give a shout out to all of the Pixar folks, particularly in the USD community with Spiff and the great people that are still at Pixar working on this, and Steve May for funding it.

Marc Petit:

Yeah, absolutely. Well, Rev Lebaredian, VP of Omniverse and Simulation Technology at NVIDIA, thank you for sharing your passion and your expertise. Again, kudos on the Omniverse project. I mean, you guys are really leading a lot of interesting tracks, so thank you so much for being with us today.

Rev Lebaredian:

Thank you so much for having me. This was fun.

Marc Petit:

And Patrick, we want to thank our audience, we keep telling people, "Hit us on social." Let us know what you like, don't like about this podcast. Let us know who you want to hear from. And Patrick, thank you so much as well. Rev, thank you very much again, and we will see you guys for another episode soon. Thank you.