Building the Open Metaverse

Cloud Infrastructure for the Metaverse

Bill Vass, VP of Engineering at Amazon Web Services joins host Patrick Cozzi (Cesium) and Marc Petit (Epic Games) to discuss Cloud Infrastructure for the Metaverse.

Guests

Bill Vass
VP, Engineering, Amazon Web Services
Bill Vass
VP, Engineering, Amazon Web Services

Listen

Subscribe

Watch

Read

Announcer: Today on Building the Open Metaverse.

Bill Vass: We work with NASA on the Mars Rover with RoboMaker. And so that obviously has to process on the robot on the edge, right? Because it's 20 minutes each direction to Mars, so you got to push that ML to the edge. And that's what a lot of what we do with RoboMaker, for example, is this seamless, you train on the cloud. We have a thing called World Forge, that's like the No Man's Sky, a lightweight No Man's Sky, does the interior spaces today and it randomizes interior spaces. So you can say, I want 6,000 houses that range from one to seven bedrooms, with this kind of furniture and this, whatever. You click a button and it generates those worlds for you and guarantees that they're all random, and then you test your robot out in it.

Announcer: Welcome to Building the Open Metaverse, where technology experts discuss how the community is building the open metaverse together, hosted by Patrick Cozzi from Cesium and Mark Petit from Epic Games.

Marc Petit: All right. Welcome everybody. Today, on Building the Open Metaverse, the podcast where technologists share their insight on how the community is building the metaverse together. Here with my co-host, Patrick Cozzi, CEO of Cesium.

Patrick Cozzi: Hi folks.

Marc Petit: We have the immense pleasure to have Bill Vass from AWS with us today. Thank you, Bill, and thanks for being with us.

Bill Vass: Yeah, it's great to be here. Thanks for having me.

Patrick Cozzi: Thanks, Marc. And welcome Bill. So our podcast is both about the technology from the metaverse, but also the people themselves that are building the metaverse. And Bill, I mean, you've worked on many interesting things from your time at the Pentagon, to Sun, to what you're up to now at AWS Naval and Cloud and Edge Gaming, simulations, movies, and robotics. I mean, to start off the podcast, we'd love to hear a bit about your personal journey to the metaverse.

Bill Vass: Yeah. So it's been a long one. I've always thought of this as, the metaverse is an interesting name for it, but 3D spatial computing, and then linking in the physical world with the virtual world is really where I've been focused. I've done a lot, I've worked on RenderMan in early days, and OpenGL and things like that. But my first experience in working this space was really at the Pentagon where I was CTO Army.

We were working on a thing called Force 21, where we had an ocular that came down and did 3D mapping for soldiers in the field and trying that out. And I learned a lot about augmented reality then, and how hard that actually was. And interesting thing in all of these visions, we also had, when I was at the Pentagon, we were working on, we called it 3D faxing. I know that sounds very old, but now we call that 3D printing. And so we would scan things in, in 3D. And we were attempting, for example, for a tank, if a sprocket broke on a tank, rather than having the parts, we would print it out in 3D, which we actually do today.

I was over at Blue Origin and looking at them printing the whole engine in 3D and things like that now. But back then, the sprocket was all wavy and you couldn't really use it and stuff like that, but you could actually, it used thin sheets of metal and it used the plasma beam to weld it together. The idea is, you put a CD in, remember those, and you can create a 3D model off the CD and print it out in your tank and fix your tank, and so we work in that space.

And then we started working on these things called 3D caves, if you remember. And they had the glasses where the eyes would flip back and forth like that, with the LCD shutters and you walked around. And this was when I was at Sun, we were doing the National Labs where we had, the floor, the ceiling and the walls were all projected. And so you walked around this giant cube with 3D glasses on, and you could walk around for design and things like that. And then we would take, and then turn those designs into real physical environments that you would walk through and those kinds of things. And so you could walk through and see if you were going to hit your head on a pipe and those kinds of things and be more immersive.

But then, we also took that to simulation and the ability to simulate environments as well, and do predictive simulation based on that, and that has continued to evolve. And so a lot of these things, like in the early days, when I worked on artificial intelligence back in 1978. I know I'm old. And so we had undersea autonomous drones that were giant diesel subs, and they would get lost all the time. And my boss used to say artificial intelligence isn't very intelligent, but we were doing neural network programming.

And the thing was, we're doing the same thing today with machine learning. It's really algorithms haven't changed that much. Other than deep learning, they haven't changed that much. There's still scalar vector model. We just didn't have enough storage and compute. And I think the same thing in this metaverse discussion, we didn't have enough storage and compute.

And now with the cloud, with edge capabilities, with high performance GPUs, with all these things we have today, some of this vision of the merging of the virtual and physical world, and the ability to move back and forth between the two seamlessly. Like my iPhone has LIDAR on it, and I can go around with Polycam or other things like Matterport and things like that and create 3D worlds, bring it right into my real 3D world. And I can talk a lot about RoboMaker and all the other things we're doing in that area. But that's been my direction here.

Another thing, because of this working this metaverse piece, also, I work in the oil and gas space, which I didn't call it metaverse back then, just creating 3D, very advanced 3D high performance computing, which has all led to this as well. And then I ran an autonomous vehicle company that did large scale, long duration, autonomous ocean robotics, called Liquid Robotics, got bought by Boeing. We did giant ocean size simulations from the robots, and then we'd run them in the real world. And they would self navigate and they'd swarm themselves and these kinds of things.

So pretty fascinating stuff across the board on the whole journey. I worked a lot on NASA WorldWind, if you remember that, and that's still out there. I helped to fund that when I was at Sun. I helped to try to keep OpenGL alive. I love to see Vulkan and taking that over. It's been a fascinating journey and I'm really excited about the future and where we're going. And some of the stuff, for example, that Epic has been very visionary here, and I'm also a gamer. So that also makes it fun. I can always tell an Unreal game because it's got great graphics and stuff, and I'm very immersive, so I enjoy that also.

Marc Petit: Well, thank you, Bill. So we're going to talk about the future in a minute, but before I think it would be interesting to give us an overview of the current state of real-time 3D on the cloud. What's working at scale, what's in production. We hear a lot of our experience, but I believe probably a lot more is in production than we actually think. So from your vantage point at AWS, where?

Bill Vass: So there's a number of different dimensions in this. So there's actually this thing, I'm surprised how well it's doing, I love that it's doing well, called, a virtual try-on in Amazon.com, where people are creating 3D environments using their iPhone to scan the room and choose furniture and drop into the room along with trying on makeup and clothing and other things like that. And that's continuing to evolve.

In the robotic space, we launched a product called RoboMaker, which is used for autonomous vehicles, autonomous cars, robots, all those kinds of things. We just launched, as a matter of fact, its ability to run it on Unreal, along with Unity and O3DE, so you can choose your engine and choose your physics and choose your models and then press a button, and it will swarm that.

And so you can do like, a year's worth of testing in 1,000 environments that are randomized in an hour, which you just can't do in the real world, right? You can do millions of miles of driving in a car and those kinds of things. And I think that is really advancing. A lot of the high-performance computing we see on AWS today is the simulation world, whether it is weather simulation in this space.

And so we also launched a product called Amazon Location, which has two, now coming 3D mapping on it, which will let you pull in 3D worlds from the real physical world, very much like in our partnership with Cesium, what we can deliver there and then render it in Unreal or Unity or O3DE or whatever else you want, and then drop a vehicle in there and do a training set on that ML, with photorealistic rendering, things like those environments.

And so we also see a world where that links into our retail side and in gaming as well. And I see this with LED walls as well. So we have a product called Nimble Studio, which we're working with, again, Unreal in this idea that you can have, instead of in a green screen, you have the LED wall behind you and the actors can actually see the things they're interacting with, which is amazing to me, how much they've done on green screens in the past. And so that's all linking together.

And I see this convergence happening between the gaming rendering, the simulation world, the retail virtual try-on world, the AR/VR world, and the movie rendering world. And then all of this physics and simulation capabilities, all coming together now, along with 3D mapping and geospatial worlds, all coming together. So I'm pretty excited about where it is and that we can actually do it now.

A lot of our Snowball customers today do Edge processing with drones and then map it into 3D point clouds as they collect the drone information, for example. We then use these and ship them back in, and these are used for ADAS collection and things like that in cars, for those logging.

So there's a lot going on, and I think you're also going to start to see very large-scale, real-time simulations, where you're able to simulate, if you like the 3D world, go across many servers with shared memory. And I think that's going to be transformational as well. So watch for cool things coming in those areas.

Patrick Cozzi: Thanks, Bill. So first I really loved your introduction and it sounds to me that a lot of the metaverse experiences that we're thinking about nowadays, folks have thought about in the past, but now we're starting to get hardware and the kind of tech stack underneath to catch up. Right. So I think that's a really cool theme that you brought forth. And then I wanted to ask, you mentioned about the convergence of so many interesting areas and another theme in the metaverse has been openness and breaking down silos and interoperability, and if you had a perspective on open standards and creating the interoperability among all those folks?

Bill Vass: Yeah. It's a real challenge. If you deal with, there are hundreds of different CAD formats, and hundreds of different image formats, and hundreds of different 3D formats, and we've really got to work to converge those since everybody built their own; and so we've got to start converging on a common set of standards that allows portability, because we really should be able to take a CAD/CAM of something, import it into a simulation, import it into an engineering simulation, import it into a video game, import it into a movie CGI, and then also print it out in 3D.

And so I think as 3D printers get also much more advanced, you've got multi-material 3D printers now. I saw my first cell phone printed out on one 3D printer, the entire cell phone, the battery, everything printed out on a 3D printer.

It didn't have an LCD screen, but it had buttons, it had a battery, it had a fractal antenna, it had all of the circuit board all printed in one unit. As these things start to occur and you start getting more advanced 3D printing, like I said, we've moved from, with a resolution where SpaceX, and Blue Origin, and you know, Astra and a whole bunch of others print things that they're immediately used.

Where it used to be you just printed mold and those kinds of things. There's a really cool startup called PolyWorks, which I'm fascinated with, in a shipping container, it's got a metal grinder on one side, you throw metal of any kind you want in there, it grinds it up, it puts it into a forge, a plasma forge that liquefies the metal, and then a centrifuge that breaks it into the different types of metal.

And then they atomize the metal into a powder. And then that goes directly into 3D printers. And they have all of their 3D models on Snowballs, and they're controlling all this. And then the other side, products come out.

And so this is like an amazing thought is you got this idea of you throw an old car in one end and you get new airplane parts out the other. And so it's really fascinating. They're using them on oil rigs, and in the military and other places like that.

And, it's just, the metaverse to me is anything where the virtual and physical world combine, and the ability to move seamlessly between simulation worlds, the real world, the virtual world, and the physical world.

And I think, yes it's gaming and VR headsets and those kinds of things, but that is not really what it's about. And I think the commerce that's going to happen there, the experiences that are going to happen there, if you take a look at like what Niantic is doing with Pokémon, in their all 3D-layered metaverse that they're building as well for augmented reality in that space is getting to be more and more popular.

I think as headsets get more interesting, like I can't get my wife to wear my Oculus today, but maybe somebody makes a good pair of glasses that they're lightweight, I could get her to wear or something like that. I got her to wear an Apple Watch and she likes that. So it's a matter of getting... What would that next thing for that interface there?

But you can interface with it, it isn't limited to AR/VR, just it's anything with a screen that you can interface in. And even, like I said, in 3D printing, you could take anything ideally that is visualized in the metaverse, and print it out in 3D and then touch it.

And then you could take anything that you scan in ideally, and visualize it in the metaverse. And I think those technologies are going to continue to converge and evolve.

I think if you take a look at any of the big video games now, they're significantly more complicated in production than a movie. It's certainly amazing what goes into the video games.

And then if you take a look at a movie, the 3D artists in the end, there's like thousands of them. My daughter's a 3D artist, works... going to school at a college of the arts down in LA. And so I kind of look at all the CGI stuff she's doing and things like that.

And again, she does the same thing. She'll do this CGI robot, and then she'll print it out in 3D. So this whole idea of moving back and forth is pretty exciting.

Patrick Cozzi: Yeah. And Bill on that note, to enable all these metaverse experiences, there's going to be a lot of 3D content, right?

Bill Vass: Yeah.

Patrick Cozzi: It could come from the scanned world, like the drone example you mentioned, it could come from artists, procedural algorithms. And I think over time there's going to be more and more demand for this content, very large worlds, very high resolution worlds.

And today you look at AWS and we know it powers a lot of streaming video at scale, and what do you think about the current state and the future of stepping up to streaming the real-time interactive 3D of these massive worlds?

Bill Vass: So we already have a product called Luna that you may recognize where we're delivering the streaming video games, and I think as long as you've got in the under 100 millisecond kind of latency, it's usable. And then as you get down in 90 millisecond or less, it's great.

And so I think that's going to be capable, and we're doing more and more with our, a product called Wavelength, where we push the cloud to the 5G hubs, and then with a low latency of 5G, you can get a lot of capabilities there. And our Outpost, where you can even push it into a building. And so you have the ability to deploy all the way out into a building or into a hub.

I think there's still going to be challenges. Like if I'm on Verizon, on a 5G hub playing with Patrick on a 5G hub on our mobile thing, streaming that game, that's probably going to be okay. But if Marc is on AT&T, or any other, other than Verizon, if it's at the Verizon hub, then you've got to do a network hop, and Marc's going to be trying to play with Patrick and I, and his latency's going to be messed up because he's just on a different network.

Not there's anything wrong with AT&T, I love AT&T too, that's not an issue. It's just that that bridge of the network is going to cause problems. And I think the challenge to like, we do these kind of video conferencing all the time now, and when my video slows down at home, everybody just says, "Oh, Bill you're frozen, could you repeat that?" But in the middle of a game I'm playing with Marc, and when I freeze he just kills me.

And that's very frustrating in a game. It's a different kind of story, like when you're watching Netflix and you get the little, while it's caching, that's fine, but you'd rather not have it, but it's fine, but in a video game or any of these areas... And that's why we see like in connected car, and drones, and even like we worked with NASA on the Mars Rover with RoboMaker, and so that obviously has to process on the robot on the edge, right? Because it's a 20-minute each direction to Mars, so you got to push that ML to the edge.

And I think that's what a lot of what we do with RoboMaker, for example, is this seamless, you train on the cloud, we have a thing called World Forge that's kind of like the No Man's Sky, a lightweight No Man's Sky, it does interior spaces today.

And it randomizes interior spaces. So you can say, "I want 6,000 houses that range from one to seven bedrooms, with this kind of furniture and this," whatever, you click a button and it generates those worlds for you and guarantees you that they're all random. And then you test your robot out in it.

And then you can run, make one millisecond equal an hour, and run a year's worth of testing in 6,000 houses. And then you got a great ML results from that. And then you can click a button, deploy that code onto the robots, because a lot of it, you need to run locally.

The other thing we've done some experiments with is this idea of, if you take a bipedal robot and it's walking along, you can actually, as long as it's 30 milliseconds or less, you can offload the balancing algorithm to one of those 5G hubs. And so that's an interesting thing. If you get beyond 30 milliseconds, it starts to look like a drunken sailor, at about 60 it starts to fall over.

So I think you wouldn't do that with a balancing algorithm. You would keep that on the robot. That was just an experiment we did. You could keep that on the robot, but a lot of the voice interactions and things like that, as you see, for example, with Alexa products, there's very little on the Alexa, almost all of it's running back in the cloud.

And so you get this idea of a very stupid device that acts very smart, a very cheap robot that acts very smart because its brain is extended into the cloud; that brain is trained in the metaverse, it's trained in these simulations, it's trained in these environments.

And so your iRobot runs on RoboMaker. And so that's trained in these synthetic houses, and then it's deployed out there onto the edge. And so I think that there's always going to be some edge processing, and in game streaming, I think ultimately what we need to do is make it... What we're trying to do with like Greengrass, which is a product that can run at the edge, is you can distribute your program into what's going to run on the cloud, what's going to run on a pop, what's a Wavelength, what might run on a Snowball, and what might on the device that you're pulling Greengrass on to, based on where the programmer can pick a latency and processing capability.

So if they have a really powerful thing, like an iPhone, they can run a lot there, or a car, they can run a lot there. If it's less powerful, and they have good network, they'll offload more and more to those tiers.

What we really need ultimately in this metaverse gaming environment is the ability to do that with our rendering assistants. And so you can have, where Marc is maybe he's got poor bandwidth and he's playing, so most of it's rendered on his device, whereas I've got really great bandwidth and most of it's rendered on my device. And then Patrick, you're somewhere in between and it's like half and half.

Now, of course then we also run into time, right, this speed of light thingy that we haven't solved yet. And so trying to figure out what is real, the time series across our environments, because maybe Marc is, say 200 milliseconds away from Patrick and I, and so when Marc shoots us, his bullet looks like slow motion compared to Patrick and I. So we've got to figure out what real-time is. And you don't want to have the rubber banding effect that you get or any of those kinds of things you see in games where they're trying to sync time up. I think that's going to continue to be a problem for all of these MMOs that we have to work through, and that's an area where standards could be good. I think you need some open standard for delivering this idea of deciding on the fly what parts of the game run on the device versus what runs in the cloud.

Well, a lot of the game studios I talk to, their idea of streaming is they actually want to take advantage of the local hardware. Because it costs money to run things on stuff that they don't own. It doesn't cost money to run things on stuff they own. It costs money to run things at the back end. And so what they want is rather than waiting for two hours for the game... I got Doom and then it sat there, and I got Cyberpunk, and it sat there for two hours downloading before I could play it. It may show you all this cool stuff that it's downloading, but you just leave the room.

What you'd want to do is stream the game immediately while you're downloading all that stuff so you can be making progress in the game. And I think that's what a lot of the studios are looking for. And then once it's down, then it takes advantage of the local hardware. A PS5 is an amazing piece of gift.

Marc Petit: I mentioned the Apple Watch connecting to our phones, connecting to the cloud. I think we have a suspicion that model could be the same for real-time free content and glasses connected to maybe a phone device connected to the cloud. And so is this the architecture that you expect to be the solution because that's a big challenge from a software engineering perspective to get those distributed. Because the amount of power we're going to get at each point is so different. Probably going to have two watts on the glasses, 10 watts on the phone, 100 watts on the edge. So what's the path to finding a solution there?

Bill Vass: Yeah, I think it's going to be very much engaged with the phone, as you mentioned. I think the phone is going to become your central processor. If take a look at an Oculus today, it does a lot of processing on the thing that you run on your head. And actually I love my Oculus, the new one is amazing. It shows you your hands and stuff like that. They did amazing work with that. But I think you're not going to have the average person with a heavy headset on. I think you're going to need some way to project that looks more normal. And it's disappointing, for example, that 3D movies aren't as popular as I thought they would be either. I like them, but I find that at home you can't even buy 3D TVs anymore. Some of them have it, some of them don't. And getting my family to all put the glasses on to watch a 3D movie at home, they don't want to do it.

So I think there's still going to be user interface challenges for a long time. And I think that a good amount of it will process on the phone. Phones are only going to get more powerful. And so the glasses would just be a display device basically. And the rendering would be happening on the phone and caching and streaming. I think we need to work on this idea of segmenting the application in a way that the developer's minds don't explode. I know that we don't use CUDA a lot in the gaming industry because we want to write directly, just even too much of an abstract, because we want to use all the hardware again. But CUDA has enabled a lot of things with people not having to understand that.

And my goal, I think with our distributed architecture we're working on is trying to make it so they don't have to understand that of the average program. Yeah, I'll always give them a door where they can go directly to the hardware. But if I can abstract that away where they don't need to think about where it's running, where the machine... That's one of those things computers could do pretty well is measure latency, measure processing power, measure networking, and make a decision in real time as to what to run where. But you're going to have to have the app broken into these loosely coupled components. And that's going to be a challenge for rendering engines. It's going to be a challenge for game developers. And then you still have this speed of light problem with people playing from all over the world in MMOs that you have to resolve. That's not going to fix that.

But I think streaming is going to get there for many people, but I think there's going to be an awful lot that's still going to have to run on the edge. Your autonomous vehicles are still going to, and robots and things like that, are still going to do a lot of local processing. I don't see that. I can't imagine a world where there's not going to be a lot of autonomous things and I can't imagine a world where those things won't be connected to a network. And if they're connected to a network, they're going to be connected to a cloud. Those are things that I can count on are going to be true. But I can also see a world where some of it... One of the challenges we had with our autonomous systems for RoboMaker is these drones are in the ocean and they're crossing the Pacific on their own. And it's amazing actually how far offshore a cell signal will get. It's more than you think because it doesn't have trees and stuff in the way. But when you're near a shore, it's running WiFi for its communication. As you go further offshore, it moves to cellular. As you go further offshore, it moves to satellite. And satellite's got a whole different latency model.

The scary part is when you're right on that edge where it's going between satellite and cellular and satellite, so it's going synchronous, asynchronous, synchronous, asynchronous. And for example, we specifically built a thing in Greengrass called the Stream Manager that is designed to help developers not have to think about that. Because you get these out of order executions and things like that in that space. And then the same thing as we with Hyper coming out and the SpaceX equivalent rolling out. This idea that you will be pretty much pervasively connected, but it'll be at all different levels of bandwidth and all different types of radios. And so how do you make that? That's our goal in Stream Manager in Greengrass to make it so the developer doesn't need to understand it. It discovers all the communication, it manages the order and execution of it to avoid out of order execution, and then makes those decisions. We need to get to that place in metaverse where we can deliver rendering and actions in that curing as well in a way that it doesn't make programmer's heads explode. And it's hard.

Marc Petit: So you touched a little bit on 5G. 5G brings the telcos into the cloud business. What does it do for the dynamics of the ecosystem? Does it create another layer, as you said, of indirection of abstraction, or the relationship with the big scaling cloud companies, such as Amazon and the telcos? How do we envision that whole ecosystem?

Bill Vass: Yeah, so our wavelength product that we're partnering with a lot of the telcos. The nice thing about the 5G is it's a hub-and-spoke architecture and so we have these hubs where we put our outpost products, which lets you extend your AWS environment right there. So everything you get on the cloud, you get in this hub much closer to the customer from a latency perspective. Solves the speed of light problem to a certain extent. And then for underserved areas, you can go out and look. We're doing a lot of that with Snowballs. So these little Snowballs and Snowcones can provide private networks locally, private 5G. And I think the interesting thing is that you see people doing that in university campuses and farmers' fields and a whole bunch of things like that that aren't served by 5G because of the density of antennas that are necessary are going to be focused on where there most people are right at first.

And so you'll still have this ability to extend the cloud to those places and run on the 5G hub. And then you have this, "Well, what do I want to run on the client and what do I run on the hub?" The latency is transformational for 5G. Another thing that people don't realize is the billing is transformational for 5G, which goes back to the economics of all this. The billing allows you to just pay for basically what you use and not having to pay for connect. Whereas previously with IoT devices, you had to pay for connect. And then there's other things we're deploying. Amazon has a terrestrial wireless system called LoRaWAN, which we've deployed that's available everywhere. You're not going to stream video over that. It's very low bandwidth, but very long range, low bandwidth. It's really good for heartbeat information to figure out, "Well, here's where my drone is. Here's where my car is." Those kinds of things. Without having to run on the 5G network for things that you don't need the 5G for.

I think also trying to prioritize communication as to what you might have machine learning running on the edge, like in a car, and it's collecting data all the time and it only sends back on the 5G to things it thinks is instantaneously important. And then when you pull into your garage at night, it connects to your WiFi and sends the rest. And so I think those are going to be the kinds of things you'll see continue. Tesla does that today, obviously, but a lot of the auto manufacturers are planning that. And we're trying to develop that with Stream Manager in that space as well.

And you see the work we're doing at QNX, where they're extending the cloud APIs into the vehicle OS and make that available. And so you can enter vehicle communication and then you can be always connected to these 5G hubs. The other thing that's happening with 5G is they're moving from proprietary switching environments to software defined environments running on the servers versus proprietary switching components they used to run before. And that changes the economics of some of this deployment as well for them. And so it's pretty exciting to see that happening also, but a lot a lot going on here.

Marc Petit: Yeah, no, thanks Bill. It's actually fascinating. You're giving us a glimpse of how sophisticated all of that software, those software layers are going to be. There will be no magic. Brute force is not going to take us to the metaverse. We're going to have to go through layers of abstraction and optimization. So a lot of complex engineering works ahead of us.

Bill Vass: Yeah, but I think, though, we roll back to standards. I think we need a good set of standards for 3D asset management and for a good set of standards for importing from the 3D world about mapping and object information. We also need a really good set of standards for cross metaverse. I mean, I would, as a gamer, and I will switch to the gaming view.

I mean, I love to be in Cars 2. I have a simulator at home that actually moves, a thing called a Fasetech game simulator and things. Oh, they're great. Everyone should have one. Go take a look. They're made overseas. I had to get it imported and stuff, but it's great. It twists and also moves like this. And you can put the Oculus on and use that as well.

And so I'm in Cars 2. And I paid for a Lamborghini Aventador. I'd love to now go to Grand Theft Auto and import my Lamborghini Aventador from Cars 2. And I'm okay. I should pay just like the real world. You pay an import fee. Right. And a transportation fee. Right. I should pay like to the Grand Theft Auto people to get it there. And then I should be able to drive it around in Grand Theft Auto. And then I should be able to go rob a bank in Grand Theft Auto and take the $100,000 I got from robbing a bank and then go over to Fortnite.

Marc Petit: And spend it.

Bill Vass: Yeah exactly. Exchange, the country, right. And maybe pay some taxes to Fortnite, which is completely reasonable to do. And then be able to spend it. And then when I'm over there, I'm like, "Well, I really want my Aventador." So I should be able to import it in there as well. Right.

And so, that cross metaverse stuff, that commerce necessary to do that. And there's NFTs and all these other kinds of things that is going to, I think another set of standards that we have to develop that I think is just really going to be foundational for people. They want the ability to do that today.

That's a really easy thing for me to say and a really hard thing to do. Because the texturing of the car is different. How the skeleton of how the connection to the wheels is different in every game. It's a very hard thing to do. And this commerce thing needs to be resolved. It needs to be a trusted commerce component.

But it's all going to have to happen. It's just a matter of... And that's the hard work, unfortunately. I mean, that's... I'm not really excited about building a commerce and identity manager system in the metaverse but you have to have it. It's like the thing that's going to make the engine run.

And if you can link that into physical commerce as well. So if I'm playing a video game or I'm even talking to you now. And I see there's a little rocket statue on your shelf back there. I should be able to click on that and that should pop up on Amazon and say, "Buy that." And it shows up at my house.

Marc Petit: It's a unique piece of art. It should not be on Amazon first.

Bill Vass: Oh, well. But then I should be able to print it out in 3D.

Marc Petit: Yeah, no, absolutely.

Bill Vass: And have it sent. And if I'm rich, I can print it out in titanium. If I'm not, I'll just get it in plastic. Right. I mean, it's just, that kind of seamless across metaverse and cross environments is...

Marc Petit: I'm curious, it's a bonus question because as a scaling expert, do you think the distributed ledger is a viable foundation to manage those things?

Bill Vass: Yeah. I don't know that you need distributed ledger. I'm obviously not. I build a distributed ledger system here called QLDB, which everyone, if they don't understand QLDB, they don't understand how AWS works. It's the foundation for how our distributed compute works on AWS. It's just a lot faster. It's encrypted the same way, but it's a lot faster than a blockchain system.

I think blockchain is interesting, but I keep, every time I deal with blockchain, I'm always like, "Well, you could already do that. Right. You don't really need blockchain to that." I think there's a lot you can do with or without blockchain. I think a lot of people are now going to, all these haters in the comments are like, "He's an idiot. He doesn't like blockchain," but I don't...

And it's great for replacing SWIFT or for doing cryptocurrency and a number of other things. But there's an awful lot you can do, you don't need blockchain to do that. I just put it that way. You can use blockchain to do it, but you don't need it to do it. There's lots of ways.

Marc Petit: I understand. Yeah.

Bill Vass: You understand. And the proper level of cryptography and authentication to do it. I know that Ethereum's working on a cross metaverse currency capability and I think that they should do that and that's great. And I'll partner with them and make sure it works in our environment. I know all the NFTs, everyone's very excited about NFTs. I'm not so excited about them, but I should have bought Bitcoin when I had a chance many years ago as well.

Don't do my investment based on that because I didn't buy Bitcoin. But I'm like, "Well, this is pretty cool." I remember sitting with my wife. "This is pretty cool. We should buy some of that." And she's like, "Really? You think so?" I'm like, "Well, I guess not." And I should have. Would have, should have, could have.

It's like, back in the day, I had a chance to buy AOL stock and that's getting back to how old I am. A lot of people are listening to this go, "What is AOL?" Anyway. And I didn't do that, but that's water over the dam.

Patrick Cozzi: All right. So Bill, we've covered a lot of topics today, a broad breath. Is there anything that we didn't cover that you'd like to talk about to wrap things up?

Bill Vass: Oh, I think we covered a lot of the challenges and what we need to do here. I think, in your area in Cesium, a lot more of this integration with the 3D mapping world. And I think the currency of the information is just another thing we didn't talk about in the 3D mapping world.

And I think you get this idea, you see when Niantic and others where they are constantly be updating it with current information. And I think you can again, get this interesting idea where you could have all the phones in the world constantly updating. You could have all the cars in the world constantly updating, where you get the 3D mapping capability always up to date then.

Marc Petit: Or the Amazon trucks.

Bill Vass: Yeah. And the Amazon trucks as well, and the Amazon drones and you name it. But then, the other thing we didn't cover here is privacy, which I think is going to be very important in this space as well. And I know, my wife was head of the Electronic Frontier Foundation for many years. So it keeps me always thinking of Shari Steele, who was head of that for many years. And so I'm always thinking about the privacy side and the Big Brother aspects that we need to protect people against as well.

I think the proper layers of cryptography and anonymity is going to be important as well. And how we do that along with allowing the commerce and all these other things as well. I think that going to be important also, is the privacy aspects and the security aspects for it.

I think that's another thing. Security is going to continue to be a paramount and scary thing, especially as we link devices. And I also look at, I think we didn't talk about IoT. All of this, if you take a look at our digital twin product, which is you can go look at some of my videos of the digital twin product.

Where you're combining the real-time camera data coming in from the security cameras, with IoT data, with the 3D video game, like digital twin of your factory. The security needed for the IoT components is paramount. Right. And for robotics and autonomous vehicles. And then how that will link into the metaverse and digital twins is going to be really, really important as well. We didn't talk a lot about digital twins, but there's a lot going on there as well.

Patrick Cozzi: Absolutely. Finally, is there any organization or individual in the context of the metaverse that you want to give a shout-out to?

Bill Vass: Well, I mean, I got to give out a shout-out to the O3DE team in the Linux Foundation. We just had a conference down there. And so, I'm pretty excited about the adoption that's happening there. I think it's an exciting new space and I'm looking forward to. And Cesium was involved with that too. And I think that has been great.

I do want to do a shout-out to Epic and all the great work we've been doing together in Fortnite and on our simulation environments and AWS environments as well. And the great integrations we're building across AWS. We've got a nice launch of a big simulator coming together. In addition to the RoboMaker simulator, we've got the work we're doing in the movie space, as I mentioned before and others. Thank you for your support there.

Marc Petit: You're welcome.

Bill Vass: It's really critical to our customers as well. And we're really excited about our partnership there. Some of the deals we were doing together in there. And I look forward to working with you guys too on photogrammetry as well. That's another area.

Marc Petit: Yeah.

Bill Vass: We've worked really hard. And we got to continue to work that, because you really want to get the ability to pull things in from the real world easily by everybody. You want everybody contributing through the objects, not just artists and engineers.

Marc Petit: We should talk about this in a few weeks, Bill.

Bill Vass: Yeah.

Marc Petit: There is some news coming up.

Bill Vass: Yeah. Yeah, absolutely.

Patrick Cozzi: Bill, thank you so much for joining us today on Building the Open Metaverse podcast. And a huge thank you to the community for joining us as well. If you like what you're hearing, please subscribe.