Building the Open Metaverse

Generating Content for the Metaverse

Patrick Cozzi (Cesium) and Marc Petit (Epic Games) are joined by Stefano Corazza, Head of Roblox Studio, to discuss 3D technology, the metaverse, machine learning, and the creative power of Roblox Studio. Discover the potential of these technologies and gain exclusive insights into the future of virtual experiences.

Guests

Stefano Carazza
Head, Roblox Studio
Stefano Carazza
Head, Roblox Studio

Listen

Subscribe

Watch

Read

Announcer:

Today on Building the Open Metaverse.

Stefano Corazza:

The community that Roblox has, you have tens of millions of free 3D assets on the marketplace that people are reusing. People are making 90,000 experiences or games per day.

Announcer:

Welcome to Building the Open Metaverse, where technology experts discuss how the community is building the open metaverse together. Hosted by Patrick Cozzi from Cesium and Marc Petit from Epic Games.

Marc Petit:

I'm Marc Petit from Epic Games, and my co-host is Patrick Cozzi from Cesium. Patrick, how are you?

Patrick Cozzi:

Hey, Marc. I'm doing well. As usual, I'm looking forward to today's episode. Like many people in the community, I'm excited about the barrier to entry to becoming a 3D creator and creating 3D experiences becoming easier and easier. And we have someone who is really leading the way here.

Marc Petit:

And by the way, congrats on the Google deal.

Patrick Cozzi:

Oh, thank you, we're excited for that.

Marc Petit:

A victory for open standards and a great moment for Cesium, so congrats.

You hinted at it, Patrick. We have a great guest today, somebody who's already made a big impact on the industry and who recently joined another very impactful company, who is Roblox.

We're super happy to welcome Stefano Corazza, VP and head of Roblox Studio and Roblox. Stefano, welcome to the show.

Stefano Corazza:

Thank you so much. Thank you for having me here.

Marc Petit:

It's a pleasure, so as you know in the podcast, the first thing we ask our guests to do is to introduce themselves and tell us their journey to the metaverse.

Stefano Corazza:

My journey is definitely not a straight line, and so I actually started as a mechanical engineer in Italy, so something very far away from Silicon Valley, if you want. But I joined Stanford as part of an exchange program, and then became a researcher there working on machine learning back in the day, before convolution or networks were a thing. And then was very passionate about animation, in particular through the animation, and also democratizing creation of content.

I started my first company, Mixamo, back in 2008; we were trying to democratize character animation, which is so hard to do, and then that was an amazing journey that lasted seven years. Then we were acquired by Adobe in 2015, and it was kind of the start of the 3D journey for Adobe, so a great time to join.

Fast-forward another seven years, and I joined like 10 months ago, Roblox. There, I'm leading the team that is responsible for Roblox Studio, our tool that creators use to make Roblox experiences. I'm also driving the generative AI strategy across the company, which is probably the most fun job I could ever dream about, because I'm a big avid user of generative AI myself, on the day-to-day.

It's really a privilege to be working on such a future-looking thing. I'm obsessed with the future. I love science fiction movies and books, and any way people think about how the future is going to be, and here it feels like we are actually living it.

This is all super conducive of course for the metaverse, which needs a ton of content, and it feels like we have another inflection point now from using generative content, now with AI, it's going to be another multiplier.

Marc Petit:

You work into the future. I remember the early days of Mixamo saying, "What this Italian guy can do?" We were all highly skeptical, and you turned it into something pretty magical.

Stefano Corazza:

There's actually even more weird stuff about Mixamo. You probably have seen the news about Geoff Hinton who just left Google, the father of backpropagation and convolution networks; he used to be an advisor to Mixamo and we hired one of his PhD students, Graham Taylor back then working at Mixamo.

We released in 2009 a generative animation engine basically using a restricted bot machine, which was the early version of convolution neural networks and the tech was not ready. Within, I think, three months we swapped for a different approach that was a lot less fancy, but that was basically the very early days of GenAI where the tech was not even remotely close to where it was today. I had the luck of really working with Geoff Hinton and Graham Taylor and learned a lot from them.

Patrick Cozzi:

Well, I think we have a lot of topics to cover, but let's go back to your time at Stanford around 2005. You were working on markerless motion capture. I was curious about the techniques you were envisioning then and then where you see the state of the art today.

Stefano Corazza:

That sector on its own had a massive, massive improvement in the last year. Back then we were working on markerless motion capture from video,  we were using eight high-speed cameras, and now you can almost get the same accuracy with a single camera. Motion capture has been a little bit of my obsession because ultimately was the promise of delivering amazing natural animation to anyone; one way is you do a mockup and the other way is basically you use AI or you use data like we did with Mixamo, but the technology went so far since then.

Now with using basically AI to regularize, just even a monocular video, you get amazing results. I am very lucky to be on the board of Rococo, a company based in Copenhagen that is really focusing on accessible mockup and democratizing that.

They have the suit but they also have sensors and they also have now released a couple of months ago this single camera AI-based tracking. We are seeing usage explode. The quality is now going to be at the AAA level for a while, maybe never, but the accessibility of this is pretty mind-blowing.

Marc Petit:

Let's go back to Mixamo. You co-founded Mixamo, I think it was 2008, with Nazim Kareemi.

Stefano Corazza:

Yep.

Marc Petit:

It was acquired by Adobe in 2015. You remember that. And to this day the Mixamo platform is still alive and kicking at Adobe.

First, maybe not everyone knows about Mixamo. Can you tell us about the idea behind it?

Stefano Corazza:

Mixamo is the first online service for character animation. Instead of having to learn complicated tools, you got to mixamo.com, you can upload your character; if it is rigged, we can apply animations out of over 10 thousand that we have over there, or if it's not rigged, we actually use machine learning to rig your character, which historically was really the blocker.

A lot of 3D artists can model things really well. The vast majority of 3D artists can model, a smaller set can animate, and an even smaller set can rig and, an even smaller set actually want to rig. That's what we have found.

We are trying to solve this very technical problem so that artists didn't have to do it. We implemented machine learning and in full disclosure, that algorithm I think was finished in 2013 for the outer rigging has now been touched since then as far as I know and it's still performing pretty well and we have hundreds of thousands of people using it every month. We are very proud of that because, basically, it was just the simplest UI to just solve the problem in a way that was super clean and super easy and empowering artists.

Marc Petit:

If you were to start an online character animation platform in 2023, what would you do differently? It looks like the technology has evolved so much.

Stefano Corazza:

Yeah, the technology is pretty amazing. I would say probably some of these AI-based monocular video technologies that can go from video to animation will be something that maybe we will offer on the platform. Also, we were always trying to go a little bit deeper into the character pipeline and not just give you the animations, but we would want to give you already a full blanket so you can just plug in your avatar system without animation into the game directly. I think there are companies that have been doing that.

A company that I have the privilege of advising and also I'm an investor in is Ready Player Me. They really took the avatar service to the next level providing actually a nice SDK, and now recently they also added animation. It's a way to go a little bit further in the vertical stack and give basically a package, then you can actually plug into your game, and you have a full avatar system, not just animations for a few characters that you rigged.

I think if I did today, there will be a lot of focus on just video to mockup that anybody can use and then also a deeper integration into the game engines, I will say.

Marc Petit:

Yeah. We had Timmu on the show and that actually connects to the question that would be how far away are we from having standard representations for characters, because we seem that we've always been using the same characterization for the past 15 years.

Do you think we could turn this into a documented standard?

Stefano Corazza:

I think they are the factor standards. The HumanIK one that you know very well from your past with Autodesk is also used with Mixamo. We have pumped out tens of millions of rigs over the last decade in the market and I think that's probably the most prevalent rig. Then for face, I remember when we sat down with Unity and Phaseshift’s CEO and we said, okay, let's define a standard rig for the face and we boiled this down to 42, 44 blend shapes and then they were acquired by Apple and that became basically now all the emojis that we see on the phone; that also became the AR standard.

I see some of these are not official standards, but are the most prevalent way to do things. Those 40, 50 blank shapes, are key supporting for what I can tell is the most ubiquitous for Phase and HumanIK and maybe also Biped to some extent is probably the most prevalent for body rigs.

We're getting there, maybe we should call it an official standard and make an explicit effort towards that because it's just a lot more like how do you expand it, how do you customize. Everybody wants to customize the rig, everybody's going to have one extra tail or rigs. Figuring out the schema there is the standard also for customizing those rigs I think will be super valuable because then we can have amazing interoperability not just between tools but potentially between metaverses and I think this is something that Ready Player Me has really pushed, they have thousands of companies now using the one rig and I think we could formalize it as a standard and I think that would be great for the industry.

Marc Petit:

We discussed that a couple of weeks ago with Andre Gauthier from Unity and he has also presented at The Metaverse Standards Forum, so I think that that idea is moving along.

Just one last question for me. Can you load a Mixamo animation into Roblox Studio?

Stefano Corazza:

Of course. For me, the most heartwarming thing is the amount of video tutorials and plugins that people have made to use Mixamo everywhere. It’s just unbelievable. Sometimes I just Google things, and oh, this guy made a plugin for this tool that I didn't even know existed.

Definitely, the community took it on and when you give something for free, Mixamo service used to cost $1,500 a year subscription, and then since 2015, it’s free so the users exploited it, and of course, the community is trying to enable it to go anywhere.

Patrick Cozzi:

Stef, congrats with democratizing animation there and building such a strong community and ecosystem.

We wanted to speak a little bit about Adobe. So Mixamo was acquired by Adobe, then you became the first leader of the 3D organization there, and you were in charge of AR. Looking back, I was curious what you're most proud of?

Stefano Corazza:

Those have been seven years, really, really fun. We develop the new generation 3D tool that is now called Adobe Stager, Dimension, Stager, two basically skews of the same and it was like we added a lot of machine learning in there as well. We added automatic UV creation using machine learning so people didn't have to even understand what a UV is. They could model, bring it in, and we'll take care of that. 

We also added machine learning automation to create IBLs and so we’re just trying to apply the same formula, which is to take the most technical task where there is the least amount of creative input and then just completely automate it with no UI.

Who wants to spend hours unwrapping UVs? Nobody wants to do that. And so we tackle a lot of those, and I think it was really fun.

Then it was obvious for Adobe... I've been friends with Seb and Alex from Allegorithmic for many years, and for Adobe it was the most adjacent 3D business if you want. Photoshop, Substance Painter is the Photoshop of 3D. It made so much sense, and I've been an actual user of those tools myself that I really like. It made sense over time to basically have Allegorithmic join Adobe as well. I was on the board for two years and worked very closely with the team until we made the acquisition, and so that was, I think, a big multiplier where the 3D organization went from 30 people to 200 overnight, and now we added also AR.

Adobe didn't have a play in that space, and we partnered very closely with Apple, and now there's Adobe Arrow recently announced also a partnership with Google where you can geolocate experiences anywhere in the world and localize them. The team is doing a fantastic job.

And so these are all things that were built over the seven years that I'm very proud of, and I'm very proud of what the teams there have in terms of vision and where they're going to take it to the next level.

Marc Petit:

With the press of Guido Quaroni and the focus on USD, I think it will bring Adobe into playing a very big role in the world of 3D.

Stefano Corazza:

Sure.

Marc Petit:

But recently, you joined Roblox as the head of Roblox Studio, and so Roblox with the tool, you said that Roblox creator is used to create games and experiences, and actually I was thinking about it, and I think Roblox, if you use run number is 250 million MAUs, monthly active users.

How many of them use the tool?

Stefano Corazza:

We report in terms of daily, I think we are at 67 million daily active users. I don't know if we are publicly disclosing the numbers of studio use other than, it's in the millions of course, as you can imagine. And we are even more proud to say basically how many developers actually are making money on it, which is on a number of days, constantly increasing.

I think it's probably the number that we are the most proud of. There are people making a living and really earning money. I think we have 2.7 million creators. They're using Studio, they are earning real bucks.

Marc Petit:

What I was getting at is with those numbers, it doesn't matter which number it is, but you say it's in the million; it’s probably one of the most used 3D content creation software in the world. It's probably up there as the most used tool so that's quite a responsibility you have.

Stefano Corazza:

It's a big responsibility. Also, the one thing that people don't realize sometimes is that Roblox Studio is at any given time running a single version and we update that weekly so we don't have version 5.7 and then we can keep going, then we support the other one.

Everything needs to be backwards-compatible, and then everything runs on a single version. That has major advantages. We can move faster because we support only one version, but also there are a lot of constraints because we cannot break the workflow of creators that are making games and they're building this game, and then tomorrow we update the version, so we can't break things. That was a unique, I would say, a unique challenge, really fun, but it's basically making everything more real-time than other tools in the industry. I think with new challenges and new advantages.

Marc Petit:

You've seen a lot of tools in your career; you integrated Mixamo in all the game engines, in all the content authoring tools. What's so specific about Roblox Studio?

Stefano Corazza:

The community that Roblox has, it's hard to imagine the size of it. We have tens of millions of free 3D assets on the marketplace that people are reusing and people are making 90,000 experiences or games per day that they are pushing. So the scale, I think, is at the same level of Spotify and how many songs they go live every day. That is pretty staggering, I would say.

What we have found is on the marketplace, people are very eager to make things and then just leave it there for other people to reuse. It's a community that likes to learn and to help each other. I think that's the unique value. When they are on Roblox and use Roblox Studio, they feel extremely opinionated about it and passionate about it. It's probably the closest relationship to the community that I've ever seen any tool have.

This week we published our roadmap for the year because the community was like, we want to know what you're working on, what your priorities are. We want to, of course, chime in on those. And so we decided to go for full transparency, and basically, we published on our dev forum the full roadmap for Studio and also beyond Studio.

It's probably the most symbiotic relationship with the community that I have ever seen for a tool. And that was new and also exciting, but at the same time, again, a lot of responsibility because they're also very vocal.

Patrick Cozzi:

We want to talk a little bit about generative AI. I know Roblox has already shipped a few features, and congrats on being so early to that, and thanks for your presentations at GTC.

I was curious, what techniques do you think in the short term are going to have the biggest impact, whether it's co-generation or neural rendering?

Stefano Corazza:

We launched a GTC code assist, which is basically the co-pilot for Studio, and then we also launched the material generator both using GenAI, and this was just the start. We wanted to really understand how things were evolving, how this was perceived and received by the community. This has been incredibly positive instead of feedback instead in terms of productivity, how much this increases the productivity of people, and how much people really push the tool, especially the material generator, to create crazy stuff.

We have a clear path for those two, I would say, lines of productivity. Right now, you can create only tile-able materials. It would be nice if you actually can retexture things, and then it will be even nicer if you can retexture your whole game based on a style; you could get 10 concept artboards, feed them into Stable Diffusion and then say generate only stuff that is compliant with that style potentially.

That's the one line of productivity that is pretty obvious, just basically helping people texture using words and descriptors instead of actually doing the pushing pixels. Then for code, we started with more experienced developers to make them more productive, but there are so many more people that don't know how to code. Actually, a lot of kids are learning to code on Roblox, and Lua is also pretty close to Python.

This thing serves them well going forward for their career if they want to take on computer sciences, but we don't have the tools to really teach all those things. What functions to use if you want to add audio to a game object and so on. We are now moving a little bit downstream there where we want to empower people that are less expert about scripting or coding and then really bring them into the fold and empower them.

Those two productivity lines have a clear line of sight on where we want to be, and basically, we need to have a conversational AI be able to help you create all interactivity very easily without understanding all the details of the code. We want to help you retexture, but if you really like squint, and look long term, I think the future's going to be large language models.

They will do everything. They will have an awareness of your scene graph, they will understand what you're trying to do, they will understand your intent, your creative intent, and they will understand what your development environment can do and what your toolkits can do. Then we'll just guide you through that like a companion instead of developing individual features like we are seeing today, which are, by the way, super useful. But ultimately I think we going to have just one unified interface run by one large language model, that will be able to do pretty much everything.

Marc Petit:

Do you think we can get to plausible intelligent NPCs?

Stefano Corazza:

NPC is probably not a top priority for Roblox because everything is multiplayer, so there's always a bunch of people, but think about it you take a book, and you fit it into AI, and then that AI will make the game of that book, that will be unbelievable, right? Things along those lines where you can just describe words and just give a hint as to how it's going to happen. Imagine how detailed a novel describes this environment and imagine if you could just feed that and then you can leave that world yourself. That would be pretty mind-boggling.

Patrick Cozzi:

Stef, I wanted to congratulate you on the release of the glTF import in Roblox Studio. I'd love to see the support for the open ecosystem. I was wondering if you could talk a bit about the philosophy and the importance of Interoperability for Roblox Studio.

Stefano Corazza:

This is definitely, I don't deny that this was a project that I've been very excited about, and I've been very, very involved since I joined. At the end, glTF is maybe second or third generation iteration of trying to build a standard built on all the learnings of COLLADA, and I think the team has done a fantastic job; the Khronos group in making a format this amazing for transmission and interrupt... we wanted to make sure that we had solid foundations in terms of ingestion of content. I'm very excited that we have that in Studio.

I think also one thing that I wanted to say is there are also a lot of advantages for the format in terms of performance, which for us is so important. There's always a discussion, USD and glTF, people see them as potentially mutually exclusive. But actually, I think USD is becoming really great as an authoring format but ultimately for runtime, for delivery, and for interop when you're more downstream in the creation pipeline.

I think glTF is going to be pretty amazing, and some of the people at Roblox and formerly Mixamo, I want to call out Emiliano, also helped the glTF 2.0 effort. I think he's responsible for adding blank shape support that the community really wanted back then. We're definitely very involved and very supportive of that.

Marc Petit:

You mentioned USD; there's a growing opinion that USD can be a great foundation to support layering and compositing, a little bit like HTML, and Nvidia has shown the SIGGRAPH how you could use USD to reference and compose a bunch of glTF files. Is that a vision that you share?

Stefano Corazza:

I was part of the Metaverse Forum when I was at Adobe, and there was some discussion also about how a glTF or some other variations of it could also be used as an experience descriptor and also embed behaviors, and glTF itself could be used to describe multiple scenes. I think there is really, whether the focus is on highly powerful hardware that is used to do content creation versus consumption because the memory footprint and the compression level of USD and glTF are very different.

I see there is a little bit of overlap where they have similar capabilities. Now, both formats are growing, of course, and so the features that they have are increasing. There is a little bit of increasing overlap between the two if you want. But I think the focus, one versus authoring and being full encompassing and the other one being efficient for runtime and transmission still stays.

I think they will both coexist because of that different focus.

Marc Petit:

Patrick and I are co-chairs with Guido Quaroni of the group at the Metaverse Standards Forum that looks after that. That's what we're trying to do, is really making sure that the two can evolve but in a way that enabled us less conversion between them. That's really the work that's done there.

Stefano Corazza:

There was an initiative; again, I didn't keep track, but I think also Nvidia...

Marc Petit:

You need to come back!

Stefano Corazza:

I would love to, by the way. I would love to, just life has been so busy since joining Roblox. I would love to join given that you do not have a Roblox representative on the Forum, I'll be super happy. Actually, I wrote an email to Neil saying, "Hey, I'm interested." 

But there was this test of can we distill directly at glTF from USD? And I think it's a great idea.

Patrick Cozzi:

Stef, in some ways, this whole conversation so far has been about democratizing the ability to create 3D content and 3D experiences, and you and the team have done amazing work with Roblox Studio in achieving this mission. I didn't know if there were just any anecdotes that you'd want to share on examples.

Stefano Corazza:

I think probably what got the most press was Frontlines, the Call of Duty Roblox version if you want fully multiplayer. That attracted a lot of also older players. Our segment between 17 and 24 years old is the fastest growing of all. That segment likes the more aged-up type of experiences.

The shocking thing, so this game is basically Call of Duty fully multiplayer, pretty amazing quality given still the constraints that this thing has to run also a mobile Android, and so on. But the shocking part was that it was built by a team of six; you look at it, and you imagine hundreds of people in a big game company, and it was like six people because so much of the infrastructure, all the multiplayer, master making, all the stuff is all taken care of out of the box.

I've heard about other games that have had tens of millions of users where they were built by two guys. Okay, now they hired a third one, the team increased by 50% size. Unbelievable.

I will give you my own personal anecdote. When I left Adobe, we were discussing, hey, how do we make the remodeling collaborative and all that stuff. One weekend I just joined Roblox, and I wanted to try to build a multiplayer modeling tool, or you call it a collaborative tool for modeling. It was very simple, you have a big queue, all voxels, and then you click on the little voxels, and then they disappear, and then you can basically do volume carving. But I built it, I could do it with six friends, and then I could push a button, and then I could do it on my phone.

On my phone, my friend has the desktop, we are both modeling this thing together. This whole thing took six lines of code to build, and I was like, oh my god, there is so much infrastructure already there that with six lines of code, and probably now I can probably do it with a single line of text and then our all assets will convert into code.

But that gives you a sense of how are used to these massive teams spending five years on a game, and now you can do so much more, and the speed of iteration is going to go so fast now that GenAI is going to be more pervasive and we're going to see amazing games made by a handful of people over a few months. Very much looking forward to that.

Marc Petit:

We talked about GenAI in Open Standard; what’s the next big thing for Roblox Studio in your mind?

Stefano Corazza:

For the rest of the years, the big thing is the least sexy thing that we are focusing on making it very robust and taking care of some of the depth, and some of the features in the community has asked for a long time. That's a big focus. But we do have this year more GenAI that we are hoping to ship. Actually, now the roadmap is public, so I can often talk about it.

We hope to really get a couple of next-level GenAI features in there. We are also excited about deepening the connection with some of the other DCCs and the tools that people are using. Blender is one of the most used tools by our community also because it's free like Studio, and so there's no barrier to access, and so we want to make sure that we enable that workflow. I mean, glTF was a part of that, but we are going to do something even more integrated in the next months.

Marc Petit:

I know you well; you’re more than a nerd. Now to touch on that, you're a deep technologist, but you're also an artist, and I know you've been at Adobe. You were running some sort of Adobe gallery, right? If I remember well.

Stefano Corazza:

Yeah, Festival of the Impossible, I started while at Adobe.

Marc Petit:

What have you been doing lately as an artist?

Stefano Corazza:

I really enjoyed working with musicians. At Mixamo, we had the privilege of motion-capturing Bjork in our studio. She was dancing and singing unreleased songs, and for some of us, that was a mystical experience, and so that was very exciting. Right now, at the moment, I tried to take on only one side art project at a time and be diligent about it, so I paused a little bit of music production, and now I'm helping a friend of mine who's an amazing musician. His name is Radical Face, the name of the band, and it is working on some production that involves also animation. I don't think I can talk too much about it, but I'm spending time in Blender doing 3D VFX work and pipeline right now and then trying to composite that with 2D content, and it's been super, super fun.

Marc Petit:

So what can Roblox bring to musicians?

Stefano Corazza:

So this year, actually in December, we did a hackathon, and then the focus of the project I did with the other five fellows was to bring live concerts to the platform. Imagine we saw the Taylor Swift thing that happened with the tickets where there were millions of people that wanted to attend, and it was a moment of social panic that people couldn't. Imagine how many people would've gone to that concert. It would be probably in the tens of millions, but the venue, I don't know, had maybe 50,000 or a hundred thousand. Imagine if you can create a digital twin of your experience in real-time and have that concert be streamed on Roblox; anybody can do that. I think Epic has shown, and we have shown, that we can do one-off concerts, but what if that capability will just release to any musician on the planet?

They can go to a venue, they put on a suit, they have gloves, they play guitar, or they play piano in real time. The animation that performs the music is all streaming to an experience that everybody can join for maybe 1 cent instead of $20. That was the thing that I'm super passionate about.

We did it as a hackathon; we proved it's possible and working, and now we are working to actually bring it to the market. It's going to be, for the music industry, a big revolution because, ultimately, for artists, touring is one of the primary ways of both getting known but also income. If now you cannot, the fact that they can perform for anyone on the planet, I think it's going to really help the music industry, and it's going to help artists and musicians make a living of it.

Patrick Cozzi:

We covered so much good stuff today, from animation to generative AI to Roblox Studio, it’s amazing what's being done to democratize access to 3D.

We'd love to wrap up if you'd like to give a shout-out to any person or organization or multiple ones.

Stefano Corazza:

I think the years, the most intense years, probably have been a Mixamo. Every startup goes through near-death experiences, and I think we have a few. My gratitude goes to Emiliano. We’ve been working together for 15 years now. He's back in Roblox studio, so I'm working closely with him. We were at Stanford, and he was an intern at the time. He has been an amazing person to work with, and he built most of the technology a Mixamo. All the machine learning stuff, and it's been amazing to work with him as a partner.

Sylvio Drouin, Head of Innovation at Unity, was also for a period of time at Mixamo. He definitely helped us a lot. Charles, one of our most amazing engineers, people that turned down offers from Google to join Mixamo and make it up, and Jeremy was my director of engineering.

People that really believe in the vision, and also some people who believe in the vision, were not engineers; they were investors. Alessandro Viro, my lead investor, really believed in the company through it. I have a lot of gratitude for these people, and they really believe in the idea and in the journey and so I would love to take a moment to just thank them.

Marc Petit:

Stefano, I’ve known you for a long time, and I know you are an individual and entrepreneur. I could not think of a better place for you than Roblox to demonstrate your skills. Thank you very much for being with us today. Thank you very much for everything, especially around character animation; you've contributed a lot.

Stefano Corazza:

Thank you so much for having me. It's been really fun.

Marc Petit:

And thank you, Patrick, and thank you, everybody who's listening.

As you know, you can find us on LinkedIn, on Twitter, and you can reach us directly at feedback@buildingtheopenmetaverse.org.

Thank you very much for being there, and we'll be back for another episode.