Building the Open Metaverse

WebGPU and Graphics on the Web

Senior Software Engineers at Google, Brandon Jones and Kai Ninomiya, join Patrick Cozzi (Cesium) to discuss the origin of WebGPU and the state of its ecosystem.

Guests

Brandon Jones
Senior Software Engineer, Google Chrome
Brandon Jones
Senior Software Engineer, Google Chrome
Kai Ninomiya
Senior Software Engineer, Google Chrome
Kai Ninomiya
Senior Software Engineer, Google Chrome

Listen

Subscribe

Watch

Read

Announcer:

Today, on Building the Open Metaverse...

Kai Ninomiya:

When we were designing this API early on, I would say one of our major lofty ambitions for the API was that it was going to be the teachable API, the teachable, modern API, right? Something more teachable than at least Direct3D 12 and Vulkan. In the end, we have ended up with something that is fairly close to Metal in a lot of ways, just because the developer experience ends up being very similar. The developer experience that we were targeting ended up very similar with what Apple was targeting with Metal, and so we ended up at a very similar level. There's still a lot of differences, but we think that WebGPU really is that the best first intro to these modern API shapes.

Announcer:

Welcome to Building the Open Metaverse, where technology experts discuss how the community is building the open metaverse together, hosted by Patrick Cozzi from Cesium and Marc Petit from Epic Games.

Patrick Cozzi:

Welcome to our show, Building the Open Metaverse, the podcast where technologists share their insights on how the community is building the metaverse together. I'm Patrick Cozzi from Cesium. My co-host, Marc Petit from Epic Games, is out this week, but he is here in spirit. Today, we're going to talk about the future of 3D on the web, specifically WebGPU. We have two fantastic guests today. We're here with Brandon Jones and Kai Ninomiya from the Google Chrome GPU team. They're both WebGPU specification co-editors. We like to start off the podcast with each of your journeys to the metaverse. Brandon, you've done so much with WebGL, web glTF, WebXR, WebGPU. Would love to hear your intro.

Brandon Jones:

Yeah, so I've been working with just graphics in general as a hobby since I was really little, and then that evolved into graphics on the web when WebGL started to become a thing. Well before I started at Google or even moved to the Bay Area or anything like that, I was playing around with WebGL as a fledgling technology, doing things like rendering Quake maps in it. Just really, really early on, kind of pushing and seeing, "Well, how far can we take this thing?" And that led to me being hired as part of the WebGL team, and so I was able to actually help shape the future of graphics on the web a little bit more, which has been absolutely fantastic. It's been a really interesting way to spend my career.

Brandon Jones:

As you mentioned, I've also dabbled in other specs. WebXR, I kind of brought up from infancy and helped ship that, and am now working on WebGPU. I have dabbled a little bit in the creation of glTF, but honestly, the hard work there was mostly done by other people. I had a couple of brainstorming sessions at the very, very beginning of that, where I kind of said, "Hey, it would be cool if a format for the web did this," and then talented people took those conversations and ran with it and made it far more interesting than I ever would've.

Patrick Cozzi:

Cool. And I think the work that you did for Quake on WebGL, bringing in the Quake levels, that was big time. I think that was super inspiring for the WebGL community. And I still remember, it might've been SIGGRAPH 2011, when you and Fabrice showed a web glTF demo. That was before I was involved in glTF, and I was like, "Wow, they have the right idea. I gotta get in on this."

Brandon Jones:

Yeah. It was fun to work with Fabrice on brainstorming those initial ideas of what that could be, and really, it just came down to, "Okay, if you were going to build a format for the web using the restrictions that existed on the web at the time, what would be the best way to go?" That's where a lot of the basic structure of... Let's use JSON for this markup that describes the shape of the file, and then bring down all the data as just big chunks of type arrays, and stuff like that. That's where those things came from, and then a lot of the rest of it, things like PBR materials that you see in glTF 2 these days and everything, came from the Khronos standards body taking that and iterating with it and finding out what developers needed and pushing it to be the standard that we all know and love today.

Patrick Cozzi:

Yep. For sure. And Kai, I know you're a big advocate for open source, open standards, and super passionate about graphics. Tell us about your journey.

Kai Ninomiya:

Yeah, sure. So, yeah, first, I'm Kai Ninomiya. My pronouns are he/him or they/them. I started with graphics in high school, I guess. I had some friends in high school who wanted to make games, and we started just playing around with stuff. We were using like OpenGL 1.1 or whatever, because it was the only thing we could figure out how to use. And we did a little dabbling around with that and 3D modeling programs and things like that. And then, when I went to college, at the time when I started college, I was intending to major in physics, because that had been my academic focus, but over time, it sort of morphed into like, "Yeah, I'm going to do computer science on the side. Actually, I'm going to do computer science and physics on side." And I did a focus in 3D graphics at University of Pennsylvania.

Kai Ninomiya:

And while I was there, in my later years of the program, I took CIS 565 with Patrick, back when you were teaching it, and I first sat in on the course one semester, because I was interested in it. And then, I took the course, and then the third semester, I TA'd the course. So, I was in that course three times, essentially. I'm responsible for probably the most devastatingly difficult assignments in that course, because I was not very good at figuring out how to create assignments at the time, so I think we toned things down after that.

Kai Ninomiya:

But yeah, so I worked with Patrick for a long time, and then sometime during that time, I also interned with Cesium. I worked on a number of graphics optimizations, like bounding box culling and things like that, in Cesium, over the course of a summer and a little bit of extra work after that, as I was finishing up my program in computer science.

Kai Ninomiya:

And then, after that, I got an offer from Google. I didn't have a team match, and Patrick just decided, "You know what? I'm going to send an email to the lead of WebGL at Google and say, like, 'Hey, do you have any openings?'" And it just so happened that not long before that, Brandon had switched full time to WebXR, and so they did have an unlisted opening on the team. And so, I ended up on the WebGL team and I worked for the first couple of years on and off, basically, between WebGL and WebGPU. WebGPU as an effort started in 2016, right around the time that I joined the team, and I was working on it occasionally for like a couple days here and there on our early prototypes and early discussions for a long time before I eventually fully switched over to WebGPU and then later became specification editor as we started formalizing roles and things like that.

Kai Ninomiya:

So, yeah, I've been working on WebGPU since the beginning. It has been quite a ride. It's taken us much longer than we thought it would, and it's still taking us longer than we think it will, because it's just a huge project. There's so much that goes into developing a standard like this that's going to last, that's going to be on the web for at least a decade or more, something that's going to have staying power and is going to be a good foundation for the future. Yeah, it's been a ton of work, but it's been a pretty amazing journey.

Brandon Jones:

"It's taking much longer than I think it will," I think, is the unofficial motto for web standards, and, I suspect, standards as a whole.

Patrick Cozzi:

Kai, awesome story. I think you still hold the record for being in CIS 565 in three different capacities, three different times. Love the story on how you got involved in WebGL and WebGPU. I think that's inspiring to everyone who's interested in doing that sort of thing. Before we dive into WebGPU, I wanted to step back, though, and talk about the web as an important platform for 3D and why we think that... maybe why we thought that in 2011, when WebGL came out, and why maybe we believe that even more so today with WebGPU. Brandon, you want to go first?

Brandon Jones:

Yeah, it's been really interesting for me to follow this renaissance of 3D on the web from the beginning, because it started out in this place where there's a bunch of back and forth about, "Well, we want rich graphics on the web. We don't know necessarily want it to all be happening in the context of something like Flash. How should we go about that?" It wasn't a foregone conclusion that it would look like WebGL at the beginning. There was O3D. There was WebGL. There was... some work around which proposal we would get carried forward. Eventually, WebGL was landed on, because OpenGL was still one of the prominent standards at the time, and it was something that not a lot of people knew. A lot of resources were available to explain to people how it worked, and it would provide a good porting surface going forward.

Brandon Jones:

And so, moving forward from there, I think that there was a lot of expectation at the time that, "Oh, we will do this, and it will bring games to the web. We're going to add a 3D API, and people will make lots of games for the web." And the interesting thing to me is that that's not exactly what happened. There are certainly games on the web. You can go and find web-based games, and some of them are really great and spectacular, but the wider impact of graphics on the web, I think, came from unexpected places where there was suddenly an opening for, "Hey, I want to do something that's graphically intensive, that requires more processing than your average Canvas 2D or Flash could do." But it doesn't make sense to ship an EXE to the end user's machine. I'd want to do it in an untrusted... Or, well, a trusted environment, so to speak. I don't want to have to have the user's trust that my executable isn't malicious. Or maybe it's just a really short thing, it doesn't make sense to download a lot of assets for it, so on and so forth.

Brandon Jones:

Those were the uses that really latched on to graphics on the web in the most significant way, and it created not this rush of games like we thought it would, but a whole new category of graphical content that just really didn't make sense to exist before, and it's just grown from there. And I thought that was spectacular to watch that transformation, where we all went, "Oh, we didn't intend for that to happen, but we are so glad that it did."

Patrick Cozzi:

I agree. So many use cases outside of games exploded, I mean, including the work that we've done in geospatial, and I've seen scientific visualization, and so on. Kai, anything you want to add on this topic?

Kai Ninomiya:

Yeah, I can say a bit. I mean, I wasn't around, I wasn't working on this at the time, but I certainly have some history on it. Brandon is absolutely right. A lot of the things that we've seen WebGL used for, the things that have been the most impactful, have been things that would've been difficult to predict, because the whole ecosystem of how 3D was used in applications generally evolved simultaneously. And so, we've seen all sorts of uses. Obviously, there's Cesium and there's Google Maps and things like that. There's tons of geospatial. There's tons of very useful uses for 3D and acceleration in geospatial.

Kai Ninomiya:

Generally, though, WebGL is a graphics acceleration API, right? And people have used it for all sorts of things, not just 3D, but also for accelerating 2D for 2D sprite engines and game engines, image viewing apps, things like that. The impact definitely was in making the technology available to people and... rather than building out a technology for some particular purpose. And having a general-purpose acceleration API with WebGL, and now with WebGPU, provides a really strong foundation to build all sorts of things, and it's the right abstraction layer. It matches what's provided on native. People on native want to access acceleration APIs. They want to use the GPU. They might want to use it for machine learning. They might may want to use it for any sort of data processing, right? And just having that access at some low level lets you do whatever you want with it.

Kai Ninomiya:

The web definitely evolved a lot over that time, with Web 2.0 sort of evolving more and more toward bigger applications, more than just a network of documents or a network of even web applications of that era, to full applications running in the browser, viewing documents, viewing 3D models, things like that. It was very natural for WebGL to be a technology that underpinned all of that and allowed a lot of the things that people were able to do with the web platform as a whole after that point, or as Web 2.0 evolved into what we have today.

Patrick Cozzi:

Yeah, and I think the start of WebGL just had fantastic timing where GPUs were just widely adopted and JavaScript was getting pretty fast. And now, here we are a little more than a decade later, and you all are bringing WebGPU to life. I would love to hear a little bit about the origin story of WebGPU. Kai, do you want to go first?

Kai Ninomiya:

Yeah, sure. Back in 2016, I think shortly before I joined the team, it was becoming very clear that there were going to be new native APIs that were breaking from the older style of Direct3D 11 and OpenGL, and it was becoming very clear that we were going to need to follow that trend in order to get at the power of those APIs on native. Right? So, we could implement WebGL on top of them, but we were still going to be fundamentally limited by the design of OpenGL, which I'll mention is over 30 years old, and at that time, was almost 30 years old. It was designed for a completely different era of hardware design. It was designed with a graphics co-processor that you could send messages to. It was almost like a network. It's a very different world from what have today, although not as different as you might expect.

Kai Ninomiya:

Native platforms moved on to new API designs, and unfortunately, they fragmented across the platforms, but we ended up with Metal, Direct3D 12, and Vulkan. At that time in 2016, it was becoming very apparent that this was going to happen, that we were going to have... I think Metal came out in 2014, and D3D 12 came out in 2015, and Vulkan had just come out recently, so we knew what the ecosystem was looking like on native and that we needed to follow that. But because it was very fragmented, there was no easy way forward, like relatively straightforward way of taking the APIs and bringing them to the web like there was with OpenGL. OpenGL was omnipresent. It was on every device already in the form of either OpenGL or OpenGL ES, but almost the same thing. No longer true with the new APIs, and so we had to start designing something.

Kai Ninomiya:

And so, our lead, Corentin Wallez, was on the ANGLE team at the time, working on the OpenGL ES implementation on top of Direct3D and OpenGL and other APIs. He principally started working on basically a design for a new API that would abstract over these three native APIs. And it is a big design challenge, right? Figuring out... We only have access to use the APIs that are published by the operating system vendors. Right? So we only have Direct 3D 12, Vulkan, Metal. We don't have access to anything lower-level, so our design is very constrained by exactly what they decided to do in their design.

Kai Ninomiya:

And so, this created a really big design problem of exposing a big API. There's a big surface area in WebGPU. It's a big surface area in graphics APIs, and figuring out what we could do on top of what was available to us and what we could make portable so that people could write applications against one API on the web, and have it target all these new graphics APIs, and get out the performance that's available both through that programming style and through the APIs themselves and the implementations themselves on the different platforms.

Kai Ninomiya:

And since then, we've basically working toward that goal. We've spent more than five years now doing exactly that. Tons of investigations into what we can do on the different platforms. How can we abstract over them? What concepts do we have to cut out because they're not available on some platforms? What concepts do we have to emulate or polyfill over others? What concepts do we include just for when they're useful on some platforms and not on others? And also, how do we glue all these things together in such a way that we don't end up with an unusably complicated API?

Kai Ninomiya:

If we had started with all of the APIs and tried to take everything from everyone, we would've ended up with something impossibly complex and difficult to implement. So, yeah, it was, in principle, I think, due to Corentin's amazing understanding of the ecosystem and how to build something like this, but it's been a group effort. There's been a huge effort across many companies and across many people to figure out what it really was going to look like, and we're almost there.

Patrick Cozzi:

Well, look, we really appreciate the effort here. I think you brought up a great point, too, on the WebGL, and OpenGL, formerly, is 30 years old, and the abstraction layer, it needs to match what today's hardware and GPUs look like. A very much welcomed update here. Brandon, anything you want to add to the origin story?

Brandon Jones:

Boy, not much. Kai did a really comprehensive job of kind of covering how we got here. I will add one of the motivators was that Khronos made it very clear that they weren't going to be pushing forward OpenGL any further. They've made some minor changes to it going forward, but really, the focus was going to be on Vulkan from that group moving forward. We know that since Apple has deprecated OpenGL and put all their focus on Metal, and of course, Microsoft really is pushing Direct3D 12, so we just didn't want to be in a position where we were trying to push forward an API shape that wasn't seeing the same kind of maintenance from the native side that we had thus far been mimicking pretty well.

Brandon Jones:

Yeah. I will say, in service of what Kai was saying about trying to design an API that encapsulates all of these underlying native APIs without sticking to them in any strict fashion or trying to expose every feature, I was aware of what was going on with WebGPU. I'd had some conversations with Corentin and other developers on the team as time was going on, but as that was evolving, I was spending most of my time on WebXR at the time, and so it was only once that got shipped and was feeling like it was in a fairly stable place that I came back around and started being interested in working on WebGPU again.

Brandon Jones:

And before I actually joined the team and went into it, I just picked up the API at some point. I think I literally just swung my chair around one day and said to Kai, "Hey, this WebGPU thing, how stable is it? If I write something in it right now, am I going to regret that?" It was a while back, there's been a lot of changes, but the general sentiment was, "No, it's in a good state to try things. It's in Canary right now. Go for it." And so, I just started poking at it more or less to get a sense of what the API would look like and how it would map to these modern sensibilities. I had tried Vulkan several times before that, knowing that that was kind of the direction that all of the native APIs were going, and I found it very difficult to really get into, because you spend so much of your time up front managing memory and going through and trying to reason about, "Well, these features are available on these devices, and I have to do things this way to be optimal here."

Brandon Jones:

There's a lot of necessary detail there for the people who really want to get the most out of the GPUs, but for me, who really, truly is primarily interested in just like, "I want to disseminate something to as many people as possible. It doesn't have to be the best-performant thing in the world. I just want it to be widespread," it felt like so much work. And so, I dived into WebGPU, and I was a little apprehensive, and I walked away from it going, "That was so much better than I was worried about." Because the API felt like something that was native to the web.

Brandon Jones:

It felt like something that was built to exist in the world that I liked to play in, and it encapsulated some of these concepts of how you interact with the GPU in a way that felt so much more natural to me than those 30-year-old abstractions that we've been muddling through with WebGL. Simply the ability to go, "Oh, hey, I don't have to worry about this state over here breaking this thing that I did over here" was fantastic. And so, those initial experiments really got me excited about where that API was going and very directly led me to going, "Okay, no, I really want to be part of this team now and push this API over the finish line."

Patrick Cozzi:

Brandon, the developer in me is getting really excited to use WebGPU. Tell us about the state of the ecosystem, the state of implementations. If I'm a student, or I'm maybe on the cutting edge of one of the engines, should I be using WebGPU today? Or maybe if I'm working at a Fortune 500 company, and I have a production system, can I jump into WebGPU?

Brandon Jones:

I'll take a crack at that so that Kai can have a break. He's been talking for a while. The state of things right now is that if you build something... If you pull up, say, Chrome and build something using Chrome's WebGPU implementation behind a flag, you are almost certainly going to have to make some minor adjustments once we get to the final shipping product, but they will be minor. We're not going to break the entire API surface at this point. There will be minor tweaks to the shader language. You might have... like, we recently replaced square brackets with at-symbols. You might have to do a couple of minor things like that, but largely, you will be able to build something that works today and that you can get working with the final shipping product with, eh, maybe half an hour of tweaks. The delta should not be huge.

Brandon Jones:

Now, whether or not you want to dive into that right now is a good question. If you are the Fortune 500 company who is looking to launch something a month from now, no, this isn't for you yet. We will get there, but we're not on that tight of a timeline. It's probably worthwhile experimenting with it if you'd like. If you're looking at something and saying, "Hey, I'm going to start a project now, and I expect to ship it in a year," yeah, that's actually a really good point to start playing with this, because we are probably going to be shipping right around... Well, I hope we're not shipping in a year, but we will have shipped probably by the time you're looking at releasing whatever you're doing. And at that point, you can also claim the title of being one of the first WebGPU whatevers that you're working on.

Brandon Jones:

Taking a step back from that, if you are the type who's like, "I'm not really sure what I'm doing with 3D on the web. I just want to put fancy graphics on my screen," you probably don't want to turn to WebGPU first. You probably want to look at Three.js, Babylon, any of the other libraries. I mean, there's a lot of purpose-made things. If you want to do something with maps, for example, you probably don't want to turn to Three.js. You want to look at something like Cesium. And so, spend some time looking at some of the higher-level libraries that are out there that will help you along that journey, because in a lot of cases, those will provide some of the wrappers that help abstract between WebGL and WebGPU for you.

Brandon Jones:

And so, it might take a little bit longer to catch up, but you will most likely eventually reap the benefits of having that faster backend without too much work on your part. Babylon.js is a really good example of this. They are actively working on a WebGPU backend that, from what I hear from them, is effectively no code changes for the developer who's building content. Those are the kind of things that you want to look at.

Brandon Jones:

The last category that I would say is, if you are a developer who is interested in learning more about how graphics work, you're not... Let's take the web out of the equation here. You just want to know, like, "I have a GPU. I know it can put triangles on my screen. I want to know more about that." WebGPU is probably a really cool place to start, because if you dive straight into WebGL, you are going to be working against a very old API, a very old shape of API, that doesn't necessarily match the realities of what GPUs do today. If you want to do something that's a little bit closer, you're immediately jumping into the Vulkans or D3D 12s of the world, which are quite a bit more complicated and really designed to cater to the needs of the Unreals and Unitys of the world. Metal's a little bit better, but of course, that depends on your availability of having an Apple device.

Brandon Jones:

WebGPU is going to sit in this fairly nice midpoint where you are not doing the most complicated thing you could do. You are using a fairly modern API shape, and you are going to be learning some of those concepts that teach you how to communicate with the GPU in a more modern way. And so, it could be a really, really fun place to start as a developer who is not necessarily worried about shipping a thing, but really wants to know how GPUs work. I would love to see more people using this as a starting point for learning, in addition to actually taking advantage of the more complicated GPU capabilities.

Patrick Cozzi:

Right. I think this is sound advice across the board, and certainly on the education perspective, I think WebGPU will be fantastic. Kai, anything you want to add on the ecosystem?

Kai Ninomiya:

Yeah. Just in response to what Brandon was just saying, when we were designing this API, early on, I would say one of our major lofty ambitions for the API was that it was going to be the teachable API, the teachable modern API, right? Something more teachable than at least Direct3D 12 and Vulkan. In the end, we have ended up with something that is fairly close to Metal in a lot of ways, just because the developer experience ends up being very similar. The developer experience that we were targeting ended up very similar with what Apple was targeting with Metal, and so we ended up at a very similar level. There's still a lot of differences, but we think that WebGPU really is the best first intro to these modern API shapes. And it is pretty natural to go from WebGPU toward those other APIs. Not everything is the same, but having an understanding of WebGPU gives you a really, really strong basis for learning any of these native APIs, and so in that sense, it's really useful. I don't... Yeah. I don't know other particular things to talk on, but...

Patrick Cozzi:

And Kai, I believe the course you mentioned at the beginning, CIS 565, I believe that is moving to WebGPU, too.

Kai Ninomiya:

Yeah, that will be very exciting.

Patrick Cozzi:

Great. Moving the conversation along, one thing that comes up on almost every podcast episode is 3D formats, right? When we think of the open metaverse, we think of interoperable 3D, and USD and glTF keep coming up, and we love them both, right? USD coming from the movie and entertainment world, and glTF, as Brandon mentioned, coming from the web world. So, when you look at the web today and in the web as we move forward in the future, do you think is it primarily going to be glTF, or formats like USD, or other formats also be web deployable? Brandon, you want to go first?

Brandon Jones:

Yeah, I will admit right off that I have a bias in this conversation. As I mentioned before, I've kind of been tagging along for the glTF ride, and so I have a certain fondness for it. Getting that out of the way. Yeah, I think you hit on something that's really important, in that glTF was designed for consumability by the web. It works very well in a lot of other cases, but that's really what it was designed for first and foremost. USD was designed by Pixar to manage massive assets across huge datasets with gigantic scenes and being able to share that between hundreds of artists, and it's a technical feat. It's an amazing format. The reason that it's entered the conversation in terms of a web format is because Apple picked that up and took a limited subset of it, an undocumented limited subset of it, and said, "Oh, we're going to use this as one of the native formats on our devices."

Brandon Jones:

Now, there is no reason that that shouldn't be able to work. They've obviously shown that they can use it as a good real-time format for a lot of their AR tools, and I think with appropriate documentation and standardization of exactly what that subset is that they're working with, we can get to a point where it's a perfectly viable, workable thing for a standards-based environment like the web. I think it's got little ways to go, though. glTF is kind of ready to go right out the gate, because it's been designed for that. It already is a standard. It's very well-defined what it can contain, and so my prediction here is that we will see glTF continue to be picked up as a web-facing format, more so than USD, at least initially. And... I lost track of the other point that I had to make, but that's effectively where we're at right now.

Brandon Jones:

Now, there are some possible exceptions to that. I do remember what I was going to say. There's conversations going on right now in the Immersive Web Working Group around the possibility of having a model tag, same as we have image tags or video tags. Have something that Apple proposed as a model tag, or you could just point it at one of these 3D assets and have it render in your page with very little work on the developer's part. It would be pretty much entirely declarative.

Brandon Jones:

And in an environment like that, if you have an OS that's already primed to show something like a USD file like Apple's is, it makes a lot of sense to just surface that through the web renderer, and that's certainly what they would like to do. It would be much more difficult for other platforms to support that, so we'll have to see where those conversations go, but that is a way that those could show up more prominently on the web on an earlier timeframe. But even then, I would say that the majority of the work needs to just go into actually standardizing what that subset, the USDZ subset that is intended to be used in real-time, actually consists of.

Patrick Cozzi:

All really good points. Yeah. Thank you, Brandon. Kai, anything you want to add on this?

Kai Ninomiya:

Yeah, I mean, I agree with all of that, again, with the caveat that I did a very, very small amount of work on glTF and am often surrounded by folks working on glTF. To relate it to WebGPU, I would say that one of the real benefits of both WebGL and WebGPU is that like I was mentioning earlier, they are hardware abstraction APIs first and foremost, and that means that you can do whatever you want on them, right? In principle, it doesn't really matter what format you're using. You could use your own proprietary format, which is very common in a lot of cases. For example, you've got CAD programs that have their own formats that are specialized for different use cases. You've got 3D Tiles for geospatial. You can build whatever you want on top of WebGPU and WebGL, because they're hardware abstraction APIs. They're hardware abstraction layers.

Kai Ninomiya:

And so, while glTF works great, and from a standards perspective, it seems like it's very mature, relatively more mature, and is a great format for shipping assets to the end user, in principle, you can do whatever you want, you can build whatever you want on top of WebGPU, and you'll be able to take any format, and that's... could even be specialized for your use case, for your application, and make that work great with your own code, because you control the entire stack from the format ingestion all the way to what you send to the hardware, essentially.

Patrick Cozzi:

Gotcha. I have many more questions about WebGPU, but I think we should start wrapping things up. And the way we like to do that is just to ask each of you if there's any topics that we didn't cover that you'd like to. Kai, you want to start?

Kai Ninomiya:

Yeah, I don't have much. There was one interesting topic that we didn't get to, which was building things for WebGPU as sort of like a cross-platform API, right? WebGPU is a web-first abstraction over multiple graphics APIs, but there's nothing really web about it, right? It's a graphics API first and foremost. And so, we've collaborated with Mozilla on creating a C header, C being lingua franca of native languages, to create a C header which exposes WebGPU, the same API. And this is still... It's not fully stable yet, but it's implemented by our implementation, by Mozilla's implementation, and it's also implemented by Emscripten, which means you can build an application against one of these native implementations, get your engine working.

Kai Ninomiya:

If you're a C++ developer or a Rust developer, for example, you can get your stuff working against the native engine. You can do all your debugging. You can do all your graphics development in native, and then you can cross-compile to the web. Emscripten implements this header on top of WebGPU and the browser. It sort of translates C down to JavaScript, and then the JavaScript in the browser will translate that back to C and run through our implementation.

Kai Ninomiya:

So, we see WebGPU as more than just a web API. To us, it is a hardware abstraction layer. It is not web-only. It's just designed for the web in the way that it's... in its design principles, in that it's write once, run everywhere. But those properties can be really useful in native applications, too, and we are seeing some adoption of that and hope to see more. We have a quite a few partners and folks that we work with that are doing just this with pretty good success so far. Yeah, so it's a really... we're really looking forward to that future.

Patrick Cozzi:

Very cool, Kai. It would be amazing if we could write in C++ and WebGPU, target native and target web. I think that would be a great future. Brandon, any topics that we didn't cover that you wanted to?

Brandon Jones:

Boy, I think we've hit a lot of it. Nothing jumps to mind right now. I did want to mention exactly what Kai said, in that we do talk about Dawn - WebGPU in the context of the web, but it really can serve as a great native API as well. On the Chrome team, our implementation of that is called Dawn, which is where the slip-up came from. If people are familiar with the ANGLE project, which was an implementation of OpenGL ES over the top of D3D and whatnot, Dawn serves very much the same purpose for WebGPU, where it serves as this native abstraction layer for the WebGPU API shape over all of these other native APIs. ANGLE is something that sees use well outside the web. It was, I think, initially developed for... used by game studios and whatnot, and I hope to see Dawn used in... Or either Dawn or Mozilla's implementation of it. WGPU, I believe, is what they call it. They'll all have the same header. They should all be interoperable, but having those libraries available for use well outside the web is a really exciting idea to me.

Patrick Cozzi:

I agree. Okay. Last question for me is if you have any shout outs, to a person or organization whose work you appreciate or admire. Kai?

Kai Ninomiya:

Yeah. WebGPU is a huge effort. It's spanned so many people and so many organizations, but definitely top shout out to Dzmitry Malyshau, formally of Mozilla, who was our co-spec-editor until recently. He had such a huge influence on the API. Just brought in so much technical clarity from the implementation side, so is just so much... so many contributions, just everywhere across the API and the shading language. Dzmitry recently left Mozilla and stepped down as spec editor, but he is still a maintainer for the open source project, WGPU, and so we are continuing to hear from him and continuing to get great contributions from him. So, that's the top shout out.

Kai Ninomiya:

I also want to mention Corentin Wallez, who is our lead on the Chrome team. He started the project on the Chrome side, as I mentioned earlier, and he's the chair of the community group and really has just such a deep understanding of the problem space and has provided such great insight into the design of the API over the past five years. It's really... Without him, we wouldn't be able to be where we are today. He just has provided so much insight into how to design things well.

Kai Ninomiya:

And there are a lot of other standards contributors. We have contributors from Apple. Myles Maxfield at Apple has been collaborating with us on this for a long time, and that's been a great collaboration. Again, extremely helpful and really useful insights into the API and into what's best for developers, what's best for getting things to work well across platforms. The folks working on WGSL, on the shading language, are numerous. There's many across companies. The art-int team at Google has done an amazing job pushing forward the implementation, and in collaboration with the group has done an amazing job pushing forward the specification so that WGSL could catch up with the timeline and so that we could have WebGPU almost ready at this point in time after only like a year or a year-and-a-half or so of that development. I think about a year-and-a-half at this point, so that's been incredible work.

Kai Ninomiya:

And then, we also have a lot of contributors, both the standardization and to our implementation, from other companies. We work with Microsoft, of course, because they use Chromium, and we have a lot of contributors at Intel who have been working with us, both on WebGL and WebGPU, for many years. We have contributors both from the Intel Advanced Web Technology team in Shanghai who've been working with us for more than five years, since before I was on the team, as well as contributors from Intel who formerly worked on Edge HTML with Microsoft. And so, we have a ton of contributors there.

Kai Ninomiya:

And finally, partners at companies prototyping WebGPU, there's like... We've been working with Babylon.js since early days on their implementation. We met with them in Paris. We had a hackathon with them to get their first implementation up and running. We've been working with them for a long time. Their feedback's been really useful. And tons of people in the community online who have contributed so many things just to the whole ecosystem, to the community. It's a wonderful community to work in. It's very active, and there are so many amazing people that have helped out.

Patrick Cozzi:

Kai, love the shout outs, and love that you're showing the breadth of folks who are contributing. Brandon, anyone else you want to give a shout out to?

Brandon Jones:

Kai stole all the thunder. He named all the people. I have no one left to name. No, actually, so two people that I wanted to call out specifically that are not necessarily intimately involved in the WebGPU... a little bit more so now, but just graphics on the web. Kelsey Gilbert, excuse me, from Mozilla, has been stepping in and taking care of some of the chairing responsibilities recently and has been a presence in WebGL's development for a good long time. Someone who just has an absolute wealth of knowledge about the web and graphics and how those two intersect.

Brandon Jones:

And then, in a similar vein, Ken Russell, who's the chair of the WebGL Working Group, who has done an excellent job over the years helping steer that ship, and really everyone who works on WebGL. But as I mentioned previously, that includes a lot of the same people who are working on WebGPU now, and Kai stole all of that thunder. But yeah, Ken and Kelsey both have been helping steer WebGL in a direction where it is a viable, stable, functional, performant API for the web, and really has done so much of the heavy lifting to prove that that kind of content and that kind of functionality is viable and is something that we actually want on the web.

Brandon Jones:

I've joked several times that new web capabilities seem to go through this cycle where they're impossible, and then they're improbable, and then they're buggy, and then they're just boring. You never get to a point where they're actually like, "Wow, this is cool." Everybody likes to say, "Oh, you could never do that on the web," and, "Okay, well you've proven can do it on the web, but it's not really practical, and "Okay, well, yeah, sure. Maybe it's practical, but look, it's fragmented and everything," and, "Well, now that you've got it working, it's just boring. It's been around for years, so why do I care?"

Brandon Jones:

That's kind of the cycle that we saw WebGL go through, where there was a lot of naysayers at first, people saying like, "Oh, the web and GPU should never touch," and, "What are you trying to do?" And it's individuals like Ken and Kelsey that have done an excellent job of proving the naysayers wrong and showing that the web really does need this kind of content and paved the way for the next steps with WebGPU. It's very easy to say that we literally would not have ever gotten to the point of considering WebGPU had WebGL not been the rousing success that it has been.

Patrick Cozzi:

Yeah. Great point, Brandon. Great shout outs, and then also a plus one from me for Ken Russell. I mean, his leadership as the working group chair for WebGL, I really admired it, and I really borrowed it as much as I could when I was chairing the (Khronos) 3D Formats Group. I thought he was very engaging and very inclusive. All right, Kai, Brandon, thank you so much for joining us today. This was super educational, super inspiring. Thank you for all your work in the WebGPU community. And thank you, the audience and the community, for joining us today. Please let us know what you think. Leave a comment, subscribe, rate, let us know. Thanks, everybody.