Building the Open Metaverse

Immersive Storytelling

Rob Bredow, Academy Award nominated VFX Supervisor with roles at Industrial Light & Magic, Lucasfilm, and the Academy Software Foundation, joins Marc Petit (Epic Games) for a discussion about the evolution of visual effects and immersive storytelling.

Guests

Rob Bredow
SVP, Chief Creative Officer, Industrial Light & Magic; SVP, Creative Innovation, Lucasfilm
Rob Bredow
SVP, Chief Creative Officer, Industrial Light & Magic; SVP, Creative Innovation, Lucasfilm

Listen

Subscribe

Watch

Read

Announcer:

Today on Building the Open Metaverse ...

Rob Bredow:

And Trials on Tatooine was just a tiny little experiment, but it really gave the picture of what immersive storytelling ... Or the beginnings of a picture of what immersive storytelling could do that's different than what you can do in film. A little light interaction. Some fun game mechanics but a lot of character and a lot of interactions that you would normally think of as movie-type interactions in a totally different way.

Announcer:

Welcome to Building the Open Metaverse where technology experts discuss how the community is Building the Open Metaverse together, hosted by Patrick Cozzi of Cesium and Marc Petit from Epic Games.

Marc Petit:

So hello everybody and we come to our show Building the Open Metaverse. The podcast where technologists share their insight on how the committee is building the metaverse together. Hello, I am Marc Petit from Epic games, and today I'm on my own as my co-host, Patrick Cozzi from Cesium could not make it today. And we did not want to delay today's recording as we have a great guest for you. And it's my immense pleasure to welcome Rob Bredow, the senior vice president and chief creative officer at Industrial Light and Magic. Rob, welcome to the show. We're thrilled to have you with us today.

Rob Bredow:

Thanks. Thanks for having me. Fun to do.

Marc Petit:

Yeah. And I'll say a few things because I know you're two modest to say them yourself. So you're a unique blend of art history, creativity, as well as a technologist. And you're an Oscar nominee for your work on Solo: A Star Wars Story as a visual effects supervisor and producer. You're CTO of ILM and image works, important things. And you've already been very involved with the visual effects community, member of The Academy. And you are the chairman of the governing board of the Academy Software Foundation, which we know is leading key open source projects for the movie industry. So quite a pedigree, very impressive. And this being said, so I'd like you to describe in your own words, your journey through computer graphics and ultimately to the metaverse. So alternating between the very technological core roles and being an artist, how did you do that?

Rob Bredow:

Well, I kind of love everything around computer graphics and filmmaking. It's just such an interesting area and there's so much innovation. I mean, since I've been in it. The first computer that I used was a personal Iris 25, that was a really beefy $250,000 work station. And today, of course my iPhone has way more... it has more memory, it has more graphics power. It has more CPU than that machine could ever dream of having. It ran at 25 megahertz, which is just a completely different world than what we are dealing with today. So the amount of innovation and the amount of changes happen in the industry, it's been a lot of fun to keep up with, a lot of fun to learn and break new ground at each turn.

Rob Bredow:

And I really have enjoyed getting to keep one foot in the world of technology, because that's certainly what is so intrinsic to all of the work we create than tech for tech’s sake. That just never appealed to me. So the fact that we get to make pictures, tell stories, collaborate with people from all different backgrounds has been something that I've just always loved. So I truly learned the artistic and creative parts of the job at work with people who had graduated from Cal arts and world class artists and world class storytellers that I've gotten to work with over the years. And I mean, I got to sit next to Ron Howard while he's directing Solo: A Star Wars Story. And he was generous with his learning and his sharing. So you get to learn so much in this role. So I feel very, very fortunate to get to have these experiences.

Marc Petit:

Yeah. Learning is so critical. And so you mentioned the evolution of the industry. And I think when you think about Lucasfilm and the role of Lucasfilm in the industry from the invention of a digital video editing to the first creatures in Jurassic Park to invention of THX. I mean, it's been a source of innovation for the industry. And you guys did it again with virtual production and stagecraft and your work on The Mandalorian. So how did this happen?

Rob Bredow:

Yeah. Getting to put up that first LED wall that was going to be used in an extensive way to create The Mandalorian was just a huge thrill. And we had been working on it and kind of laying the groundwork for that since before I arrived at Lucasfilm and ILM. I remember in my first week I drew a drawing of like, what would it be like if we could do a whole room that was surrounding you with LEDs and you could take the HJI sphere that we usually shoot on the set and you put it out in the LEDs and use that as a re shoot stage. And I drew these drawings and showed them around and I wasn't certainly the first person who thought of that, but it was definitely a passion project to try to make that a reality, whether that was projectors reality tech and the technology wasn't quite there.

Rob Bredow:

I mean, as we were shooting Rogue One, we were really interested in this real-time technology and we use some LEDs on the walls to do the elimination, but we needed to replace all that content. But I mean, I'm just very fortunate to work at a place where these experiments are willing to be jumped into even before the tech is ready. So as John Knoll, who was a visual effects supervisor, who was working with Greg Fraser, the DP on Rogue One and shooting these brilliant looking shots where the backgrounds needed to be replaced, we had these walls available. So me and Tim Alexander flew out to London and as they were wrapping those stages, we got a day on those stages to shoot a bunch of tests. Like what if we could capture this stuff in camera finals, what would it look like? And we did all these calibration tests and did all this work.

Rob Bredow:

That was the work that led up eventually to giving us the ability to get in a room with John Favreau and say... He came with all of this virtual production experience having just finished Lion King, knowing those tools inside and out, knowing what they could bring to live action filmmakers, but you weren't going to put VR headsets on your actors. Like you're going to photograph your actors. So how can we bring that into the real world?

Rob Bredow:

So I was part of the team that was in the room, pitching with him of options of how we could create this show. And when I showed him our LED tests that we had done, and when I talked about the kind of ambition we had for making this possible, he really lit up. And I mean, he had a vision for how that could change his show, how he could design his show around that, how he could shoot his show there in Manhattan beach, but bring the whole world's locations.

Rob Bredow:

And of course the locations we've already had in the library at the studio, to the sound stage in Manhattan beach. And that was a risky decision. I mean, that's one of the things that we're very fortunate to have at Lucasfilm. And some of the filmmakers we work with at ILM, they're not very afraid. This could have gone badly and there was a lot of nerves and there was a lot of hard work that had to happen to make this work the first time. There were as many engineers on set, I think as the rest of the crew. There were engineers writing code on the side of set on the first days of those early days in The Mandalorian, both people from Epic and people from ILM, and other companies as well, working together just to make sure it held together long enough for that shoot. And now, since then, it's now quite a few years ago, since then it's actually a pretty production hardened system that we've got up and running, but those first few times were not guaranteed to be smooth.

Marc Petit:

All right. So let's switch... Let's stay on the topic of innovation and talk about a different topic. Soon after you join ILM in 2014, you created ILM X LAB and 2015, I don't know if you remember, but it was early too... It was very early when it comes to interactive content and VR. And your work on Trials on Tatooine, which was a real innovative piece in VR to tell stories. So tell us about X lab and how far you've been able to push that exploration into the world of interactive content.

Rob Bredow:

Yeah. Getting to create early immersive entertainment that was really story driven, was just another one of those great opportunities. And I remember I wrote the first draft of Trials and talked to Wayne in the middle of a jet lag night sometime when I was in Europe working on... Actually I think I was shooting the test we were just talking about Rogue One and I was jet lagged that night and I'm like, "I've got an idea." So I wrote this thing with a malign topic.

Marc Petit:

What a day!

Rob Bredow:

Yeah. That was a good day. That's a good point. And I mean, the studio just got behind doing these experiments. And at the time we took our dance development group. We didn't even have X labs. It wasn't even named X lab. It was the advanced development group and some of the team around it was a team of engineers and artists all working together to kind of push what the state of the art was, both in filmmaking tools from real-time, but also in what people would experience at their houses. And Trials on Tatooine was just a tiny little experiment, but it really gave the picture of what immersive storytelling or the beginnings of a picture of what that immersive storytelling could do that's different than what you can do in film. A little light interaction, some fun game mechanics, but a lot of character and a lot of interactions that you would normally think of as movie type interactions, but told in a different way.

Rob Bredow:

And since then, X Lab has just gotten to create more and more immersive experiences with amazing filmmakers. Like we do Carne Y Arena, which ended up winning an Oscar for its innovative storytelling, a very powerful film, an interactive cinema piece that was well recognized. And then many hours of virtual reality story-based entertainment from cartoony experiences like Ralph Breaks VR, Avengers Damage Control is a location-based experience about 15 minutes long where you go through and you get to actually experience what it's like to be drafted into the Avengers.

Rob Bredow:

And then the Star Wars stories that have more recently been told with Vader Immortal and the most recent one, the Star Wars Tales from Galaxy's Edge, those are multiple hours of Star Wars storytelling that take place... tales from Galaxy's Edge actually takes place with Galaxy's Edge, the location you can go to at Disneyland or Walt Disney World, that's the hub and then you branch out of there to get all of these other immersive stories told, and you experience them firsthand.

Rob Bredow:

So it's been a lot of fun to learn so much about what that balance is between storytelling and gaming mechanics and all the other things that go into making these immersive experiences. And then from a technology perspective, it used to be that there was a long list of things you couldn't do in VR. And now on a Quest 2, we can't do just anything we want, but we have multiple real-time characters, very complicated environments, very complex storytelling environments that we can use to bring these stories to life. So we're really starting to see the beginning of this industry really becoming a reality.

Marc Petit:

So from a creative perspective, do you think we've cracked the code on mixing the agency of interactive content and the art of storytelling? It looks to me like it's a wide field of experimentation and you guys have done some, and you're not the only one, but there's not a lot of experimentation in mixing linear and interactive. So do you see potential there?

Rob Bredow:

I do. There's a lot of potential there. I just got to experience Star Wars Galactic Starcruiser, which is a two-night experience at Walt Disney World. Have you heard about this or talked about it?

Marc Petit:

Well, we heard. We guessed what it could be, but the more if you could describe it a little bit, I think it would be fantastic.

Rob Bredow:

Yeah. I mean, I can't say anything that's not publicly available yet. It opens just in a couple of weeks, but it is starting, I think it's March 1st. Anyone can book a seat on the Galactic Starcruiser, and it is a two night immersive experience. You get to decide what kind of Star Wars character you want to be. Or you can just stand back and have a nice drink at the lounge and play some Saba and hang out and watch other people play.

Rob Bredow:

But if you do play, I mean, you're getting... You're interacted with on your data pad, you're interacting with characters, they’re sending you on missions. You're doing stuff all through the ship. There's a ground excursion to Batuu which is... Batuu is the Galaxy's Edge park so there's a ground excursion. And when I first was working on this experience, because ILMxLAB helped to create all some of the work that makes it immersive, all the screens that make the outside view of space and all that kind of work.

Rob Bredow:

And when I was working on it, I had to be very intentional about using the right words. If I said, "The hotel," someone would correct me and say, "It's not a hotel, it's a ship." And I'd be like, "Okay mental model I got to remember it's the ship." And I wondered if experiencing it in the situation... If you had to kind of play along, if you really had to work to suspend your disbelief, or if the illusion would be complete. And I got to say, I mean, this is all credit to the Imagineers and everyone who created this experience. You walk in there and in the first 15 minutes you go from thinking you were checking into a hotel if you were, to you're on a ship. Every window you look at is space, you are welcomed at the captain's muster. I mean, it is like going on a cruise except you're going on a cruise into space.

Rob Bredow:

And it's very high end. It provide the very, very high level of service, very nice meals, all that sort of stuff. So anyway what made me start thinking about that was... You were asking about immersive storytelling and all of the tough challenges to crack there. Because as people, we like to be told stories by expert storytellers. That is, the model of a choose your own adventure, doesn't always generate a great story.

Rob Bredow:

And most people don't want to choose their own adventure. They want to be told a fantastic story. And some of the things we've experimented with and you've seen in some of our immersive experiences and what the Imagineers and us together have done in Galactic Starcruiser, there is a master storytelling, telling you a story, but you do also get to interact with it. You get to interact with characters, you get to change moments along the way, even though there is an overarching narrative that is very satisfying and you can just stand back and watch the whole thing. And it's a really fun show or you can get as involved as you want. And I think we're starting to see big plays like that in immersive entertainment. Actually, I think Galactic Starcruiser is going to stand on its own. It's a pretty unique experience, but we're starting to see models where you really see this immersion really work.

Marc Petit:

Yeah. By the way, Patrick has got tickets already and we try to get Bay Young to come to the show and we haven't had a chance to talk to Bay yet, but we wanted to hear about that experience. So glad. Thanks for sharing, being generous and sharing some of this with us. So tell us about the intersection because the traditional VFX pipeline, the virtual production pipeline and the work of ILMxLAB, are you able to deliver on that dream of sharing assets across intermedia properties? What's the state of the art there and what can we expect?

Rob Bredow:

Yeah. We fortunately have a long roadmap and have been able to deliver on some of that promise. So over 10 years ago, the decision was made when we knew we were going to have 10 years of Star Wars films in a row. It made sense to invest in a library, in a way of storing our assets that was going to be reusable over time. And we settleD on some formats. We’ve actually translated most of that stuff to USD now. We created MaterialX, which is open source and is more widely used now, but that was initially created in part to give us this translation layer so that no matter whether an artist was using tool A or tool B to create the asset, we would have an interchangeable way of describing the geometry, the textures and the materials that would stand the test of time.

Rob Bredow:

So you can take... Literally, you can take an X-Wing from Episode Seven, which was the first one that was done with the unified assets specification that we laid out and you can drop it into one of our shows and the way I like to describe it is, the water that was streaking down by the windows that extra effect, that we don't bother trying to standardize. That's a one-off, but all the rest of the materials, the paint colors, how metal materials are, how they reflect the light, how even how the globes work, those all work the same in our system from Episode Seven to today. I’ve been at a few different places over my life. I've never experienced that sort of continuity of library and the value that's brought to the organization.

Rob Bredow:

And that's true, whether we're talking about a game that we're doing with a partner like Electronic Arts, whether we're talking about the work that's being done at ILM or ILMxLAB for immersive entertainment. So that's been really important, especially as we get into doing work inside of stagecraft on our LED stages, the virtual art department is almost always done these days in Unreal, but there's other tools that get used in that pre-production process. We use a couple of different rendering techniques on the wall. Sometimes we're rendering to the wall with Unreal. Sometimes we're rendering to the wall with Helio and we need to make sure we have the same high fidelity experience and the full interchange of data between all the different tools that our artists want to use.

Marc Petit:

Yeah, it's quite a feat. I mean, it looks like the dream is coming true. I mean, as you said, it's 10 years worth of work and thanks for your contribution. And we'll segue into the open source conversation, I think. You specifically as an individual and ILM has been contributing a lot so that the rest of the industry can also benefit from all of that. So throughout your career, you've been a proponent of open source even you were at the origin of Alembic, right?

Rob Bredow:

Yeah. Yeah. It was when I was at Sony and I was collaborating with Richard and the team at ILM and we teamed up to create Alembic.

Marc Petit:

Yeah, I remember that. It was a very interesting moment because it was the first time I think, correct me if I'm wrong but as for me, I remember the first time when competitors would actually start collaborating for the advancement of the industry. And I think it was a great thing to see happen so.

Rob Bredow:

It was interesting because we both caught wind. It was actually... I think it was in a meeting you organized when you were at Autodesk, we both realized that ILM was working on something that they thought might become an open source standard for geometry representation and Sony was as well. And we both realized how silly would it be if there was two competing open source standards for geometry storage out there in the marketplace.

Rob Bredow:

And yeah, a quick conversation we realized it was going to be better for both of us if we teamed up to create one solution that would be useful for everyone. And Alembic really took off once we had that finished and ready to go and share with the market. Although now, before there wasn't a really a centralized body that was facilitating that. So fortunately there were great teams out there at Autodesk and other places that would facilitate these conversations. And eventually you'd have these side conversations where these kind of things could be discovered. Now with the Academy Software Foundation, there's actually a forum where you can find out what's going on with other people's ambitions in the open source space and hopefully keep that more coordinated to keep the efforts as efficient as possible.

Marc Petit:

Yeah, no, that was a great achievement because we had project pop up, live up and almost die in the industry because of people moving on from job to job. And you were the origin of the Academy's Software Foundation. It's fair thing to say. Convincing the academy to provide infrastructure in the Linux foundation to put life into some projects and also to coordinate the industry.

Rob Bredow:

Yeah, it was my project that I was chairing as a member of the site tech council at The Academy. And to be honest, I might have been one of the most skeptical people that the right place to solve this was going to be inside The Academy because it needed to be fast moving and innovative and it needed to be very freeform and unstructured because that's I felt the most successful open source development happens when it's really engineer powered.

Rob Bredow:

But as the more we looked into it, the more we realized that a foundation that would solve some of the problems of licensing and help coordinate the version challenges and help provide a centralized resource for the legal and all the other requirements around open source, it could be really efficient for the industry. And then The Academy knew its biggest contribution to this could be the gathering of people and the people that want to participate in what The Academy represents, which is that top quality filmmaking and that kind of storytelling.

Rob Bredow:

And I think one of the key moments for me was when we met with the Linux foundation and they came in and they had a system set up to partner with industry to solve these problems. So then it could be branded as a collaboration and it is a collaboration between The Academy and the Linux Foundation and then operated by the Linux Foundation who... They make Linux, they make hundreds of other open source projects and they're very successful at organizing the open source project collaboration. So now on any Academy Software Foundation project, you can just go in, check in a change, test the code, add the documentation. And it can be in the next release with very little overhead, which is miles and miles away from where we used to be.

Marc Petit:

Yeah. And absolutely we had Royal O'Brien from O3DE project, which also shares the Linux Foundation or infrastructure. And I think we cannot under the importance of infrastructure and this knowledge in the way you run the technology, the technology groups and powering the engineers really. So kudos to the Linux Foundation, they created a really good model right there.

Rob Bredow:

Yeah. It's been fantastic. And I've learned so much from their processes and procedures. And then of course we have incredible support from industry. We have more than 25 sponsoring companies, including both of the companies we work for, who are represented on that governing board, who put effort in and contribute engineering hours to these projects.

Rob Bredow:

So when a project comes into the Academy Software Foundation, it's not just, "Hey, it has a place to be hosted." There's lots of places you can go for that. But there's engineers from the industry who are going to be spending time on those projects in addition to anyone else who wants to volunteer time. But both of our companies pay people to work on these projects, which is a superpower to be able to do that.

Marc Petit:

Yeah. And actually this makes me... One of the reasons we do this podcast and we see the metaverse happening, the extension of the internet to this technology we've been working on the past 20 years. And I know because of the work we've done together on the CTO council or the Academy software Foundation, that there is a true appetite for the industry come together and work together at the technology level. I mean, business models, other ways, but it makes me very optimistic to even start on this endeavor.

Marc Petit:

We're working with The Khronos Group. And with the Academy Software Foundation with David to try to advance around USD and glTF and coordinate between open standards and open source. But I do believe in that there is a real appetite for the people of the industry to come together and create an open endeavor. So tell me what you know... So you guys have been using USD. You've been proposing MaterialX as extensions to USD. We had our friends from Nvidia on the show saying, "Hey, it's going to be the HTML of the metaverse." And so do you share that optimism? Do you think we can get there?

Rob Bredow:

Well, there is no better platform I think for the kinds of things we all want to create than USD. I mean, it's designed for high complexity, it's designed for multiple users to layer their work on top of other users, which is how we're going to build complicated environments and do complicated storytelling in worlds that we will create whether call them metaverse or not that sort of experience going forward. Absolutely. Those kind of immersive experiences. And I don't think there's any major company that's working in computer graphics and entertainment who doesn't see USD as the next step.

Rob Bredow:

It encapsulates so many of the workflows that are so essential for how we're going to create and express and share our assets. I mean, just the fact that on an iPhone, if you fire off scan of geometry, which you can do so easily on this device and you send it off via messages, that's going through USD right now, that kind of adoption you only see when... we've seen that with EXR where on your desktop, on a Mac, if you drop an XR there and hit space, you actually get a preview of EXR that's perfectly accurate.

Rob Bredow:

You see that with USD and I think those are indicators of how important and prevalent it's going to be for the future where geometry and worlds that we've built are going to be a really important way to express storytelling. And they already are for sure, but this language is going to be a very, very good language for us to communicate highly complicated scenes.

Marc Petit:

Yeah. So we're using it to have static declaration of the scenes, albeit very complex. How do we get from there to a fully simulated world, in your opinion? What are the steps that you, in your mind in the crawl, walk, run, fly model, what's next? What should we tackle as a group to advance the fully simulated worlds?

Rob Bredow:

To me, the first part is actually before fully simulated. It's the full workflows behind those static representations, and we're close, but we're not there yet. What we don't have right now that I think we all want is a software agnostic interchange that does a nice job for describing the concise changes that were made in one application versus another, on a complicated scene.

Rob Bredow:

We're all pretty good about dumping USD out now but you end up with a big USD file and you don't get just the changes that happen. And it's not a trivial job just to be able to author and tweak the kind of changes you want to make, and then layer those back onto a USD file. So that's a step one. Then where do we go from there in terms of being able to actually encode an expressive character? I mean, there's certainly no limit to the kind of data USB can store, but the real trick is when do we standardize what?

Rob Bredow:

Because I think we all know what it takes to move a video game character around because the GPUs are optimized a certain way. So we can standardize and say like, "Hey, you can move a character around. And here's a way to describe bones. And here's a way to describe point waiting." And the GPUs can perform this in real time. So that's a useful standard to put into USD, which will get you part of the way there, but then when you want to do clothes and when you want to do other things on top of that, or you want to do eyes with articulating lids, like none of that is going to be able to be captured articulately with just those controls.

Rob Bredow:

So what are the things we choose to standardize? And what are the things we intentionally choose to wait on letting industry innovate and try 50 iterations until we all converge on the same things.

Rob Bredow:

So I see we've converged on a lot of workflows, layering of edits on top of complicated scenes. I don't think we've quite converged. I mean, Epic just put out that amazing Matrix experience where you have all this complexity. There was a ton of innovation that went into creating that. New ways of doing level of detail, new ways of representing characters, new ways of creating these really dynamic and large worlds. My guess is if Epic were to redo that today, with everything they just learned, I'll bet there's all sorts of innovations that are going to go into that next generation. So question is when do we standardize and when do we continue to innovate? And I'm guessing we're in an innovation cycle as it relates to characters and interchange of characters for a while still as it relates to both real-time and film asset creation.

Marc Petit:

Yeah. I agree. And I think for me, a lot of that is going to be at least a 10 year journey. I mean, just to set expectations, we're not going to figure things out overnight. And one thing we've always come back on these conversations is experimenting in the open so that everybody learns from people's experimentation, that let's try to create those forums, those places when we can experiment in the open.

Marc Petit:

I think it's interesting to see the benefit and a parallel of having an open standard, like glTF, which doesn't prescribe an implementation and those standards usually come after the fact to rubber stamp what works and to drive the commoditization and have as open source project, we can be places to drive innovation and to experiment in the open.

Marc Petit:

So it's one of our... One of the things we promote.. An idea we promote here is trying to keep those two in lockstep and making sure that in the process of developing around USD and around glTF, we drive some level of synchronization between the two. And so that the solution that at the end, we don't end up with widely different solutions, but both serve a different purpose and will have a different journey hopefully to get us to a proper place. So, do you have any view on open standards and how we should go about those?

Rob Bredow:

I think I'm with you, or at least the way I understand what you're describing, which is there's a time for standardization that once you're an expert and when there's a few different ways of doing things, but they're all basically doing the same thing, just different ways. And that's the time where standardization is your best friend and there's a time for the innovation phase, which we're seeing in a lot of what we're all seeing right now, where you barely have time to write down what you're doing, but if you're able to, and I think this is where efforts, like what the Academy Software Foundation is doing, and others who are working in an open and collaborative way, like what Pixar is doing with USD, working in the open so that people are seeing this innovation. And then like you're sharing the best practices will absolutely rise to the top.

Rob Bredow:

And if somebody else can do it better, everyone's going to see that. If it's done in the open and say, "Oh, that's a way better way to do that." It's probably worth changing our system to accommodate that. And I think that's where the open source workflows, whether that's all in software, like it is in USB or whether there's other kinds of open sharing.

Rob Bredow:

It's not just about software. It can be about assets and the way assets are stored or the way the hierarchies of assets are stored or the tags or those kind of things. All those kind workflows are really valuable for us to share. And sometimes that can be really formalized check-in processes. Sometimes it can just be, "Here's something we learned on the last show." Which ILMs had a long history of doing at events like SIGGRAPH and at View Conference and other places, at FMX every year. Like, here's some of the thing we learned on these shows, which is an effort to, well, selfishly, it's an effort to recruit other people that are like minded to us, but secondarily it's an effort to really push the industry forward because we want to see everybody continue to advance in the industry and we want to recruit the best people in the world.

Marc Petit:

Yeah. That's thanks for being honest about it.

Rob Bredow:

Yeah. It's both. It's both. It's not just because we're so nice.

Marc Petit:

Look it's fascinating to see the level of collaboration in the industry. I think maybe it's also because it's a relatively small industry and we know that what goes around comes around and we better all behave.

Rob Bredow:

That's right. That's right. For sure.

Marc Petit:

Yeah. Well, so yeah. Thank you for that. I think it's the conversation around USD is really one of the topics we cover at every episode on the podcast and you're right that everybody sees the important part of the future and we're looking forward to MaterialX is something that I think we all need. And I think the past seems to be clear now on adopting that. So it's really great news for the industry so. Thank you for that. So is there any other theme or topics that we should cover or we should have covered today during this conversation? Something that seems important to you?

Rob Bredow:’

Well, one of the things I'm interested in, but it's definitely a topic for probably a whole other podcast is the future of procedural assets, both like in movement and in style. So if I've got... If we're building the metaverse and I want to sell you a lightsaber in Vader Immortal and or you earn a lightsaber in Vader immortal and we would like that to show up in Fortnite because we have an agreement and we're building the metaverse where assets are going to be interchange between these two and like the NFT component aside, there's just different looks and different attributes that are relevant for both of those.

Rob Bredow:

And my question is like, whether that's clearly the future? Whether we're going to have a style layer that is going to say like, "here's the Fortnite style, it's going to go on this lightsaber”

Marc Petit:

Like CSS.

Rob Bredow:

Yeah, exactly. Like a style sheet for lightsabers or for anything, or is that just a complete pipe dream? Are we ever going to have procedural assets that are at that level? And I do think one of the reasons proceduralism hasn't taken off in visual effects production is because... And it has in certain areas like no one models a tree by hand anymore. We have systems that help us model beautiful trees. There would be no reason to model every leaf and put it on a tree although I did that when I built my first pine tree. When I started, but now we have systems that build those for us, but we still build our walls by hand. And then when you find out that the Hulk is going to bust through that wall, you have to put the two by fours in, you have to put the different layers, you have to define all those things and then the Hulk can bust through it.

Rob Bredow:

But you have to simulate those sometimes at once, sometimes separately, depending on your system. And you're still doing a lot by hand. We build a lot of walls. We build a lot of hallways, a lot of things that require a lot of investment to create a completely procedural system that understands the architecture behind this. But how powerful would it be if you had a system that knew that? And you could say like, "Oh, I'm designing something from the 1600s. No, I'm designing something that was built in 1980 so it's all stucco in California. I'm designing something that's built in 2020 with the latest things.” And it knows how to put, whether to put metal two by fours or wood two by fours in based on the construction style. Today we actually have companies like Epic, companies like Facebook that are putting in the kind of money that could make these procedural systems possible.

Rob Bredow:

It was really hard to do that when you're just doing one show at a time, you got a lot of work to do. You're never going to have of time to sit back and say, wouldn't it be more efficient if I made a system that could build all the walls for me for the next 10 years, but I do see opportunities for these systems and a large enough audience. It's not a relatively small community of tens of thousands or maybe hundreds of thousands of visual effects artists. Now it's millions of makers who are going to be creating things for the this world. So yeah, I'm very interested in that whole space.

Marc Petit:

Yeah. When Kim (Libreri) was on the show a few weeks ago, we talked about importance of simulation and how simulation can be put to service in moviemaking. He took some examples of The Matrix Awakens. But I think you're also referring to a lot of work we've seen from NVIDIA on style transfer or even on stylization. I mean, teaching a machine about how Monet or how Gaughan are drawing and being able to reproduce that. So do you believe that this concept of augmented artistry is something that can really happen or people will just want to create it by hand because it's going to be better?

Rob Bredow:

Oh, I think both. I think once the machine learning based techniques become a useful artist assistant tool, there's no question those are going to be valuable. And if you could paint over... If you had a brush that was going to do the style transfer and you can determine the strength and how much Monet and how much of each style you're dialing into that brush and you paint it on and you're like, "Oh, I need a little bit more Monet over here. And I need a little bit more of this here." And then you have a controllable system in which to apply that. I think artists will absolutely use those brushes to create. And what I'm not clear on is if there'll ever be the automated translator that translates the Fortnite asset into whatever the other system is. That might be a pipe dream, but maybe there will be tools to make it manageable enough. And if it's user generated or user enhanced, maybe that will be the way we get to an interchange. But that's an interesting topic and I definitely don't have the answer to that.

Marc Petit:

Definitely something we should. We're going to have Mark Sagar and Vladimir (Mastilović) talk about digital humans. And I think it's interesting to see how they're using 4D capture and machine learning to actually assist the process of creation a plausible human. So I think there is probably an angle there, but yeah, very good point. Well, I thank you for that. And finally, is there a specific person or organization or institution that you would like to give a shout out to today?

Rob Bredow:

Well, we've talked about Academy Software Foundation, so I'll plug to the website. It's aswf.io, anybody can go and participate, sign up if company isn't already a member, encourage their company to get and get in touch with us and become a member because you can access all the software without becoming a member, but it doesn't exist without our sponsoring entities. So thanks to all of them who participate. Just a random shout out to a guy named Sam Zeloof, whom I don't know, but I read an interesting article on Wired about him. He's a building integrated ic chips in his parents' garage. He's like a 22 year old guy and is building them from first principles. He started with the simplest and is basically recreating what happened in the fifties and sixties and seventies decade by decade in his garage, which experts... I don't know anything about making ic chips, but experts say it should be impossible to do what he did, but he's getting thrown off hardware, fixing it up and then actually creating ic chips in his garage, which is actually kind of amazing. So yeah, thought that was an interesting article.

Marc Petit:

Yeah. Thanks for pointing it out. I'll check it out for sure. So Rob, thank you so much for your time and your generosity. I think it was a very, very enlightening conversation. I'm sure ILM is keeping you very busy, so we very much appreciate the time you spend with us today. And to our audience, thank you so much. We're lucky to get good speakers. So the podcast is pretty popular. We hear good feedback. So keep giving us the feedback. Tell us what you want to hear here. So Rob again, thank you for being with us today. Patrick was with us in spirit. He will do this Star Wars hotel experience, sorry, ship experience for sure. And I'll try to make it myself as soon as possible. Rob, thank you again and bye everybody.