 So welcome to another Vision Series, maybe this one we should call it Vision Series X. So this one is a collaboration between Lucasfilm, Substance, and Autodesk. And we have a lot of speakers for you today. So I'm really just the MC. I'm gonna get these guys up here and get them to talk about a lot of really cool stuff. And so we have myself, we have Jonathan Stone from Lucasfilm, we have Ilian, who's one of the core developers on Arnold. We have Dabi Day from Substance, David Larsen, X Autodesk, X a few other things, and then Substance. And then we have Nicola, who's gonna show us some really cool stuff. He's with the rendering designer on Maya. So one thing that I do have to claim here is that because it's a Vision Series, we're gonna talk about things that don't really exist in products. So we really don't want you to make any purchasing decisions based on that, but like feel free to take pictures and tweet and do whatever that, but it's really not something where these may or may not happen. They're coming out of our R&D labs, but it really represents our opinion on how some of these things are gonna go. So without further ado, I'll bring up Jonathan and he can start talking to us about MaterialX. Cool. Thank you so much Eric, thanks for inviting us today. So yeah, I was just gonna start with the story of MaterialX and how it came to be. Back in the April of 2013, Lucasfilm and Industrial Light and Magic were preparing for a brand new generation of Star Wars shows. And we launched an internal initiative to unify the way that our assets would be authored going forward. We knew that there would be a long future of Star Wars shows to come, or at least we were pretty confident there would be. And that both the tools we used to create the content and the media for which we developed would be rapidly evolving along with advances in technology. We felt that it was important to standardize the way that our assets were created, shared and archived. So the content that we authored today could benefit from new tools as they came along and would remain reusable by artists and creators long into the future. So there were a number of different facets of this standardization, but one of the most important was the choice of data formats that would be used to capture different components of these new Star Wars assets. Whenever possible, we wanted to use open source data formats which are transparently defined and independent of the tools with which they were originally authored. And fortunately, we were able to leverage a number of great mature formats, such as Alembic, OpenSubdiv, OpenEXR and OpenColorIO for the geometry textures and color spaces. But what was the equivalent for look development and material assets of the type that an artist would author in substance, Mari or Katana? So it's worth noting, when we talk about look development and materials, we're not just referring to the baked textures that get consumed by our renderer. In this breakdown video of the Millennium Falcon from Star Wars, The Force Awakens, you can see that an artist's view of a material asset is a rich set of layers which are combined in a precise way to generate the different surface properties of the material. If we're transferring this material between different look development packages or maybe we're archiving it for reuse in some future Star Wars show, it's that full layer description that we need to capture so that all of the artistic flexibility and the control of the original asset is maintained. So with that goal in mind, Lucasfilm decided to invest in a brand new standard called MaterialX. To capture the layered complexity of real world assets, this new material description would be node based with a powerful set of standard nodes and data types that could be extended and customized when needed. Additionally, it was designed with color spaces as a first class feature, allowing assets to be consistently interpreted and rendered across tools and media with very different color space assumptions. The very first show to use MaterialX as its canonical material format was The Force Awakens, where it was used to represent our brand new material library for painters and also to archive final looks at the end of the show. That same year, we used MaterialX in a second medium for the real-time experience Trials on Tatooine, where it captured the film look of the Millennium Falcon that we transferred to our customized real-time engine. Then in 2016, as the use of MaterialX was expanding within Lucasfilm, we published it as an open standard in order to get some feedback from other studios and companies in the industry, including Autodesk Algorithmic and our Disney colleagues on the Pixar-USD team. And then in 2017, we published MaterialX as an open source project on GitHub, allowing other teams to integrate it into their projects and to make direct contributions to the code base. So I'll hand it back to Eric for a bit. Yeah, and so while all this work was happening at Lucasfilm, at Autodesk, we had similar problems because we have a bunch of different DCCs and some of our DCCs have a bunch of different renderers. So if you think of Maya at the time, it had viewport one, viewport two, it had some offline renders. And so we were looking at how do we have consistent shading between all these different renders? And that was also around the time that artists were starting to wrap their head around how do we physically-based materials work? And obviously the closer you get to physics, the closer you are to ground truth, you can be predictable and it kind of makes sense. And so we started playing around with a bunch of different things. And one is a demo that we're showing here, which we were calling AMG. So this was actually something that David, who's now at Adobe, worked on on my team a couple of years ago with Adam, who may be in the audience. And so the idea here was could we kind of bounce back and forth between a viewport render, so while you're tumbling and stuff like that, you get a rasterized renderer, but then as soon as you let go, you get a ray tracer that's kicking in, so you get GI and stuff like that, so you can make some good decisions. And so while we had been focused on the shading, we were also looking at this problem of, people are dealing with Alemik files. How do you bind the material definitions? How do you carry those looks around between things? So there was ABC material at the time, and we heard about material X, and we were like, oh, that sounds pretty interesting. And so we started talking to the dudes at Lucasfilm, and we realized they had solved one part of the problem, we had solved a different part of the problem, and the way these things meshed together was actually really, really great. And that's kind of when we decided before material X was open sourced that this was something we wanted to commit to, which then made it fairly clear that we needed to be part of this project. Cool, so I'll pick up the story from there. So yeah, both of our companies, as Eric mentioned, recognized that the unique, not only were these two initiatives very closely aligned, abstract material graphs on material X, but we recognized that the unique ideas that were in Autodesk's approach would make material X much more effective and universal. So when Autodesk transferred their development resources over to material X, which happened I think in June of 2016, they started developing a set of key extensions that were given the name ShaderX. There were two main features that ShaderX brought to the material X project. So the first new feature was a set of true physically based shading nodes. Before the ShaderX collaboration, a physically based shader like standard surface was effectively a black box with material X able to describe the patterns that fed into the shader's inputs, but not the shader itself. Now material X had a rich set of physically based shading nodes to describe the different distribution functions and layering operations that composed a physically based shader and the material X repository now contains shading graphs for standard surface and for USD preview surface as initial examples of how it can be used. The second new feature is a general purpose framework for shader code generation, which makes it straightforward to convert a material X document into domain specific shading code in a language such as OSL or GLSL. Now this feature marks a pretty fundamental shift for material X as it means that an application no longer needs to encode either the rules of material X or the details of the node set that it's using. By converting a material directly to shader code, it can render any content that the material contains. Even a document that's using completely custom nodes can be rendered in this way. So long as the definitions of those custom nodes are available at shader generation time. And then early in the ShaderX collaboration, it became clear to us that by combining these two features, it would enable the construction of a material X viewer where code generation would be used to convert the content into GLSL for the application viewport. We started on a public prototype of this project in 2019, building upon the in progress work at Autodesk and it's now been published back to material X master. One important advantage of having a standard viewer in the repository is that it provides a ground truth reference for renders of material X content, which can then be compared to other implementations. And additionally, we strive to keep this application simple enough that it provides a good reference for how material X shader code generation can be integrated into other applications. So when the two images up on the screen, you can see ILM XLAB production materials for BB8 and R2D2 rendering in the material X viewer. And we've showed these two characters with our colleagues at Adobe and Allegorithmic for the demos that you'll be seeing later in this session. Thank you. So one of the things that was really interesting about this is that when we realized that we wanted to collaborate on this project, it turned out that all the legal framework that we needed at Autodesk didn't really exist. Autodesk hadn't really participated in open source projects before. So we'd clearly used open source products in terms of some of the components we have in our applications, but we hadn't actually been actively involved in code developing one. And so if we look at today, we've really come a long way. We're really a strong supporter of open source now. We understand that our customers' pipelines are heterogeneous, so we're trying to really add value where it makes sense, but obviously, be as close to standards where we don't need to have RIP in there. And so we're also providing funding and technical expertise to the Academy Software Foundation. And we sit on some of the strategic boards to sort of help drive the strategic direction of things. And so in that short amount of time, we've actually contributed to quite a few open source projects. So there is OpenColorIO version two, which you may have heard about. I believe there was a bof, or maybe there is about to be a bof. I'm sorry, I don't remember exactly when it was, or is obviously MaterialX that we're talking about here today who are very active with USD. And there's a few projects that we've since open sourced on our own. So those are things that are completely done internally to Autodesk that we've kind of opened up. So AnimX is one, so AnimX allows you to calculate the animation curves the way that Maya does. So a lot of people are always trying to reverse engineer that when we realize that's something we should just open source so everybody knows that works. ShaderX that we've talked about here. Softimage to Arnold. So we did open source that. And then most importantly that we're gonna talk about now is standard surface that Aileen's gonna come up and talk to us about. Hello. Right, so I'm gonna talk very quickly about standard surface. What is standard surface? It's an open specification for a surface uber shader. Uber shaders are convenient because they allow you to encapsulate the logic of the potentially complex intricate ways the internal components of the shader work and expose a very minimal well-chosen set of parameters for intuitive control to users. So the shader follows the design of the standard surface shader that has been shipping with Arnold for a number of years now and has been successfully used by many, many studios around the world. So it's production proven. It's currently supported in Autodesk products, Arnold, Maya, 3ds Max, but goal is to have support for it across the board. We also hear that other renders have expressed interest in supporting this standard, so which is great. Right. So one of the main goals of this shader is to provide a representation that is capable of accurately modeling the vast majority of materials used in visual effects and feature animation productions. To do this efficiently, it packs a carefully chosen set of scattering lobes that are mixed together to produce a wide range of material appearances. Another main goal is to provide simple, logical and intuitive behavior. So rather than providing parameters for every conceivable case, we intentionally try to boil the set of parameters down to what's really most useful in practice. We also aim to provide guidelines on simplification for modeling, look depth tools, and also other real-time applications. And here's a nice example where on the left we see the Maya viewport that you're using during your workflow, but on the right side you see the rendering Arnold and it's a fairly good match, so it's nice to have this close approximation. So standards are only useful if they're widely adopted, so we have a white paper, raised on GitHub. It's open to other software vendors and content providers. We also provide reference implementations in MaterialX and OSL. These are currently working progress, but the paper itself, and there are reference implementations, they're versioned and will be updated as the specification involves, so it's not just this rigid thing, but we aim to work and improve it over time, ideally with the help of other parties. Of course, so as I mentioned, the shader follows the design of the standard surface shader in Arnold V, and which itself has strong spiritual predecessors in under slag lands, IL surface shader, which used to be the de facto standard for a long time, but it's no longer supported. Also 3ds Max's physical material and Disney's and Substance PBR models. And basically, so we try to approximate the behavior of some idealized surface model that is composed of 10 components combined by a statistical mixture and layering. In practice, this idealized model is approximated by a mixture of closures, and I'm not going to detail in this, this is all in the white paper, I'm just going to go quickly over the individual component just to give you an idea of what they are, and yeah, so the images I'm gonna show they're not the most spectacular ones, like Jonathan's, but yeah, I can assure you that a lot better images can be produced by this model, right? So we have the transparency, and here in the box, the red box, above you should see where the specific component lies in this model, so we have the transparency, we have the coat layer that sits basically on top of everything. Interestingly, the emission is under the coat, which can be quite useful for simulating some film and some light sources that have coating on top. We have a metal component that is mixed with everything that's on the right of it. We have thin film that you can layer on top of all the specular components to give you these spectral coloration effects. The usual reflection and refraction have a sheen layer to model various types of cloth. Of course, diffuse reflection and transmission, last but not least, subsurface scattering. So as I said, it's not a fixed rigid model, it's, yeah, we're working on improving it. One thing is that we currently, currently it's not reciprocal. We'll be looking into, yeah, making it reciprocal, which means it will be better suited for more advanced rendering algorithms like bi-directional path tracing and so on. Also, we want to have a better layering model. Currently, we represent all these components as just a mixture of BSDFs, but there's been a lot of work and we've seen that through real proper layered models, they can have very intricate, interesting effects if simulated properly or the scattering between the layers and there's been a lot of research work on this lately and in such proper simulation of scattering between layers, it's becoming more and more practical, so this is one major thing also we plan to look into. These are just two examples and if you want to see something, something more, please join the conversation. We welcome pull requests, issues on GitHub and yeah, hope to see you there. Thank you. Okay, thanks, William. So yeah, I was talking about joining the conversation, so Davide is one of the people who did join the conversation and we were kind of circulating the spec before we published it on GitHub with a bunch of different people and so Davide is one of the people that gave us a bunch of feedback. I do want to point out too that I think we might be one of the first specs that's actually on GitHub that can accept pull requests, so you should definitely consider that. If there's things that you think we should be changing in this spec, why not send us a pull request and we can review it that way? So I'm gonna let Davide go through and give some of his feedback. Hello, everyone. So I work at Substance, as you might notice, and I had the labs department, so I'm going to share why we care about, in general, uber shaders and what we think you should care to and then how we got to standard surface. You may know that Substance, one of the things we do is materials and to sum it up in a nutshell, they are procedural packages and they come from a compositing graph that generates textures on demand which get rasterized at some point before rendering and they get rasterized to very specific channel names or usages like Albedo, Rathness, and height and specular level, you name it, and that is because they rely on standard shaders. We count on the fact that the shader will know what to do with these textures. We take it for granted over the years that we know what we're exporting to, right? And not just that, but we also know how to approximate it for speed, for preview purposes. And that is important because once you know the model you're shading for, you know how to convert it to other models. We have clients that use all sorts of things for many good reasons and sometimes the conversion is lost, but as long as you know what you expect, what's gonna come out of it, that is a good place to be. Now, uber shaders are not the only way to do this. They serve us pretty well for majority of cases. And we like to split the two strategies. We nicknamed them Playmobil and Lego. This name, nomenclature, actually came from a friend at Weta, but I snatched it out. Anyways, well, the Playmobil, as you can imagine, and larger uber shaders, they cover, they're a little more rigid, and they're good for a majority of assets. They make it easy to exchange because they have all the data you need to know and they're clearly documented. On the other hand, you can use the Lego approach, which is combining your own lobes, maybe add some pattern modulation in it, and that makes it more powerful. You can really do whatever you want and break the rules of physics even if you want. And that will cover the final 10% that came up with a number randomly, but a small number of look def cases, though there are studios that use it exclusively. So there's very good reasons for that and we're gonna go back through it in the next talk. But yeah, MDL, for example, has been thinking long and hard about how to make this work across platforms and renderers. Well, thanks to the recent addition to MaterialX. Thanks Autodesk. Thanks. We are able now with MaterialX can support both the Lego approach and if you package it nicely, we can represent also the Playmobil approach. And that's exactly how standard surface is represented. So also generally speaking, we like the portability of Uber Shaders. If our textures were not portables, we wouldn't have a very good business model. So it's easy to rely on standard to simplify the exports to other model if you have a non-conversion. So we add our baseline models with the Mela roughness which we use for years. Don't cover all the cases that are becoming more and more common. Especially a substance become more used in VFX and animation, but also in design and architecture. We started building extensions. I want for clear code, I want for, you know, an exotropy. They're all standard now, but then there's other ones. There's a sheen, there is subsurface, and so on. And they were a little bit ad hoc. So we started thinking maybe we should draft white paper and document our own little shaders so we can come up with some standard. But coming up with a standard is daunting and risky and we all know how standards can proliferate. So now it will be, as much as it will be fun, it's very aware to do it right. It's hard, standards are hard. And yeah, uber shaders need strong standards to be successful for all the reasons that Alien already talked about. They really need good documentation, not just for accepting the choices that were made, but also to be able to do partial ports. So at the time when we were thinking about this, you know, there was no standard. So definitely one standard is better than none. But in the meantime, other thoughts came up and one standard is also better than too many. So we need to have very good reasons to come out with a new standard. And it also takes time to gather support from other vendors and studios. And if only we could find someone to collaborate right from the start. This was last year, SIGGRAPH. Just as we were starting to look into that, how to test came and showed us a draft white paper looking for feedback. And the timing was perfect. We had plenty of good use cases to test. So we flooded them with our feedback. And while I don't think the word perfect should be in the same sentence as the word standard, no matter what we do, we really like standard surface. It strikes a really good balance with a more complete feature set that captures a vast majority of use cases, I think. But it doesn't overcomplicate the spec with many parameters and arbitrary rules. And it also considers standard rules for simplification and partial ports, which will be key for extending. It's also well thought through with a documentation that keeps improving. And it's also the documentation itself is on GitHub. So I think this is a prerequisite for success. It's important for all the reasons I mentioned. So partial ports are important. I keep mentioning that successful standards have always had partial ports. Like the Disney 2012 model was successful as an inspiration, not as a literal implementation. Not very many studios implemented it exactly as it was conceived, but it inspired pretty much every Uber shader ever since. Anyways, we like that it's an ongoing collaboration. It was the first time that we heard of at the time. And we really liked the fact that the involved top experts in the field, many of whom I worked with across the years that were all included in this conversation. So it's really collaborative. So to sum it up, we do plan to grow our existing user shader starting from substance to align. I'd like to fully embrace it at some point that is not where we are today quite yet. Also because no matter what happens, we will still need to export to all the other coming up standards because that's our clients, right? So we'll go more in depth in the next talk, but we have implemented standard surface, maybe 80 to 90% of a spec in both GL and MDL for our MaterialX Shading Graph prototype, which Dave is gonna show. I bet we're not the only ones who have attempted that in this room. So please follow and participate in this project on GAB, we're really excited. And without further ado, thank you. Hello everyone. So I'm going to talk about MaterialX prototype we built in Substance Designer. So as Davide mentioned, like working with and sharing materials is at the core of what we're doing at Substance. And we're typically producing maps, the base colors, normal roughness, those kind of things. And one of the things is that the world is generally in agreement when it comes to how we talk about the images. If I give someone a PNG file, they know how to interpret it. When it comes to shaders, it's a very different story. So very often when you set up a material, it might have certain things that doesn't really fit into just another map that you directly apply in there. The UbiShader might not represent it correctly or know how to consume it. So the solution to this is something, we call it shaders. It can mean everything, it can mean nothing. And the thing is that the portable shaders is something that we haven't really had. There are so many standards out there. Every renderer seemed to do things a little bit, their own way. And as MDL, there was OSL, there's GLSL, HLSL, all of those things. And there are different abstraction levels. And in the designer, we do have an MDL editor, which is great, it's actually a really cool and good shader editor. The problem is that MDL is not largely used by the VFX or games industry. And also we don't have support for the OpenGL viewport in our application. So if you build something in the MDL editor, you can't actually see it in the real-time renderer, like in the viewport renderer. So it means for a large part of our users, they can produce pretty renderings using IRA, but they can't really take whatever they authored out and use in their production environment. So supporting shaders would allow a bunch of good features for our users. So here's an example of the BB8 Droid. And in this case, we have one set of textures and using various procedural effects, we have made three different versions of it. So the left is very close to what we got from Lucasfilm and the right one, we have added dirt and added oil to this thing. And it means you can get many looks from one set of maps. Another thing that can be very useful is breaking the resolution limits of unique texturing. So especially things coming out of Substance Painter tend to be very limited by the resolution of the texture set you're working in. So in this case, on the left, in the dirt, we have tiled a high-frequency normal map, which is a pretty trivial thing to do in a shader, but do that in a way that you can transport between applications is surprisingly hard. And also the procedural mask for where the dirt is applied is also at higher resolution than the textures that we received from Lucasfilm here. And perhaps the most important thing is basically about getting parity between applications. You build something in designer and you wanna make sure that it closely resembles what it's going to look like in your production environment. And it sometimes is not even about transferring the shader. It's about to implement the same thing on two sides and be able to make sure the viewport in designer roughly matches whatever renderer you're using or whatever tool you're showing the modeling. So MaterialX, why did we build a prototype around that? So it's an open standard that is rooted in the VFX industry and it also generates OSL, which means that the people that are not using MDL, to a very large extent, they are able to consume OSL. It's an open source library and it's an open source library that I really like because it's focused on the data representation. So it's not a runtime for rendering shaders. It's a way you can consume and edit and transform these shaders. And it means it compiles out of the box on every major platform. There's zero controversial dependencies. There's like no boost, no LLVM, no custom string class, no thread pool, it's basically just standard C++. It comes with Python bindings and the graph-based representation is very nice for a programmer to interact with. Also it generates code for GLSL and OSL and we're heavily using the GLSL code generation for our viewports. So we decided to build a prototype in Substance Designer and because the node-based workflows in the designer maps very well to the MaterialX data model. Also, designer contains a Python interpreter and a plugin interface. So we can load the MaterialX Python bindings natively and do all the MaterialX manipulation from in there. So our solution is actually largely piggybacked on our MDL editor. So basically what we did is we took the MaterialX standard library, converted it to MDL, largely in an automated fashion, and basically it gave us these nice workflows where you can author your textures and your shaders together so you can generate a mask in designer and then use that mask in the shader in order to get like the effect that you want and by having this MDL implementation of everything we're doing it means that it works in the IRA and we get the GL support from MaterialX. So our focus in this prototype has been on procedures and not the actual BRDF editing and we have also had a focus on trying to export useful data. What we wanted to avoid is like this walled garden thing where you make something in designer and it might work in painter but then if you want to take it further than that it can't be used and it was very important for us that it works in MaterialX view and Maya and Arnold. So now we're going to show some demos. So the first one is a little overview of the editor we built. So this is pretty much the MDL editor but if you look at the list of nodes here I'm scrolling through a little bit too fast. It's actually the standard library from MaterialX that we have there. And what we have here is a standard surface and a bunch of textures we have imported. So we have an ID map and we also have some metallic roughness the usual suspects. We're also doing RGB to monochrome splitting in the shader just because we can and it all goes into this standard surface node here. So in this case we're going to build a little bit of a we're going to use the ID map in order to color this object. So we're creating some new color nodes. Then we're going to extract some masks from the ID map. So we're basically taking the ID map and we're using a smooth tip in order to select the part of the range that gives us a map, a mask that we can use. We're also creating another mask here where we're getting the bolt and the tracks for the teapot or tank or whatever it is. So now we have two masks and we're going to mix them using the material X mix node here and in order to create a new base color channel for this object. So for the second mask we're going to use the original base color map and here we're wiring it into the base color input of the standard surface uber shader and now we have the new look. So here we're going to expose these color parameters and that means that these are going to be externally available for anyone who's consuming this material X document. So now we can go in and tweak the exposed parameters and see the results in the viewport. This is the open GL viewport here but we're going to switch to the IRA renderer and again like since we have MDL implementations for all these nodes it means that it runs nice and runs well in IRA 2. We also added a button for previewing it in the material X view application. So here you can see it and again like all the properties and textures have been transported into a material X view through the material X document. And finally we also added a button for exporting all of this so that you can get the directory with all the dependent textures and the dependent material X documents and a master one. So the next one, we're going to show a procedural shader we built for the BB8 droid. So this one, we have built a procedural graph here that allows us to add dirt and oil to the model using some masks that we generated in Substance Designer. So by tweaking these parameters you can get different levels of dirt on the model. We can also control things such as the color of the dirt. So also as I was talking about before we wanted to break the resolution limits of the unique texturing. So we have tiled a high frequency normal map in the dirt here. We also exposed the ability to turn it on and off like how much of these normals you actually wanted to see in the dirt here. So another thing that is useful in material X is that you can use it to convert between different uber shader standards. So with the Lucasfilm content we got they had a different representation of the specular color than standard surface consume. So we built a subgraph that does the conversion for us and that allows us to use their maps in a shader that officially can't support that representation. And here we're going to show the same model but now we're going to use the OpenGL viewport and show it next to the IRA one to see how close we are when it comes to matching in matching the result between the two renders. So left is IRA and the right one is our OpenGL viewport. And you can tweak parameters and see the results in both of the viewports. And we can also show this model in material X view and as of recent it supports showing UDM assets. So here we can actually apply all the procedural effects on the entire model with all the UDMs and all the geometry components. All the properties has been transferred from Substance Designer into the material X document and it can be tweaked from material X view as well. It also, the textures have also be transported so you can see the normal map that we're using in the dirt and we can control the intensity of it from this user interface as well. So another quick video I wanna show you is when we applied the same material to a different droid. So here we're using a different set of maps and a different geometry but the rest of the network is identical between the two models. And you can see the procedural effect are still working on the other model and you can control the dirt level on the second droid as well. So the final demo I'm going to show is about droid washing in Substance Painter. So we decided to also be able to generate the GLSL format for Substance Painter. So again, all the parameters have been transferred into Substance Painter and it can be tweaked on shader level. And since this is a painter demo we have to show the particle brushes. So we added a special paintable mask to represent wetness on this one. So the more intensity you have in this mask will first start tinting the dirt darker and then afterwards removing it completely and replacing it with a layer of water on the model. So as we're spraying particles on this model you can see it's kind of running down there and giving it a wet look. I also want to point out that we did not go as far in trying to match the shader in Painter because we ran out of time but there's nothing preventing us from getting an equally accurate representation in here. And of course you can use your ordinary brushes here too in order to paint on the model and it's really cool how you can see how the paintable masks can interact with the shader in various ways. And I think it's a very powerful thing that I'd be very excited to give to our users. So with that I'm going to hand it over to Nikola and he's going to show Material X in Maya and Arnold and also he's going to show some of the stuff that we created in Substance Designer which is really what this is all about. Hey, Ron. So thank you. Awesome demo. It's gonna be hard to match that right now. So anyway, so my name is Nikola Milosevic. I'm a principal product owner, a product designer for Lighting and Rendering in Maya. So next thing we're gonna do, we're gonna see the video that is basically showing us exactly current stage of development in Maya regarding the implementation of Material X into the Maya workflows. What do I press? Green button, thank you. So anyway, we're gonna have here on screen two familiar characters. Both of them have ARS and the surface assigned to it and as you can see that, so if you click right click, you can assign Material X file, this Material X file existing in the scene. So however, this Material X file is actually file that is actually on the disk. So we're loading the file from the disk. The way how we did it is nothing else we're creating the Surface X material, then loading material from the disk for R2D2 again. So again, this is nothing to do with hypershade, this is just you're loading the material from the disk. And again, assigning materials, you can do all kind of way for assigning materials like right click, assign, drag and drop, or you can just actually create material assignment within the Material X file. So that would be one of the popular way of doing it. Then you're gonna probably load that material file, Material X file inside of the Maya and then assignments are gonna work as long as the names work in between geometry and materials. So on the left side, we can see the Arnold render. So we're gonna render that in Arnold and then on the right side, you see the viewport render and they look pretty much the same. There's minus the lighting in viewport because we don't support shadows in IBL. Okay, so I guess you wanna do edit this. You wanna edit Material X file in the Maya. So we're gonna select one of the geometry and see exactly what kind of shader is there and then edit that. Again, this is prototype, some workflow is gonna be much better, but anyway, we're gonna click edit button and the look.exe is gonna be called out. I'm gonna just like quickly increase the diffuse for 50% to see exactly what's happening and the same material. I'm gonna override Material X file on the disk and when I click reload, we're gonna reload that material and you're gonna see how things are changing on the viewport in Arnold. Okay, that was a simple workflow just to show you exactly what we can do. But we can just raise the stakes here so we can edit a different type of material. So David sent us material that is actually out through it in the substance. So we have that complex surface shader there and then you have this graph here like and all these properties that is actually promoted from the substance is actually right now available here. So we have that translation transport model working really, really, really nicely from substance to Maya. So this is Material X file in the look.exe in Maya we're editing and as you can see in Material Viewer you can see lots of details. So I really can actually check this oily effect and just change it. So pretty much we can do pretty much whatever we can do in substance, we can do it here. We can alter the look and create some beautiful stuff here. Anyway, Color Picker is gonna look better, I promise you. Anyway, this is just a fast thing to do. So anyway, we're gonna finish this altering the material and gonna save it again, overwrite a same material that is existing on a disk and let's see exactly what we got here. So again, Viewport Reload is gonna pick it up immediately and Arnold, we gotta just reload Arnold and just render again and you're gonna get quite, quite good look here. So just because the shader X right now we actually have something amazing happening here. We have one shader, one graph showing us two pretty much same looks here. On the left side, again, you have the Arnold, on the right side, you have Maya Viewport. Again, I was thinking about when I saw Davis like editing in the substance, I was thinking about can actually match that in Maya using Material Viewer and I was able to play with this and I was able to actually get to really, really nice edits here. So that's quite important because Material Viewer looks exactly almost same like exactly like our software render, in this case, offline render Arnold. Save it again, Reload and we're gonna get new look here in the Arnold, yeah. There we go. So if you zoom in, you're gonna see exactly those details that David was talking about. So if you're thinking about like right now, we're in Arnold, we're in Maya and we actually can edit this beautiful, beautifully created procedural textures actually created by the substance guys. Cool, okay, I didn't talk too much about the LookDev X. I think it's a good time to talk about that and see how far we went there. So LookDev X, again, it's in progress. When you load the LookDev X, you're gonna see some few features. One of the features that we actually added is like you can actually load any kind of geometry into Material Viewer. Again, it's in progress, you can actually, you saw how we loaded BBA to load anything. Eventually you're gonna have color spaces there, you can actually change it and so on. Regarding the library, whole library from Material X is actually there. All kind of math notes, compositing notes. If you wanna create your own uber shader, do it. Like there's lots of BRDFs here. But if you don't wanna do that, obviously you can use our standard surface. So this is few BRDFs, I'm just showing you. However, standard surface is there, so anyway, use it. Regarding the graph, like you can see that right now, pretty much like we copying the whole logic that we actually applied in Bifrost. So this graph is coming from the Bifrost. So eventually it's gonna be slightly different. We try to conform this graph to LookDeverse. So in this case, you can zoom in, zoom out, you can see exactly how this graph is showing you your nodes. We're adding some extra small things, like in a way like select nodes, like upstream nodes, downstream nodes. When you do that, you can do something about the selections. In this case, you can add like backdrop, nothing new, but it's really helpful. Obviously, compounds existing. So we're gonna actually, you'll be able to actually altering those compounds, publishing these compounds. You're gonna be able to publish the attributes that you wanna do it from this specific graph, and so on. This graph is built, this checker board, checker is built actually from scratch. This is interesting thing to say like, obviously we're gonna give you some checker board, but anyway, pretty much, that's it. That's it, I looked at it so far. Thank you. Thanks, Nicola. And in fact, there were a ton of people that actually worked on this project over the years, so I wanna go forward and thank some of the others that were involved. I was really tempted to do a Star Wars scroll on this one. Thanks. So with that, maybe I'll invite all the speakers up and we can open it up to questions if anybody has any. I'm wondering what support you'll have for light baking tools. I'm gonna figure out which one of us. So do you wanna answer maybe from the substance perspective first and then I'll take the Maya one. For light baking tools, we already have tools in substance that you can bake the illumination and into maps, and that's one way, but we also have additional IBL nodes in designer. I'm not sure that's what you're asking for though. Yeah, because the light filter in substance cleaner is good, but it doesn't have as many sources as I would want. There will be more things coming than the light. Thank you. And just to add to that, from Arnold obviously you can do light baking as well, so we will support that through MaterialX Transport eventually. Any other questions? Great, well thank you very much for attending.