 This is for the display. I love the market over here. Try to connect everything and you can start right now. You have to help me with the resolution thing. Do you know how to set this over here? The function of the display and screen is where the display is located. You have to detect some devices from it. Really, it's all monitoring. We have to detect the devices from it. We have to detect some devices. This is the new internet. You have to set the display to the screen. Then you have to set the display to the screen. I use your one which has downloaded the PDF from the internet. That's easier. You have to detect the display. You have to find the device. You have to detect the display. You can always do that. Is it really? You can always do that. First of all, this is the moment to start the matter. I use human and we just don't know the PDF. The guys that are gone. Your computer doesn't find the screen. That's why we didn't find it wrong. It's the internet and you can download the PDF. The computer doesn't find the display. Is that what they're working up there? Why is there no... Maybe it's the next phone. It doesn't show anything. I think there's no display. We haven't changed anything yet. Look. Everything is changed. Guys, this is going to be great. I think that's going to be great. Look. I'm very happy to see this. There's a lot of people here. Do you feel it? Yes. What is the reason for that? I don't know. I don't know the reason. One mouse in the middle. Cool, thank you. This one is a bit too loud. No, it's mine. When someone is talking here. And it makes a sense like this. I have to put it really up to the end. When someone is speaking directly into it. I can put it very softly. One, two, three. Okay. I'm trying to be fast. My name is Jan Walter. I work for the Mill in London. And I want to talk about a multi-exporter, which is an add-on for Blender, which I developed, and I'm using it to do render comparisons. First, I want to introduce you to the web page because basically you don't have to write anything down. The slides can be downloaded from the website. There's a link to a render forum, which I will talk about. And then you find a bunch of blog posts. And, of course, there will be more and more over time, and they are all rendering related. And in the link sections, you will see links to topics I talked already about in the blog posts. So they will be reorganized over time. This is an example of such a blog. It basically has an image, you know, a bit of text, so it's not overwhelming. It tries to cut down things I am currently working on and talk about it only briefly. And most of the time it's reflected, you know, in like repository changes. There are two repositories we will see later for scenes and one for the source code. The system I'm using allows me to share code snippets. In this case, this is a shader from Radiance, which I ported to Arnold. So I have the same procedural patterns than in a Radiance rendering. And, of course, because it's my webpage, sometimes they are off topics, but most of it will be rendering related. So the forum is really for you to either read it from time to time. That's what most of the people do. It's online already for a couple of years. But I really would like you to participate because in case you want to post something, you have to be logged in. During the registration, they will ask you who owns the webpage. That's just my name, and there you go. And you can, of course, reply to existing topics. You can create a new one. One thing you cannot do is like create a new category. So there are two main things, a general section where you can, you know, talk about scenes, for example, or you can talk about the technical paper. And otherwise, in the second section, there's just a list of renderers. And if you want to add a renderer, you can't do it, but you can ask me. So there's others renderer where you can say, okay, I know another renderer. Why don't you add it to the list of the renderers? But most of the time, I just talk about the things I really can get my hands on. So the forum is there for you to share the experience, your own experience. There are two repositories for scene descriptions. The ratings first, you name it as the older one. It started with an HTML page which got out of hand. But back then, I was writing about much more renderers and I used just any method, you know, to have a scene and create images with arbitrary renderers. So that export multi-scene description repository is basically really using my own add-on. And the source code is on BitBucket and just two words about pictures and publications. So the pictures, I don't allow people, you know, to attach it on my own website. They have to link to... They have to host it themselves and link in the forum. But there are free hosting services if you want to show pictures. And for the HTML page below that pictures, that's actually how it started. I show different renderings of scenes with various renderers. And it was a single HTML page and it just got out of hand. It takes forever to load now. So another approach under the publications was I started writing a PDF file which on a scene-by-scene basis would explain how the exporter works and explain the features based on current scenes. So the main idea of an exporter is a UFA scene description and then you read that scene description with a parser and you don't want to lose any of those settings. So you want to keep this in the host and then you want to render with various other renders and then each renderer, I just call them here A2B, A2F, each renderer has its own scene description and from the host you export in this particular scene description and then you can cut the lines to the host and you can render stand-alone. So the decisions I made so far is I started with radiant scenes. Radiance is a very old renderer but it's very accurate regarding global illumination. And I wrote an importer and once the scene was in blender I could use my multi-exporter to export, for example, for Arnold which is a .s scene description. Indigo is .igs, Luxrender, AlexS, Winfrey, MI, Maxwell, MXS and Renderman RIP. So Renderman is basically an open standard. The most famous one is obviously Pixar's Implementation PR Man but 3D Light and Air are just two other examples of commercial renders. 3D Light gives you a free license and there are many other implementations of that Renderman standard. So in this picture I show images which were rendered with radiance and the top three ones they come actually from a book about radiance and the lower three ones are not related to the book but they are still visible somewhere on the internet. You can download the scene descriptions and I want to compare this with pictures I rendered with other renders. So let's talk very briefly about the top three. The left one is like basically just a room with a light emitting sphere and you see that crystal sphere based on a blue box and then outside there's just a disc as a ground object and another building which is reflective in the sun and sky simulation. The middle one is like using, that's something you do in radiance very often. You just have a common geometry and then you do variations whether the material or the lighting and then you say how does the mood of a scene change by just using a different lighting and in this case there are two different setups. Obviously that picture shows only one and the most complex scene in the book is like this gallery scene which you will see in another slide. It's basically one room without windows and it has an opening on top and then the light can come through that opening and it hits a triangular structure and that doesn't allow the light to go directly on the floor. It has to bounce against the ceiling and come back. So that's basically the worst case for a global illumination render and we will see in some other slides we will see how that affects a scene. The other ones are more complex. The most famous one is probably this conference scene on the lower left that conference room actually existed even 20 years ago they took a picture of it and then they compared it with a random with a radiance rendered image and it matched very well. The middle one is a case study of a theater which was actually never built and if you compare it to the rendering here I think that is that's done with Indigo a couple of things are missing. I just take the base materials and the patterns they are procedurally generated so I cannot write this for all the different renders. Most of the people would use textures anyway but if you look for the on the left side lower left this is another perspective of that conference room and that was rendered with Arnold and for that case I put some effort into recreating all this radiance patterns for Arnold actually they matched very well. Then the one on the right I just want to draw your attention to the staircase because in radiance you can say that staircase is grey. You can tell the renderer oh don't worry I want to have the global illumination right but don't worry about that particular object. If you compare it with a lux renderer rendering you see it looking much more realistic but that's just something which is particular to radiance that you can do that. So that top right image that's kind of a flower, a lotus shade of glass and here you see a lux render rendering basically all the other lights are turned off you just get lights from above and it's acting like a lens it's bundling the light and you get this nice caustics. The middle top image you haven't seen yet that's from a part of a ship and it was rendered with Maxwell and you see that it's using depth of field so some of the renders they force you to use a physically correct camera and in this case it's pretty obvious. Then the scene on the left the bathroom that's taken from BlendSwap was set up already for cycles and then I modified it slightly so it can be used with my exporter for all the other renders. All this you know scenes are in a repository and there are probably a couple of camera perspectives and you can render with at least six different renders with my exporter. So I want to talk briefly about commercial packages because that's the part I cannot share with you today but I want you to know that it exists. So on the right there's Blend again with cycles a very basic head with a diffuse texture HDR image for the lighting and something to catch the shadows and once I exported it from Blender on the right side then I can render standalone renders and in this case it's Arnold and I wrote for commercial packages in this case Maya, I wrote like a translator and that does not just export it also imports scenes so I can use the scene I exported from Blender and import it into Maya and it picks up everything correctly. So why would I do this? Here's another example of that conference scene. The focus with Blender was really I don't want anything rendering specific in the Blender user interface. I just want to have a scene description bring that scene into Blender and export to as many renders as I can and then I support only a couple of things like glass or de-electric material something like diffuse or glossy like plastic like and maybe some metal but I don't want to see anything specific of that particular render in the user interface and then once I bring it into the commercial package in this case with Arnold I really want to play around with additional options. So there's an official M2A Maya to Arnold exporter but with Maya exporter I basically have access to all the third party shaders ever written for Arnold and then I can play around with the same scene and I try various shaders other people have written. So let's talk about splitting up the image into several images for purpose. So those light images are basically available I think in Indigo, in Lux render, in Maxwell and maybe other renders. What I show on the right side is screenshots of Blender and I try to highlight the light emitting geometries and sometimes you know like here on the lower left what they should illuminate in the scene and what you see on the left side is basically from one particular camera angle what effect does that light group have. I just want to mention you cannot add up the images here on the left side because they are nicely lit. So what's happening in Lux render for example you turn off the daylight and then you can switch through the light groups with the electrical light in the scene. What happens is that auto-exposes it adjusts the camera settings so it's always well lit. And what you would have to do is like you switch for example from auto-linear to linear and then the camera settings are frozen with all the lights on except the sunlight and then you have a good exposure for all the lights and then you switch and then you see the real contribution. So if you would add all the lights up again doing that then it would be the beauty pass. Another way you're probably familiar with is AOVs which stands for arbitrary output variables. Basically you can render anything like normal positions in space like UV coordinates into separate images and with Arnold there's a workflow where you start on the top left with a noisy image and then you want to have a noise free image and that's on the bottom right. And I will come back to that slide I just want to show you the noisy images on the left top and the noise free image on the lower left. And what you do is like you split the beauty into different things like diffuse, specular, reflectant and refractive and stuff like that. On the top right the two images they are your diffuse but with globally illumination you have not a single diffuse you have in the middle you have the direct diffuse and on the right you have the indirect diffuse. And then you can do the same with a specular but that's kind of boring for that scene so I don't show it. And then the middle one on the bottom is like the reflection and on the right it's a very bright spot on the floor. So let's talk a little bit about the diffuse on top the two on the right. So that very bright spot on the floor is basically what I said before. The light comes in from above it is supposed to bounce off the triangular structure but some rays bypass that triangular structure and they hit the floor directly. So only that directly from the sun. So where's the other light coming from? That is a sun and sky simulation is basically a hemisphere and then you position the sun in the hemisphere and the remaining hemisphere gives you still light from the sky and it reacts for example if it's dawn or sunset then the sky color reacts to the position of the sun. And so the light beside the camera to the right there are two windows and behind the camera are two glass doors and that's where the direct contribution from the sky into that direct diffuse AOE comes from. On the right that's basically everything after that after the first hit so think about the ray goes already through the glass that ray doesn't count. Only the diffuse material you hit the first one is in that AOV. So all the rest the bounce light is on that picture on the top right. So what you do in Arnold is like you look at the first decision and the last decision and it says if this particular AOV is noisy then go to the global options and there are some samples and those samples are found on the left in the blue box. We have 105 parameters AA stands for anti-aliasing and GI diffuse is the first decision, GI glossy is the last decision. Single scatter is for subsurface scattering and I want to talk just briefly about the middle decisions because the middle decision says look at the direct specular AOV increase the light samples and the other one says the direct diffuse look at the shadows if that's noisy increase the light samples. So, if you remember the first scene like with the light emitting sphere I could use a point light with a radius to have the effect of the lighting but I could not see either directly or indirectly that light emitting sphere with a point light. So what you normally would do you use a sphere primitive and use a standard material with emission and emission color but then you don't have anything to tweak that standard material doesn't have any samples to tweak so what you do instead is using a mesh light you cannot use a sphere anymore you have to tessellate but then you can use an arbitrary mesh and then you have samples again to tweak. So I mostly done the camera settings I just want to say every renderer has a very different user interface by basically lux render on the top left Maxwell on the top right Indigo on the lower left and Blender depth pass on the lower right you can misuse the same settings basically film ISO shutter F stop for all these renderers you just have to know where it goes to use interface Maxwell and Indigo and other renders might force you to use proper depth of field and then you need a focal distance so you either take the Blender depth pass and hover over with a mouse and read the set value and then put it into the other renderer or like in Indigo while it's rendering you can just pick a point in the image and put it into the camera and automatically puts it in the UI and then you use it for Maxwell or others with Maxwell I had a problem to match like the other renderers so what I'm doing is like basically I use the camera settings I found for other renderers and then I end up with under exposed image which you can see here on the left which is pretty dark and then I do the inverse gamma which is the inverse and brighten the image up and then it kind of matches what I get with the other renderers so let's talk briefly about the source code the multi-exporter glass on the bottom that does basically all the work and it has a list of supported renderers and that list grows and fringes dependent on a rendering mode so the most basic rendering mode is you have no lights at all you use the AOVs like for show me the UV coordinates and stuff like that then there's direct lighting and then there's indirect lighting which is globally illumination and every renderer or supported renderer implements a class which derives from common exporter interface and common exporter interface you see on the top is a base class which has a bunch of functions so if you want to add another renderer you derive from the common exporter interface and you implement those functions so what are the future plans obviously I want to support more than those six renderers I might do the same thing for a commercial app or I might even do this for Blender again but do it in C or C++ I could need help with more public test scenes but then please make sure that all the textures everything used is like creative common license I happily hosted one of those two scene repositories so far I started with importing radian scenes so as soon as I go to blend swap or grab from any other source I need radians exporter if I want to compare against that renderer the materials are very basic as I said it's like diffuse glossy some de-electric glass maybe some skin or subsurface scattering would be nice animation I just didn't have time yet it's just taking too long already to render interior scenes with all these renderers but it should be easy to support OSL is already used with cycles and Arnold implementation at Sony picture image works maybe that's a future of shading languages at something you could discuss on the forum maybe it's MDA the material description which Nvidia tries to make the standard I'm looking forward if you want to register you know to cooperate with you guys the source code is out there and here again all the links the company I work for has a Facebook channel and I have like the dot com one is in the US the dot org one is hosted in Berlin I'm moving more and more stuff out of the US back to Berlin it started with that HTML page which gets out of hand because it's one page with a lot of pictures the slides can be found on dot org already the render forum as well and yeah the source code is on bit bucket and that's it thank you very much before we go to the next talk Jan can you say one thing about the conclusions that's what I miss a little bit cycles really that slow or is it not that bad do you know that kind of statistics I don't think it's that slow can you quantify that I mean there are a couple of weird things like for example with the sun and sky simulation what I really do for all of this render I just say okay that's the direction from the sun and then all the rest is done automatically by the sun and sky and if I do the same thing for cycles it somehow doesn't get through into the room I mean I tried it even with Lagoa and others you know what you have to do in cycles probably is like you put an area light in front of the of the windows and then you get the same effect for the future you know if that would be the only thing yeah so the next speaker is Vladimir Alistakov