 Hello everybody My name is Paul mellis. I work at the surfsara, which is the Dutch National Supercomputing Center here in Amsterdam Head of the little visualization group the way have there. So we do scientific visualization VR for users of our big compute systems I'm a computer scientist. This will be a slightly different type of talk than the last two ones So no facial effects unfortunately for you I'm a long time blender user and I actually tried yesterday to figure out when was the first time that I use blender So I went through my files on my computer some backups and then tried to find the oldest blender file that I had and turned out to Be this silly cow that I once made December 2000 that didn't make any sense. I figured what the hell would I model a cow? And then it reminded me once I did a Christmas garden year 2000 Kind of silly how did that go so you completely forget that kind of stuff At work we use blender a little bit. I would like to use it more But it's not always a good fit for scientific data. This is one of the things that we did is a scientific rendering of Some data. This is a little piece of archery near the heart We use it for 3d printing to prepare models these days But actually I don't want to talk about it and I don't want to talk about cows or Christmas either I want to talk about past racing and cycles in particular So as many of you will know past racing is everywhere these days has become the default rendering method mostly for Both official effects for movies all kinds of stuff Every major studio has some kind of in-house renderer or buys a commercial password or something And you can get really great images from that and past racing has become so common That's about a year ago. Disney released this kind of little movie on YouTube Disney's practical guide to past racing. It's about 10 minutes long and it explains for the late person how past racing works What it is? What what happens with for example light height bounces right and seeing how the camera rays are working? But if you have worked with blender or cycles in this movie will not tell you anything because there's simply not any information in it at a technical level and I figured well, maybe we can get some more information from from from the manual There's quite a lot of information there, but still it's text and I'd like to see stuff happening when I render something a blender So then I figured well, maybe we can open the black box a little bit black box that is cycles show what is going on when you render something Focus on the path in the race that shoot through the scene It's fun to do some last little project to hack the code to to get that done Might get a little better understanding of how rendering works show different kind of race how material influences the race, etc We should also check things for example, if you render something it has a firefly somewhere or a caustic that doesn't really work You try to find out what's happening there And it might even be useful to for example cycles development or keep up So the fun thing is we need to hack cycles code Want to basically add code to save pass and raise to disk and then look at them later The computer scientist in me said well at this point of the talk I should tell a little bit more about that. So I hope it's not too technical this part So I want to store this pass in the race that are happening that are being generated during the rendering It's quite a lot of data even for like a 1080p scene with only 16 samples per pixel Which is very low you get around 33 million past that are shooting in that scene We don't just want to start at data. I want to be able to Explore the data to do queries on it search for certain types of past, etc There also come from multiple render threads. So we need to handle that Basically the solution I chose was to use something called SQLite If you into computer science, you might have heard of that. It's a software library. It's basically an article database or My sequel database, but in a single file And you can link it to your program and use it from that. It's really cool And the most important part here is that you can query it with the sequel language It's fast enough for what I want to do here and it comes included with Python So you can script it from the bad blender Python modules and use it from them Pretty simple database setup just two tables one for pass one for race for the past We know for example through which pixel it should we know what sample it is So we have like 16 samples in a pixel of 1 to 16 there We know what amount of light it contributed to the image so we can find the past that have very high lights contribution for example For the race we have slightly more data. So we know where it started in the scene. What position was 3d coordinate? We know what direction it showed in we know if it had something and if it had something at what point so you can check for example against the geometry and Well, there's a little bit of the code I wrote figure that should show something of that and that's enough actually But the current hack that I did this is pretty limited. So it's the first start lots of specialized things and not Got right now. So ambient occlusion subsurface shading and stuff like that is all not being Safe to disk. There's also no support for the branched path tracing that is within cycles CPU rendering only It's quite hard to get something to output from the GPU So CPU only and perhaps the biggest limitation is me because I'm not a cycles developer So I only do this based on what I know from one or two pathways that I've wrote myself and So I might be doing some stuff wrong at some point Just to show some example a very simple scene Five spheres with different materials to diffuse a glass and three times glossy with different varying of roughness No light source is just white background and a very low render quality to keep the database size down So then we use this hacked version of blender Get the scene the script does a little bit more than what you would normally do Let that render takes a bit more time because all the data is saved and outcomes a single file of about 145 makes containing all information about the race and pass But if you look at the raw data, it's not that interesting in this view But this more or less what it looks like And we could do some queries on that just to make sure that what's in the database makes sense So you can check the resolution for example you can see the sequel query to do that And you see well this matches what the resolution would set to The number of paths Matches the resolution times the number of samples etc We've seen the maximum length of the password in the scene was 10 so we know how many bounces there were We see there's about one by three rays per pass So there was a bit of branching but not even that much Another quick check that we could do on this data is see if we can reconstruct the image that Blender normally does so here we basically query over all the pixels and for each pixel we find all the paths for that pixel Get the color contribution sum them up and then store them as a pixel color And outcomes an image, but you look at that image Well, this is what blender or cycles renders. This is what I produce you can see Little bit of the same you can see at least that the shapes are more or less in the right spot the colors are Not very good, but that's actually to be expected because a lot of what is used to Produce this image is missing all parts that blender does for example How to convert the light values to RGB how to do gamma correction pixel filter, etc So that's all left out. So that's what the image doesn't match But at least it's it's in the right direction For example, you could do with this little database for example look at path links for different scenes over here, we've got Classroom scene from the cycles benchmark set Very nice scene lots of reflections a lot of lights lots of shiny things and And the middle image shows what the maximum path length per pixel is so these blue spots There's a puzzling the one so it basically hit the background and then nothing happened anymore The classroom scene you can see that the past that hit the light sources were immediately terminated And there's also a little piece here where the rays managed to escape the scene basically The greener the lighter the green of this picture the more The longer the pass in the pixel so you can see what's more or less Even green as some areas where this more There were the parts are longer and the red spots is where the longest parts are actually occurring for both scene There's even a path of 10 Even for that simple scene. I don't know how that happens, but you get a lot of bouncing around bottling 10 This is all statistics That's not very interesting much more fun to look at it within blender There's a Mac Next one a Mac. Who did go here? I Only know Linux. Sorry fail say Just quickly stop it So I was asked not to do a live presentation in blender because that might run out of time or I think my question something So this is a movie that I cut together from some screenwraps yesterday I hope the pace is not too high of this and I tried it a couple of times yesterday So here we have to the simple scene in blender For example, we can look at this green pixel in the rendering. We know the coordinates where it is so we enter it in In the script and here you get the past that shot through the scene for that pixel 16 paths you can see them hit the sphere and because this is a diffuse sphere You can see that our paths go everywhere more or less what you expect We can also look at At the glass sphere next to it the top of it is blue, which is of course a reflection of the blue floor So if you find the past for that pixel in the blue You can see that there are the past They go in the sphere They go through and they hit the floor and then they reflect all the way. So that's why that pixel is blue Now this is best seen from the side. You can see the refraction happening You can also see some crazy past that Exit the sphere come back again refract and then go all the way around Something that even reflects internally And there's because the glass shader in cycles is a little bit reflective. There are some past that Immediately reflect instead of refracting into the sphere. So You can pick the number of samples that you like in your image and the number of sample pixel We can show the pixel grid on top of the camera So this is at the near plane. You can see basically where the position of each pixel is within the image and This is more or less what each pixel sees of the little scene. That's there And we can query the sample positions within the pixel for a single pixel position You get these little dots And now you see that the samples are actually outside of the pixel boundaries and that's not a bug That's to be expected Because there's this setting called pixel filter with within the render Which is wider than a single pixel Which is over here. It's set to one and a half. So that's a radius of one and a half pixels And that's why those pixels are one and a half pixels wide around the center of the pixel Pixels within actually close to the center are have a more contribution to the image than the ones that are further away So they influence the image more than the ones close We can take the sample locations of all the pixels put them on top of each other So here you can see in the middle. You've got a pixel one size one pixel wide The box on the outside is the radius of one and a half And you can see that all those pixels are well more or less inside of it There's actually a few of them there seem to be outside of the note That's a bug that's something that I'm doing wrong And what is also interesting to see is that over here seems to be less samples than over here That's something which I wouldn't have expected expected more like a fall of distribution So we look and can look at The way light sources interact and for example caustics. So we have a simple scene here a diffuse cube Diffuse floor and there's a glass sphere over there into lights point light and an area light This is actually very hard scene to render because light will go through the glass sphere and then should form caustics But because of the type of past racer that cycles is it's really hard to get this effect right as many people know You need a lot of samples But we can look at for example this very bright pixel this firefly Check what's going on there? We find the pass and what you see is that some of the paths go through the sphere that refract One or two of them managed to hit the light source and that's where the white spot comes from That's one or two and actually it's just one of that lights are one of the past that has a very high contribution So that is why that pixel is white and then we can look at the the black pixel next to it See what happens there and you see more or less well the same thing happens, but these two paths just miss it Right, so this is this is why caustics is hard enough paths have to go through the glass and then hit the light source and that's Really tricky for a past racer like this. There's also past racer that started the other end That should raise from the light and then combine them, which is much better for caustics There's also shadow rays being traced which are the dotted lines being shown here You can see that they go straight toward the light. They don't go beyond the light actually But they don't help in this case because all the shadow rays basically hit the sphere even though it's transparent The shadow rays stops at that point and no light gets contributed to that location and same goes for the cube basically So then I figured well, maybe you can try to model a small piece of optical fiber Just like you have a network cable or an audio cable See what happens to to the rays within this thing We've got an emissive red plane at one end. We've got the camera looking down the other end if you start to render as well You get the red Going all the way there, but as you can see there's a little bit of black and red in the same At least pixels of black pixels of red and it doesn't really make much sense So we can look at one of the red ones. You can see a lot of the parts coming out But it started the camera they bounce around within This little piece of geometry just like in a normal optical fiber They come out, but actually not all of them come out Some of them have a really strange part. We've got this one which just bounces off immediately due to the reflectivity of the material We've got this one which almost managed to make it and then gets reflected off And all in all when you look at the number of rays coming out There's about 13 of the 16 coming out and either other ones get stuck within Digiometry basically so if you want to for example model render this better you might have to increase the Path length the minimum path length to make sure that every one of those parts gets out before it get terminated This is one of the great pixels and you see only two parts managed to get out and The rest is simply stuck inside So depth of field some nice effect to look at Three objects the Suzanne object is the one that the camera has the focus on the depth of field is pretty narrow So the sphere and the cone are not in focus as you can see with the rendering here We can look at the little gray area between the two objects See what goes under and now you can see that where the rays normally start in well That's the camera position. They now start in a pretty large area and the focus are Actually, they they meet at the depth that you specify as being in focus. So they start at white focus at the right distance and then they get diverging in for example, if you look at Pixel that is well mostly half gray half red on the sphere You see that about 75% of those rays hit the sphere But have very different places and the rest of them goes past. So that's where you get this combination of gray and red Pick a point on Suzanne, which should be in focus. You can see that the rays there Converge on a single point just like the normally would so that's why it's in focus and then finally It's not called global illumination for nothing Here we have the class room scene again So there's the main area was in the class and there's a little bit of an areas next to it And there's two doors that were opened that that the light can go through For example, we can pick one of the faces in the room next to it and then see if the light gets there or not You can pick the face Here is the database for all the parts going through that face. We just take the first 15 in this case It's a little bit of time There we are So now you can see that all those parts managed to get through the door She kind of funny was actually past going through the glass here at the top But it means that this piece of geometry influences the final result even though it's off-camera, which is what global illumination is all about and It's a really final step We can show all the head points where the different rays and past hit something some part of geometry as a point cloud You can now literally see that basically the whole scene in some way contributes to the final image And this is only 16 samples per pixel. This is already everywhere this light See what happens here. Okay, so that's the movie and back to the slides So actually already finishing So I did this for the two simple past is the basic primary pass in shadow rays It wouldn't be hard to to add more rape types to this like ambient occlusion of such a fish enemy I Would be nice to have not just the python code that I run but a little gooey with some buttons to query the stuff Something that I came up with but just didn't have the time It would be fun to have like an htc 5 headsets Have a scene where you have got the race flying around your head call it the cycles experience or something Maybe turn it into a game where you have to catch a race or something Something like that So that's that's it. I hope you enjoyed it