 And thanks for joining me this afternoon here in Raumannheim at FMX 2013. My name is Thomas Dinges and I'm a software developer for Blender for the last four years already and I'm currently studying computer science and today I would like to give you an overview and some want to show you some scenes about our new render engine cycles which has been in development for the last two years and yeah I would like to give you an overview and make you familiar with the engine and the feature set and its possibilities today. So just very brief for those of you who are not familiar with Blender, Blender is free and open source it can be downloaded from the internet for free and it has been open source since I think 2003 if I remember correctly and it's available for all operating systems so it doesn't matter if you use Linux Windows or Macintosh and behind the project is the non-profit organization Blender Foundation which is in the Netherlands in Amsterdam and their goal is to provide to provide a tool set in open source for individual artists and smaller teams and studios and yeah the goal is to have an entire creation pipeline from modeling, shading, texturing up to rendering and compositing and we have four paid software developers at the moment and a couple dozen of free time software developers. Okay so cycles. Cycles has been originally announced in April 2011 so it's roughly it's two years anniversary now so it's two years in development now and has been originally created by Brecht van Lommel who is one of the paid software developers for the Blender Foundation and the design goal of Cycles is to have a production engine for smaller teams and smaller studios and it's on the one hand interactive but also very easy to use and we have yeah two kinds of render engines in the industry on the one hand we have the render man like render engines which provide a lot of flexibility and are also programmable via shading language the render man shading language in this example and on the other hand we have those engines which are really really photorealistic so you can do all kinds of crazy physical effects and it looks really accurate and Cycles is actually it tries to combine the best of both worlds so it actually sits in the middle between those two paradigms and it's on the one hand programmable so you can write your custom shaders with open shading language I will show you a bit more later and on the other hand it provides a really realistic image and just to give you an idea about a feature set some of the main points of the engine it runs both on the CPU and the GPU and on the GPU we only support Qta at the moment we have an unfinished OpenCL backhand but it's a bit problematic to get it work on AMD hardware because the AMD compiler is currently not capable of handling our big render kernel but that is being worked on and we also in contact with AMD so hopefully we will see the engine on AMD hardware in the future as well it is interactive rendering so you have an interactive render inside your viewport you can rotate around you can change your shaders while it's rendering and refining the image and we have depth of field motion blur and the entire shading system is physically and node-based system and behind the scenes it's based on the technology which has been released by Sony it's the open shading language and we also have other features of course like subsurface scattering and hair rendering only for a few months now but it's still being worked on a bit and we have render layers and render passes for compositing so you can also take the images and separate it into several AOVs and combine it afterwards again and we have texture bumps and normal mapping and procedural textures high dynamic range textures and instancing and a lot more but unfortunately I couldn't put everything on one slide so it was one too fast about a roadmap so currently cycles is almost feature complete so it has been in development as I said for roughly two years now but we still miss a few bigger features and probably the biggest one is definitely the volumetric rendering which we don't have yet we have just a limited displacement support although you can use blenders displacement capability to pre-subdivide the mesh and apply the sub there and the displacement already before the rendering process and in terms of motion blur we don't have deformation motion barrier so when you have an animation with a character with with bones and a rig you cannot it's not motion but at the moment and one of the problems we face is that GPU technology is constantly changing so it's very unpredictable to be aware of the GPU generations which which are being released about every one to two years by Nvidia and AMD and although CUDA is very very powerful and really advanced already and you can almost port everything from the CPU to the GPU it's not really possible with everything and that's also a reason why we don't have hair rendering at the moment on GPU for example but this is something which will hopefully be addressed in a few weeks but we just very recently got subsurface scattering support that was just I think four weeks ago and this is also not running on the GPU yet so this is a bit of a problem which also slowed down the developing process during the last two years because instead of just developing a feature and integrating it into the engine we also had to make sure that it runs on the GPU at the same time and this is a bit of a problem but the plan is at the moment to first make the engine feature complete and have some of the bigger features CPU only and later on worry about the GPU implementation and another thing which will still be worked on a lot is performance so it's already really fast and versatile but especially if you have interior scenes for example it's still quite noisy and it needs some improvements there for the technical dial guys here who want to get some details about it it's a vanilla Monte Carlo based ray tracer a path tracer so we don't use fancy algorithms like B-directional path tracing or MLT or stuff like that so it's a path tracer and I want to at the end of my theoretical part of this presentation I would just want to thank the main developers of the engine which is Brecht von Lommel, Lukas, Sergei, Dalai, Stuart and Mike who have contributed essentially to the project and if you're interested if you're a student for example you could join us for this year's Google Summer of Code which is a project which is run by Google every year and they you can apply to the Blender Foundation for software development and get paid over a period of I think it's two to three months this year and you can work on cycles but of course on other features for Blender as well if you're interested in simulations or interface or whatever you can work on that and then you get paid by Google for three months so if you're interested please feel free to talk to me after this presentation okay so let's come to the actual practical part of this presentation and I want to really thank CUT network for providing me with this awesome workstation here for technical specifications it has two Tesla K20 GPUs and one Quadro K5000 GPU it is equipped with a 32 gigabyte of RAM and is equipped with a dual-sion CPU so we have 16 physical cores with hyper threading so we have 32 threads in total and it runs with 2.6 gigahertz I think so it's a really nice machine and again thanks to CUT network for this workstation for this presentation okay you can find more information on Blender.org or contact me via my website or email if you have any further questions to that but that's it from the theoretical side so I will show cycles in action now but if there are already some questions now I would be happy to answer them already if there's something unclear or want to know something already okay then let's see the engine in action so I'm using a development build here so we are going to release the next version Blender 2.67 in roughly one week so there's just some smaller problems which we have to address and we will probably release and release candidates tomorrow so if you would like you can go to Blender.org and download this version by tomorrow or Saturday okay I have prepared some scenes which I'm going to show you first of all an architectural scene this is quite a huge one it makes heavy use of instancing for the geometry and stuff like that and I will just start the rendering process here inside a viewport now so you can basically take every viewport you have and start rendering here but this is using I think this is using CPU now that's because I have to enable the GPUs first so if I go to my user preferences I can go to CUDA and then I can set all the cards so we have three graphics cards in this computer but I will only use the two Tesla cards and leave the quadro for display because Blender's interface is also drawn where the graphics card we are open GL so you would get some some latency there if you would use all the three cards so I will not do it so we have the two Tesla cards here and I set it to GPU compute and then I can start the rendering process directly here in the viewport it will initialize slotted textures and then you see it starts rendering already we could move around interactively and it will then continuously update the image until we get a we have here some scenes in the background but this is really only for and we have a few other variations here as well so for example we could take the sunset scene here so I will quickly switch to that quickly loads the geometry and the shaders into the GPU and renders the image and you can see this is quite fast here on the two Tesla cards and we could now go to the node system for example we have a basic support here for materials and preview rendering but you can but this is more for basic shaders if you want to go with more advanced shaders you have to use our node editor here so I could just change some stuff here for example if I remove the image texture or yeah then change the color or whatever you see that the it instantly updates and you get immediate feedback about your lighting and shading which is really really important in a workflow because CPU render time is much more cheaper than an artist who has to wait a few minutes until the pre-processing and pre-passes are done but here you get immediate feedback and you can light your scene and change your scene and can take it to the farm then okay let's go to the next scene I have prepared a nice Ferrari car and let's render this as well okay and you also see here this is really interactive I can move around my scene it refines the image and we can change stuff like the lighting for example and we get immediate feedback here okay if you have any question during my presentation just feel free to ask it so then one of the CPU only features like I said before is hair and strand rendering at the moment we have several types of primitives so we both have mesh primitives like mesh planes and ribbons and stuff like that but also smooth smooth curves sorry which can be then rendered so you can select between all of those here in the particle system and we have some presets for that so depending on whether you would like to have faster rendering or more accurate hairs you can then change the setting here and also change some additional settings then and I will also just start the rendering here in the viewport now and this is now being done with the two CPUs so 16 cores and 32 threads in total can see that here we have 32 threads and of course you can not only render the scene in the viewport you can also render it final and for this cycles uses a tiled based approach so it will allocate the same amount of tiles and with the same amount as you have processors available and you will see now we have 32 threads rendering this image this is being rendered with full GI and everything and we have a few I'm not exactly sure but it's multiplied with some children particles that's the wrong system I guess I'll check in a second so you see that it's rendered this image quite fast and it's just on a CPU so this is now there's no GPU acceleration here in just about 40 seconds okay the next thing I would like to show you is our open shading language integration so open shading language if you maybe you've heard the presentation by Rob Brito yesterday the open source talk and open shading language is one of the project projects they open sourced like I think yeah it was in 2011 as well and what you can do with that is basically you can take a shader and write on a write your own shader and define all complex kinds of things which you wouldn't be able to do with just nodes so you have a fully programmable approach with all kinds of loops and conditionals etc etc and this is an example it's a pocket shader which has been originally written by Larry grits for render man so you see that this is also quite adaptable with a few modifications in the language itself you can port render man shaders and use it with OSL so I will just make it a bit bigger here and start rendering process and you see this is here procedurally generated pocket texture here generated we are open shading language here and we could change the parameters here like the color for example but also the specularity or the width of the planks or for example where is this planks profile so we have currently four planks vertically and horizontally and we could change it for two for example yeah please okay so this is just a procedural texture it's and then we are the OSL shading language and you can just use that by adding a script node which is here available in the node tree so you can go to script at a script node and then I can select my text file from within blender or I can also of course load an external OSL shader file just select it and it will then automatically compile the shader for me so we get all these inputs and outputs and those are specified in the OSL shader itself inside the shader declaration here so this is the shader body here and you see we have all these parameters here like KDE specular color ring scale and is actually those are actually the inputs and outputs which you will find in here later on in the node so you can hardcode the value you can expose it via a node socket and then interactively tweak it and you can then add this node in your entire shading tree so I in this case I have a fully functional shader which just takes a coordinate as an input and then outputs are closure of your DF and it's ready but you could also just take I could show you that we have some examples here within the program I think here at templates and let's take a noise shader which is a very simple noise texture it just calls the noise features of OSL and we can then add the script node again take this noise and then you can see we have here the noise shader with an time input and the cell output and Perlin and unsigned Perlin texture and this case you already might guess it from the colors of the sockets this won't work with the shader because in this case it just outputs a color and we would then still need to tell cycles which shader we would like to use so I'll just use a simple Lambertian diffuse shader in this case and assign the color value of this and start rendering here and this is probably due to the scaling I think we okay now you can see it better here so we have this noise texture here and in this case it's a four-dimensional noise so it outputs a color but you can also animate that via a time factor if we take this again or the cell noise this also really interesting but I think I'm missing something here yeah I'm for example I would like to I can just show you that quickly it's a it has a fixed size here but I would like to be able to actually scale the texture so I could just add another input field and call it size give it a default of one and then actually multiply the shading point by the size parameter so if we now recompile this I get a size and I can change that now so I've changed my shader here and I could then just go in here and change the size value here I could then of course further customize this by for example adding a color ramp note here and output this as a value and then depending on the actual value which sits between zero and one it will then use this color range so we could add a few more here and we'll actually get some nice variations there OSL is as well CPU only at the moment but we don't have much influence on that because it's a third party project it's still developed by Sony Pictures and they have a public repository on Github you can contribute to that but I think it would be possible to port it to the GPU the entire system but it would it would be probably a work of several months or so so unfortunately we're not going to see this anytime in the near future okay so much for the open shading language implementation in cycles and I also want to mention that we are actually in the first production engine on the free market which has this technology there's only and the Arnold version of Sony Pictures image works their in-house version and they have OSL support and outside of it it's just blender I mean there are a few of very small applications which are also part of the OSL library itself for very rough testing so you can only render I think a corner box or stuff like that with simple shaders to test it but this is actually the really first production engine which has this feature I just got it confirmed by Vray yesterday Vray will also implement it in a few months so it will be interesting to see that being implemented in more engines because you can of course exchange the shaders then from one pipeline to another and render your scenes in several render engines probably it will not look 100% exactly but you can at least copy your textures or your shaders and later on when cycles will have volumetrics we can also use OSL to modify the volumetric and the density of something or the displacement later on so this is still at its early early stages because cycles is just a feature complete yet but when it will have volumetrics and stuff like that it will really really be powerful and just recently and we got subsurface scattering support and this is also really nice because it is a fully ray traced algorithm so there's no pre-processing or point-based like caching or stuff like that it's fully ray traced and it also works only on the CPU so far but I hope you can see this example here I hope you see the scattering within the nice Zenfraud Dragon model which I think you're most of you should be familiar with it it's a quite popular model I think it has been a scan by Stanford they scanned this model in 3D and you see here my shader graph so it just has a simple mix of closures which you can use to create more advanced materials and layer your closures and I have the subsurface scattering here I could change the color of course or affect its scale so how deep the scattering penetrates the surface so we could increase that for example or if you have just zero it will of course do no scattering at all and then we can change the scattering based on the free RGB components so we can say that incoming rays with a red color will get more attention in the shader or if you take blue I think it's not really visible here because we don't actually have red light but if I go to my texture here this is just emitting white if I change this to red I think it should be more visible if I then again decrease the radius for the red component and you can see here especially in this part just quickly do a border rendering here so you see this actually then affects only the red part of our mesh so really useful of course you know that for materials like wax or milk or if you have other materials which need this kind of subsurface scattering effect okay to make you a bit more familiar with our note shading system as you can see we have it's still quite a simple setup so we have a light path node which is also a system which is also available in OSL so in OSL you can actually distinguish between all kinds of different ray types so you can say whether it's a camera ray or a shadow ray or whatever and yeah maybe I just open up a new scene and show you that from scratch so if we just add a simple flat surface and we have a cube here and well let's just make this our first separate my viewport so we'll have a note editor here where I can write my shader and this is the scene and we can then render this I will just enable the GPU as well so it's really fast and start rendering here okay so we have a very simple scene it's just gray at the moment but now we could actually use that light path node to change the shader and create some kinds of yeah nonphysically based results so this is also something which I meant in the beginning that cycles is per default really accurate it can do caustics it can do accurate refraction and reflections and everything and it has real GI bounce light but if you want you can have the full artistic creativity by disabling certain features or mixing shaders and depending on features which would not be visible in the real world of course so maybe I'll just use this new toolbar toolbar here as well at a mix shader and then for this this is the ground plane I think yeah it's a ground plane so just add a glossy surface here to make our plane be glossy decrease the roughness a bit so we can see it a bit better and I will also increase the sampling a bit so it has a bit better quality okay that was actually the wrong one so we have glossy flat plane now and now I can get there and take my cube and have two diffuse shaders and one of them is it's let's say green and the other is white we combined it and now we take as a factor input we now take the light path node so now we say if our if our shader if our ray is actually a glossy ray then it will be green so if I connect that you see it now on the actual object uses the first node became because this is the first input so it's zero so on this object there is no glossy ray it's a diffuse ray but when it then bounces off this object and bounces down to the flat surface it then is a glossy ray and therefore we have the second shader here and we have this blue reflection here on the plane and you can do that with all the other ray types as well of course but you could also take the camera ray for example so you can say that the camera is actually seeing blue but then the shader and everything else is just seeing the other one and we can also use this to for example if we add a transparent shader then it should it's a translucent one sorry transparent and then you see there's some range of section just for because the cube is we can move that up a bit so you can see that the cube is now not visible for the camera rays but is visible for the glossy reflections here or it would be visible in any other object or it would also contribute to the GI bounce light of other objects here but it's not visible in the camera view so that's for example useful if you have mesh lights geometry lights and you would like to place a few of them in your scene but you do not want to have it visible in the render then later on okay and apart from that we have a quite wide variety of closures already so we have the mixed shader you've seen already which just mixes two shaders depending on the factor input we have a add shader which is not so physically correct because you actually add two shaders to another which is also not really a real light real life effect and then we have some BRDFs like the diffuse glossy and transparent refraction we have glass could also just quickly show you that if we take a bit of more interesting object and subdivided a few times and give it a smooth shading take back my material and if we now take this glass material the monkey will be glass now you could also mix them later on again of course we're the light path node and other things so if you would like to have other kinds of effects so again we could mix that and in this case the glossy ray so with a green glass so you see that this is quite note heavy but it's really easy to create simple materials we will have an so-called standard surface in the next release so it will actually be one single note where you can tweak the diffuse contribution the glossy the transparency the subsurface scattering bump mapping textures and so on so it will be by the next release which will be in approximately two months because we have a quite solid and good working two month release cycle so we actually work with that since I think it's also two years ago so at the beginning of each cycle we add some new features then and to the engine and improve blender as a whole as well of course and then we have like four weeks to improve those features and add new stuff and then we have still the other four weeks to finish it to fix bugs and then make a new release and this worked well so far as it this worked really good so far and we really get a lot of new features this way and it's also really so in other packages you might wait a few months or so but here you can actually get a new engine and with bug fixes new features every two months and you can also of course as it's open source just download the source code or we also have pre-compiled builds on our web server so we have a server which generates new builds every day so in the morning you can always check out the latest builds and benefit of the latest features and improvements okay let's take a more interesting scene again and let's take a look at some rendering features we have so we have two shading beckons and