 Hello Okay, so I guess we start with this presentation Hello everyone Great to see that the room is almost full My name is Thomas Dinges also known as ding-toe online and today I'd like to Yeah, give you basically Tell you something about cycles and Yeah, talk a bit about what we have been working on in the past year since last year's Blender conference talk a bit about recent developments and Yeah, look a bit into the future what we will be working on in the upcoming months Okay, so cycles just quickly. I guess Everybody knows but what can you hear me? Well? This one is also One two three that's better. Yeah, that's that's great. Yeah Yeah Okay, thanks ton Okay, so just quickly cycles is meant for smaller teams and studios So that's what we what we use basically and what sort of the thing where we are using to evaluate targets and priorities so cycles is meant for production for visual effects for animation and Yeah, it's it's not meant to be a 100% physically actual accurate renderer So you can render like all these fancy stuff or interiors or so their cycles is not optimal for that, but we really try to optimize it further and have a versatile render engine which can be used in all kinds of different production environments and We really focus on interactivity and ease of use so we try not to expose a lot of Options in the UI, but we try to do as much as possible in the background So the user is not confronted with all kinds of different settings and algorithm options or whatsoever so we try to keep that a bit hidden from the user and As ton also mentioned in his introduction talk this morning is that we Realize since it to Apache 2.0 last year so it can be used Permissively also in the commercial applications and we hope to see cycles in other applications in the future as well Nathan let worry is actually working on an integration into Rhino 3d Which is a great 3d application and he has been working on on an integration for that for the past few weeks and I think there are some screenshots online Which you can check out on Twitter for example so we hope to see cycles also getting used outside of Bender very soon and We also want to improve the API there. So it's It's more usable for people outside of Bender Before I go into details and talk about the development last year, of course one thing that we Should quickly mention is of course that cycles is there because of one man and this man is Brecht So your brecht started a new job this year. So he went to solid angle in Spain the guys developed the Arnold Render engine and Yeah, we are really missing him And but I think for him. It's a really great opportunity and I think he really enjoys working there So please let's have a short round of applause for Brecht and all his awesome work All right, so let's first talk a bit about what we have been working on in the past year since last year's Blender conference Some of you might remember last year During the conference the people were still working on the coming on this short film project Pampa and The consensus back then was basically it is too slow so Pablo Vasquez always Compared it to Blender internal and said like yeah, we can get an amazing image in two minutes and now we it takes like two hours or so so that was basically the starting point when we decided to focus on optimizations and The one developer came along Mr. Local and There's a lot of great work on textures and hair and that's basically what resulted in a great speed up from Blender 2.69 to Blender 70 As you can see here in the chart, so it was basically for this particular scene in 1280 by 700 resolution HD it went down from 66 minutes to 46 minutes So we could get a nice speed boost there, so that was already better But we did some more work Brecht von Lommel did a lot of optimizations then for 2.71 Especially optimizations for for hair again, but also transparent shadows Which is heavily used in scenes with hair or when you have like trees and leaves with transparent textures So as you can see here, we have the coral character And the render time decreased from almost five minutes down to roughly two so it was a big improvement back then and Yeah, so basically this Was the beginning of all these optimizations that we did But also we added a lot of major new features since last year, especially volume rendering and smoke and fire So we have Basic volume rendering added for absorption for emission and scattering and also in 2.71 smoke and fire rendering So yeah, you can use the full feature set with cycles now when it comes to volume metrics One important thing for animators has been of course deformation motion blur which was also added a few months ago and Also one big feature by the life elinto texture baking especially nice for Game artists if they want to bake their global illumination and use it in game engines For example or to pre-calculate stuff so they can speed up render time by decreasing the bounces later on So they first baked the GI into textures and then they can render it faster But there's a lot more that we did in the past few weeks Some of those have just recently been released for a blender 2.72 the release which we just did like two weeks ago So and some of these were for example that we have improved glossy sampling I'm not sure how well I guess you can see it well on the projector. So on the left-hand side we have Suzanne with a glossy shader GTX with roughness of point four and you see especially on the edges there was a lot of noise and the improved algorithm that we have now based on a paper that has been released just this year you can see there's a lot less noise and Render time is slightly slower, but due to the better quality and the less noise you can actually decrease your samples So you get a fine image with just 80 samples for example instead of 100 So you still have roughly the same render time if you compensate for that, but you have less noise One thing that we also recently could do is to bring volumetrics and subsurface scattering onto the GPU Something that have people have been asking for quite a long time, but it wasn't so trivial to do that Because every time you add new features to the GPU it basically gets slower because cycles is basically one big code blob and GPUs don't like these complex programs. So basically for a GPU it's better to have smaller parts And they are then communicating with each other and during the workload But with one mega kernel basically every time you add new features you risk a slowdown So we had to be careful Not to slow it down too much and thankfully and to the new CUDA toolkit the compiler from NVIDIA We could enable volumetrics basically for free because that compiler version made it a few percent faster and by enabling volume we were basically back at Yeah, basically, so the render time wasn't there was no slowdown, but we could enable volumetrics For subsurface scattering we actually had to split the kernel into two. So we now compile two CUDA kernels for each GPU architecture because when you use subsurface scattering on GPU It uses quite more memory Which is a big problem when you have a GPU of just one gigabyte of video memory for example So for that kind of thing for subsurface scattering we recommend a GPU of two gigabyte or three gigabyte or even more Because otherwise you may hit the boundary very soon and you cannot render big scenes with it There are other things that people have been working on Martin for example worked on texture interpolation options. So when you wanted to have like If you don't want it cycles to interpolate your textures and because you want some some kind of Minecraft style For example, you can now disable texture interpolation and stuff like that. So that was added Memory improvements a bit. So we don't calculate the face normal Precalculate the face normal now. So it is saves a few bytes per and triangle and there were some improvements for Intel CPUs the Haswell architecture. So rendering on Haswell is five percent faster approximately and Some other things I worked on during my Google summer of code this year was improved clamping So that was something that that is really nice and I got a lot of feedback about it is Before you could all you could just clamp your image with one slider So basically with that you could get rid of some fireflies, but at the same time you reduce the the image The dynamic range basically of the image. So you got a I image which was darker than and you don't want that usually So I split it up into indirect and direct clamping. So now you can keep the direct highlight So when you have a light source for example, it's still very very bright shiny, but you can just clamp the indirect samples So this way you can get rid of fireflies very effectively basically and another thing That added was multi light sampling for the branch path integrator that we have so basically cycles was already Taking all the lights in your scene into account when doing the rendering process But just for the direct contributions for the indirect global illumination it still picked one lamp at random and now it also picks every light and takes it into account which also gets rid of of noise pretty effectively Okay, and another thing that I want to mention is the great cycles demo reel which has been done by Alexander Mitzkos here and we use great music from Jan Morgenstern and We did that demo reel for this year's FMX presentation in May and I guess most of you have already seen it, but I Still try to play it back. I hope we have sound And it's the internet works, yeah, so just to quickly Show the great demo reel that we did this year for cycles. All right. So yeah, that was the demo reel and I hope to see how how Versatile cycles is being used in all kinds of productions for commercials for animations for VFX So it's really great and I hope to see more of it next year And I hope to have another demo reel next year again So we can show Even more of it because there are a lot of things that are going on right now and people are using cycles more and more in Animations also thanks because thanks to the to the fact that we are constantly working on making it faster This year especially so I hope to see much more animations next year being done with cycles Alright, so let's talk a bit about Upcoming changes changes that we already did for the upcoming 2.73 release and things that are that we want to work on afterwards So a few improvements recently done by a Zell gay here For example improved area light sampling so on the left side You see the old algorithm which produced a lot of yeah some fireflies here And it's pretty noisy here and then with the same amount of samples with the new algorithm You have a perfectly fine image. You have no fire flies here and the actual area lamp here on top is much more bad much better Rental time is roughly the same. It's a bit slower but usually when you have a real scene with a few every lights that is just a Small fraction of the of the overall computation time so in a in a big scene. There's no big difference In scenes like that we just have like an area lamp in a sphere for example here render time is a bit Yeah, noticeably slow like that was 10 seconds slow or so here But even when you render with the old algorithm and increase the samples you had the same render time with the old algorithm as well But it was still far worse in terms of noise So you got a clean image in a short amount of time I think that was roughly 100 or so, but I'm not sure Yeah, sure That's just for area lights at the moment. Yeah, we probably can improve the sampling for other lamp types lamp types still but that one is in particular just for area lights Another thing that we did and look at this crappy coder out here is camera inside volume support so You know you now can fly through your volume object Which is nice when you have things like clouds for example or fire and you want to fly through it and have cool animations with it So that's possible now, which was not possible before So you can achieve some great effects here We have just that is just a cube with three point lights and the checker texture. They have a checker cube so to speak um Yeah, and some other things that We're looking into at the moment. Zergy has been working on improved BVH to speed up the rendering process and Reduce memory usage a bit. So one thing that he implemented is the watertight paper Which basically makes intersections more precise. So Before that you could have some situations where the array actually slips through when you have like an an edge and arrays directly Going through the edge and it's then not colliding properly and then you could get like fireflies or more noise or so so Hopefully this will be in soon I guess, but yeah, we'll see Which will make it more precise. So you don't have intersection problems anymore And there are actually quite a few bug reports in our tracker that we got in the past two or three years. So There will be quite a few precision errors that will be fixed with that Another thing that we can look into is a better improved QB VH So you have actually four leaves in the BVH structure, which should improve the performance of the rendering a bit But these things are pretty tricky and we have to still check how much performance we can gain and which particular algorithms we are going to use Another thing our further sampling improvements just like the one you've just seen with the area lights So there are still some things that we can do to Decrease noise and we look into that a bit I know many people mentioned displacement and opens up the four cycles But I think that is something that we probably will see in next year So I I don't think that we will get it this year already I could be wrong but Yeah, so basically cycles already has displacement support when you go to the experimental feature set But that's basically unfinished and there needs some there are some things that need to be fixed and then further improvement. So yeah, probably This will this will be tackled next year. So I don't think it fits into How to do list for this year, but yeah, of course anyone who wants to get Yeah, who gets started with cycles and wants to help out is welcome So if there's a C++ coder here who wants to get crazy with our code Just talk to us and we're happy to get your starting point and Yeah All right, so that's it already for my theoretical part here of my presentation and I actually Didn't really prepare a lot more slides because I always find it boring when there's just someone sitting in front and talking and talking So I actually want to spend the rest of this talk giving you guys the opportunity to ask questions Tell me complaints cycles is not fast enough and I don't know. So that's your chance now So if you guys have questions or want to give us feedback on what's the error now, that's That's your chance. Okay. Yeah. Yeah, there's something that is an hour to do list actually for quite some time Okay So he asked whether we will have light groups in blender in cycles basically the same like we have in blender in journals So you can have some lights which only affect certain objects Yeah, that is something that comes up quite often and I guess it will be implemented once but yeah, I Cannot give you an estimation when If we have a second microphone that would be it would be perfect. Otherwise, maybe I can we can pass around that one and I can just Something happened Hello, okay in this one. Okay So next one so I've seen I was very interested in this when this volume stuff came up and Then I got disappointed because I couldn't get volume textures inside. So When do we really get volume textures because right now? It's like you load a 2d texture and then it shows something up like this checker texture But actually well, I have a lot of volume textures. I've IRM images. I have Molecules that that have some electron clouds something like this So so when can when can I go back to render this stuff? Yeah Okay, the one question is actually always the most difficult one Yeah, I mean, I know what you're talking about The thing is you can have different kinds of implementation. So we could have like a basic voxel texture way and then load in stuff But the long-term solution I guess is to integrate open VDB to also import those kind of things. So yeah But I cannot tell you when but it's definitely Someone working on it Not at the moment But it's definitely going to happen sooner or later. So yeah, I guess In the next half year or so it might be there. Yeah, okay. Thanks a lot there is somebody on blender artist who has worked on a Open shading language script. So if you have your data as you probably have slices for images like If you have these image slices, then you can search blender artist There's somebody who has written an open shading language shader that takes a series of images Love slice data like MRT or CT and then it creates a volume texture from it that you can run in cycles So it's it's possible. You just need to work around good In cycles you have to choose whether you want to render use with CPU or GPU Will it ever be possible to use both in some way to render one one scene? So at the moment that is not Yeah, well, it is possible if you use open CL. So when you have an Nvidia GPU for example and an Intel CPU And you enable open CL for both of them. You can do it actually but it's well, it's not optimal. So When you have like an Nvidia GPU you should use CUDA to render on it instead of open CL because that is faster So it might be possible in the future, we also want to have proper network rendering so you can Because because usually for my PC the limitation is the memory on the GPU cards If it would be possible to I don't know slice the image in some way to use one or two gigs on the GPU And use the rest to use CPU to render the rest would it ever be possible to do something like that? Well, that is not really possible because it doesn't matter how How big the Part of the image is you're rendering you still need the entire scene the VVH the textures on the device itself So it doesn't matter if you just render a fraction of the image Just a few tiles because you still need the entire data. So You can so you can use multiple devices to render that with open CL And hopefully also in a more proper way in the future But you still need to have the the memory available to load the entire scene in all the devices. Okay, thanks Hi, how about Shadow ray pass for mesh lights. It's still missing is and can we get this? Yeah, a shadow proper shadow support also shadow catcher and stuff like that for compositing is something that is also Coming up often Sebastian here by any chance. No, it's not he's asking about it all the time and Yeah, it should definitely be implemented But yeah, it's just one of those items on a long to-do list and half year or one year Yeah, so yeah, it's definitely an important very important feature for VFX artists, but yeah We need more coders Sorry and Then a classic was a starry sky and very cheesy And miss was that already included? sorry in a Old internal Render engine you had to starry sky Nobody Stars, yeah, well There's no plan to add a star feature to it's really sleazy, but I quite kind of use it loss No plans I Just a question maybe not from More specific question When we for example in cycles render Passes and we have the normal pass it renders the Like the object in the object space or in the world space. I don't even even remember but would it be possible to have this pass a standard way normal way so When in the camera space Because it's when we are rendering the normal pass It's it's like I've never heard it used The data, you know in the in the world spaces or something, but everybody uses the camera So so we need to convert it somehow So But I wouldn't want to change it I can't do it, but I would like to render passes several passes like have the direct indirect Diffuse and blah blah, but I don't want that I want to have it in you know Multi-layer EXR for example and what comes as the output of the normal I would like to have it in the camera space because Everybody uses this in camera space in compositing. I've never heard anybody use Normal pass that uses other space so it This is this is just a question and If if it would be possible if if there is any thinking about it Even I would just Well, as you know said you can basically just use a compositor and then convert it there into another space If you want to put it directly into a multi-layer EXR for example, then Probably there could be some kind of checkbox which outputs the pass in a different Or or the yeah checkbox anything switch I simply just because you know my norm Why standard standard workflow is that I always Okay render to passes, but I always render to multi-layer EXR and compositing is the next step I never composite during render. I render then I import those files and composite them So I want to have the passes the correct passes in the when you're still composited You can still put a take the multi-layer EXR and then change it in the composite Oh, because you still have access no no no way in multi-layer EXR We I just have the colors. How on earth can I convert? World space into camera space in compositing it's impossible when in the multi-layer You have all the passes just like when you output it in the compositor, but I don't have the angles. It's impossible to convert from How It's well Yeah, you need math you need to fake camera you need to whatever so so a lot of calculations But if we simply had because from the coding point of view it's just a you know a little switch and We wouldn't have to use fake cameras drivers that drive the camera link it to the something and so on it's Yeah Yeah, what no blender internal blender internal uses the camera space when you output normal pass from blender internal We have it in camera space when we output Normal pass from cycles. We have it in the world space It's different Okay, so maybe the question am I the only one who would want that maybe I'm the only one oh Yeah, I guess to add to that what Martin said basically in cycles we our common space sort of space We are always using is world space So I see I guess that's just why it's why it was also used in for the passes there because we use it everywhere Of course have my ways of getting around this. This is not that I can't work with this, but it could be a little bit easier Thank you. Okay sure Yeah, just a small feature idea Will it be possible to render madcap materials the default madcap materials in blender for preview rendering or? Skype preview renderings some stuff like this Yeah Well, um, you don't have access to those kind of things in the cycles materials at the moment But what you could do probably is like having some kind of OSL material for that or an OSL script Which imports those GLSL shaders and basically GLSL and OSL is pretty similar So probably this could be done, but yeah, there's no way you can access this at the moment Yeah, our shader materials. Okay. Thank you. Thank you. It's another question for the Passes basically There was a really I mean I usually work on a stills and there is a pass that I'm always missing with cycles Which is this pass that you get the object color Without any kind of shadow any kind of reflection anything any kind of so it's really easy then to go to photoshop or game or whatever And select that and in the old blender internal You had a way of getting the mat I don't remember the name, but this the mat Pass without the textures and so far in cycles as fast I know it's impossible to get the pass without the textures So I found like a couple of ways of getting it But it's like hacking the beard using the scripts and I think I I mean I have been Going around the blender artist forums and it's a thing that I have seen Many times request and I'm I'm I'm aware of the ID mass But you know that when you have a scene with I don't know 40 objects or 40 kind of different materials You cannot spend one afternoon making the ID for each of one Would that be like really complicated? Oh, I mean for me. I'm not a caller. It looks probably not I mean we have the color passes already, but they would be with the texture on top of it So yeah, you just basically want the material without any textures then but just the plain color without As I said like I remember in blender internal. There is just one Yeah, exactly, and you check it and you get the pass without the textures I'm not sure if I'm will it be that like Really difficult to do in cycles will probably not like us. So yeah, we could yeah, probably it's a bit difficult to extract You would basically need to render it I didn't hear From here behind we didn't I can't I can't show you That I would be Like you want to make it easier for the user and He going to to use notes and really complicated stuff That usually scares people. I mean I work in architecture visualization and you know people are really You you want an easy way because you you don't have much time to do this images, you know, I Have a question about the sampling and seeding Procedure is there a way to distribute the rendering using the seed the sampling seed So that you can if you have a large image and you can distribute on many notes and then Fusion the results together at the end There's no automatic way for doing that at the moment But I mean you can wait here with the compositor later on if you have like Renewed on one note with 100 samples on the other note with 100 samples of a different seed You can just combine them. So basically, I guess the correct way for that would just be a Python script I'm not sure if there is one already. Maybe There is one I think yeah, and the seed are well distributed. What is the range of the seed you can? For example, you can decide if for example, if you have thousand samples you can say I use seed seed number one to 200 every 10 samples it will be the same results as the inferior inferior should be the same result but for those kind of things, I mean when you have like a lot of a lot of Samples I actually would recommend to rather split it in in bigger chunks like split it with 100 samples and then have it on 10 machines instead of having really small small amount of samples because then you eventually get a Lower quality result out of that. Yeah, thank you Blender internal has a really really great feature. I absolutely love it. It's called halos Halo more halo material when is this coming to cycles? Well cycles is not blender internal No, but seriously, um halo is really really a fake fake type of material and I Mean blender internal is really great for for those kind of non-fotorealistic things like wireframes and and halos and whatsoever and I don't think that is something that we should add as a real material type to cycles So I mean when I look at the halos in blender internal it it's kind of nice But it's not what you would expect from a physically plausible render engine I guess because when I first heard about cycles at FMX I also had a talk to Tony and he told me that brecht was planning on implementing halos based on volumes But now brecht is gone. So I guess Well, I don't know of any plans about it But at least for me it wouldn't be a priority and or I would even go further and say that I don't think that This is something that should be in cycles At least it wouldn't be a priority for me to add halos. So can we at least have point density textures? Well, that's what what the other question was early on. So having a voxel data or point density texture to load in volumetric Shaders and volume data. That's of course something that we should add. Yes But in regards of potency I'm talking about really points like particles and you have spheres that cycles is computing spheres around the particles For example and merge system like meter balls or so just like the point density texture is doing in blend internal Well, there are no concrete plans for that at the moment Okay, so what I think is that we should add the open VDB later and then have a good basis for importing volume data, but yeah One question is will there be a standalone standalone cycles renderer as a Separate program. So for example, we can import from Maya files or from other really programs So a cycles itself is standalone. So you can compile it as a standalone application already. The problem is that the API is not really awesome yet. So Well, as I said Nathan let worry wrote a C or I think C sharp API for cycles Which she put on github so people can look into it We have we maybe can merge some things of that and and use it in master as well So yeah long-term the long-term idea is to have a proper API Which can be used by other applications and then they can basically write an Importer exporter for their software and then you could put it in cinema 40 or Maya or whatever That is definitely something that I'd like to see because I think there's a lot of potential for cycles out there and we actually got a lot of requests for that and Every time I go to FMX for example, there are people also from studios who say that it would be great to have cycles standalone but yeah It hasn't been a real priority for us yet to to spend too much time on the API and stuff like that But yeah, it's definitely possible already. You can compile cycles without blender and you can load in smaller scenes with XML But to make it really useful we would need a proper API which can be used by the other applications then Okay, thank you I only have a small question so it's only a very small annoyance But is there a way to set the viewpoint color of a cycles material automatically to the I don't think I need to finish my sentence even the reason why this isn't in yet Is because it is sometimes not so trivial to extract the diffuse color from a node tree Yeah, I can so when you have like a huge now a node tree and you have like a few diffuse shaders glossy shaders Subsurface getting whatsoever which of the diffuse shaders is the color you want to see in a few port Yeah, I know it's difficult to find an algorithm which which picks out that that right color Um Random one every time you open the file Of course when when you're when you just have one diffuse node in the scene then we could just take that but when you have That would be great because I'm using it for research purposes. So I Often have various similar characters and it's just the blue one or the red one Yeah, and that's the only identifying trait and if I then switch up cycles and go to the regular viewport They're all white Oh, well, I think that that can be done But yeah, it's just complicated when you have a more a bigger node tree But for for those basic things that it should be very well possible. Yeah, great Okay, well at least one step forward thanks Maybe it's a simple question, but I've got some scenes which I can GPU Render, but if I put them in a Scene which is built up of linked Models the GPU crashes. Is there a way of working out? when where the limit is or Possibly that there's some sort of warning that says if you go over this and your GPU is not gonna accept it Well, basically your program will always crash when it runs out of memory It doesn't matter whether it's CPU or GPU when you just run out of memory it will crash So you should keep an eye on the information on in the image editor Which has how much memory is being used on the GPU? In in practice This is most of the time the number that you see there is is below the actual amount that is allocated on the device But when you have like one gigabyte it might actually use 1.1 or 1.2 gigabytes So you should see if that value is Near to the physical amount of memory you have on your card and that is then and usually you also get a Warning in the console terminal of blender when you all when you when you're on Linux for example When you launch blender from a terminal you should get an error message which has and which says that it's out of memory So you should check for that and then and the render works it out through all the linked scenes So if it's made up of 10 different scenes It works out how much memory for the rendering this is irrelevant for rendering still gets all the data from all the different Landfiles and then builds the scene in cycles. So that's it doesn't matter whether it's linked or not linked It will still need to get all the data first and then calculate it So the memory as memory information should be accurate for that case as well Any more questions? Hi a few weeks ago you talk about cycles being the default from their engineering vendor and He traced the big debate in the community, but I would like to know how far away for From having cycles as defaults in blender Well, what did tons say today and defaults are stupid. So we shouldn't really worry too much about this I mean Yes, I had a pretty strong opinion about that because it was basically Planned for a long time and break was also talking about it a year ago already And then we always said well when we have volume and deformation motion where we consider it The thing is there are there are two things that I'd rather see finished before we evaluate this again one is import and export of Data so when you have like when you import or export an fbx file or a colada file You also have material information and with blender internal those are more or less well being converted and export imported We don't have that for cycles yet because that's again due to the fact that we have a no tree here And it's a bit difficult to extract textures and diffuse colors or whatsoever into a format So I'd like to see that being fixed first or at least some some work arounds or some some Yeah, some things that work with that and the other thing is that I'd like to have some sort of presets or Some kind of uber shader which was already planned So so people only have to when people open blender They just have a few sliders So they don't need to add certain nodes to get a simple shader first But rather they have a node group for example where they can affect diffuse color and amount of glossy and whatsoever So it's easier So I'd first like to see those points being addressed and then we can talk about it again But yeah, I mean I've seen the discussion as well, and I think it's not a big deal I mean in the end it's just one click and if you always use cycles you can save your startup land already And it's not a big deal. So yeah, first I'd like to fix those problems And then we can check again and see if it's feasible or not and if not, it's also not a big deal Martin you want to add something to that? Okay I Wanted to clarify on the GPU memory issue One of the big problems we have is nvidia the API actually doesn't tell us the total amount of memory We can only measure what we send in textures or in things But if you start up as soon as you start the render the the program is copied to your graphics card And that can allocate a few hundred megabytes, which we cannot measure we cannot know this There's no good way. Well, there is on enterprise cards, but not on gforce cards to measure this So this is a problem our measurement can be off by a 100 max or 500 max depending on how If you use an experimental cycles kernel on an old card We can be off for 500 max at worst So you've seen might report one gig and we might be using 1.6 in the worst case And then you get a crash because we cannot allocate memory And and we say yeah, we only need a gig and I have 1.5 gig cards, but it still won't work And there's no good technical way around this unfortunately Yeah, yeah, yeah, but We cannot also reserve because memory is so precious If you have a two gig cards you want to use as much as possible So we we also Because we cannot know how big this is going to be We also cannot like cut you off before we crash basically And yeah, unfortunately, no real good way around that And also your operating system for example if you if you use that card for the display as well It will also need memory already. So Yeah, I mean in practice I really recommend if you're really serious about gpu rendering I really recommend cards with three or four gigabyte at least everything below that is Today not not good anymore for for real big for real big shots. Yeah, so So just one request what about metadata on osl shading? You know the metadata to describe the user interface of shading of shaders Yeah, that is something that I like to see being added I unfortunately didn't had the time to do it But yeah, it's definitely something that would be cool to have because it's a Yeah, without that it's sometimes it's a bit you use integers to fake the booleans for example whatsoever. It's a bit Annoying it would be cool to have more Have a better UI for osl shaders. Yeah, so I hope to see it being added But yeah, I cannot tell you when when I have the time to edit or maybe someone else wants to do it It's probably not a big deal. But yeah, I mean The osl specification self says how this should look and what information should extract from the osl shader So You know, we just need a parser to get that from the osl shader and make an RNA property or so from it and put it into the node interface So, yeah, hopefully we will have this in the next few months as well Thanks I have a short short question um I was wondering if there's some kind of transition documentation if you have complex scenes in In the blender internal renderer and you want to transform that that complex scene in in in cycles ready seen Is there some is there some documentation with with with some steps or guidelines? Uh, how you can do that in in a smooth way because in this way when I switch uh switch to the cycles renderer and you have you there's a lot you have to to adapt and change and and that that's that's uh, yeah a barrier or a step too far to to to to transition to the cycles renderer Well, that's a very wide question actually. I mean it really depends on on the scene So, uh, of course we you could you could have some sort of documentation Which tells you when you have like a shader with a diffuse material and image texture What you need to do in order to have this in cycles But when it kind there are so many things that you have to consider when when you're comparing those two engines like For example, um, it's it doesn't really matter for blender internal if you have like an interior or exterior For cycles, it's the last memory at the last geometry geometry you have to better And sometimes I I've seen for example, there was one scene I I saw on blender artists once Where someone had an office desk and he rendered that office desk and you couldn't see anything around it But he actually put a A room around it So it has it had walls and everything and it was pretty slow Um because he thought like okay where I can just do the transition and just adapt to cycles And I keep the room and everything although I only see that particular desk in my render And it was very noisy and slow and I he provided the file for download And I just removed the the room and just kept the desk and it was much faster and less noisy So there are a lot of things that you have to consider and it really depends on the scene um, so yeah, I guess If someone wants to make a tutorial about that There is a demand for it. So Yeah, it's it's really a wide topic So it really depends if you have like an interior shot an exterior shot characters Complex materials. You cannot really write one single guide for that. That's impossible Okay, thank you Thank you. Um, perhaps I'm just being particularly stupid But are there any good introductory tutorials or explanations of what some of the nodes mean In the cycle system because I've read, you know the wiki from top to bottom And it's some of them are still completely cryptic or worse You know really just confuse they make no sense to me at all But then I I come from like a physics background rather than a ray tracing background. So I'm seeing things in the opposite order Yeah, I guess the wiki could be improved For the nodes and maybe we could also add some examples there what and what the nodes are doing Well, yeah, maybe when you look at Usually you can just look at tutorial sites like cg cookie for example and and look Watch the cycle tutorials there. I've looked for example the the the shader forge the shader forge series on cg cookie It's really nice at the moment where they build procedural textures or flots of nodes So you can learn a lot from that for example, but yeah, some some better guide would be nice I suspect I am atypical here. So not typical at all. Um, I like things systematic I find these we'll do a tutorial for how to do this particular scene Very hit and miss for what particular things they need for that particular scene Actually, I quite like textbooks, you know, just give me a list of all your nodes and tell me exactly what they do And how I should model them in my head and I'll be very happy Just wanted to ask if it's possible that um, you see in the material notes um You can grab the texture you made in the texture, but you don't see the name Or in the node editor. Yeah in the node editor. You always see only text but I don't get Like in the blender internal. You could also associate this material has this texture Okay, so but I don't see it in the node editor Well, then the node itself when you add an image texture it has the the Name of course of the image texture that you add there That's okay Maybe a preview would help there for adding textures in the node editor. Maybe Okay, I guess we have to wrap up it Wrap it up here then so thanks a lot for the questions for the feedback and yeah, I'll hope you enjoy cycles And keep using it and thanks for attending here. Thank you