 Oh, hey everybody, hey, hey, hey. Gleb Aleksandrov here and welcome to another very exciting presentation. Today we're going to be talking about 3D Nebulae and the power of the Blender community. So yeah, my name is Gleb. I'm a 3D artist and a Blender YouTuber if there is such thing and together with Adi Burrows we run the Creative Stream blog with tips and tricks about computer graphics, art and coffee brewing and usually we include some kind of a music jingle before our tutorials. So I wanted to recreate this special kind of ambience just for you here. Thanks a lot. Yeah, so why space, why Nebulae? Because space is awesome, of course, but also in 2017 we released Space VFX, the ultimate guide for creating galaxies in Blender and it went pretty well. But one thing that was missing was polymeric clouds and that sort of stuff. Of course, we papered over the cracks and we used various two-dimensional techniques to create an illusion of depth, but in the end it was just an illusion and we've been wanting to play with the same toys as other folks who use proprietary software like massive particle systems with millions and billions of particles, fully volumetric objects that you can fly your camera around, that sort of stuff. And let's say we couldn't do anything like this in Blender. And if you jump back to 2017 with me, you would discover that we can do nothing like that in Blender and traditionally that area of computer graphics was dominated by proprietary software and for good reason. On this pie chart we left just one percent of the Krakatoa-style Nebulae simulations to the open-source software, but unsurprisingly most of the Krakatoa-style simulations and the Krakatoa is the render engine meant for that kind of stuff, was made in Krakatoa. Here you can see amazing art by Tion Vandersalm, Viekoslav Puzovic, Martin Mirol, other amazing artists and like 99% of such stuff was indeed produced in 3D's Marx and rendered in Krakatoa or it was Houdini. And how could we even compete with that? How could Blender catch up with that? So let's take a look. My hypothesis is that we could use the power of the Blender community of the open source, open collaboration to cross the divide, you know? So first of all we've been donating money to make Blender better and I mean all of us, not we, but all of us and some corporations, so we have been donating money, but in 2018 we did something amazing. Basically the Blender user base crowdfunded the CodeQuest project to fly developers to Amsterdam, lock them in the room and make them work for the common good and hopefully that would lead to improvement in all areas of Blender and in volumetric rendering, right? So without the never-ending support of the Blender community it would be highly unlikely that, for example, 16 developers could work on improving Blender and improving volumetric rendering. So for example Clement Foucault could come and single-handedly code something like volumetric rendering system for Eevee for the new real-time render engine. So when I think about it that's insane. Now we have like real-time volumes right in your viewport that's the holy grail of 3D rendering and now we have it. So that I think is pretty damn insane. So since 2017 we got about 120 code commits directly related to volumes and volumetrics like principled volume shader coded by Brecht van Lommel and other amazing features and most importantly bug fixes and stability improvements. So now you can for example have like amazing improvement coded by Kevin Dettrick, cycles volume, fast empty space, optimization, whatever but every improvement like that made Blender render volumetrics a bit faster. So all in all like now we can enjoy about 100% faster Blender in relationship volumes and if we factor in the hardware growth that occurred naturally over the years so we can say that it's like three times faster or something like that which is representative of switching from the video card of the previous generation to the mainstream gaming video card of the current generation plus the bug fixes and the performance improvements coded by the amazing Blender developers. So is it enough for us to render amazing Krakatoa style nebulae? I would say kind of yes but also no something else needs to happen and I want to show you the volumetrics slides of the interactions within the Blender community no pun intended. I want to show how people talk to each other share ideas and generally pinpoint ideas of each other to create something unbelievable and on the right you would see the progress bar at the top there is the Krakatoa mark so the ultimate goal of course I'm a little bit biased I consider myself to be a part of this community but I will try to keep it reasonable. So let's start with 80 euros 80s procedural patterns took us through the initial stages of research I mean the stuff we designed for original space VFX procedural noises in Blender are like generators of everything especially if you combine a few noises together you can quickly create patterns that resemble astronomical objects like like nebulae alright and what I like about procedural noises is that they have built in infinite details so you can fly right into the cloud right into the noise and see the new levels of details unfolding right in front of your eyes the standard Blender noise in other words the pearly noise gives you that opportunity to experiment with all kinds of patterns and to have built-in fractal like details alright and right at that time somewhere in the depths of the Blender artist forum Benny Govert's invented a pretty clever way of utilizing 3d textures 3d patterns to plug it into the emission shader then into the volume input of the material to create a volume light in the most literal sense of the word and it's already looked almost like an emission nebulae if you think about it and so we took it a step further and after playing a little bit with the volume light plugged it into the volume shader instead and that's how it goes so basically you take a cube of volume light and you can play with the density or rather the strength of that light by utilizing three dimensional noises of Blender and you can combine a few noises together if you want to create a slightly more detailed noise and after playing with the color ramp to colorize it you would discover that you created something that looks roughly like an emission nebulae let's marvel at the beauty of the universe for a second and then you just recombine it with the volume shader instead to get juicy features of the volume shader like absorption scattering anisotropy and other physically correct properties of the volume shader of the principle volume shader coded by Brecht van Lommel and inspired by Disney so you will get something like a dust cloud and we knew that we're on the right track when the grandmaster of the nebulae design Tion van der Zalm gave us a top on the shoulder and said that it looked just like particles and I think that was the point because we were aiming for that kind of look bam the B3D factor kicks in the community-generated progress you'll be seeing this a lot and back to particles who on gear optimized the particle generation so we could play with more and more particles on screen and when Gottfried Hoffmann noticed it he just removed the hard-coded particle limit all together and after pin ponging some ideas with Gottfried we invented a pretty interesting way of utilizing the inherent motion of the particle system to apply motion blur and create an impression that there are more particles on screen than there actually are not only that but Gottfried also wrote a script that throws time so we can fly our camera around this constellation and if you think about it that's pretty insane we had a state static nebula which showed motion blur due to the motion of the particles within the cloud that is pretty cool I think and yeah yeah Gottfried is for you but particles are fun only when combined with with physical forces so we watched many tutorials on the blender side of YouTube thankfully you could find anything you want there the blender tutorial business was and is booming so we watched a bunch of tutorials including the tutorials by Yago Mota the best we could find Yago's tutorials were like insanely valuable for us we were able to familiarize ourselves with that aspect of blender like in no time and we were excited to try that out for our own stuff for our own nebulae so I would like you I would like to show you a few demos that we managed to create without actually burning our computer because smoke simulation was fun the shapes that produced looked pretty neat but at the same time it had a bunch of problems like well the biggest problem was the problem of the resolution of the smoke unlike procedural texture you you didn't have built-in infinite details so we needed something to push it to the next level and we thought about stuff like volume displacement like volume displacement in Arnold render engine that could help us to to do that and volume displacement does exactly what it says on the team it takes the volume and it displaces it using some kind of a texture and we even wrote to Tom Rosendahl asking whom do we have to bribe to get this feature in blender and as it turned out Brecht van Lommel would be willing to check on this in the coming days yay but we wanted it now but we wanted it now so after talking to Robert Schutzer we discovered that Robert called it a little blender cycles volume importer that allowed us to import volumetric data from Houdini and other applications supporting open VDB as slices as 2D means the main dimensional images that then got recombined within blender within the shader to create a three-dimensional representation of the smoke or of of the nebula in our case that was pretty cool but we didn't know how to properly install open VDB library for blender to be able to export stuff from blender and then import it back and then do these things like distorting the UV input and displacing the cloud to add extra details so yeah we put it on hold and instead Gottfried proposed to use the point density shader the point density shader indeed worked just like volume displacement when you think about it let's see how it works actually I want to show you so first you run this smoke simulation this one is by the way controlled by the turbulence force field and then you add some particles and you make particles follow the smoke using this smoke flow force in blender and then you turn these particles into tiny volumetric points in space or voxels using the point density shader but it looks cool and detailed only surprise when combined with procedural noises and let's strip it down and actually see how it looks without any enhancements let's get back to its initial form and you will see that it's a bunch of voxel floating in space and then let's get back our resolution and zoom into the cloud just a little bit and you see bam no details whatsoever let's distort it using the noise now you have details right but it was pretty unstable I would say unstable as hell so we decided that instead we would come back to our comfort zone and thankfully right at that moment Simon Thomas node genius a math genius of blender released a procedural noise park which had this amazing advanced noise group which works pretty well when combined with volume shader it had the right balance of big medium and and small shapes and you we don't happen to understand how it works basically it's a bunch of noises combined to create a sort of a mega noise which produced pretty interesting patterns when plugged into the volume shader because every kind of noises in blender is three-dimensional it works perfectly with volumes and this noise in particular helped us a lot and empowered us to keep experimenting with that technique of rendering volumetric objects with procedural noise that is the default cube by the way and let's zoom into it a little bit and you will see that it holds up pretty well because the details given to you by the procedural noise work with with volumes perfectly and Omar Imaro took it to the next level by extending all the blender noises I think I think or was it just the pearly noise to other dimensions namely to the fourth dimension which is time so we could play with the evolution of the noise which would shift over time that is pretty incredible Omar did it during the Google summer of code that is really really great and really helpful boom b3d factor goes up yet again another branch of deep space research was aimed at fractals fractal is a never-ending pattern infinitely complex pattern that could be very useful indeed for simulating infinitely complex formations like nebulae so Jonas Michelle was the one who showed us how to do fractal math in blender and Jonas not only produced absolutely incomprehensible I mean great tutorial but also but also he shared the blend file and we picked that band file did practically nothing with it because our math knowledge wasn't up up to this task but anyway it was great and now everyone can can try can try to render fractal in blender which is pretty neat and Robert Schutze also experimented with fractals basically what he did is he took a webgl playing marble shader or whatever and he recreated using blender nodes so for example we could take this shader and render yeah infinitely complex stuff like tiny universe inside inside the inside the box it's amazing right yeah round of applause for Robert yeah you'll notice the spaghetti monster going on and you don't have to understand how it works it just works it's a fractal but have you ever tried art directing a fractal yeah don't you don't have to answer because it's nearly impossible you know and so incidentally Robert also coded a custom internal volumetric sampler or whatever it is which allowed us to calculate fractals four times faster right out of the box boom community driven progress sweet sweet open collaboration back to procedural noises thanks to the Simon Thomas noise and just to blender default noises and procedural patterns and thanks to the principled volume shader coded by brecht von Lommel we managed to arrive at the patterns at the astronomical looking objects that we really liked that really resembled the real stuff but surprise it was low because volumetric rendering can be slow in general and default blender denoising didn't do a great job with it so the game is the game kind of changed when Grant Wilk released the denoise add-on that brought the NVIDIA optics artificial intelligent based denoising to blender it performed in a much better way than the default 2.79 denoiser especially in such insane cases like the particle system and low resolution stuff like that and we got dramatic performance boost at the cost of insane rendering artifacts especially in animations because the denoiser processed each frame slightly differently so the animation was jittery in the end but we could live with that and thankfully right at that time Stefan Werner also added the Intel open image denoise to blender which wasn't hardware specific because optics only worked with NVIDIA video cards and yeah the Intel open image denoise was as magical as optics it simultaneously appeared in many places including the theory build so hopefully by now you can see how how the blender user base like a hive mind have been making the prospect of rendering amazing three-dimensional nebulae in Blair a bit more real so for example we can sprinkle it up with the creative shrimp magic and render something like that that is by the way cycles a good old path tracing render engine of blender I want to show you a few demos and take a sip thank you so much and then Evie came out and it changed the game once again and if you obviously is the semi real time render engine which supports volumetric rendering now thanks to Clement Foucault and other amazing blender developers and so the blender community rushed to port their space objects to Evie yeah and just within a few months after the official 2.8 release with the support of volumetics we got dozens of renders by Brent Patterson dozens of amazing blend files shared and sharing is at the heart of the Blair community is at the heart of the open source movement it's very useful indeed to have access to all the files like we got amazing breakdowns and resources by Curtis hold under Stephens in went invented clever Z depth hug that paved the way for the smoother-looking animations Mark Kings North released a nebulae generator add on and some amazing artworks Gottfried Hoffmann did something again because Gottfried was pretty active during that research process Stefan Wink Stefan Wink absolutely killed it by sharing real world or rather should I say real universe example of pillars of creation rendered in Evie and on the left you can see Evie render which took three minutes and cycles render took 40 minutes what would you prefer hey and I want to show a few Evie demos now yeah actually if that is not impressive I don't know what is just within a few months after the official 2.8 better release we moved from zero to the pillars of creation type of nebulae seen right in your viewport almost in real time but even though Evie has been maturing very fast boom progress it was yet to be cycles in terms of raw visual fidelity and in terms of details and physically correct light transport and that sort of stuff the best nebulae we created so far were created in cycles with some Evie help but it took a long long time one frame could render for six hours or more on an average video card and eight seconds of animation could easily take 50 days to render and that's made us sad you know because we wanted to render animations and on an average render farm that could take up to four thousand dollars something like that that made the doc really sad so what could we do what could we do yeah actually we asked people and the people responded once again and we did our own research as well but the Blair community was unbelievably helpful during that process so I won't be bothering you with all that stuff especially we running out of time but Stefan Werner suggested us to use light path node to kill the details in the shadows and optimize our nebulae this way Derek Barker just sent me the link to the custom theory build that's out of the box gave us one point 30 whatever whatever a huge boost in performance Tiago de Sul shared a pretty clever hack of disabling the ray visibility of every ray except the camera rays because the nebulae doesn't interact with anything else in this scene except themselves so we can disable all the rays and get our much-wanted performance boost we changed that was very clever we changed the aspect ratio to make it slightly more panavision to cut the top and bottom of the image and optimize it this way and we removed seven I know that is phenomenal I know don't clap that is I know that is genius and we also we removed seven light sources out of nine because you can always optimize it a little bit more and but now hold on to your chairs do you know what is cycles add-on is it's a wonderful add-on that is basically a supercharged charged version of cycles and the creator of the add-on Matthew Manue shared a custom build of a cycle specifically optimized for nebulae rendering for volumetrics it had just one checkbox nebulae optimization something like that and it gave us a huge immense boost without altering the quality like think about it and in some scenes it was 30% in some other scenes it was this is a checkbox by the way and in other scenes it was 16 times faster yeah at with some loss in details but anyway and we hoped incidentally the new version of e-cycles is released today we hoped that Linux would bring another competitive edge in terms of a GPU rendering performance as suggested by Steve Lund and other folks from the community but we got a mixed bag of results and we decided not to hurry up too much and crazy stuff we we hoped that Andy aka deep blender would release a super resolution add-on for blender that would allow us to render in 50% of the resolution and upscale everything to their original size without the quality loss but yeah I hope that one day he would he will do it yeah and my brother Roman Alexandrov implemented even more shameless hack basically he coded the optical flow plug-in for blender do you know what optical flow is basically it allows you to slow down the sequence the video sequence smoothly without stuttering but if you can slow it down and it's usually used in after effects and DaVinci resolve you can skip frames like you can render each third or even each fourth frame and speed up your rendering by 400% and then you can reconstruct all the frames using the optical flow algorithm and in some cases you would even get away with it and so yeah we got another another performance improvements which which was about I believe an average type of scene it was we skipped just every second frame but anyway so yeah multiplied by 2 or by 5 bam the b3d factor goes up once again so by squeezing every bit of performance out of blender and by picking the community brain by sprinkling it with the creative shrimp magic by utilizing techniques originated within the community and performance optimization tips custom builds like theory build or e-cycles nebula build we managed to move from six hours per frame to much more much more manageable figures like less than 20 minutes per frame and that meant that we could render this stuff overnight the and the preview animation I want to emphasize in the preview quality but yeah we could render it and that meant that we could submit it to render farm and it wouldn't cost us an arm and a leg because now it became possible. Thankfully such thing as distributed rendering has become a viable option distributed rendering means that people share their computers to do rendering tasks that would otherwise take weeks and months to finish probably the most popular distributed render farm is ship it and to give you some details about ship it 100 million frames rendered since 2012 up to 700 machines simultaneously rendering each day thousand years of blender project since 2012 including our nebulae special thanks to Brent Pearson for helping us out with rendering this stuff on ship it so yeah it could take five months I mean seven nebula sequences to render on 80s PC and it took three days to render on ship it with approximate cost of five million ship it points I'm not sure about the carbon footprint I'm not qualified to talk about that but anyway it was very satisfying to watch how people connect their computers to help us render this stuff out and did I say that I will show the amazing demo reel at the end of the talk did I no not really I will I will squeeze the duck and there is a special button inside and it will launch the demo reel so get ready okay thankfully not only three render farms collaborative render farm have been changing but also commercial ones so render farms like is render which had distributed infrastructure allowed us to have pre-competitive prices like 0.5 to 0.75 dollars per hour thanks to the nodes scattered worldwide it worked kind of like Uber for rendering yeah so we rendered I think it was four sequences on raise 200 hours versus one hour it costed us something like 140 dollars which is not bad I think so with the help of collaborative rendering and commercial render farms we pushed it to 11 out of 10 and I don't want to disappoint you guys but I don't think we quite reached the proprietary software level the Star Trek and credits quality but not that we want it and yeah it's not an arms race and I don't think that's the actual state of affairs is as important as the trend and the trend is pretty clear to me now I want to give you I want to give you a very exciting demo reel indeed so can you count from 10 or let's say from five to one okay five four three two one go wait wait wait hold on a minute glib can you see me oh goodness like the bigger theater this year nice look I don't mean to intrude but since I am captain disillusion worlds greatest blenderer and outer space is kind of on brand for me I think I should be the one to press the button that starts the demo okay it makes sense all right great of course in order from my signal to reach all the way to your laptop from here I had to rewire a few things again and give it more juice so get ready for the ride of a lifetime here we go one two okay how about I just show you the view out of my window instead yeah thank you so much thank you the doc loves you all yeah yeah I'd like to thank so many people who have been helping to push this stuff forward the doc adores you all and yeah thank you very much