 Hey, good morning everyone. You're all awake? Good. Anyway, welcome to the pipeline presentation of spring. This is really a first here, because this is a little bit more technical than I normally get, and maybe this is also not my area of expertise, but I'll try to explain it as best as I can. For those of you who don't know me, I'm Andy, I work at Blender, and in April we made a film, well in April we published a film, we premiered a film called Spring, and yeah, we made it over the course of the year at the Blender Institute, Animation Studio, and yeah, we published it in April. And for me this was really a first, because this was actually the first film I was really directing, and I had a number of other tasks during the production, so I was not only scrambling to make this a good film, but I was also doing lighting, and I also did environment modeling, and I did the occasional map painting for the film, and I did a lot of simulation for it, and some rigging for the simulation, like the hair for spring, hair grooming was also a lot of fun to do, shot assembly, which ended up eating most of my time, and that's actually most of what this talk is also going to be, and finally some occasional rig fixes, you know, because we didn't have our rigor in-house, and some things needed to be fixed, but also the animator had to do rig fixes, so yeah, and finally grading, so there's a big number of things that I was fortunate enough to get my hands on, and that was really cool, because I could look into all these different areas of filmmaking, and yeah, this is why I'm giving this presentation, because I want to share that with you guys, and we'll see, so filmmaking is really complex, and for a short film it's a little bit simpler, but you still have to do all these steps to get to the final product, and how do you do those things, and how does everything get together? If you read up some of that stuff on the internet, you get a lot of simplified versions of it, you know, there's this pipeline thing, and the word pipeline suggests that filmmaking is really a linear operation, you go from the first concepts to the storyboards, and then you make an animatic, and then you end up somewhere with the final film, it's really not like that, it's more like a lot of things overlap during the whole production, so a lot of departments have to work with each other, and they have to communicate, that's a very big challenge of filmmaking, and the pipeline is really there to just facilitate that, to help people to communicate with each other, and on the most fundamental level it's about putting all the stuff together, so the putting all the stuff together is what this is about, and I don't have enough time to show you everything, because you could literally give an entire conference about the things that we learned during this production, but yeah, here we go. Another thing is what makes us as the Blender Animation Studio different from the rest, apart from the fact that we use Blender. For Spring we used Blender 2.8, we switched halfway through the production to Blender 2.8, and that was really wild because everything was super nice and cozy, Blender 2.79, and then we had to port our entire workflow to Blender 2.8 while it was being developed, and that was super challenging, and that also influenced the entire way that the film was being made in many ways. Another thing is of course that we are an open source studio, so we try to be very open about everything, and we want to make sure that we use open source software for filmmaking, because we don't think that filmmaking should be limited to big studios that can afford to push all this money through development of stuff, so we try to, while we use Blender for almost everything, and there's some 2D work that has to be done in Krita, Inkscape, GIMP, and that kind of stuff, and then there's other open source tools that we use, and we develop as well. The next thing, the next challenge is that we don't have a really big team, like for Spring, it wasn't really like we had a, well not a handful of people, but a bit more than a handful of people to make this film, and that means that a lot of people have to do a lot of different tasks, like you could see from my first slide, like everyone had to engage in a number of things that they weren't originally planning to do, but sometimes you just have to put out the fires in the places where they get started, and a big thing was also that we didn't have a full-time TD on this project, like we had a lot of really great developers on this project who were like really putting all their energy into Blender 2.8, and the filmmaking aspect of that was sometimes on the sidelines, because you want to do all this great stuff with Blender, but sometimes you can't put, you can't do everything, so that's also where the pipeline kind of ended up being very manual, very grungy, very down to earth, because a lot of the things that we would otherwise automate, we couldn't, because there was simply no time for it. That's maybe what you'll see here. Another big thing is that we published everything on the Blender Cloud as we go, so during spring we had Friday weeklies, that's the right name, and then on Monday when we all got back into the office, we published it on the Blender Cloud, so it was out there immediately for people to see, and we gathered this huge library of all the concept arts and the work in progress shot, like on a shot by shot basis, the work in progress of the film, was really cool, but it also just exposed us for a lot of input, and of course you want to follow everyone's advice, make sure that you're not doing anything stupid, so that was also really challenging. So let's figure out how to make a film. So I think the most important thing for me, the most down to earth thing for me to start with, is where do I put everything, right? So where do we put everything in the studio? It's kind of a little bit of a twofold thing, like it's a bit more complicated than that of course, but for us artists, we basically have to worry about two things. So we have our own hard drives, and those house the production repository of Spring, this is where we store all the things that make up the film, and that's on the hard disk, it's managed by a versioning tool called Subversion, it's fairly old, it handles binary diffing really well, so that's why we still use it, and the artists commit and get updates from that SVN, and then we also have the shared storage, which I will just refer to it as Slash Render, because that's how we call it in the studio, and that's where all the big stuff ends up being. I'll get into a bit more detail. So this is the most important folders of the Spring repository. We have libraries, we have the edit there, we have all the shots and the scenes and the tools, we put that all in those very important folders. In the libraries, we put more folders for characters, for props, for environment assets, for the sets, can't read that actually, some extra nodes, and then maps for generalized textures. In characters, we have subfolders for each character, so that's where the actual blend files for the characters end up being, it's usually a big mess of files, but there's a main file, for example, Spring.blend, where you find the character as a collection, and we have maps, which are specific to that character. It's similar a bit in props, so you have maps again, just for the individual props, and this is where all the props end up being. This is actually not props, the nodes are really small, maybe I should look there. What I just said, sets, this is where we store the sets, and then there's a bunch of other things, edit contains the main edit, I'll go back to that later, and nodes, we store some general node setups there, tools are tools specific to that production, scripts, add-ons and that kind of stuff. The main thing is scenes, scenes contains all the shot structure of the films, and in subfolders there, we manage how the different scenes come together and the shots within those scenes. I'll get to that later also. On the other side of things, for the big data storage, we have slash render, so it's all about, this is where the caches go, this is where the final frames go. We have shots and frames, which are kind of similar, frames is more like the render farm output, so all the raw stuff from our internal farm gets dumped there, and then shots is where it gets cleaned up later on, and everything gets nicely put together, so we can refer it for the rest of the process. We have plates for background plates that get to be re-rendered and used in individual shots, and export is where always the latest version of the film is, if there is a change, so each Friday we'll render an export, but I'll get to that later. These are the two things that we as artists have to worry about in the studio, these are the two locations that we mainly interface with, and how this interfacing actually works, I'll explain that in the course of this presentation. Let's get to actual production. You've seen this earlier, and it's a bit simplified because I don't include everything of the whole production process, like concept art is not in there, and some of this stuff, like character development, might be even before the storyboarding starts, or some of this stuff might be in different order, but the main thing to take away from this is that you have basically two things here. You have one thing that goes on during the entire production, and that's asset production, so you're developing all the things that you put into the shots later on, and then you have shot production, which is more of a cyclic thing, it's a cyclic operations because you have many different shots in spring, we had 102, and you just go through these things, and it repeats a lot, because for each shot you need to set it up again, and then you need to link stuff, so a lot of cycles happening there. Let's start with the first bit. How did we do the storyboarding for spring? Storyboarding is a very versatile process. We've seen a lot of great work by tangent animation, like how they do, how they work their magic with grease pencil and everything for us. All the tools were still in development, and we were a bit uncomfortable with it, because we're not storyboard artists per se. Our storyboards were done in post-it notes, and we started out using grease pencil and a number of other things, but it really became too much of a thing that you have to worry about how the tools work and that kind of stuff, and that was really getting in the way. For us, the best communication tool was just to put the stuff on the wall, because also it was me, Hjalte and Pablico, our animator, who are putting these storyboards together, so a wide range of different styles, but we had to communicate about it, and post-its were really the best way. We then scanned those in batches on A4 paper, we cropped them, and we put them in a blender sequencer, and this is how we put the first version of the film together. This is how the first animatic was born, and this is also how the genesis of the edit.blend contains the entire edit of the film during the entire production, and it changes over the course of the production, of course, and you move things around, but it's always your film, you always have a film to look at, and it'll come back during this presentation. So this is how the first animatic was born. It's really a crude thing with really crude sound effects, but it helped getting our minds around this thing, because it's a really complicated thing. It's very good to, even at the simplest, in the simplest form, to have a version of the film at every time of the production. While that's going on, there's also character development, and I could go on and on and on about character development. It's really important, and it's currently our focus at the studio also with the Rain Rig and the Rain Tutorials to make this process better, and it's a time-consuming process, and it's the most important thing, because your character is the main focus in the entire story. It's how you connect with it, and it's so important. So there's a lot of back and forth between concept artists, modelers, sculptors, textures, animators, and rigors, and that goes on during the entire production, and it ends at the very end of the production when the last frame is rendered. So it's immensely complicated. The second most important thing I would say is props, because in Spring we had a lot of props that were actually related to the characters, and we also started fairly early. Might be a bit wrong here, but we started them fairly early because our staff, the chimes, and other things were really important to the story, and that goes on also to the end. And props are usually the things that are objects, but they can still be rigged because they need to be animated by people, and they also need to be constrained to the characters. So they're the second most important thing, because they're also there on the screen, and they move and everything. So layout. After storyboarding is done, again this is very simplified, but after storyboarding is done you want to get a three-dimensional sense of how the movie looks like, and layout is really the best thing, because you're really mocking up how everything looks like in 3D, and you get a lot of clues about where to place the cameras and how sets are built and that kind of stuff. For us the layouts were really made in big files, like one blend file per scene, and all the edits within that blend file were done with camera markers. So we would place the markers and then put in a lot of cameras in the scene to edit the film together and to find a way to put the storyboard on the screen, and of course that would also get put into the edit. So you see that there's very crude animation, a lot of skiing dogs, and yeah, but it really just helps to put everything together, and I always think that's funny. Anyway, while the layout is in progress, it's also a very good tool to get your sense or your mind around what this movie is made out of, like all the different assets that need to be put into shots, like all the things we need, we need trees, we need rocks, we need plants, we need icicles, and all those things, like at that point it starts to dawn on you what kind of work you need to be putting into this. So this is also where slowly we start to work on an asset library, because it can be very mind-numbing to think of the whole movie as this very complex shot by shot thing, but if you have a library of stuff that you can just bash together, it's a bit easier. So library work started really early on, Julian did a lot, he modeled the entire film basically, so it also goes on during the entire production, because something is never finished. Once you have a good library or you have some assets, you can start putting the sets together, and sets are really the background for each shot, and in the set file, like for example here in the set file, we link all the assets from the environment asset library, so nothing ever gets appended or so, it's all just a reference. Set building usually starts from the layout, so in layout you get a rough sense of how everything is put together, laid out, and we start from the ground planes and some rough models we made during layout, and then we put them into the sets files, and then we detail them or we replace things, we put collections and we place assets. One of the most important things during that time is ground contact, because over the course of the production you might be working on the set and you start animation and the character still has to put their foot on the right places in the ground, so you have to watch your ground contacts at all times. At this time, we also put the cameras from the layout into the set files, but we also didn't have a good way to keep them up to date, we'll get to that later. We had six different set files, six main set files for the whole film, you can see them listed here, and also the names, and one of the most important ones was the river bed, where like with a pillar, and that was really literally used in almost 80% of the whole film, and the rest of the sets were really just one-offs and things that were constructed out of environment assets for specific shots. At some point, the layout is done, it's never really done, but at some point you have to push production forward, and this is really a crucial step, because in layout it's a very organic and messy process, so the stuff that you're creating there needs to be cleaned up for the rest of the production, so I've said we have this edit, we have all the scenes, like all these scene strips in the edit, and at some point we go in there and, well I go in there and I place these color strips to mark each individual shot, and then the cool thing is that we have this attract add-on, which adds the shot to our internal, into our cloud database, so we can actually access that from the blender cloud, and all the blender cloud users can do this, so this is where we actually get the big picture of how this film is put together on the shot level. We need to name shots, but we can also see how long each shot is, because we get that directly from the frames in the sequencer, and that's really a crucial information for the next step. By the way, how are things named, how are scenes and shots named during this film? We had 11 scenes, 10 plus credits, the scenes were really just given individual names, because it's a short film, you don't have that many, you don't have to number those scenes, so just as a reminder, we give them interesting names, and those are usually prefixed by a number. On the shot level, it's a bit more complicated, so again we have the scene number, then we have the shot number that we increment by five, why do we do that? It's sometimes easier to add a shot in between if you have some space to do things. If this is a bigger production, you might want to increment by 10 or by 100 if you don't know how many shots you're going to add in the middle during this whole process, and then we give it a letter at the end for takes, because sometimes you want to do a different version of that shot. In the blend file level, we give the blend files a little dot notation appendages to signal that this is the task that this blend file is associated with, and these are also put in the shot level of the whole production tree that you saw earlier. Once this cleanup is done, you can kickstart animation basically, usually it's a very stumbling kind of process, because animation also takes a lot of preparation, you know, the rigs need to be done, and there's a lot of stuff that needs to be ready for that time, but you have to put the layout into a form so you can use it for animation, because the animators need some files to work with. They can't start with a layout file, which is incredibly messy, so we create a file called the shotname.anim.blend, and this file links in all the character assets, it links in all the props that are needed for this character, and we link in the sets as a background, and this is really the bread and butter what the animator deals with. So, how does that look like? Well, we're preparing a blend file basically, we're starting with a completely empty file, and then we just go file link, and then we link in collections that are in the rest of the production tree, and for the characters, we had, you know, we had collections and rigs in the collection, so we used the old proxy system for that. We had a tool that was a leftover from the agent project, which was kind of a bit icky to get to work with blend 2.8, but that helped us also to prepare individual things within the file, but not a lot of us automated there. Then the most important thing is the frame range, so how long the shot is going to be, we get that from a tract, we copy paste it from a tract into the file, and usually we start the blend files with frame 101 to give us a nice pre-roll for simulations and that kind of stuff, and we also add handles of five frames, so whenever there's like an edit change, or even just, you know, for motion blur glitches, you know, when the character's moving into a shot, you don't want it to be static before the shot starts, you want to have it, like you want it to have some movement, so there is no first frame that looks out of place. Then we set up the collections for the animators, and those get prefixed also with the shots, because we might want to link them into other files later on, and this is also where, like sometimes the animator also makes different collections to manage everything, so it's not all set in stone, and then there's a crucial step, which is putting the camera from the layout into the animation file. Now we used normal cameras with object animation for the layout, and we had a camera rig for the production, because we wanted to have some additional controls, and we wanted to use action to animate the camera, and we wanted to, you know, give it different pivot points, so we had a camera rig that needed to get this information from the layout camera, and as some of you know, armature coordinates and world coordinates are sometimes, they don't really work, so most of the time we just had to hand animate the camera and do our first camera pass at that level, and then, yeah, you just prepare everything for, you know, for setting different visibilities, so the sets are usually set up in a way so you can hide certain things, so the animator doesn't have to see everything at once, and also some things are just low resolution, so we want to make sure that only the essentials are there for the animator to worry about. Then often, we, well, most of the time, we put a reference shot into the file, so we, I mentioned the exports, so we always put the latest exports into the file as a sequence strip, and then we line up the shots with the actual start and end, and so the animator gets some context where they actually are in the movie, like the previous shot and the next shot. Then once the animator keeps doing animation, they do viewport renders, and they save them in sequential order on our shared server, and this is the kind of stuff we can critique and we can do notes on, so you're saving the stuff in shots, and you're referencing that from the edit. A next crucial step is naming the actions, so all the actions get prefixed with the shot, and then the character it's associated with. While animation is going on, you you can already start doing lighting, and usually during spring, we started lighting roughly after the first blocking, so you already get an idea of how the shot is going to look like. We create a lighting file, and this is also called dot lighting dot blend, and that links in the character and the props, the sets, but also since we're constructing a shot, we're also linking in individual environment assets and changing that environment for this specific shot, and there we can, like we have a clean file, and we can set render defaults, and we can optimize this file for lighting. Then the crucial step is that we link the actions from the blend file, you know, we have to make sure that they're named correctly, so whenever the animator updates something, we get that change in the lighting file as well, so we're linking the actions from the animation file into the lighting file, and you can see that here, like you can touch all that stuff, it's still left to the animator, so while you're doing lighting, you get the latest updates from the animator, and then of course we set up different collections for lights to be put at, and we can shuffle things around, and we can add more objects, and those get put into different collections. The test renders from those lighting files are put into frames, and those are also loaded into the edit. While we're doing lighting, we can also do simulation, and usually simulation has to be done after the animation is finished when you have all the movement in the shot, so we create a simulation file, and the simulation file can be a bit more messy, but it still links the animation from the animation file, and then of course we need all the other stuff that we have in the shots before. As I said, the file is very messy, so here we can do a lot of things, we can throw objects around and make stuff local partially, which is really cool in the new collection system. You can only get into your file what you need, and also you can make objects local and still keep their mesh data linked, and of course you have to make objects local, in this case if you want to influence simulation parameters, but since the armatures and the actions are all linked, it's all still kept in sync. There was a lot of simulation during spring, mainly smoke simulation, so the smoke caches were really the biggest data we had to push around, so the simulation file puts all the stuff into caches, and there's more stuff that gets put into the caches, like we have some general simulations, like for falling branches, for grass, for growing plants, those were cached to Alembic, and all kinds of different things that we need in the background, like generic caches. From the lighting file, we can reference all the data from the caches, so we link everything into the lighting file. For smoke, it's a bit complicated because you need the smoke domain, actually, so you append the smoke domain, and then you're loading the VDB caches on top of it, and then you can do lighting and shader tweaks on top of that. There's more simulation for spring, like we had first simulation, which was a big thing, but we only did it in a number of shots, so we couldn't do it on the entire film, so we just did it in very specific shots where Autumn, the dog, had a big moment in front of the camera. Another big thing was particle simulation. Particle effects were handed a little bit differently on the linking side. We didn't cache them to file caches per se. They were cached within the blend file as particle data, so what we do is we reference that data in the form of collections from the lighting file, so the lighting file links in the collections from the simulation file with all the particle animations in it. So here you can see, like all the pebbles, they were simulated in the sim file, and that whole collection was just linked in, and that means that every time we update the simulation, we also get all the new data in. Finally, we can do some rendering. Now, this is final rendering, of course, because during the whole lighting process, we're doing rendering all the time, we're tweaking things, and at some point, you can do the final thing with all the samples in the world. So what are the settings for it? We rendered everything to 2K, and we outputted that render as multi-layer XR, 32 bits. Why did we do that? I'll get to that later. We used 2,000 samples, plus sometimes we used 4,000 samples, sometimes we used 6,000 samples in some shots, which were incredibly noisy for smoke. We limited the number of bounces to 2 or 3, and we used motion blur as well, which was incredibly time consuming, but crucial for the animation to work, so these are the settings that we use generally in all the shots. Of course, they vary a bit. Then we used Cryptomat, which was really, important and very handy, and we denoised also. We put in all the denoising passes into the multi-layer XR, so we had the noisy image and the denoised image always in the XR output. Those XRs get very big, and they're saved from the render farm into the shots directory. Why did we use multi-layer XR? Well, we did some cool tricks using sample merging, which is a process where you can render a chunk of samples on the farm, and then you get it back. It might be 10 samples, and you see, okay, the shot is okay. There's no pink textures. Let's go it. So we render it with an offset, and we render more samples on top of that. And then what we can do on the farm is just merge that together, and this is actually what Flamenco, the thing that Sebrin made, supports, which is super cool, because you get a really quick version at the start, and then you can add more as time allows it. Then, of course, we did denoising, and the denoising we did actually in post, because we had to do all the sample merging, so we couldn't do all the denoising after the fact. So we had all the denoised passes, and we used this handy operator in Blender to denoise the individual frames and override them. So here we get the denoised image. Once we have the frames, the denoised frames, the clean plates and everything, we do some compositing on top of that, and sometimes also fix this. So we create a compositing file, and this file is usually separated from the rest of the whole production. It doesn't link in any animation, and we link in the data from shots into that file. So you get frame sequences that are loaded into a compositing setup, and in this compositing setup, we can do a number of cool things that are specific to that individual shot. So I mentioned Cryptomat, so we can access all the objects and do adjustments based on that. What kind of adjustments? Well, for example, we had Autumn, which was a black dog in a black forest. We had some problems making him visible, actually. So what we could do with custom animated masks and Cryptomat is actually do some color correction, like some brightness changes on top of it. Of course, we didn't do a lot of color correction in the compositing because we didn't want to deviate it. So we did grading at the end of everything, but I'll get to that. Also, what we did is render fixes, a lot of render fixes. So we had to border render a lot of stuff on the farm that had to get fixed. For example, Autumn got first simulation during the middle, like after the shot was actually rendered, so we had to comp him in with the new simulation on top of it. So we used Cryptomat, and here you can see custom animated masks just frame by framing to put the dog into the shot with the actual simulation. And then we did other things, lots of other things, like, for example, we had these glitches all over the place where particles were floating around, and then we did some custom masking to hide our crimes after the fact. Then also, what happened in compositing was adding plates, most notably for fog breath, and we rendered out these plates with smoke simulation, and we rendered them out in Eevee because that was super quick. It's only like a little thing that was added on top, so it wasn't crucial for any, you know, light interaction with the scene. And we put those plates in the plates directory, and we referenced them from the compositing file as well. So you can see there are just a number of image strips, and they get some color correction, and then they get some translation on top of them to place them in the right point, and then they're just added on top of everything. Number of other things get added in compositing. We had vignetting. We had lens flares. We had some chromatic aberration on top of everything, very, very subtle, which was just basically a RGBA separation with some translation happening. And yeah, we output that compositing, this compositing sequence into the server, and we saved that as 16-bit EXR files, usually to save some space, but also we don't need all that multi-layer data later on. So 16-bit EXRs, we put them in shots, and we can do grading based on that. And the grading was actually done in the edit and reading the data from shots. So here you can see the film with the grading on top of it. So we did all the grading in adjustment layers, and there was usually just some color, small color corrections to get shots color-wise to line them up with each other, and that was done in adjustment layers in the edit of the film, loading all the 16-bit EXRs, which were linear, by the way, and the edit was all in filmic. We also did some other grading, like some generalized film effect, if you can call it. It just makes it a little bit more filmic, like to add some warmth to the highlights and make the shadows a bit more contrasty. Yeah. Then we also added some sort of film grain on top of everything, because we had denoising, we had real noise, and we just wanted to put everything a little bit together, so we rendered some noise plates, and we put it really, really subtly on top of everything, just to mash everything together and make it feel organic. Then we can output the film for further processing, and we do that as a TIFF sequence. It's actually a 16-bit, not 12-bit, that's an error there. So we output everything to 16-bit TIFF, you can see 16, yes, and that also gets put on the server. From those 16-bit TIFFs, we can actually process them and render out the film using FFMPEG and put it on YouTube. We also put all the data of the film on the Blender Cloud, all the characters, and all the assets for the film, and we packaged some of the shots that you can see on the cloud. We put everything together in April, and that's where we are at the end. I think I'm a bit over time, I'm sorry for that. I just wanted to say thank you for our great team to make all this possible. This is not my own work, I'm presenting the work from other people, so a big round of applause for them. Thanks to all the people who watched the film, and all the 3 million people on YouTube, and thank you for watching this presentation.