 Maybe this one will make you smell the display, or maybe you'll see a speed. Here's this one. Do you want me to smell it? No, maybe you'll smell it. Do you want me to smell it? No, maybe you'll smell it. Do you want me to smell it? No. My German isn't that good. Do you want me to smell it? No, if I smell it, I feel it. So, you can have a look at this screen. Maybe it works. Is it ever in the NVIDIA, or is it over there? You know, after pressing the right button on the desktop, you have NVIDIA on the display. Maybe there are some options. So, this one is German. And now I have another one. What is it? This one is the same. What is it? What is it called again? General graphics. Nice. Let's see. It's another one. Can you make it? Can you make it? The same for general graphics. So, maybe some of the period from the time of the manuals. So, we have another one. Which is gold, but it's not important, so it's going to cost you. You're going to have to speak in the morning session. Well, yeah, and you're supposed to have your hands on the table. Ah, yes, it is. Can you just make it in the morning session? Yes, to be sure. Right on time. Okay, so we're going to have to wait a bit more. And then this. Well, then we need to have athrough. What do you want to say? You have to know, in the lunchtime, we're going to have a lunch. Maybe we should come here for some people. Yeah, it will be less usefull, when we now are hungry. Very well, because I thought it would be packed up and... Oh, that's a bit of a nice story. Well, I slept well, didn't I? No, it was very good. You didn't say nothing, but I was excited. I was excited. Very nice. Very nice, completely relaxed. Very nice. It was horrible. It was really, but... Yeah, I'm not an expert. I think I've been there for a while. Yeah, good. I need to start. All right, it's not really a big deal. Maybe we can do it. We'll have to wait somehow. Yeah. Well, it's not... You know, there was nothing. A second window. His location was... His location is on 1280 by 800. He was in the middle of the room. He was in the middle of the room. He was in the middle of the room. Which resolution? 1280 by 800. Down. 1280 by 800. Down. So this one doesn't... The other one was... This one is the screen. So... This one doesn't help. Anyway, it's in showing up. It's very interesting. But now it's very not here. It's on the mind. Do you want something? I don't know. I should press F5 to get it started. Yeah, now it could do. If you start your presentation, you'll be able to do the screen. Yeah, I know. Just a second. Yeah, that's it. Now I can... Oh, but then we'll check if this movie will be displayed. Oh, yeah. If we'll do it. Move it. Now I have to go back to the other room. There would be no sound, probably. Now I don't need sound. Aha. Let's see whether... Well, it's way down the presentation film. I think it should... It's a problem of... If it plays on the computer, it should actually... Yeah, but there are two separate screens now. And this is the first one, the second one. We can't make it work with the first screen. Or maybe we can do everything. I don't need a screen here. I just need the same as... Yeah. Yeah. I think it's open somewhere beneath the presentation... Yeah, but there's nothing to see the other one. Yeah, but there's nothing to see the other one. Yeah, but there's nothing to see the other one. Oh, here. But it's not a question. We have to move the... Can we now be removed from the the small part of the screen? That's the other one, man. That's the other one, man. That's the one, man. That's the other one. Now we go back to the... the last one. So, open the blender and the creation thing. And now if we press... No, it's the other way around. No. So, maybe we can... We have only one screen. Or at least the same as the last one. Yeah, just switch this one off. There. And we will just move this one from the bottom. Yeah. But then maybe the resolution needs to be right. 12.18. 17.6. 16.8. I think we're using it a bit, I guess. There's nothing to see the other one. About 20 minutes left. There's nothing to see the other one. If you want more... More... more... with the eyes open. And the next one is the brother count. That's two. That's two. That's two. That's four. That's five. That's three. That's three. Okay, that's true. Thank you very much. Thanks. Okay, hello. And welcome to the case study. Blender has an all-in-one tool for film production. My name is Tito Sphere. I'm running an animation studio called M emotion in Zurich, Switzerland. I'm from doing CGI work for clients in different fields. I'm also producing independent short animation films. I started using Blender roughly three years ago and would like to give you an insight into the creation process of my most recent short film production made with Blender. I'm going to talk about why my software choice was Blender and the film's general production pipeline then explain one sequence in more detail. So why with Blender? Over the past decade, I've been noticing Blender a couple of times. As I had my doubts about the reliability of freeware in general, my attitude towards Blender wasn't different. Then Elephant Stream, Big Buck Bunny and Sintel came along and made me curious. I learned about the reason of the open movie projects and the Blender foundation and I noticed a vivid web community around the software. I also realized that Blender wasn't just some specialized niche software designed as a set of tools to accomplish the different kinds of jobs involved in a production pipeline. I really liked Blender's all-in-one philosophy which seemed ideal for my studio. So I decided to use Blender as an all-in-one solution for my newest short film, Size Matters. The question was, was it reliable? And for that I had big question marks at the beginning. But then first tests proved reliable indeed. So I gave it a try and used it as an all-in-one tool for accomplishing my film. So the animation film, Size Matters, is about a fish which gets pulled out of the water on a hook. As it struggles for life, its last dream begins and with it a journey through a world of color and sound. With death approaching, the dream wears out but suddenly the fish finds itself being thrown back into the safe water among its bigger fellow members, Size Matters. The film is a six and a half minutes animation rendered out in HD 1920 by 1080, 25 frames per second. The overall production took me roughly one and a half years. The rendering capacity involved three machines. The Blender version was 2.67 for most of the production time. The film consists of the intro and outro and seven sequences in between, each with its various scene layers. As with any such project, the creation process of the film involved the general stages of pre-production, production and post-production. Blender's tool set conveniently allowed to accomplish these stages in one software environment. So in the pre-production phase, I broke the story down into sequences and made a storyboard, collected and shot some reverence material, started blocking in scenes, the technical pipeline and folder structures and made an animatic. The animatic is basically a film version of the storyboard with additional timing and sound added to the pictures. So for that, I took the storyboard's pictures I had made before and put them on the timeline of the video sequence editor or the VSC. Each sequence I numbered and named according to the storyboard. This numbering system was the base both for the future Blender files and the render folders generated in production and post-production. For each sequence, I used a full three-digit number, like sequence one was 100 plus a name and so on. This allowed some flexibility if later an additional element had to be added. So a transition mask between sequence two and three, for example, would get labeled something like 250 plus a name. The same naming convention was applied to the compositing production. Once the storyboard and the animatic seemed to work, the production phase could start. It involved modeling the assets and characters, creating the materials, texturing, lighting, rigging and the animation of camera, objects, masks and more. It was also about getting specific and translating the drawings into a modeled 3D world. The concept drawings weren't always precise enough for direct translation into 3D. Here, for example, to the left, the muscle opens and the fish swims into the muscle while there is some transition effect going on. So I had to decide how to interpret the visual effect, which, like in this case, sometimes meant a lot of experimenting. Where possible, I kept the elements of a particular film sequence in one blender file, creating a blender scene for each main part. This allowed quick swapping back and forth between scenes. For example, here you see the blender file of the fish tunnel sequence with its scenes and their compositing, all in one blender file. Post-production. First, the individual scenes of the sequence were put together, color adjusted, etc., in their own blender file. Then the render output of that compositing was combined with the other sequences of the film in a global compositing. That was when the battle with the spaghetti monster began. Keeping the overview with all the inputs, transitions and animated values proved to be quite demanding. Now you might ask, is the Node Editor the right tool for the final edit? Probably not. Why not line up the finished sequences in the VSC for its linear display and therefore better overview? Well, that was the original plan. So the animatic drawings would gradually get replaced by the finished film frames as the project developed. The master edit would happen in the VSC so much for the plan. However, this turned out not to be very practical with my project. Why not? The film had one peculiarity. The plots dictated a one-camera shot through most of the film's dream part, which is most of the sequences. This meant hardly any clean cuts. Instead, the sequences were connected by very long transitions with animated masks, patterns, textures and colors. So the reason why I ended up editing the whole film in the Node Editor was that for those transitions, the VSC offered less compositing possibilities, at least to me. Of course, the price I had to pay for keeping everything in the Node Editor was the loss of overview because it is, as you might know, not timeline-oriented. The film was rendered with blender internal. Why not cycles? The nature of the film, all sorts of graphical elements and underwater environments didn't really call for a global illumination approach with bouncing lights. Additionally, with cycles being quite new at the time and me still a newbie with blender, I didn't want to hit a wall somewhere way into production. That would have meant re-texture and relight all the scenes again for the other render engine, which would have been a nightmare, obviously. One exception when I used cycles was a sequence where the fish swam between light-emitting jellyfish. So the fish is cycles, the jellyfish blender internal. I could have lit the fish with parented point lights in blender internal, but for this layer it seemed easier to swap to cycles, give the jellyfish an emission material and get the fish illuminated by the jellyfish's geometry. Okay, let's have a closer look at one particular sequence in detail. This is the beginning of the dream part once the fish lies on the shore. Basically, the sequence is about a bunch of fish eggs being the prey of a school of sardines, which in turn are on the menu of some tuna. Then the next sequence is introduced with an animated transition mask. The sequence is composed of several geometry layers. The underside of the ocean, the light rays, the eggs and the baby fish or the fish embryo, the fish trail, the school of sardines, the tuna, the water noise. Can one see that? No, it's not really. The freestyle layer and the animated transition mask. So the water surface was created and animated with the ocean modifier. For the skylight behind the water surface, I created a disc with a shadeless material and an opacity falloff. This way, I had full control of the highlights, shape and location and it let me achieve in a simple way the illusion of that particular underwater look with stark highlights in the center and the darker regions towards the edges of the frame. Later in compositing, this effect would be further enhanced with a vignette. The light rays were a similar cheat. Basically, they are spot lamps with a volumetric halo. The halo gets partially blocked off by a plane with holes, the square one. Right above that plane, a disc with some missing segments is rotating. This creates the illusion of randomly casted light beams shining through the ocean surface. For the rays fall off towards the deeper regions, I animated a mask according to camera movement, which then would be used in compositing to fade out the rays' image layer. The fish eggs. The fish embryo with its rig was animated along a path. The egg deformation was accomplished with a lattice and the pounding embryo hearts were animated with shape keys. There are two versions of eggs, one animated and high-res for the foreground and a simpler one for all the eggs in the middle and background. Here, the eggs' refractions of the background proved quite tricky. Since the background wasn't physically present in the scene, but consisted of various layers in the compositing, there was nothing for the eggs to refract. So I had to render out a version of the composited sequence without fish eggs, then feed the result into the egg scene as a 2D background and render it with 3D eggs, which now would have something to refract. But because I had to blur the background for smooth refractions, I couldn't use this render as the final frames. Now, Blender's Node Editor allows to insert a 3D scene as a render layer, which is what I did to integrate the refracting eggs into the rest of the sequence compositing. The fish trail. The fish trail was a particle emitter with two emitting faces, which followed the fish and rotated along its path direction. Here, I ran into a problem. With Blender version 2.67, an emitter which was parented to an animated object didn't emit particles on a subframe level. So as an example, the unparented animated emitter spread them evenly and gave me this result. The parented emitter, however, would give me this, clouds on a per-frame basis. So in order to get a smooth fish trail, I had to unparent the emitter from the fish and animate it along the fish path, nearly frame by frame, so it would follow the fish. That was quite tedious. But the good news, this issue has been solved in a more recent Blender version. Yay. So the School of Sardines particle spoids. For the sardines, there were two void systems set up with slightly different settings. Both chasing a target or a leader. Some of the tunas, or rather their dummies, swimming through the school were set up as avoid collision objects. This way, when they hit the school, they made the sardines flee in all directions. After modeling and texturing the tuna geometry, it got rigged. Tail and head movements were animated and saved as actions. The tuna rig was animated along a path. Then rig, tuna and path were copied and arranged in the scene and the animation was randomized in the NLA editor and dope sheet. The water noise. To make the shot more believable, I simulated some of these small drifting bits which are present in every underwater life footage. To do this, I set up a nearly static particle system with some Brownian motion and rendered it out through the main camera a couple of times with different seats and as a black and white mask. Then I integrated the noise into the compositing with varying intensity and color settings. The freestyle layer. Apart from the intro and outro, there are superimposed outline animations throughout parts of the film for a storytelling reason. For these outline renders, the newly introduced freestyle feature came in handy. The option mark freestyle edge together with the edge type settings gave me good control over which contour of a mesh would be drawn and which wouldn't. And finally, the compositing of the sequence. Now the individual scene renders were fed into the node editor for compositing. The ocean background was combined. Sorry. What was that? I missed something. The animated transition mask first. The moving pattern in the shape of tuna fish acts like a stencil with the other sequence underneath. For this, I recycled the tuna scene by getting rid of the lighting and textures, repositioning the tuna and rendering it out as a black and white mask. Now the compositing. Everything was fed into the node editor for compositing. The ocean background was combined with the tuna and the fish flock each with their individual adjustments in color and contrast. Then the rays, water noise and fish eggs were added. The finished sequence was rendered out and fed into another mother of all compositing in a separate blender file, then offset to the right time and blended into the next sequence with the animated transition mask. Here also the freestyle layer and the fish trail were added. Finally, this global compositing got tweaked again in color and contrast and a vignette was applied. So wrapping it up. In my opinion, blender's main strength lies in the combination of the various CG tools under one hood. The fact that one can make an entire film with one package is fantastic. I believe the more the individual environments are blending into each other, the faster one can accomplish such a task. With this in mind and having done my film with blender or made my film, I had an idea for a possible feature or improvement. The integration of parts of a compositing in the VSC pipeline. A change in the timeline, sorry, a change in the compositing, maybe the node group or something likewise would show in the VSC. But moving the strip in the VSC shouldn't screw up the starting frames in the compositing. The VSC strip would act as a placeholder or a parent of the compositing. Anyway, I didn't think it through in detail yet and I think there is already some efforts in the forums and on the web. But nevertheless, the ID seemed noteworthy here. In my view, such a feature could well improve the post-production process. No, conclusion! Now the conclusion. Computer graphics imagery is about finding the most efficient way to sell an illusion to the human eye. If it looks great, it is great, right? For creating this illusion in my opinion with blender and its combined overlapping work environments, one has a very efficient and consistent toolset at hand. Thank you. Thank you very much. That was the last presentation in this section. So please move questions to the lunchroom and have fun. Thank you.