 Hi, and welcome. My name is Sebastian König. Before I start, maybe one question. Who of you knows Blender? Not. Aha, okay. Who of you doesn't know the concept of open movies? Who of you didn't see Tears of Steel? Aha, that's good. So I don't have to redo my presentation. That's great. So I'm going to talk about how to do short films with Blender, especially with X short films. And I will do that by showing you how we did Tears of Steel. Tears of Steel is a short film by the Blender Foundation, and maybe just very quickly in case you didn't see it just the very first seconds, just so that you have an impression. There is sound. Look Celia, we have to follow our passions. You have your robotics and I just want to be awesome in space. Why don't you just admit that you're freaked out by my robot hand? So that's the main plot. So it's a boy and a girl and they are having troubles. Maybe I don't tell you the whole story. You can just watch it on YouTube. So I'm going to talk about Tears of Steel and how to produce a short movie with Blender. And since this is supposed to be a workshop, but we have only one hour and you don't have your workstations with you, I'm not going to start with technical demonstrations. I will just tell you a little bit about the open movies, how our production went, how the pipeline looked like, maybe what's the special thing if you do it with Blender and open source software, and after that I will do some demos just to show you how this would look like if you would actually do it. So the open movies are a very interesting concept. They started in 2006. So open movie means you can share everything. So you can watch it for free on YouTube. You can download the footage, you can download the movie, basically you can download everything. And not only the movie, you can also get the assets, all the blend files, everything. You can look into the blend files, see how the composites work, how the robot works, how all the characters look like and stuff like that. So you can do everything with it. You can even use it for your own commercial purposes. So it's really completely and totally open. And also the production is open. So that means we had a blog and we were posting about everything we did. Also about all the stuff that we didn't know because we were noobs, especially in terms of shooting and filming, so we didn't do that before. So yeah, we just shared everything. The open movies are short films and since they are free, we have to get some money for that and they are financed by crowdfunding. And I think they were actually one of the earliest projects who did crowdfunding. So there's a pre-sale for the DVDs and people can buy the DVD and then we can actually start doing the movie. And once the movie is done, the people get their DVD. And that's how we finance the movie. And because everything is open and free, you don't really get rich by doing that. So why would you do it then? I mean, what's the point? So there are several points, actually. One of them is that Blender wasn't always open source. It used to be in-house software for a studio in the Netherlands for Neo Geo. And because it was used in production, it was really well tested. So if you use the software and develop it yourself, then of course you know what you're doing and you know where the bugs are and what doesn't work, so it was really well tested. But then for various reasons, Blender became open source in 2003 and since then the development happens in the Blender Institute in Amsterdam but also of course online, so everyone can get the source code and contribute to the software. So it's a very open approach to software making and that means that there is no centralized place where you test. The things. And that's why the Blender Foundation founded the Blender Institute in Amsterdam and they did these open movies. So the thing is that you have several artists and developers in one room or at least in one building and they have to produce a movie within six or eight months or so. So you have a real production environment, so it's not just some concept that some coder says, okay, this is a nice piece of code, it works, awesome. Let's put it into Blender, but if you actually test it, then you will find that it maybe doesn't work in scenario A or B. So you get feedback if stuff works or not and that also gives you a faster development. And it's fun. And so this all happens in the Blender Institute in Amsterdam. This is the Institute, this is how we work. So this is from the production of Tears of Steel that happened last year. So there are six artists, some workstations and in the next room there are the coders. So if something doesn't work, you can just walk over and kick them really hard until they make it work or you can say, dude, I need rotoscoping right now. Do it and then they do it and next day you have it, ideally almost. So the first open movie was Elephant's Dream in 2006. Then we had Big Bug Bunny and I think it was in 2008. I didn't check the numbers. 2011, Sintel and last year Tears of Steel. Then every open movie had a certain focus for the development. Of course it was the one part is always the movie and the other thing is improved Blender. So there was a focus for every movie and Elephant's Dream focused on the animation system that was recoded and brought us the note compositor, the first version. Then Big Bug Bunny was hair and fur and cute fluffy animals. Then Sintel was used to develop Blender 2.5. So a total recode of the software and then Tears of Steel was used to develop the tools that you need for visual effects. So I'm going to talk about Tears of Steel. It's an open VFX short film and it's a work in progress. So this spring, actually I think they're still working on it. So currently there's the 4K version that is being produced in Amsterdam. So it's still going and that was for visual effects. So if you start doing a visual effects movie then even if you use commercial software you need to have a plan. What do you do? So of course it starts with concept art or before you have a plot. So plot then concept art, then you start storyboarding. There has to be an animatic so that you know what happens. Then if you do animation film, this of course doesn't need to be done. But for visual effects you have to do the shooting and you have to plan for that. You have to prepare all the props and get actors and everything organized. Then after the shooting you have to prepare the footage. You have to do motion tracking, masking, layout for the files, keying, animation, compositing and color grading. So all this has to be done and almost everything can be done in blender or at least in open software. Not everything unfortunately but most of it can be done. So that's the plan, what had to be done. And of course we wanted to have an open pipeline because it was an open movie so of course we didn't use After Effects for compositing. We had to use blender. Also the concept art was done not with blender but with Krita and the Gimp. So the concept artist David Revoir did some concepts for us and some sketches. And of course you don't need Photoshop for this. You can just use any painting program that you want. So of course use Krita and the Gimp. Then the next step is storyboarding. And that was the point where David and Ian, our director, came together and started actually doing storyboarding. They also rewrote the story a little bit. So these are some storyboard concepts. And then the next thing is you have to have an animatic. So once you have the storyboard you pin it to the wall and it looks very nice but you don't see it in motion yet. So for the movie of course you have to do some planning. You have to do the timing. You have to maybe even test some dialogues or get some soundtrack already. So for that of course we could use a final cut or premiere but of course we don't. So we use the video sequence editor of blender. And that's how this would look like. So you have your preview of the storyboard. You have everything nicely arranged in the sequence editor. There you can arrange every picture of the storyboard and get the timing right. Maybe I can show you how that would look like in blender. So this is the sequence editor. You have a preview where you can zoom in and out. Of course you have to see the image. And then you have this timeline where you can drag and move your images and you can do everything that you would do in the video sequence editor. So of course that's very nice because you don't have to leave blender. You can stay here and do all the stuff. And the good thing about this is that since this is in blender you can reuse this file for the step that comes after the storyboard or the animated storyboard and that's the animatic with actual characters. And since this is just OpenGL rendering we usually call it the crappymatic because the guys look like this so we didn't have the actors yet but we knew how they should look like. So we just got very simple versions of the characters that could be created very quickly. They could be animated. So this is how we would plan a scene. Of course without sound but at least it's enough to to get the timing to know what they are doing, how this would look like if you would film this. So that's a very nice tool to plan the production. And why did I go... So the other thing is that you can do in blender of course is to plan your assets. So if you know you're going to do green screen filming then still you have to know how you would build your asset how this would look like. So we could just very easily quickly model this and then place these dummy characters in the green screen and see how this would look like if you would film this. And the nice thing is that we can use these OpenGL renderings and put them in the sequencer as well to even have a better timing but not only that we don't even have to pre-render. We can just put the scenes directly into the sequencer. So maybe if I go to the edit we have the previous edit takes a while to load because it has to load all the scenes. So again you have the sequence editor you can have your strips you can duplicate them move them around you can do everything that you would do with a video but the nice thing is that this is actually live 3D viewport. So you can cut with your 3D scenes without having to re-render you can even do a reverse frame or speed effects if you want to have this faster or slower you can just add this effect and edit your 3D file live here in the sequence editor. Of course on a tiny laptop this is not really that fast in real time but if you have a nice workstation then you can actually edit your 3D files. And in this blend file I have all these linked scenes. So I could also go to a different screen layout where I have the 3D viewport and not only the sequence editor and can open the scenes and see how they look like you could even animate them from here. So if I look through the camera now you have all the guys here I think I could even move them around so I can take this and this is an entire set or an entire scene and all these scenes are linked. So even though you have them here you can see the camera I cannot edit them from here I would have to go to the blend file that's loaded or linked into this scene. So we have all these different scenes so I could just open anything here maybe not this one I don't know maybe this and these are the scenes that are loaded and linked into the sequence editor and here we can actually move stuff around and prepare the animation. So for previous that's really nice and useful and that allowed us to get a really good idea of how the timing would be and as I've said not only the scenes are linked everything is linked so if you have the scene you don't really put the characters inside the scene you link your character. So we had all these different elements the props the environments the characters and they are all linked into for example shot one and shot one is then linked into this animatic into this previous edit that you just saw and not only shot one but all the other shots are all linked into this animatic and that's a very nice and tidy workflow and that allowed us to have a really good idea of how the movie would look like later. Yeah I've already showed you this. So this is how this looked like. You're a rock star. You have the storyboards then the crappie medic. We should have about 10 minutes. The green screen footage and then the final result. We're only one. All systems go. Yeah you go. So incredibly useful. So this was all pre-production and while the animators and the director were doing the crappie medic and the storyboards and all that kind of stuff the modelers started to create the actual production models the robot the environment all these things and then we had the production. So pre-production was smart to April and then in May the shooting came along so we also had the actors and well we had no idea we really I didn't do this before the only person who was actually doing visual effects was our director Ian Hubert so we had no idea he knew what he was doing. So he even did a feature film project London also with visual effects but I would say more on an indie level. I mean it's a very nice film but they didn't really use a big camera so it was all low budget. We were also low budget but at least we wanted to do it pro. Get a kick as camera. So we got the Sony f-50 no 65 with an 8K sensor with a real shutter. So awesome. By the time we were doing the shooting there were only two cameras from this in entire Europe so we were really quite fancy and if you have such a great camera then you don't just grab it and film it yourself we wanted to have a bad SDP. So we got Joris Kerbosch and he was very helpful and then we wanted to have a giant green screen studio so we got that too and of course this is all expensive so all our money the pre-production pre-sale DVD money just burned away but well that's what happens if you want to have a giant green screen studio and we wanted to get a Wechslebrichling so basically we wanted to have our own parking space for the crane that we were using so we had one week or one day reserved for us. And of course let's film in the red light district. I mean we are in Amsterdam so of course we do it there but we actually had a reason because the the movie takes place at the The Audekerk in Amsterdam and this is in the red light district it's a very famous church with a bridge that you can see in the lower right corner and this is where these two main actors Tom and Celia they were breaking up there so the entrance scene that was supposed to be filmed here so of course you get your Sony f65 and your DP and your lighting and the crane and all the stuff and you put it in the middle of the red light district well so we had no idea but luckily we not only got a green screen studio and the stuff we also got an assistant director a gaffer and the DP and these guys weren't newbs they really knew their stuff so they could tell us maybe it's not the best idea to put all this into the red light district camera, red light district there may be not the best idea so they suggested to move to another bridge that was a very quiet area of Amsterdam where we also got the Wechsleep-Reicheling so we could the crane there but even there it was really a quiet nice and lonely bridge but even there we really had to do to keep the people from running for our set so even Tom Rosendahl the main founder and programmer of Blender he had to block the guys and even then the people managed to get behind our actors when we were filming so we had to road with them out which is a nice exercise of course but well it sucked so we had this green screen studio and it was so big and so large that we said okay that's easy there's well we have the actors and the camera and then what else but then when the people came the gaffer the light guy the grip the focus puller and all these guys suddenly filling up the studio putting up the dolly and the light and everything was so crowded we had no idea so this is how it looked like first and then very quickly and turned into such a mess it was amazing and since we didn't do this before it was really like we were just sitting there what happens but it was really interesting so my job was to take care of the tracking so later I was doing the match moving so on set I had to place the markers there then I had to note down the focal length of the camera so that I can track that better and take all this data transfer it to later to our production sheet where we knew this scene was using this how do you say the clapboard number it went into this and that folder and we had this and that focal length so all the information had to be gathered and I wasn't really prepared for that so I had to run around through all this mess and try to get an idea which focal length would be used for some certain shots and I had to place the markers I don't know if you can actually see them so behind the ladder you can maybe see some markers there then of course on the ladder there are markers then there is this interesting device that the actor is holding that's the arm gun so this is our tracking device so that we could get an object track and later replace that with a fancy plasma gun and this was a handheld shot everything went very quickly so it was just panicking and hoping that all the markers would be in the right position and that nothing went wrong and all that in all this mess so that was really interesting I had to put markers on the wall as well and since we had one set that was on a little stage I had to also put some markers very high another thing that I wasn't really prepared for so I grabbed these wooden sticks and placed the marker tape on the tip and tried to place the markers there but they would fall off because this was not a wall this was just a piece of cloth so the markers didn't stick and everything between two shots so I was freaking out but there is a lesson to be learned from this don't run on set this is something that I did wrong and if you ever do special effects or visual effects supervising be prepared for the mess also hire an assistant director so without her I would have been so lost and don't panic good advice so that was the shooting that was really interesting we had four days three days in the green screen studio and one day on set on this bridge and then everything was packed into little packages and everything went home so we were done then the next problem came and that was we had to deal with the footage because you have this camera and then you have these cards and you have to put these cards into a black box no one really knows what's happening there but it spits out data but of course not through a cable you have to go through the network so we had to get our developers to look at this web interface of this black box with the footage inside and then we had to somehow transfer that to the hard drives and then what then you have the footage and it's supposed to be super fancy 32 bit float high dynamic range and the format was linear aces which is supposed to be a very big color space so very good but then when we put this into a blender it looked very dull and very flat and very horrible and then we just went and asked the public what to do with it and then they said yeah you need open color IO which is also from I believe Sony Imageworks and this was supposed to be integrated into blender so by the time we didn't have color management so we had to use a Python script and through Python we could then convert the footage to something that looked much better so from this to this that was very nice so we converted all the footage to this color space and it was a very tedious workflow so all the different shots then all the footage into different folders then put them from one folder to the other through open color IO we also had to use unfortunately the Sony image viewer software for that which also wasn't really that nice but anyway somehow we managed to export all the footage and then we could go on so that was post production and for post production we needed a few new tools so blender already had nice modeling tools, something for texturing the sequence editor and all that stuff but for visual effects you need some certain tools that we didn't have by the time so of course tracking and matching moving when we started with the movie we just had it for one year and it was very nice and usable we did have some keying tools but not that great so this was something to be developed still masking and rotoscoping didn't exist a new multithreaded compositor because we were supposed to do 4k later and then if you do 4k you really need a decent compositor which we also didn't have and photorealistic rendering and color grading so all these tools still had to be developed and we succeeded so we got all these tools that's great because now I have something to demo so maybe other questions so far or any good then I will do a little demo and I thought maybe I just show you some tracking a little bit of keying, a little bit of masking a little bit of compositing, a little bit of everything so we have 20 minutes let's see what we can do are there any special things that you would like to see or should I just... Rotomatch moving? okay let's do it so first I have to get to the default theme so that you don't get confused so reset to default theme I don't know is this actually a good projector so you can see everything right or should I make it bigger? good so well we started everything with match moving we could have also used keying but keying and tracking somehow overlaps so we always started with match moving so if you do match moving in Blender you go to the motion tracking interface with a big screen so this is where you open up your shots so let me go to the folder with some footage so maybe this one let's start simple so this is obviously something for keying but the camera is moving a little bit so if I go through the shot you can see it's a little bit of movement so you don't really need a camera track but at least one fixed point so that you can put something in the background so if you do this in Blender first I would go full screen maybe so that I have a little bit more room to work with and for tracking it's good to have everything in the memory of course it can work from the hard drive but that's a little bit slow so first I would press P for prefetch by the way this would be something that is new in Blender 2.67 that is supposed to come out I think in two weeks or so so now I'm loading everything into memory you can see down here this purple line everything is caching and once everything is cached I would search for one marker that I can track I could use the lamp but if I go through the shot the lamp is shaking like crazy but in the background there is one little red dot so we hit these pink this was one of the markers that I had to put with a stick so that's a very high marker it's a pink cross of strips but because of the motion blur everything is, sorry because of the depth of field it's totally blurry and you can hardly see it so that's why the first thing would be to search for the markers and because this is a green screen I would just disable some channels and if I disable the blue channel or maybe first black and white then disable the blue channel and there's almost no information in here so this doesn't really change much but if I also disable the green channel ooh there's suddenly this red point that I can track so I would control left click here to place a marker make the marker a little bit bigger maybe so we have this giant point and this tiny marker so I would scale it up just by pressing S for scale and currently I'm just seeing in the preview this black and white image but the marker is still using the full color range so I could also go to the track panel and disable two channels for the tracking point and then press control T to start the tracking and it will go through the shot and the footage is a little bit crappy because this is an export JPEG images of usually it would be open EXR files so this has some compression artifacts but still the tracking went quite nicely I would say you can also lock the footage to the marker so if I now go through the shot it looks rather stable okay so this would be already enough for something that we can put in the background so I would go to the compositor because now we are basically done with the tracking so I could set up a key note and maybe a background image that we can put in the background so from the motion tracking interface I would go to compositing use the nodes of course and the default setup is we have the render layer and the composite output since there is nothing in the 3D viewport render I can just delete this and instead bring up my tool shelf which is also new in Blender 2.67 so from here I can drag in my movie clip and if I look at this, this is of course the clip maybe make it a little bit smaller so I could go to the resolution panel first because this is not actually HD this is a different resolution the 4K is scaled down to HD would be 10, 1,012 pixels high maybe just use 50% so that compositing is a little bit faster so first I would scale this down to the render size like so and then let's have a look at the key note so keying you would find in matte and the very first one is the key note so I just drag this in and place it on the line pick a color with the color picker like so we can actually average between different colors so if I click and drag here this will pick up all the different shades of green and average them so we should have a nice starting point and if I look at the matte channel it's a little bit crappy so I have to clip the blacks and the whites ideally I would use two keying notes so that we keep the detail here and then another maybe more aggressive keying note for the outside but for now I would just keep it like this increase the contrast maybe to get rid of some of these problems here this sharp line down here I could use some feathering like so and if I have a look at the image it looks totally horrible but that has to do with just the display problem if I activate the alpha channel everything looks alright again or if I just want to look at the RGB channels without the alpha I can disable use alpha in the viewer note and then also everything looks much better so this image is not pre-multiplied yet I guess some of you know what pre-multiplying is and for those who don't I won't explain because we would sit here tomorrow still and just drag in a converter alpha convert to pre-multiplied that and then I can also enable the use alpha button again so this is almost an almost nice key so we could put it onto some background to make it a little bit more interesting let's not use a plain color but an image so I would open up an image here so this is an image of the background plates so I could just place this on top of that and don't worry if this is too large this will be once this is rendered in the composite output this will be clipped but the compositor will show you everything that is there so to make this look a little bit nicer of course I would have to blur the background to make it look a little bit more realistic so I go to the blur and maybe do a 1% Gaussian blur like so it's not perfect but nice enough for a demo so if I now cycle through the different frames then the background is static of course and that's a problem so what we did in Tears of Steel was to actually do a tripod track so that we had a moving camera and put a plane in the background and the more hackish way would be to just move the background around with just one marker and we did the track for this one marker already so I can bring in the track position and from from this panel from the track position menu I can pick the marker that we had been using and use the X and Y coordinates for that so maybe even before blurring or let's do it after blurring I could move the background plate around with a translate or with a transform node and just pipe in the X and Y coordinates here like so set the position to relative start frame and now if I cycle through the frames you can see the background is actually moving just like our marker was also moving so the background plate will do the same movement as this one marker and that should be enough for the scene so that would be the composite and after that of course you could do some color grading if you want usually we had been doing the color grading in the sequence editor but it's also possible to do it here so this is the very basic keying workflow if I would do masking for this shot I could I could well let's go to mask mode and just draw a mask around his head like so maybe so that we have a nice smooth outline shift click on one of these points and drag out a feather area maybe even make the whole thing a bit smaller maybe like so unfortunately we can't really have a preview right there in the movie clip editor that's not so nice but we can again use the compositor to have a look at the mask so if I drag out the mask node this is our mask and well we could use it to maybe combine two different keying nodes I think I won't do it or maybe I because I want to do some other things as well so maybe let's not use this mask instead let's let's get rid of all the crap on the left let's do this let's make this mask a little bit bigger like so and then the feather area here should be maybe a little bit smaller yeah good enough so this would be that and now I can combine the output of this mask node with my matte channel or maybe even just pipe it directly into the keying node as let's say garbage matte but of course the other way around so I can just invert the color and have now everything separated so now we only have the guy in front of the background without the lamp and the other guy I wouldn't even call this rotoscoping for rotoscoping this would have to be much more precise but since I want to show also some other things this should be enough for the masking demo maybe just very quickly you can have multiple masks so I can have another mask maybe for for this guy just in case I want to use them I want to use him as well so I could draw this mask in the compositor I can have another mask output and use that one so you can multiple masks and this is how you would do it for this shot this should be enough because I want to demo some more tracking unless there are questions to this topic good then new blend file I'll also switch the theme back to the default so let's open up another shot from Tears of Steel so that would be this one and again I will press P for prefetch everything is cached into the RAM and I can preview and again we have a very very simple camera movement probably this would be even good enough for a one point track just for the background but just to demo I want to do a tripod solve from this for a tripod especially if you have such an easy shot like this one I can just maybe let's make the pattern size a little bit larger let's say 30 pixels so let me zoom in here so you have the pattern size in Blender's match moving module so this is this, this is the pattern and you have the search area there is a button for the search but there is also a hotkey alt S you can just toggle that so if you have a very fast motion then you should have a very big search area if the motion is very smooth in our movement here the search area can be much smaller and tracking will be faster and in the tracking settings you can set a preview sorry a preset so that every new marker that I place here will use the same properties and for tripod I think three markers would probably be already enough but just in case let's put some markers here in the scene and then press control T to track them from frame let's track them backwards which is also possible and in this case this is really so easy that I don't have to do it one by one so they are all more or less doing the correct motion one of them is jittering you can use the graph view for that so if there are spikes like this I would just erase the spikes or just erase the entire marker because for tripod this would be really enough the movement is so little that this is very easy okay so we have all these markers tracked so now I want to have a camera solve so to make blender solve this camera motion I have to tell blender about the camera that I've been using and this was the Sony f65 and that has a certain sensor size which is 24.33 that's the sensor size of this camera and the other thing that is very important is the focal length that's why I have been taking all these nodes on set so that I can tell blender which focal length this is and it was definitely not 16 I think it was something around 40 and with this information blender should be able to track the camera rotation so if I go to tripod solving mode and maybe let's refine the focal length even to make it more precise you can just press a button and it's done so blender has solved the camera rotation with a solved error of 0.3 that means the difference between the actual 3D points and the track markers is less than one pixel it's just 0.3 pixels wide so it's quite accurate yeah, next thing 3D so I would go to the 3D viewport and so far I see nothing so the camera is just static in order to make this work I have to first assign the camera solver so I would go to the properties and in the constraints panels I can add the solver and suddenly I can see these little crosses here and if I look through the camera I can assign this as a background plate and if I press Alt A I will have the playback and can see if everything works and that's also the beauty of having everything in one software because you don't have to import, export you just press a button and it's there and you can work with it as much as you want no import, export and that's really, really nice for example whatever I want to align the camera with the 3D scene here I can of course do some reconstruction math setup thing or I just set my cursor to the pivot point of my scene and rotate around this point until I have something that works more or less it's definitely not perfect but maybe it will do for now something like this maybe if the cube more like this so maybe if I say let's not use a cube let's put something on the plane on the wall in the background so this will now stick to the wall and that's nice but that's not interesting really I mean tripod solving is basic and boring yes that would be so nice well you sort of can if there's an actual 3D solve this is just a rotating camera so there's not real depth information there if you have the camera that is located in space and moving then you can pick 3 points to define a floor plane but it's not really constrained it will just use the current camera solve and put it there but if you resolve something it might go away again so you would have to put it there again that's something on the menu for the next steps hopefully this Google summer of code 2013 we might get 3D constraints but so far this is a little bit tedious because you can put it there but if you resolve you might have to reposition it it sucks of course but so far tripod solving is super easy and boring so let's do something more interesting so as you can see the captain here and if you look closely you might find that this is a webcam cut in half wired with some crappy cables and a metal thing so this looks totally stupid and we decided after the premiere even that this looks too boring so we wanted to have an object that we could place there and now I have to hurry up just 8 minutes left so this will be a very quick object solve I hope I can do it so for an object of course you get an object I don't want to resolve the camera next I just place a marker here and press control T and see what happens well it fails no surprise because the feature that it's tracking is deforming he's moving his head set up a different preset and use affine tracking so affine tracking is sort of planar tracking not really like in mocha but still the marker will try to transform with the object in space so the next thing is normalize to compensate for differences in lighting maybe match it to the previous frame instead of some keyframe and if I now press control T it will track the entire frame or the entire footage through the center that looks actually quite usable and I will just repeat this 8 times because 8 markers are needed for a 3D reconstruction ideally these markers are not all on one plane but they cover some depth so his bigish head I would say is very well it's a very good thing to track so we I mean we didn't put tracking markers you would usually do that if you would approach something like this but with blender you don't need that you can track hair and skin and it's just totally awesome oh well you can but that's super tedious I think if you really need it of course so if I set this back from previous frame to keyframe let's try this one then well I wanted to demo that the markers will at some point lose their point well this is not really stable so if I find a frame where it's jumping off I press G to grab it and then have a new keyframe or a new reference point so to say and then track from there but especially for object tracking or the previous frame as a reference because usually that works better especially with tracking skin like this that really helps so hopefully how many this is actually already four maybe I can do one more here you can mute the footage to look at the markers and if you play back yeah that looks nice you can also control the motion curves and that looks all quite solid so now the interesting part happens and that is I have to try to solve this and the most important part usually for solving is to set the keyframes right the keyframes are the two frames that are used to establish the perspective and if I look at the markers then it seems to me that the keyframe maybe 17 or 19 there's some nice depth and after 17 it stays there until frame 30 so frame 30 might not be the best usually keyframes are set to 1 and 30 that's why I'm explaining that you might want to set the keyframe B to frame 17 now I solve this the solve error is 2.7 which is not great so in my 3D view these markers these are the markers of the head they are somewhere in space so now I will start finding better keyframes at reference point 21 that doesn't really help and this is the horrible demo effect because in R 1.0 that's nice okay so they are all moving with the head so ideally they also move in the certain exit because sometimes with object tracking it happens that there is some inverted matrix so the view axis is flipped and this is something that you can only tell if you use a test object and of course we use the blender mascot Susan and this object would get an object solver constrained 3 minutes left so fingers crossed yes it works awesome so this is how we track the head perfect so in the last remaining 3 minutes I wanted to show you how to insert an object and how to render that but there is no time for that but just to finish this object tracking in blender is something that I really like it works very nicely this supervised tracking approach with affine tracking is very nice for tracking skin and hair and you don't need tracking markers that's why I like it really a lot okay that's this and now I quickly finish my talk by showing one breakdown so that's one breakdown you can also find these on YouTube and then very quickly currently the 4K project happens so the blender foundation has a great 4K screen sort of but it's a very nice project so the difference between HD and 4K is quite nice because I think this is just the presentation program this is not showing you 4K this is 4K again this is oops this is 2K 4K so there really is a lot more resolution and it's a lot less forgiving so now we see all the crap all the bad keys all the bad masks all the crap so 4K is horrible if you have to do it it's really a nightmare especially because blender is still a little bit slow so the 4K project we got the prefetching in the clip editor so that previewing the footage is faster we have GLSL playback and GLSL grading so that's a nice improvement in the compositor there's a region of interest so you can define a little border in which you can do the compositing buffer groups also a little bit faster and other benefits well there's now tons of stuff free the models and also the footage so the footage is available on the web we have the original 4K footage there's the cleaned footage that's already been masked the frames that are rendered everything is there and you can use it you can even use it commercially and test it like for example you might know this guy so this is the nuke site so they are demoing nuke 7 with Tiers of Steel awesome so that's a very nice benefit also for the entire VFX community so I think it could have gone worse all in all thanks