 I'm Ivan Cappiello from Madden Entertainment and Emissions Studio. Which one of you was here in 2013? Anyone? Yeah. And we were here too. It was our first banner conference. And we presented some workflow for a 3D, Gigi animated series made in a 2D, 3D hybrid style. And we have presented some lots of stuff like things like that and how to do it in Blender and lots of stuff about rigging, basically. We were coming from previous productions. Our studio is set in Naples in Italy and was built up around our first animated feature. This is called the Art of Happiness. You can find it on the Bray, I think. And I don't know which distribution is that outside Europe. So, if you want you can check it. And this is where we work. We have the last presentation in 2013. It was our first project entirely done in Blender. And time to time, in animation business, you start a series. You are going to production, you pre-production, you do teasers. And then the production stops because of co-productors have to discuss about international rights and things like that. So, in a while, the studio can test some new things when some spin-off works comes. And this is important for us because this is a quote from Woody Allen's movie. And it says, Virgil receives a cello as a gift, but he has no conception of the instrument. It was blowing into it. This is how we felt initially with Blender because we tried to replicate some sort of workflows we had in other softwares. And then we had the chance with some set of works to test what else we can do with less pressure without producers on our side. We tried it in this kind of work. It's a bit difficult to explain because it was committed by a choir. It was a polyphonic choir. It means they sing all together with no instruments. And they wanted to have live animations on their back when they were singing. We created these ten images inspired to sell bus artworks. And we had the count that in the end it was the total length of a short movie. Each of these has five minutes. It's an average of three minutes per each. So, the problem was how to manage this because it was to be played on a single shot. All together. I'll let you see the movie. Can you please turn the lights? This is a reel of various animations. I cannot play it because 30 minutes is too long. So, I condensed it in the best animations we've done. Probably we could have done it in any 2D software but it was kind of like to experiment to see what can we do with this. Technically, it was not very intriguing. I will go through things you may already know. This is how a simple set is done. For us, it's very important because the people we have seen in slides are not typically 3D animators. They are classical animators, illustrators, background artists. We are sort of mixing an melting pot of talents. It's important for us that each one of them is capable of doing something from the beginning to the end. So, even the backgrounds you see are very schematic but it's done from people that not always use a blender for modeling or for creating artworks. These are just animators. It was very convenient because they also started rigging. I will come to rigging a bit after but the interesting thing we have discovered because in a blender, when I was saying use blender as blender, not as other software, there is this cool feature you probably already know that you can link in an empty file your scenes from other blend files. This is very practical because that file, you can see I named it editing. We used it for live editing the sequences because they were split into more short cuts because it was not possible for the animator to animate three minutes altogether in a practical time. The cool thing is that when you import this, we link these scenes, you can use it with a video editor sequencer and they will react just like film strips. You can edit them and you can put it together. This is a simple preview of how we did it. I think it's not particularly fancy but we find it very useful and as far as I know there are some features I haven't seen only in blender. You can see this is an empty file, there is one of, in this case, I think 10 or 12 separate scenes linked inside this empty file. You can see it, it's the L stands for link and the practical thing is if you see here the timeline is reaching accordingly to the sequence you are using. If you start in the end frame correctly in the single scenes then you can have something like that. This is live streaming from the video editor, there is editing scenes in this case. You can see there, you can switch the GL mode. This was pretty new for us because which other software can do this? I don't know. But we tried to go a little further and we also composited these frames to the compositor via the linker scene. So it is still composited in the same file using the same scene as a reference for compositing. It's not very, when it's done composing it's a bit tough because otherwise the frame rate will go on one, two frames per second but if you just want to test how it's done it's good and moreover we used it to export all the sequence directly from here and the animators could at any time open the animation sequence and see how their cut is matching with the other and so on. I don't remember. Oh, okay. This is after... This is after... ... As I was saying before sometimes production stops, sometimes starts suddenly you never know when. And in a while we tested render furthermore for this teaser that was for our upcoming movie should be in theaters next year and this is our second feature film. At this moment we were still doing preproduction and did this with Crafting Campaign to show producers how the film would look. And in a while, while we were doing all this stuff it happened something big because we are in small studio and our budget for art of happiness was around 1 million euros. Something like that. And it's not big for a full feature animation. And they invited us in this wonderful show which one of you knows it. It's like the European Oscars. And we were at this beautiful show with... Win Vanders is the president of the European Film Award and we were in here against Jack and the Cuckoo Crock Art and Minus Cue. These are big movies, European movies, big budget. And incredibly we won. We won. It's the first Italian movie to win. The... I want to share this moment with you because for little studio like ours it's very important to have software like Blender because you can focus more on developing tools but also you do not spend too much money in licenses and you can pay more artists that is very important for low budget movies. This was a great achievement and give us the strength further for the next movie. This is the prize we had in the studio. Okay, so coming to the actual presentation this is, as you have seen from Cesar, our next feature film. And I think I can... You can organise here as the director of the Art of Happiness who won the DFA. These are produced and this is me. This is Dario Sanzone. He is also the author of the song you have heard in the teaser. And this is Marina Warnier, the fourth director with us in this project. Yeah, four directors. The story I can summarise as fast as I can is from the original novel that is not so international, it was written and written in Naples about in the 17th century. And the original story has some resemblance with the actual, you know, from Disney opera role. And we have centred the story about two separate times. One is the time of light and is happening before. Cinderella is actually Cinderella and one is after. It's the time of Cinderella because there is a burning volcano that always makes full of Cinderella on the city. This is Cinderella's father. There is an inventor. He has great projects for the city. This is a wife, supposed to be the wife, but she's going to become the stepmother of Cinderella. And this is her as a baby, as a child, and then when she became Cinderella because the events you already know. These are charming prints a bit set off aside from the story because he acts like a prepared policeman for the father of Cinderella and knows her from the beginning. And when the father of Cinderella is killed, he to all he can do for Meet Again Cinderella and Silver. And this is our best guy. He's called the king and he's a singer. And in the time he became a mob boss and it was together with the stepmother responsible for the killing of the Mia's father. These are six sisters. They are six because in the original novel they are six and the stepmother already has them in the previous marriage. Okay, these are some pictures of the actors from the movie and I think I can show you one minute of the movie. This actual movie. This is coming to theaters next year. We are about finishing it and it was entirely done in Blender except for the compositing because we had some problems with the compositor for the compositor itself that it tends to be slow in some cases because not every animator we can teach how to composite because as I said before, each of us do everything in the production so we try to keep the things together. This is, I want to show you, maybe I'll skip it if I have no time but this is how we did the basic backgrounds. This time it's all 3D and we don't use textures. This is a complex color to have this flat chili. It tends to be less memory intensive and you can see as much as you want. It will look in the final frame. I don't think I have time to show everything but I want to show that there is a scene in the movie where this is like an hologram aquarium where you can see fish that are going to swim all around the world and I'm waiting for that to happen here. Okay, this is it. This background element was done again with something we asked ourselves how we do this in Blender. Let's try. The thing is we can link one character, one rigged character, that is like one fish inside Blender. You will see in a few seconds. Each fish, we have five, six singular fish for each group and we just duplicate it animated as is. It works. I never asked myself if it was possible but it was possible because it is like randomizing and offsetting the group itself with the soft selection tools. It worked for us. We can also reparent it to some empty and do all kinds of stuff. I don't think I have time to go through all this. Let's see. Maybe I can. If you want, you can contact me after and we can see some more. I think the slide is pretty clear what we did. Okay, this is how we export for compositing. It's very, very basic. We have the color pass that is just vertex color. We have a shade pass, sometimes a Z pass that is not listed here and a freestyle line pass. That's all. The background artists received this shading in their paint up and they stuck the layers all together and they paint over it. They had control on how the elements are stuck and then with a simple masking for ground by ground and then it's composited with characters to have the final frame that I should see. That's the final frame from the movie. How many people know what is Rigify? Have I ever used it? Oh, lots of people. Okay. Why we choose Rigify? For a start, I have to say we have, until now, when we're going to raise a bit, 110 characters in the movie. There's a marriage. There's an audience at the show. It's very, very complicated to rig all these characters together. So we used the world project Rigify and when we were here in 2013, Net and Vekdel updated this wonderful visual software where it was like in Pitchipo in Israel, I think. They presented it here and we were doing lots of complicated stuff and we saved each other's. Why don't we try the included one? So the benefits of Rigify is that if you don't know, maybe you already know, it's a modular rig system. It means that each part of the rig is made by itself. You can do an arm, a leg, a spine, and so on. And in our pipeline, this rig is shared through characters. It means you will see maybe later that you can paste animation on which character you want because the controls always have the same names, same positions, and so on. The metrics can be updated and regenerated so you can add two arms or one face more, if you want, and you will not lose your previous control. How it works, I will explain for the one that you are not using it like this, is there is a custom attribute on each one chain and every one chain or example has a required number of bones to work. And then when you assemble it together, you can have new metrics. This is actually using the movie I will show you in a while. And these are the kind of types you can use. This was not working. We fixed it. I will come to it later. And then all the mechanics are done for you with all the quick bones and so on. It's not so difficult to use, but it's not very well documented, so you have to have a bit of practice to use it. This is how we use it in movies for arcade scenes. I cannot show you the audio because the film is not yet in theaters. But you can see there are lots of cats. They count on 110 characters. So we did some testing and it seems to work. I'm teasing a bit more of the movie. Hope I have no problems. So our kind of metrics we had to do is birds. Again, you create a metric. It works. The actual bird we used. And this is a scene with the bird. Here he is actually singing. And even the fish are made by Regify. These were made by animators. Directly from animators. They put in a spine, a tail. It creates for us... Oh, even strange animals. This is for Venice Film Festival. So our work on Regify is divided into two parts. The first part is a fixed existing code. And the pose I was mentioning was not working, was there but was not finished. The code was a tiny fix. We did it. This was a big error. I don't know which is the reason why it was there. There were floating points, position errors on the super face bones. That means controls were offsetting and rotating. I don't know why it was a quick fix. We did it. Rotational mode on tweaks. Even this for me is strange coming from other softwares. Because in Blender there is quaternion, you have newer rotations. But the Regify rig is always on quaternion, even on things that should spin on the one axis. And this gives you some way of messing it, because you can have rotation on other axis. It's a fix. You won't see it, but it's working right now. Tail option on SuperTorso. This was working and not depends on the case. We had a fix and now we can have tails for cats, fish, birds and so on. About the Epidemic Dental, I have two videos here. Like if Snapping was missing on Pitchy Poi version of rig, and this is how it's working now. This is all auto-created from Regify from the MetaEek. It simply just works. No fancy stuff. We just fixed the B-Sync code starting from Regify original code by Natan Begda. Then, IK Limb solution mods. That was requested for us by some other guy. And effectively, we used it in production, so we did the implement in Regify itself. It works like that. Basically, when in IK mode, the Limbs tends to stay on where the root is. And you can see it here. You can pause it, and then when you move the torso, the ends stay like that. It's not very practical when you're using... Yes, you can use SK, but in some cases, reaching is not practical for animators. And in this way, using this... in opening this following parent, you can decide which point is at the fall of the root or the direct parent. The other is, if you set IKFollow on zero, then the end is free. You're not full of the root anymore, and you can use your own constraints to do things like that. It's very handy if you're in production, you don't want to switch. This is already committed in 278. Okay, bone groups and colors, and we're not explaining too much. The controls are created basically just with black color. We added some feature that classifies the bone groups. It's very handy for, like, finger, sense, face. You can select via bone groups menu, and moreover, it adds colors. You can customize them after. This is not yet in Master. And then character utilities, I will... I have no slides for it, but basically, you can regenerate the rig. If you update some feature, you have a MetaRig and the rig. You can press a button and update the rig from the MetaRig. One of our characters do some stuff under Duke. Basically, the rig has an ID number to be recognized and run the UI in Python in the viewport. And so when you duplicate this character, the rig is not changed. We have the utility that lets you change the name of the rig and the rig ID for that, and also update only the bones that you are changing. Not all the rig. Maybe you just want to fix the face. You don't want to recreate custom properties and so on, and it is going to work like that. This is still not in Master, after we finish the movie. Oh, Bendy Bones. Yes, Bendy Bones. Autotopic. And I don't know if anybody knows, but Rigify in this Pitchfoy version already used Bendy Bones for limbs, especially. You can see it is in action. And it's not created by the Rigify itself without having to touch anything. So it's working. For cartoon animation, it's very handy because if you see our 2013 presentation, we are doing this through our scripting, but since it's already implemented in Rigify, that wasn't me. What else can I show you? This is the main feature. We originally switched to Blender because we had to write this add-on. We want to do a mission for background situations. I repeat, we have 110 characters. You cannot make eight or ten animators animate clapping people, working people for all day long, or one month after the other for 18 months. So we have to develop some kind of low-cost motion capture to work with it. The basic is you can import and use the MetaRig, import the MetaRig to store the BVH animation, and then you can constrain the Rigify Rig to the MetaRig because they are the same thing. They have bones in the same positions, but the MetaRig is very simplified. The good thing is we don't want motion capture and stopping. We want the animator to interact with this, and so we built this interface that animators choose which pose has to be kept, not keeping all together. Then when you're ready, you say, this I want, this I want, this I want, then you can disable the constraints and see which are the pose you are keeping and if it is not for you to refine. And you can add any pose you want at any time. And animators really love this because they have some kind of layout done, and they can tweak it. Obviously, this is what imports the BVH because the motion capture takes 30 frames per second. And we are animating at 12 because we want this classic look. And this is how you can keep, if you want all the animation, you take it all and then the animator clean up the standard way, or you can specify a step for extraction back here. One frame, it means each frame. Three, it means one on three, four, and you can customize it. This is the animation extracted without refinement. Obviously, we are not using this for animation. We are using this for additional animation. In any way, you always have to animate hands and face by hand, and that's the most important part in animation. That part is done with pose library manager. You have hands in various positions, you can just click it and use it also for facial vision. This is a comparison of what the extraction did in this case with no refinement. I think it's also... This is a standard arc playing here. As you can see here, this is a standard extraction. There is no manual keyframing here just to test how accurately it will take. Usually we use to take one frame on three because we don't need much more than that. This is another cool feature for Malkanima. We use it for work cycles because animator tends to lose a lot of time in customizing lots of work cycles. Since our movie is basically realistic in the field, we decided to give some try to emotion capture to have this for the animators. You can see different characters and different works don't like in 12, 30 minutes. This is a bit more... How many poses do you need to make a work cycle in Malkanima? It's like you will see here all the poses the animator chooses and then what you will see. Just six more keyframes here. This is our look. After the refinement and some cloth animation, it's all made by bones to rigify. This is how it looks. Here I show you what I mean with the character rigs that are intercommunicating. You can paste the animation from one character to another because the character shares the same controls in a rig. The bones that are added are not considered. If he has not here the dress, it will not be considered, but the controls, they are similar are used. And again, you can also replicate the character after you did these wife storms. She don't want to stay here. Olivier is rushing me for... I know, it's almost over. Yeah, we are still testing it a bit further in production. It will work for now only with the Pitch Boy version of Rigify. Next steps. Okay, we have something on the code sides and something on the side and UI. On the code there is a needing of a general cleanup because there are lots of typos like a stretch is written like a stretch. In the controls, we fixed it just as displaying but the operator underneath is still called like that. It should be fixed. There is not PayPay compliant. Then we need to unify the metarex and the rig sample systems because now we have basic version, Pitch Boy version. There are lots of samples. It's not very handy to use if you don't know where to pick all the features. And a new method for parenting and construing because now the metarex is the key for doing all this and all the parenting is done by the metarex. If you parent one bone to another, it knows if this is a chain or a basic parenting. We want to replace the metarex system with the widget system. So you can place the hand and the fingers like just on the widget or the shoulder or the face moving this widget and not the single bones together that tends to be not very user friendly when you have to calculate bone rolls and so on. Interactivity selection and key and set tools. I have seen a presentation yesterday I think it's called Julian Druror. That is already doing some things like that. Maybe we'll keep in contact and try to implement this because key and sets are really required to rigify to work efficiently. You may want to select all the fingers in one time and so on. Julian already has some cool stuff. Maybe we can integrate this. And also with the rig samples UI. Rig samples are now stored in just a list and there's also a list of contemple, basic copy, super copy, super torso, turbo and super face or wonder things. I would like to unify this and have a more compact list and more handy selections to maybe you want an arm and bam, select the arm icon and so on. And then the character mesh collection. This could be very handy for wrapping up the add-on because right now the only way you have to update the character is you have the metric and then you have the rig itself and when they are all in the same file you can update it. But the thing that's going on is that the rigify keeps the armature object itself deletes all the bones inside and refills it. That's the way to keep the former son. But if we could have a mesh collection to store this data that will be very handy. Okay, the discussion is still in progress, is here. We have made a proposal to make this change. Some are already master, some may come or not and we can discuss with you what else we could do. If you want to have any questions, we can talk outside later. You can talk to me or to Gusha Rossi is here. He is the coder of this work. Thank you.