 Hi. Hello. Thanks for coming here. We are going to talk about interfaces for animation. So the things you are going to see here is not new. I don't know if it is right to call it revolutionary, because the revolution happens continuously. Revolution is a common thing here. We do that all day. The thing is it's more about the freedom of creating new way of interacting with our 3D characters or 3D entities, because 3D is not just about the character, even if it is the part that fascinates me the most. But yeah, it's about thinking of new ways without throwing away everything we got. So it's kind of a conservative revolution. So I see many faces. I don't know. I'm kind of surprised. My name is Paolo. I am a computer programmer and I work on 3D, so I write plugins for 3D applications, some standalone applications, but everything comes down to animation, BFX, sometimes prototypation. But yeah, I am a 3D programmer. And in the past, I used to be a 3D artist. That's how I started, and that's a very common thing in our industry. Most of us are former generalists. So sometimes it's some aspect that people don't think about when they say, oh, you're a programmer, but you are also an artist. But most of us are. And most 3D artists have an analytical set of skill that kind of overlaps. I don't think the two roles are that separated. And I am also a very sophisticated person. Thank you. I work as a developer for a company, which is called binary alchemy. It is based in Germany, but it's a fully remote company, so I get to see it in Italy. And we work on a render farm manager, which is called Royal Render. It is used in many countries in the world. The very nice thing is we get feedback about the pipelines which are used in our customer studios. The use of Blender is increasing more and more. When I joined Royal Render, Blender was optional, an optional application who had to be installed using a different repository. Now it's kind of interesting to see how Blender is a must-have in a pipeline. Even if it is not used, you can no longer avoid having Blender in your rooster. So it's a good thing because we are getting feature requests that are specific to Blender. Usually that happened only for the main commercial platforms. Now we are starting to develop Blender plugins. We are thinking of releasing them as standalone plugins and make them freely available. So that's very interesting. That's very good for me because Blender has a special place in my heart. I started doing 3D in Blender when I started to do it seriously. I don't remember the exact year. I remember this bird who welcomed me into this wonderful journey of 3D. I have kept using Blender for my personal stuff. Whenever I can choose what to use for my personal projects or whenever I want to test some idea about a new approach, I always use Blender. And I have a YouTube channel in which I show my add-ons or the way I can handle tasks in 3D. It's called Bulls and Ninjas after a joke about my first portfolio because when I started working in 3D, my portfolio consisted mainly of sport balls that I had modeled for commercials. This character, which is my first character, I modeled this after a concept that another Blender user posted on forums on what is now Blender Italia, the Italian forum of Blender Artists. So this was my first fully working character. I was very proud of the incredible feature I could put into it. It had IK, FK controls, reverses, control shapes. It's stuff that we take for granted, but that looked like, oh yeah, this is cutting-edge technology now. And when I needed characters for my videos, I imported this into Blender 3 and made it, give it a new materials to make it work with a new render. And it was very interesting to see how much has changed because render now is real time, is physically based. And we model using scout mode. It's like modeling clay. We don't do box model anymore now. And when it comes to rigging, now we rig by putting bones inside the model. It's still the same stuff. And the thing is, if we think about it, the job of a modeler has changed so much. The job of a texture artist of a look dev. If we could take a time machine, go back to the 90s and start working on 3D, it would be completely different. We would need to readjust our workflow, but rigging is a time travel compliance skill, apparently. Because it hasn't changed that much. That's the feeling. So how come that everything changes and rigging is the same? Perhaps rigging is the pinnacle of evolution. Or maybe it is a living fossil. It is something that still is around because Mother Nature forgot about it. So I want to have a look at it. Let's look at this in action. This is a wonderful model, in my opinion. It's the dodge from the memes. My Twitter friend, Pierre Schiller sent this. So here's shadowboxing. That's how come. He's in such a good shape. And we have these bones, these oxygel widgets, which are a good approximation of our locomotor system. They are, of course, inside the body. I know you know this already. The thing is, it makes sense that those are centered inside the body because that's where the full chroma of rotation is. But the thing is, you need X-ray in order to visualize them or in order to interact with them. So what we do is another thing that you know well about. We draw these wireframe shapes so that they can go around the model. They can be selected because they grow outside of the model. And since we are there, we create an additional system that can be used for inverse kinematics. Every animator can choose. This is a riggy favorite rig. Most of us are familiar with it. So it has these widgets, these displays that allow animators to work. And we put both sets of widgets in every character because you might need both. You want to use forward kinematics to animate actions that start from the body and propagate to the limbs. And we require inverse kinematics when the action starts from the extremity, from the hand or the arc of strings like the floor. And you cannot know which one you will need for each shot because if the character is standing on his hands, then the hand must be inverse kinematic because it has a constraint and so on. So we put both inside the same rig. This has been called Uber rig somehow because it's an old-purpose rig that can switch between a key and a F key. This is a very simple rig but still it embodies the knowledge delivered to us by very competent people who made sure that we could find a way to animate our character. And it is the Bonafide hero of the entertainment industry because allowed to get things done for more than 20 years. So yeah, I think it's a very interesting technical achievement. But even on this simple rig we can see so much stuff that it is a little bit crowded with all those widgets in there. And the thing is even if those wireframe curves are an improvement over octahedral display, you lose the direction of the limb. For instance, this is a very easy to read pose. But sometimes it's very hard to find the right ball for the right limb that you want to tweak. So we also design interfaces to interact with that. This is an early experiment of mine. I tried to design a peeker but I wanted to draw my own custom widgets and implement some advanced behavior. So I tried to integrate the PySight library in Blender and it was difficult because PySight is already a Python wrapper to an external library which is called QT. It works very well on applications that are using the QT library already, like Maya and most commercial applications. But Blender uses its own graphic interface and that means that you are going to have both the QT library runs its own application internally. So it's going to be very heavy and computationally expensive in Blender, mind it, because the QT library is actually very efficient. So that was a very nice experiment but it couldn't really work. It required too much work only to exist. When I came to this conference in 2017, I met with Christophe Sou, French TD, who made something very clever. He was using OpenGL, the OpenGL buffer in order to draw his own shapes on the directly on Blender's interface. This was interesting because you could script advanced behavior without requiring any external dependency. So yeah, made quite a sensation. I'm so sad Christophe is not here this year but everybody is talking about him. Christophe left a mark because he is the author of the classroom scene which is used in many benchmarks. So yeah, thanks Christophe. So with that in mind, I started to work on my own widget library that used the same principle but was extended to a much more generic purpose library that could be useful to everyone but I never told anyone about it. I worked on it in secret and this is the first time. This is ever displayed anywhere and then I stopped working on it because there were a few things I didn't like. Artifacts could happen when I tried to draw on the OpenGL buffer over 3D objects and also it requires to run inside of a model operator and that's about themal because that can interfere with other operators or with some Blender's routines. So yeah, these go back to the bin. But I then searched on the Blender API documentation something that I don't do as much as I should. I searched for something that could handle events and something was there, a long time neglected entity that can be used to draw widgets and act on events. There is even a very nice example in the Blender Python templates and you can see that you can input 3D coordinates here and you find it there so it's like 3D objects that are not part of the scene. They are just there for display and you can write custom behavior for these objects. So it's impressive level of control we are given I wasn't aware of that and many people I spoke with weren't there, I don't know why. Anyway, with that in mind I thought that I could do something similar to an old proposal. Anybody remember this? This is from the Gooseberry project. We got very excited when we could see those wonderful widgets. They were so happy, oh, I can't wait to use this that they never came because those sadly were never made into production. So I tried to build something similar in Python with moderate success. This is an add-on that I called because I thought I was good with names but maybe I'm not. It did what I wanted to. It displays these overlays, these overlaying widgets rather than curve shapes and I could use them to select my controls on the character. But no action was implemented. I tried to work over that but then I stopped trying. The same fate that happened to phase map widgets happened to my Pismo as well. So I thought it was the end but it wasn't because the methods that it from the Blender Institute expanded from this and the other idea into something that can be used, something that can trigger actions. So this is the bomb gates most that are part of this cloud rig. I'm so sorry. I had to resort to unlimited gifs rather than videos so what I can display here is limited but with mouse click you can select bones and by dragging you can translate them or rotate them the behavior can be configured in the interface. It's very nice. The very clever idea about his implementation is that he resorts to, he is using Blender native operators to handle the transformation. So you get all the existing stuff, all the autokey and all Blender utilities with a new widget. The only cons is in the words of the materials himself, how much it takes to set this up because for every widget you have to select which vertex group is going to be shown which behavior you want to associate. So it is true that it is time consuming but we tend to forget that also setting up those wireframe curves is time consuming. We rely on a set of tools that has been built over decades and made it easier and faster. So perhaps we could do the same thing about these new widgets. Here comes another addon of mine. I called it explicate, same reason as before for the name. I use it to convert characters across one package to the other here. I am converting an Unreal rig to a Rigify rig. This is not the only tool that does it. There's plenty. But the idea behind this tool is that once you convert one rig to another, you can convert all that it comes with. It's like all the animations, well, kind of that's ideally what would happen. Sometimes there are some fixes that must be done by hand. But yeah, so it can convert stuff. Once a character has been explained to this addon and it supports Rigify and other known rigs and allows to define custom rigs. So if I have this tool and if I throw the meta widgets into the mix, I can have fast setup for my widgets. I have a Rigify rig becoming a bone gizmos rig. So now I could have all the characters I wanted using this new system. And the problem with this new system became evident to me. The reason why it never took off maybe was that this setup suffers from a real estate problem because some areas of the character have more widgets than available surface. So for instance, how would I handle the hips where you have two controls, one for the torso and one for the reverse hips. It becomes very hard to allow the user to select either torso or hips. Perhaps I could split the region being in two, but then it would be harder to select it. So I really, really wanted to use this new system. And I thought that perhaps it was wrong to think that I had to find a way to select the hips. It's not like I say I want to select the hips. I want to select the heel. What I want to do is move the torso or move the hips. It's the artistic choice that comes first. It's a matter of design rather than just mechanics. So what I did was resorting to hotkeys. In these examples, okay, thanks God videos are playing. And in these examples, I only select the region of the hips, the first control of the spine. And when I do that, I can rotate the entire torso. But when I want to switch and move only the lower part of the body, I press stop. So that's how I could work around the limited space that I had for all the, to accommodate all the widgets. And the same happened for the feet. Because when you have a reverse foot setup, you can move just the foot or you can select the heel to raise the heel or you can spin on the forefoot. And again, we could, we can toggle, we can loop towards this different behavior using function keys. And that's possible because you can implement any logic in a custom gizmo. So when the old system was designed, there was no way to do this. You could not make a control act in different ways. But now you can, you can make context aware controls because you can script whatever you want or almost inside those event functions. And not just that, you can use pie menus. Because again, my aim was just animating from the viewport. What I was aiming to do is I don't want to use menus. I don't want to use panel. Everything I need or at least everything that I need ordinarily must be readily available. So in, in this example, I have triggered, I have added an F key pie menu that can be triggered using the alt key when clicking on the control. And so all of that is not hard coded. It can be configured. But these options, these options in the bone panel are already too crowded. And I want to, I would like this behavior to be as much as flexible as possible. So I think that rather than having this option fields inside the property panels, these gizmos should have their own three node tree. We should have a graph in which we can create our own widgets using visual scripting. So for instance, display a widget over this region of the mesh can become this node setup. And when, when the mouse hovers a widget, just display and highlight effect can be this. We can have an event node, a color node. So this is a quick mockup than using geometry nodes. But yeah, that's to convey the idea. So there are opinions about nodes. Everybody loves nodes. I do. Sometimes they can become hard to read. But for this kind of stuff, I think they are the right balance between flexibility and clarity. I think this is not going to become too messy. But that's what everybody thinks. I cannot be sure on that. But yeah, I would like to implement that in the future. And maybe I will, or maybe someone else will. Because the next thing about what I, what I'm talking about is all this stuff acted across the work of different persons. And we co-operated. We worked together without being together sometimes, most of the time, without even writing to each other. And by the way, I love our online meetings. I like exchanging our opinion and clues. But the thing is when it came to developing this stuff, we were able to minimize the required communication. This is something that I love about open source. There is a sensible human factor. And we are here. We are now living it when it comes to growth, to personal growth, to learning new things, to sharing what we do. And when it comes to carrying on our tasks, we can just look at each other code. We can just build on each other's solutions. And we can just be happy about it. So yeah, this is not ending here. This is going on. Someone will do this. If you want to play with this code, you will find the experimental branch of explicate that has this implementation. There are some hacks here and there. It's so experimental. But yeah, I will be happy for you to try that. And I want to thank everyone for coming here. I don't know how much time we have. Oh, we have 20 minutes. So we can have a QA. Or if you fail to grab lunch, perhaps there's still something I don't know. Thanks. Oh, yeah. Okay. Okay. You were first. Yeah. So the aim is maybe repeat the question. Oh, yeah, sure. So if I understood correctly, it was about using tab for multiple selection. Okay. So at the moment, most of what happens is hardcoded. But I left many options for configurations. So what I did now is using associated bones. There is, among the options, you have which widgets is associated with a control and which controls are associated with that widget. It's kind of weird. And the reason is all these gizmos must be properties of the bones themselves as they are implemented now. So one widget must belong to one bone. That's how they are designed. When I started to link widgets to multiple bones, what I did was adding an additional property where you can have like friend bones. So one bone has the widgets. In that case, the torso is the owner of the widget. And then you can associate other bones to be friend of the torso. And the tab key will allow you to switch between them. Same with the foot. The I key foot is the owner of that widget. And the heel and spine are friends of the foot. And that's why tab can loop among them. So that's how it works now. It's a tentative solution because I would like to have that wonderful note graph, I hope, that will allow to have a more egalitarian asignation assignment of the widgets. Does that answer your question? Well, at the moment, no. But that can be done. The overall idea is that the last resort is writing some Python code to implement some advanced behavior. In theory, you could have a modifier key that allows you to send emails when you select the head. So the idea is making it as flexible as possible. Okay. So, oh, yes, please. Yeah, yeah, yeah. So the question is, is it mandatory that it is an overlay of the character model? Or can they be independent widgets head up displays showing on the screen? So the answer is yes, you can have independent widgets. It is something that I have done, but I have removed them because I couldn't focus on both tasks. But the intended use of gizmos is head up display widgets. The weird use of gizmos is making them overlay system for your character. So, yeah, that was intended from the beginning. And that's more of the natural use of gizmos as they were intended in the first place. Oh, hi. Sorry. Okay. So the question is how computationally expensive this is and if it requires powerful computer. Okay. Okay. I look for the vertex group of the deformer bones. So what I do is I look into the rig, find the control, then find the deformation bone that is controlled by that shape. And I picked that as the vertex group that will be used to do the gizmo. But the vertex group can be changed. So that was a problem on occasions because sometimes you don't have clean waiting. Like you have some vertices from the wrist or even the forearm who have weak assignment to the hand, for instance. My first solution for that was adding a threshold. So only weights from 03 upwards are considered. And still you can add a different vertex weight. You can set different vertex weights, sorry, different vertex groups for that gizmo. It's something that can be configured. Hello. Not yet. That's not possible. That's one of the intrinsic limitation of that system. Gizmos are only aware of what happens once the gizmo has been clicked. So it is not possible to have a rectangle selection inside the logic of the gizmo. But gizmos don't take that away. So rectangle selection will work as usual. And once you have selected your controls as you always did, you will see the highlight on the selected gizmos. Okay. We don't have to do this right now. You can just find me around. There is still time left. Thank you a lot.