 but need to be quiet. You might not be very fast here to allow the person passions. Otherwise, you're gonna be here all the way until tomorrow, so you can still come for us. Oops. This presentation, so I'm Dalai Falinto. Here's Bena Boussi. Hello. And you're gonna be talking about drones controlled by a blender animated file, pretty much. So the project starts as something that Pudifu wanted to implement. So Pudifu, it is medieval thematic park. Think about a Disney world, but with castles and... Vickings. Vickings and fire and trebuchet. And they also have what you call the Cinecini, this open lake of a castle where they're always trying to push the limits of what you can do with entertaining. So they have synchronized lights, synchronized water show, video projection. They have 20,000 actors acting at the same time. They have a dynamic set and they were thinking, what can you do next? So how can you go from a place which at the same time, it is very low level technology animals. So per se, if at the same time, the state of the art technology. So this year, they decided to come up with the Nupter project which was a joint venture between the Pudifu and the ACT. Yeah, ACT is a Belgium company that provides the lighting for Pudifu and they were asked to provide a solution that would allow to put a light show in the sky. And the solution was to use drones carrying lights. But many lights, so it's not possible to consider these old situations where you have one driver for one drone and when you have 50 drones, you cannot have 50 drivers and all of them must be completely synchronized. So the solution was to have drones doing autonomous flights to guarantee safety and synchronization. And of course, you need to prepare the path of all these drones, you need to prepare the emission. And the hard part here is not only to make drones and make them fly and flying synchronization, but they need to be entirely autonomous. And it's really hard to get legal authorization to have 50 drones flying with an audience watching them. That's why I have those drones here. And this is something that no one is doing in the world, apparently. We have a few people playing with drones. They have this drone open-source initiative by the Linux Foundation, but people are still having to control them. Amazon probably can intelligently navigate drones, but they can't allow this to be happening on the civilians. So the main challenge here is what we're gonna be talking about, how do we go from a single-hand control animated drone to a show where you can get all these flying objects. So what you see here, you don't see the drone, but you see the payloads, which are lights, objects, if animated lights, and just medical. And how do we do this from the thin blender? Yeah, to do that, first you need drones. And these ones are big ones, big bees. They can carry multiple kilos. They have 15 minutes autonomy. They've been built on purpose for this project because no commercial drones would fit the requirement. We did not do the drones, but we made the tool to control them. So the requirements of our part was to allow the visual editing of the path of each drone, then doing automatic safety verification of the drones, for instance, making sure that they never get closer to each other than six meters, never fly above each other, never get closer, too close to the audience, and so on. All of these things need to be checked before exporting the file. A specific requirement also was the possibility to play the show in Blender, in the tool, in synchronously with the real show so that you can compare if the drones are doing what they're supposed to do. And if not, push the button to kill them. Yeah, and of course they're not playing live with Blender, but Blender had the responsibility to tell, or the tool that was built for this mission task, had the responsibility of telling you that this is a safe path, you can animate these drones, no one should die. Yeah. Nobody had this one button that everything just falls from the sky. So I was first contacted, then obviously to me it was clear that Blender would be the tool for that. And then I joined with Dalai, who is the expert of Adon, and together we made a team and we provided the tool within one year. Why Blender's of these drones? I think they don't have to explain that. The critical point is that we have access to the source code to make further customization if needed, and we needed it. We needed a lot, you're gonna show what, and more than that, Ben was for years was the main developer of the Blender game engine. It's the person everyone goes to when there's a big low level question. And more recently I've been more involved in the Blender coding as well. So for us it was really easy to use Blender in that sense that we knew with whatever was the limit, we saw a few of the other Adon presentations. You could go and extend it further, either talking with the developers or getting our hands dirty. So we made this tool. Here is a view of the tool. You can explain that. Yeah. Basically we tried to concentrate all of our tool in these panels here on the left, right, and we even have our own panel, we get a few of the options there. And the important aspect is that we tried to hide Blender from the user, but at the same time, we're actually exposing a lot of the Blender built-in features as well. So it'd be different from the project we saw the microfluid project. In this case, we wanted the user to use the timeline, we wanted the user to use a dope sheet, but we also wanted our own controlled environment with our own handling of deleting, our own handling of moving, and what else you see here? You can see already this environment, which is also part of the tool. So our tool has a few templates, so the mesh of where your show's going to be is part of the tool. So everything is using external libraries if you have a different, I'll show the next one. Yeah. Another payload, yeah. But next one, yeah. So whenever we had a new payload, you wanted to make it fly, you simply create a new file to be used. So you even have the separation between the operator, the guy who'd be creating the animation, and the guy who'd be the technical guy preparing the tool. Yeah. So the workflow, we are not going to present the tool in great details because we don't have the time and we want to concentrate on the Blender side, but we'll still, of course, do a quick tour of the tool. The first step would be to define the payloads. It's a simple mesh, but we have defined what we call a payload protocol. By which, by giving specific names to the meshes, the different parts of the payload, the tool reorganize that as being led and being lightable. So it will provide the tool to put a color on them. Once you have your payload, you will instantiate the drones in the scene. We have, of course, buttons and menus for that. You can see here the drones. And then you go to the editing part and there we use the visualizations of the path. We didn't start from scratch there. We used a powerful tool called motion-train from Bart Crouch. Thank you, Bart Crouch. We contacted him. He said he was simultaneously it, but he has plans for a new tool, so we just took where he left and fixed a few bugs, quite a few, actually, and customized it for our own needs. So we see these already in red. It's an example of whenever the drone was exceeding the velocity limit for our animation. So it is a visual feedback tool that would really fit for this task. Yeah, what you see in this image here, you see the path of the drone. You see the collision cage, which surrounds the drones and the payload in yellow. And then you see here the red dots are parts of the path where there is a problem. The problem being, in this case, intra-drawn collisions. So the collides in this path. And then after that, the designer will be able to click on the frames, the key frames of the path, even create new key frames, and drag the path a little bit. We can show how it works here. The most important thing to keep in mind during the presentation is that this is a tool within a tool. It's an example of a compromise, again, a bit different from what the microfluid people did. You're not hiding Blader from the user, but you're trying to find a compromise between customizable experience and extended experience, but within Blader. Once you have your path set, you must control the color, the light, because that's really the target of the whole thing. We provided a small tool to set the color of the lights, so that the user doesn't have to go through f-curve to set the color animation. There is room for improvements there, but at least it works. And then when you're all okay, you have checked your animation, you export, that's a big XML file that contains the path of all the drones, and each drone will take the path that is relevant to him. Two things to note here. We only export the key frames, what we call it the waypoint, and that assumes that the drones have built-in the logic to recreate the path from the key frames. It's again one of the big advantages of Blader, because we knew exactly how Blader was doing the interpolation, so it could have the same logic in the drone. Kind of, we're gonna show that they actually had to change both of them together. And for the lights, we export condensed textual format that will be interpreted by a low-level firmware. This slide here shows that the tool is compatible with high-level animation techniques. What you see here is a chain starting from the top level with an armature that has a built-in symmetry. The armature controls the nerves, and the nerve controls the spline, and the spline controls each individual drone. So you don't need to manipulate each drone individually, you can manipulate them as a whole and create a very nice animation. The tool is made such that it's compatible with this kind of technique. And here, just before I go to the more dynamic part, here we see the interface of the castle within Blader. What you see in yellow is actually the individual drones with the payloads in a safe region. And you see that since it's Blader, you can see with different cameras at the same time, you can see how the director look, how the audience would look, how different seats would have the experience. And also, this is a picture I took from one of the demo flies. So this is how the whole thing fit together. Now, again, I think you're more interested in not about the specificities of the project, but about the workflow, how do you, how does anyone can do any project within Blader? How do you know just so problems that in my face? So one of the main problems we mentioned that we are very happy and grateful to use the motion through add-ons, but their colors team was quite different, and so was their, there are a few limitations. But just as Blader, most of the add-ons that you can use in Blader, they're also GPL. So we can just take them and we change it. In our case, we actually change the color scheme to follow our limits. So just get red whenever it's closed, the physical mutation of the runs, and you can also analyze the individual frame. So if you click in a frame within the viewport, just gives you the speed and acceleration of that moment. Yeah, which implies to resolve power three equations, quite complicated formula, but it's there. This one is something interesting. Whenever, since we're exposing Blader to the user, user could be doing anything. And originally motion through had no way to update the motion through, which is basically the motion of your object. If you just change something on the dope sheet or something on the F curve. So what we did, we just add a bunch of callbacks in Blader source code to actually just tell the Python script that something had to be updated. This is the kind of extensions in Blader you end up doing for this kind of project. Okay, this one is for me. We heavily use the auto handle algorithm in Blader when we do the animation. The auto handle means that the user only enters the position and the time of the key frame. The handle themselves are automatically calculated by Blader to make some kind of curve between the key frames. The auto handle of Blader is okay. It's kind of okay, but it's not good enough for drones. For instance, in this case, you see this handle here, traverse the curve. This means that just before the key frame, the drones will slow down and just after the key frame, it will accelerate again, which means for a drone tilting. And tilting means it's not good because the, you did it. Because the, yeah. Yeah. Yeah. So that's not good for, it wasn't good enough for us. So what we did, we put an additional constraint on the handle that the acceleration before and after the key frame must be identical. And thanks to the Bayesian formula, which is quite simple, it turns out to be a set of linear equation that can be solved recursively. So we set up a formula which gives this. You can see now that the curve has a much, much smoother form. And by the way, there is a nice plus with that. If you add a new key frames, oh, sorry. If you add a new key frame, for instance, here, the curve will not change, which is an interesting feature, which is not the case with the old algorithm. Of course, there is a price to pay and the price to pay is, I think, quite small, but it needs to be mentioned. With an old algorithm, if you change one key frames, the handle of the neighbor key frames are modified, but not the one after. You see the curve here remains exactly identical, while with a new algorithm, the change to a key frames propagates further. But the difference is small and for us it was not an issue, of course. And this allowed us to have the XML representation of the path just mimicking one-to-one the blender key frames because it was a controlled algorithm all over the system. You can see here the difference in the 3D. You can see that the new algorithm will provide kind of a global aspect of the shape. I don't know if you can, okay. Next one. I mentioned that it's very unsafe to have drones flying around and always need this one red button that if you press everything just falls. But sometimes there is one drone which is not necessarily putting anyone in a risk situation, but maybe the battery is not so good, it is going down, maybe the GPS is failing. So sometimes you want to shut down one single drone, but the drones are expensive. I mean, you mentioned we're building them ourselves, so it's not like you can just go in IKEA and buy a new one. So what we're actually doing, we preview that for a few moments in your path, in the animation, you had what we call escape paths. So if you're failing, the battery is very low, close to this point, we just say next time you go to the next K-path, just take the escape route. So if you see here, you have the, this is the main path here, and this one is the escape path. It's just an example showing the principle of a drone taking an alternate route. So we have like one drone, is this only one drone, but two alternate, alternate, alternate route. This drone is the escape drone, it's a ghost, it's a copy of this one when it takes the alternate path. The problem is, since this drone is kind of connected to the original drone, we had a few constraints that we needed that Planner wouldn't allow us to do. First thing, first thing is, you want this new path to have the same initial velocity as the original point of the path. So that's one requirement. The other one is, we cannot change, we cannot let the user change on the loop sheet or anywhere else the original keyframe, because otherwise you need to update everything to be too much hassle to implement. So what we did was actually change, we created a new keyframe type in Blender, the Blender source code, with a locked keyframe. So this keyframe cannot be moved, cannot be scaled, cannot change, cannot be updated. So the moment you set this as a lock in keyframe, you can only change via Python. So it can change, but only via Python. Or you can delete it, but if you do it, you know it's deleted. We have a callback and just deleted the escape route drone. Okay, let's go a bit quicker. Collision, you can have as many as 50 drones in the sky, and at any moment in time, they cannot collide to each other. Impossible to do in Python, obviously. So we use the bullet, which has hopefully has been integrated into Blender by Sergei. Where's Sergei? Where's Sergei? The other Sergei. Our Sergei. Thanks very much, Sergei. A clap of applause for Sergei. He continued, but he did a wonderful job on having bullet integrated within Blender. Among other developers, helped as well in the past. So Blender could actually use bullet simulation just as we have on the Blender game engine. But these were not accessible via Python. But we needed, in our add-on, to be able to access these collision shapes and the collision simulation. So three things we did quickly. First, we added a new type of body called sensor. It's a body that can collide, but does not produce any force. So it's a ghost, which is perfect for detecting collision. We gave the access to the collision groups and collision masks. This is typical from a physics engine. And it basically means that you can control very, very, in a very refined way, which object collides with this object. And then on the Python side, we gave access to the collision pair. So after a simulation steps, for after a frame step, from the collision pair property of the rigid body world, you have access to which object did the collision in the last frame, previous frame. And also a function to cast a volume in the scene. And this we used to verify that no drones never flies above another drone. So we cast the volume of the drones down. And if it hits nothing, then it's okay. So just basically extending the blenders, not even the sea functionality was a little bit, but just to make Python faster. People say sometimes Python is slow, it's not as low. It's low if you had to do all the collisions only in Python, but in Blader, it can also count on having the sea to provide a more efficient method. Okay, thanks for that. We can export pretty quickly. Quickly on this one is simpler. I mentioned the LTC clock source. It's a wire clock. And there is a board that detects it and turns into a digital value. But there was no Python API for it. So we made a Python module to get the clock and we simply set the clock to Blender to synchronize Blender with the clock, as simple as that. So you could have the real show happening. Everything was synchronized, the sound, the music and the drones in the real show, but the Blender could get the same input and it could see in the simulation within Blender what was happening in the real place. Another add-on which has been proposed as a patch, I think, is the SMTP add-on, sorry, feature. SMPTE, it's very common to have when you're doing editing or special effects, I know it's special effects, editing, light, audio mixing to work not with frames, but if you want to call SMPTE. It's a format which says we have hours, minutes, seconds and frames. Blender does support SMPTE on the timeline. So if you press something, S I think. Control T, I think. Control T. You get to see all your time of your movie. With SMPTE. But Blender doesn't let you to input SMPTE anywhere. Doesn't let you see the input of the button as SMTP either. So for us, it was a requirement because the show itself is two hours long and the drone would just play part in a few chunks of time. So you would need to know exactly what's the time mark to start and to end. So what we did, we patched Blender, I think it was a patch, to allow for any time-related input to be able to be shown as SMPTE. Okay, there it is. And it is submitted as a patch as well. It was under discussion with this other topic. Light programming. Just something very specific is an example of how a lot of things in this kind of project can benefit other projects. So in our case, we showed the drone with I think 10 to 20 lights we have on it. We need a very specific way to have the user to have what we call selection group. You could very easily go back to a specific drone and very quickly navigate within the selected lights. So we have here just select one drone or three drones or all the drones and you can store as a selection group and you can play back all the selection group. You can record anyone, you can rename and you can also play all the elements one by one. So go all the lights one by one, it can do changes. This is a generic feature. So it's not specifically for us. It can work with anything in Blender. It simply records selections in buttons and then store them in the file. And when you reload the file, the selections are still there. And as I said, since it is a standalone add-on where even releasing this in GitHub is over there to anyone to see. Same goes for the next feature. Just an example of things that can be done that help us, maybe help somebody else to learn from that, I don't know. This is a choreography tool, choreography tool. So even though after we built all the low-level tools and all the requirements, they were just having a good time recording and designing choreography but they were having a terrible time making specific forms. Circles, squares, lines. We just built again a standalone add-on that just helps you to select a few objects and quickly create a circle or a square. And what you do is you just create this circle and then you go back a few frames and see how they go from the landing position to the take-off position, all the way to the circle. They can rotate and then go back to another shape and so on. That's it. So all of all these things we just mentioned are publicly available on our Twitter. And there's an Obrikant branch, the branch we called. So the Blender Changes. There is a demo that I think is really... Sorry? Yeah, now let's see how it works in the... How it turns out to be. Yeah. Please, can you please dim the lights? This is the beautiful itself. The Sydney is the biggest show in the world by the depth, by the scope, the width of the stage, but also by the number of actors. But the Sydney is missing in such a way in our minds since the years of the relief. That is to say that the sky is a sky that is nothing above a certain degree of the catalepsy. Because I expect this idea of the sky is dark. Giving the relief to the Sydney by suspending all the angles. Kurt then told me about the idea of the drones. And that's where we made the way together to give birth to Neo. Nikola was, you know, I think, mesmerized or at least enchanted by the idea because the next day Nikola came to me and asked me if it was really possible to get this technology here. A couple of weeks later I saw him again and I said yes, I think I found a team and we are where we are today. The machine didn't exist. There are commercial drones available, but they're not waterproof. They're not able to carry a payload in the type of payload that we want to use. They're not made to work in an event setting so they don't fly synchronously with sound, with video, with light. So there were a very large scale of problems that still needed to be solved. The audience doesn't see any drones, just experiences. In the coming years, technological developments offer new artistic possibilities. He says it's not only to start and they want to put the drones everywhere in the world. You just have a few more pictures just before you leave. Or, and you can later get the presentation online, you can get the link with more information about the project and the YouTube video. Just, I think, I don't know which one is. No, I don't. F5 maybe? The one, yeah, this one. Yeah, that's fine. Just some final image and if you have any questions, you can start thinking. I don't have time, I probably won't have time, so no. Just final image. I just send some daylight demos. And the last thing, so we have this very high end technology, but we are in a castle, we are in this medieval park, and sometimes things will just crash. But overall, it was very good. What's your name? This is Ruben, who was in a team doing the controls of the reading our XML file, making sure that the drones implement it. Oh, and this is the control room where you get to see everything happening. That's me right there in the end. This is Luke, the main operator for Blender. And they're using Blender there, it's pretty cool. Thanks everyone, thanks Pudifu and ACT for the opportunity on working this project, and thanks everyone for coming. Hope you like it. Sorry. Go on, guys. Yeah? Why is it still playing? It's just one, one, one, one. It was a summer, we did one last summer, controlling the drone for the first year in the main flight. There's some troubles, but in December, they're going to start again. They're going to make the second flight. We've got one of the first. All day at home, it wasn't going to be working in the handling, the managing, the handling and all that. And the implementation and everything was going to be going there, see the drone, find the cast of your... And that was Blender, that's all of it, Blender.