 Hi everyone. I guess we're going to start with the presentation. We're going to talk about the breaches level editor. First of all, thank you for joining. We're going to go over how we have used a Blender in a way to basically create levels for our VR game preachers. It is something that is not done traditionally. As usually, 3D assets are created in tools like Blender. They get exported towards Unity. And then that technical setup is done in Unity. It is something that we didn't like when we started working on our game preachers and we started making levels for it. In this talk, we're going to go over how we have seen those pain points while making those first levels and then basically work towards a new way of working all within Blender and specifically how we have done that as well. A little bit about myself and the company and the team I work for or with. My name is Yal Soudonis. Basically, I help create levels for our games at Triangle Factory. Triangle Factory is a game development studio within Belgium in the beautiful city of Ghent. We have been focusing on developing games for a few years now with two major VR titles that we have published. One being Hyper Dash, which is a fast-paced quake-like shooter. So you like quick response and fast combat. And now more recently, we have released Breachers, which is a more tactical five versus five shooter. And I was part from the beginning during the development and worked on those first levels and saw that the way we're working is not really efficient, so we probably need to do something about it. Quick introduction about Breachers. So it's an economy-based shooter. You basically take up against like two teams. Two teams are fighting against each other. So you have the Revolters and the Enforcers. And it's an economy-based shooter. So if one team basically wins the round, they have some money they can spend on better guns. So they have the advantage for the next round. In a way, you could say if you know Rainbow Six Siege and you know Counter-Strike, Global Offensive, or Counter-Strike II, the new one, if those made a baby and it would be in VR, then you would basically have Breachers. It's going to be better to show you what Breachers is all about by just showing you the trailer. So I'm just going to do that. So that was the trailer. From the trailer, you can basically see that we have a few gameplay mechanics going on. So first one is can we actually switch to the other mic if that's possible? Hello. Perfect. So we have the breakable windows that you can see on the left. They are basically, as you've seen in the trailer, that there is the inside and the outside of our game. Basically, 95% of the gameplay takes place indoors. So those breakable windows basically separated too. And during the buy phase of the game, they actually separated two teams. And once the game actually starts, they can shoot those breakable windows and go inside the building. Next up is those breachable walls that you've seen in the game as well. So you can buy a gadget that basically allows you to spray some foam on a wall and destroy it. So you have a better advantage on top of the other team. You also have repelling. Repelling is basically being able to scale down a building. If you want to go to the second story window, you can just walk up there. And then lastly, like most games have is a vaulting. It's basically allowing you to jump over a chair, for example, or jump on top of this table. So those are the main gameplay mechanics we have in our game. I want to take a quick step back towards level editors, especially for the people that are not in the gaming industry or don't play a lot of games. Level editor is basically a tool that allows level designers and environment artists to create a level for a specific game. They come in all shape and sizes. So for example, on the left, as you can see, you have the Far Cry 4 level editor. It's a level editor that basically ships with the game. It's more simplistic in a way. As you can see, well, I can actually laser point. As you can see, you have those platforms and those platforms you can place at one go. If you want to actually take one of those planks and you want to extrude them, that's not really possible because you can only place certain objects in there. You can also see they have objects limitations and performance-wise. So the level editor basically limits you in how much objects you can place in the scene, because otherwise, it becomes sluggish, slow, and your game would run at 10 frames per second, which is not good. On the other hand of the spectrum, you have the Hammer 5 level editor. It basically ships with Health Life Alyx, Dota 2, and now more recently, the Source 2, Counter Strike 2 game. And it's a tool that actually Valve uses as well to create a level. So it's not only for the community. On the left, you have the Far Cry 4 level editor. And that one is way more advanced. You can do environment art in it. You can set up gameplay. You can do scripting. There's a whole lot of stuff that goes in there. Brings us to the level editor called Hammer 4. So this is actually the level editor I spent the most time with as a hobbyist. It looks ugly and old. It is really old, but it was actually the level editor that was still driving Counter Strike Global Offensive. So literally a few months ago. And what the thing is that makes this thing beautiful is that you can basically take this tool, you can spend multiple days and months in it and never leave it and build a complete whole new level. And that is different compared to tools like Unreal Engine or Unity, for example, where you need to take a tool like Blender, model the environment, and then export it towards that game engine. This brings us as well to the breaches level editor. So this is our basically our scene in Blender with one of our levels called Skyscraper. And you can see that we have a lot of our environment in there, but you also have those green, those blue boxes, those blue cubes, and then you have those breakable windows that you've seen before as well. And you can do actually a lot of functionality, like most level editors can do. It can do a lot of complex stuff because we do that on Unity site for specific reasons. And for example, these are all some kind of features that a level editor can do. The Source 2 Engine level editor can do all of these. And our breaches level editor can basically do all the things that are selected in green. Lighting, culling is something that we do on Unity site because it's much easier to do it there. Audio, we could technically do as well in Blender. We could set up our audio sources there, but our audio engineer prefers to do it in Unity, so there's really no need to set it up. And lastly, scripting. Scripting, like for example, in Counterstrike, you could set up like you have a collider here, and if you walk in it, that door is going to go open. We don't really need it as well for Beecher, so we also didn't set it up in Blender. And it would also be easier to do in Unity anyway. Why we work the way we do, just also give you some context on why or how we work in Blender or how we have used to work. This is an example of like an asset kit. So you can download those or buy those. And certain game development studios use these. You have environment artists or 3D modelers. They make those kits. And you can basically take up a little part of the wall, for example. You can place it next to each other. You can put a roof on top of it, and now you have a very quickly built house. The issue with this is that you lose a lot of control over your level. And that's something that we didn't want for our games because we are building for VR games. And VR games give us some technical challenges as well. So our games we actually optimized from the get go. We built for the meta quest 2 device at first. We don't build for PC VR and then downscale the whole thing. We actually built a performance game within the quest 2. And then we eventually upscale it if needed. And the reason being is for people that haven't tried VR, you actually have like two lenses attached to your face. So you basically need to render the game twice. So like one for this eye and one for the other eye. The game also needs to run at like 90 frames per second, which is a lot. Because if you drop below those 90 frames per second, players in your game can actually become nauseous. And that's something that you don't want. And as well, you don't want to overload the GPU of your quest 2 device because then the battery drains faster and then your players can't play the game for very long. So a lot of technical stuff that we have to keep in mind. And the way we do it as you can see in the bullet points is we optimize our polycounts quite a lot. We also have a lot of like geometry that you're not seeing in the game. We like delete the faces. So our light maps stay lower. Like we don't have as much light maps as we need. So that's basically why we built the way we do. And this is, for example, one of our skyscraper levels in Blender. All of the red stuff. I'm actually going to start a video. So in here, you can basically see that all of our level is basically attached to each other. We have a very limited amount of geometry sticking out of the level. As well, this prop, for example, has a face on the bottom. It's very minimal. But in general, you can see that there is very little. Here we have some Z-fighting. If we want to work really optimized, we will have to delete it. But it's such a small thing. So there's really no need for this. And here you can see all of our stuff or geometry is attached to something else. So we have a watertight mesh. And there's very little stuff sticking out. Sorry for the OBS overlay. I actually recorded it a bit earlier. Because I was going to demo this, but it's not a good idea. Brings us back to, yeah, this basically our level and how we work. The problem with this is we built a lot of stuff in Blender. So in our previous game Hyperdash, we would model the whole environments in Blender. We would export that towards Unity. And then in Unity, we would set up those gameplay elements that you need to have your game properly work or your level. And then we would play it, play it, test it, iterate on top of it, and so on. This is an example of one of our levels in Hyperdash. All of this is from within Blender. And in the next screenshot, if you think away the lighting, this is almost the exact scene. However, the things that I've marked in yellow are basically those gameplay elements like, for example, here, there's a pickup point where you can pick up a better weapon. You have the rail for the payload level. But in general, there's very little going on, like you can export it to Unity, set up your stuff there, and be done with it. When we started working on breaches, however, we wanted the game to be more ambitious and more complex. So that basically means that we would have way more systems, more complex systems, and the setup of our level would become a lot more difficult. And during our prototype and the first level that we made, we were still working in the Hyperdash way. So doing all of the modeling, Blender exporting, setting up our gameplay objects. But what would happen actually is like, you can see those breakable windows from earlier, that was a prototype view or how they looked back in the days. And it would happen that I was modeling in Blender, I would poke some holes through the whole building. And then in Unity, I would have to place those then there. But basically what happens is you're working like a whole week on your environment. You add a window here or like a doorframe, like a gap. You remove on there, you move one window from the left to the right. And then after like a whole week of like working in Blender, you basically have to go to Unity and be like, okay, where did I have to place those windows again, those vaulting things, where do you have to add those? And you basically have a mismatch between Blender and Unity. And that was basically the issue because all of your work that you did in Blender, those five days, you would have to redo once again in Unity and admit the whole experience a bit like boring. And also when it's boring, you start to make mistakes. And then it would happen quite often that we did a playtest and there was no window inside our wall. And then basically both teams can see each other and your playtest is kind of ruined and you disappoint 10 people. So it was really not a good way of working. And here at the bottom, there's also some examples that I can go into a bit later of those technical systems that we had to set up as well. There's also an issue in the way that Blender talks to Unity. So for example here, if I play this, if I play this, so here you can see we have like, if I can pause this, we basically have like a ventilation unit over there. There's a particle system attached to it. And also we have, if I go back a bit, and that's why I'm not doing the demos right now, because it's quite difficult. Anyway, there's a script attached to it in Unity that says if you shoot this object, it's going to give like a metallic spark. If you walk on top of it, it's going to make like the metal sound as well. But the issue is with Blender and Unity and how they communicate with each other is like, if I'm going in Blender and I'm saying like, hey, this object, I don't want it to be called Cube005 or 6 and 7, and you want to work in a bit more in a clear way. And you rename your object, you export it back towards Unity. And you'll see in a minute. So we exported back towards Unity. And now Unity is actually going to lose that particle system that's attached to it. We're also going to lose the script, like the metallic surface script to it. And basically all of your work you have done in Unity, all that setup is just lost because you've changed a name of an object. And that's a bit of an unfortunate thing. And it's also a scary thing because in this case, this is a metallic event. In our game, you can also shoot through certain objects like this wooden table. I would be able to shoot through it. And if it's not tagged as wood anymore, that doesn't work anymore. So your level changes then. So we had to do something, right? We set together with the programmers to make a way to do it a lot better. And we basically came towards a solution or the idea that all of our stuff that we did previously in Blender, then in Unity, we wanted to add all of those Unity elements back into Blender as well. So we basically moved from the top to the bottom. And all of those gameplay specific objects that you need, like I said, we could do in Blender as well. And the way we actually did that was through a feature in Blender called Custom Properties. So Custom Properties is basically a way to attach extra data on an object. It's a bit hidden in a Blender UI, but we'll go over that in a bit later. And if you actually export your FBX file towards Unity, you can see we have a layer, this Unity thing, and the static flags as well. But we also have the surface wood attached to it. And in Unity, we have a class called the asset post processor. It's basically codes in Unity that you can basically check your FBX file, like is there some custom data added in here? If it is, then I probably want to do something with it, like I want to attach a script to it or that particle system from earlier. And the beautiful thing is that even if you rename your file or something, those Custom Properties are going to stay attached to it. This is an example as well. Is it going to play? Yes. And so here we have a wooden wall. The resolution is quite crappy, but we have its wood, right? And in Unity, by default, it's not actually loaded in yet. We have to click a little script on top of it. And now Unity is going to read in all of those custom data. And now we actually have wood script attached to it. And that's basically the whole technical setup, basically, that we had to do for all of our gameplay elements. Most of it is actually on Unity's side. So it basically reads this as a wooden object, attaches a script, but a lot of setup had to be done on Unity as well to make that work for other systems. Basically, it brings us to the breaches-level editor. And for example here, this is part of our skyscraper level as well. This is the static environment. So when you play the game, you can shoot on it. You can walk on it, but you can't interact with it. The next one, however, is then all of our gameplay elements are added in this editor as well, like in Blender. You can see those sound portals over the left. You can see those blue vault colliders that we've seen as well in the earlier screenshot that basically allows you to jump on the red couch. You have the breachable wall with the X on top of it and those breakable windows. So we can do all of our... If we're working in Blender and I want to move that window over there, I see that there's an object as well there. I delete it. Next time I export it five days later with 50 changes, we no longer have to do that setup in Unity. And that's what it basically makes the work for beautiful. The next one is then the screenshot in Unity. And it's also one of our goals is that what you see is what you get in Blender. So if I go back to slide, there's a very little difference between Blender and Unity. Just obviously those colliders are actually loaded in and they give functionality within Unity. And then lastly is the same thing within Unity, but the whole scene is light baked. And this is basically how you would see the game when you're playing it. Now, all of this is driven a bit in a way by our Blender Asset Browser, or not our Blender Asset Browser. So you can see here there's an example of our Asset Browser for breaches. We have a lot of stuff in this. So we have all of our props at the top of the screenshot. And we also have all of our dynamic stuff to make the game or the level run properly at the bottom. I actually have a video for this as well that I recorded earlier. So here you have all of our breaches assets. We have our blockout stuff. So these are the things that we use when we create a new level. Everything is in like depth textures. And we can quickly place like a stair in there. But also if you want to add some cover in there, just to make the whole level play correctly, make sure that it's balanced. So that's our blockout stuff. Then the dynamic stuff that I've talked about earlier as well, all of our types of breachable walls, etc. We also have our material library. That's something that's common for the Asset Browser. But the way we use it, it's a very useful way to prototype the look and feel of your level. So you can quickly slap a yellow color in there, see if it looks good. If not, you control Zed it. And it's a really good way to prototype the look. And then lastly, you have our props. And in here, this is also a type of props that we use. We can just slap it in our scene. But these ones we can actually edit. So like if you want a blue cable connector, you can make it longer depending on your hallway length, for example. Which leads us to props. Props for the people that don't know is a term for reusable objects. For example, all of these chairs could be one model and then you could just duplicate them the whole time. So this is just a depth scene from our props. I also have the video prepared. So in here, we basically have all of our props. Again, we can quickly drag them into our scene. Something that's done or the way that Blender works is if you have a collection as an asset. So you can have like an object mark as an asset, but also a collection. If we drag that into our scene, you don't have like the nifty preview that you've probably already seen in Blender. Sorry. Now I'm opening a different blend file and I'm showing that a collection is also the reason why because we have different objects in there as well. So if you have a specific only one object marked as an asset, we don't really need it because we have multiple objects within collection. And I think I'm going to go over here. If you also link an asset object in your scene, you can't move it the way Blender works. If you append it, you have this as you've seen the nifty preview tool. And that's why in Blender as well, we have our collections. We have a little script setup that basically snaps your prop on top of geometry and you can quickly drag it into your scene. The next slide is about our gameplay objects. I'm going to try to demo this through the screen, but it's a bit laggy. Can we switch to the other microphone for this, please? So this is an example scene of one of our levels called ship. And here you can see that there's also all of those gameplay elements set up. I'm going to try to do a quick demo of how it works. So for example, here, I'm going to try to work this way a bit. We have on our scene a lot of objects set up like those breakable windows that you've seen already. We have some props in there. We have the static environment. And if I export this towards Unity, they actually load in like this way. And we now have like a static white cube there. It's because Unity doesn't yet know that this is supposed to be the breakable window, for example. So we still need to load in those custom properties first. And it's something we try to do when you actually export the FBX file towards Unity. We tried to do that during that time. But for some reasons it didn't work. So we always had to do it manually, unfortunately. So as you can see, we have that script attached to it. And now when we actually click it, it's going to basically reload all of those custom properties. And now we have like all of the gameplay elements set up in our game like very quickly. And like there you have it. The beautiful thing that we also do, and it's probably hard to showcase now because I didn't set up the video. But instead of like in the previous way of working that we did was we were actually manually cutting out the holes in our walls. But now with the way we're working, like those cubes that we have, we basically like slap in our scene at like at them in between the wall and we boolean them out. So if we move our window or gameplay object two meters to the right, it's just going to follow that boolean operation in Blender. And that way we no longer have to like manually add holes everywhere in a level. So that's something that we also improved. We're actually still looking at that. So next up is also volume placement. It's one of the technical things that we had to set up as well in the beginning or like during our levels. So for example, this is one of our levels called hideout. As you can see, there's some yellow and orange objects in there. And this is an isolated view for example. Each one of those cubes is basically a room in our game or our level. We have like up to 20 or 30 individual rooms like a living room, garage, a basement. It depends on the level, right? But all of those colliders we will have to set up originally in Unity. The way our assets or like our workflow works is that we can do it as well in Blender. We add those custom properties on top of it and then we export it towards Unity. And this is basically how it shows up. In Unity as well, if you want to add those colliders in your scene, it's quite difficult to do that. As 3D artists, we're also very used to working in Blender. So that's also the reason why we try to push it towards Blender. A difficult thing as well, like a lot of these colliders in our game right now are cubes, but not all of our rooms in the game are a cube. They can also be like this room here is convex. So if it's a convex shape within Unity, you can't actually edit it. You need like a separate tool for it called ProBuilder for example. And you need to add that in your scene then edit it in there and it makes the whole setup a lot more complicated or more annoying to do. So that's why we prefer to do it as well in Blender. As maybe in Blender, you're changing the room, the size of your room. And now you can also quickly change those colliders as well for it. Something that we can also do with our workflow is basically collision overrides for people that have worked in Unity before. If you export a mesh towards Unity, by default, it's going to be a mesh collider. In Unity, you have a mesh collider, you have a convex collider, and you also have a box collider. Mesh collider is the most expensive one to render. And keep in mind, we're building for VR games, so we have to basically cut performance or just keep in mind performance everywhere where we can go. So for example, one of our props in the game is container. Our props are also very simplistic in a way. So this is literally just a cube with textures like applied on top of it. And as you can see in the hierarchy of Blender, we actually are calling that container at the end no collision. So when we export that towards Unity, that container is basically not going to have any collision. You can walk through it. But above it, you can see that can we actually switch back to the microphone? There you go. Thank you. You can see that we have overwritten that collider with a box collider. So basically, we can control all of those collisions from but in Blender. And this is actually one of the things that we don't do with custom properties because we wanted to have our artists see that this is a collision is being removed and we are overriding with a box collider because you'll see in a bit that the custom properties are hidden a bit within Blender, so we don't want artists to accidentally export something towards the game and then see that it doesn't have collision and ruin our playtest, for example. So that's the only thing that we do on name basis. Next up is then as well collider placement. Collider placement or clipping pass is what it's called in Counter Strike terminology. It's something that we do in our levels as well to make sure that our players can, from our level here, skyscraper, you can prevent them from jumping off the building. You also have that we have a drone in the game. You don't want the drone to fly out all the way out the level and see all of the back faces from our level. So all of those collider placements we can also set up in Blender. I also have a little demo on that. So here you can see that all of our colliders set up here. One important thing here, so we have our invisible walls. All of these objects have then custom properties attached to them. So like the Unity Layer invisible wall for the player exclusive is then also the player exclusive custom property. And for example, that's important for competitive games. I'm probably going to zoom in now on that staircase. But basically, I'm not going to zoom in on it. I can probably show it from the screenshot. On the left, you can see those stairs there. It's a competitive game. So basically, if you're walking up stairs in a game, like jump up or like not jump the whole time, your crosshair is basically going to move up and down. So you don't want that. You want a smooth transition on those stairs. So that's all of the things that we can set up in Blender that way. And we have some other colliders as well that we basically guide the player while playing our level. Now, added handling is also a feature that a lot of level editors can do. So this added handling was something that we have done for our first three levels on the Unity side. But we started noticing that the way that we work is, for example, each room that we have in our level is going to be a separate FBX file. We do that for calling reasons within the game. And for people that don't know what calling is, it's basically if you're looking forward, the whole world behind me is not going to be rendered for performance reasons. But like I said, you have 20 or 30 separate FBX files. We export those. That takes like a minute or two for a complex level. We have to reimport those custom properties that I've shown earlier as well. Can take up to a minute as well, because it's literally reading all of those FBX files. And the way we work for our game as well is if you have the example of five rocks, you have five different draw calls. And you want to merge those into one rock or one single measure, you have one draw call. And that also takes a few minutes. And at the end of that whole step, you can check in Unity, like, hey, is there any errors in my models? And after like, yeah, the whole workflow is done, like three to five minutes, you know, if there's errors, you go back to Blender, you fix those, and then you have to redo that workflow. It's a bit annoying. You lose a lot of time. It's also quite annoying. So we basically made it that we have a Blender QA bot in Blender. So basically all of, or a lot of functionality on Unity side for error checking, we actually moved towards Blender. And in Blender, for example, also have a little demo. And here it's going to basically like a colleague makes the joke that I used to say, like, don't do this, or like, you're making this mistake. Now, our QA bot is doing that for me. And we're checking quite a bit. For example, I just deleted an object that didn't have any vertices data to it. This is also quite an annoying error. Just like, instead of linking your model, you appended it by accident. And that basically means that like, if you have a gap in your model, and the community notices, hey, there's a gap here, you fix the model, it's not actually going to be updated the next time you reload your blend file. It's basically a lot of error checking here, like a depth texture as well. So we can check if we want to ship our level, we can enable that check to see if there's any depth textures in here, and then it will find that for you. And then at the top, you can see we have quite some options to enable specific checks as well in there just to make sure. Because if we enable a certain check, like, hey, control point is a game mode of us, and you have to set up colliders for it, and you have that check enabled, it's going to say, hey, I can't find any collection with control point, maybe you should add it. So that way you can turn all of those errors and warnings off. So this brings us to custom tools. Basically, for our whole workflow, we have made one custom tool in-house, which is called the Blender Triangle Tools. It is basically a tool that allows us to show those custom properties in a more efficient way, compared to having to go into the Blender UI and check it in there, manually type it in. It's not very efficient. And then the second one is actually the bundle exporter. It is an add-on written by the community. Then it was forked by somebody else to make it compatible for Blender 3.0. And then we forked it again to make the user experience a bit better. The Blender Triangle Tools is basically this. So we have a few components to this. At the top left, you can see the tool set. Basically, it gives us things that a lot of our artists do. For example, the textual density and breaches, we have a specific textual density. You don't want to go to the UV editor to reach time, scale those objects, check the textual density. It's quite annoying. So we have a button for it. You can see those assigned colliders from earlier as well. So we can just assign those names more quickly. Vertex collars as well for specific shaders that we use. Instead of going through the vertex coloring tab in Blender, we just have a button for it and makes the whole experience better. The second row, you can see all of those custom properties that our workflow offers. So in the beginning, we didn't have this tool. We would have to manually type it in what kind of surface we want. But then we basically have a nice and easy overview of everything. The third thing is the QA bot, which I've shown earlier, that basically nags at you if your level is set up incorrectly. Then our bundle exporter, or not our bundle exporter, but the one that we improved a bit. So on the left, you can see that was the tool that we originally used. The issue with it is that at the top, you can see there's quite a bit of settings. And it would happen sometimes that our artists would incorrectly set up those settings. You would export it towards breaches. And then your level is not set up correctly. It's not going to run correctly. So we basically removed all of that bloat in there and only used the settings that we need. And lastly also, at the top, you can see we have actually two paths for exporting wise. And that was something that we noticed like if I work at my computer at home and I export to a specific location, the next day I go to the office and try to export again, I'm exporting to my path at home. And that was also very annoying. So we basically split it up as well. The nice thing about this actually as well, our bundle exporter talks to our TF tools. So if you have errors in the TF tools, it's actually going to throw you like a super annoying sound, but you're not going to be able to export towards Unity until you fix those errors. We have an override for it that's enabled on the right screenshot, which should only use it in cases. That's not sure. Conclusion. Does our workflow have any downside? Nothing is perfect. So it definitely has some downsides. What we noticed is that if you're trying to find an error or like troubleshoot something, so you've seen something in the game like, hey, this is not working as it should, you have to go towards Blender, change that stuff, that gameplay specific stuff in there, you export it towards Unity, reload those properties, and only then you can start to like troubleshoot your stuff. So like in that way for like troubleshooting, it's quite annoying. We also have like sometimes instances where in Unity, like our vault colliders, we have one to like jump on top of like objects like the table, you also have something to jump over something. That's a Unity that's literally just a drop-down button, but a Blender way of working that are two separate objects. So if somebody says like, hey, you're using the wrong vault type, you have to go into Blender, change that object, export it back to make sure that your Blender site is always like the most up-to-date. So that's the only downside that we found, but we are making levels for competitive game. So in a competitive game, the level has to set up in a balanced way. It's also quite complex our levels, so it's basically always three floors, or usually it's three floors, and during the blockout of our levels, this is like a very fast way of working and eliminates a lot of that setup you had to do in Unity after those five or days of working. We actually don't have all of our gameplay objects set up in Blender, which is unfortunate. But for example, our player spawns and the bomb location, which is a mesh that's shown in the game as well, and you can use it for cover as well. These are the two things that we have not set up because our workflow grew quite naturally, and those things were already programmed. And if we would have to go back now, change all that code in Unity, it would take a lot of time to actually do that. So these are actually the two things that in a lot of playtest that happens that you're, hey, let's playtest, and it's like, oh, I'm suddenly spawning within a table, and that's basically the unfortunate thing about it. So those are the only two things we have to check in Unity. Something that we also want to give towards the community of Blender, because it's an open community, like one of the previous talkers here was saying it's like one of the nice things about Blender. If you Google something and you can always find an answer compared to other tools. So for example, on the left, we have, for the Unity side, some code example of how do you do that as a processor. And on the right side, we have our Blender QA bot set up. You can basically take that and extend it the way you want it to. We're actually also going to go and like probably next week, those property lists that I've shown in the tools earlier, we're also going to put that on GitHub as well. So you can put your custom properties in there in a more efficient and easy way. So basically our workflow, it took quite some time to set up. But in the end, we thought it was a really good way to eliminate all of that annoying part of setup in Unity we had to do, and it made us iterate on our levels like a lot faster. And we basically, therefore, we also make less mistakes, which is very useful because prepping your play test for a level takes quite some time. So if you're prepping it and like all of the 10 people get together to play your level, and some of you notice that there's errors here and there, it's quite an annoying feeling. So doing everything on Blender's side was a good way to fix that. And our workflow is actually set up now. All of our tools are developed as well. So we can basically take this up to our next project as well and continue to keep working in Blender. So I would like to thank you for listening. And if there's any question, happy to try to answer them.