 our game development company. My name is Kenan Beruk-Başı. I'm from Istanbul, Turkey, and I'm a CG generalist and a developer. I work in a company called X-Series Studios, which is, again, in Istanbul. We are employing a completely open source technology stack, and we have a fully in-house production infrastructure and game engine. We are a small, but very ambitious team. There are some buzzwords here. We do mobile-first, multiplayer cross-platform games, and we use sprite graphics 2.5D. Yes, and we use Blender for all the graphics production, but also we use it for prototyping and optimization, which are very important factors for game development. Let's see. For example, first, Blender game engine. We don't use Blender game engine as our game engine, but we use it extensively. I will show some examples of how we use it right now. Okay, just a second. Okay, I can't seem to be hitting that there, but okay, I can't switch that to here, so I don't just, I'm sorry for that. I don't know what happened. I can't seem to be switching out of this. Okay, so let's change it to here. It will not load, it seems, for example. Yeah, this is the map editor I wrote for one of our games because the creation of the scenes were really repetitive and it was an extremely hard labor, so and most of the stuff we use in the scene were already generic assets, so it was like street scenes and stuff, so I made the modular first, then I created this map editor in Blender game engine, so you can use it like this, for example. You can just put payment here, some more payment here, and I don't know, when you are finished with the first layer, you can just go up and put things like street polls and, I don't know, ball arts and billboards and stuff, so for example, I have already made one here. It's okay, the rotations are a little bit problematic, but you can see you can rotate the stuff like this, and anyway, when you are finished with your scene, you can just export the scene. With another script, you can import it in Blender. This is just a representation of the actual scene. This is not the actual graphics. When you import the scene, the actual graphics are loaded into a Blender file and if there is anything unique about your scene, like any unique buildings and stuff, you can just add that stuff there and when you are finished with it, it just, you just ray trace it with Blender internal and the result is something like that. I'm not sure if it's visible enough, but, okay, let's just try to find it. Yes, it's something like that. It's not the resolution, it's really problematic, but all the stuff like the fog effects and the compositing effects are automatically, they will be automatically added to the scene, so all you need to do after using the Map Editor is adding the unique elements like the building here and maybe the bankers here and stuff. So, okay, this is the first example of our usage of Blender game engine. Then we also use Blender game engine for prototyping stuff. For example, for this scene, we really wanted ray traced graphics and there is some problems we have. We need memory, we have memory restrictions, so we couldn't like isolate everything, every vertical thing on the map from the map itself because then we had a huge memory use. So, for example, the things like street poles, in order for them to be ray traced like this, lit by the billboards like this, you need to have them in the scene or ray traced all of the street poles separately. So, we needed some kind of a hack in the OpenGL shader, so we come up with one, but we needed to test it and Blender game engine was really useful for testing stuff because it's really easy to create a default scene with it and just test the thing even though it's not your game engine. You can always test the GLSL shaders you came up with in Blender game engine. So, another use, we also use compositor for prototyping again for some GLSL shaders, most of the times. For example, in this example, we were developing a football game and in the game we need so many different club colors and you can't just render the same sprites for all different club colors because then you have again a huge memory use. So, we had to do something OpenGL wise and we came up with the idea that, okay, we are going to use some kind of stencil buffer for the primary and the secondary colors of the jersey. Yeah, but then we realized, okay, even by using a stencil buffer, you need to double the size of the memory of the assets. So, you need to render the original animation and the stencil animation. So, we said, okay, we don't actually use the alpha channel of the images we are rendering. They are almost like, we are using it like binary. There is no partial transparency except the edges of the players that are anti-alized with the alpha. So, we can use the alpha channel as a stencil buffer. Then the ideas seem to work, but then we need to test it. So, okay, let's just look at the file for that. Okay, so this is the file. I'm sorry for the scaling problems. So, this is the guy and let's hit render. So, it's first rendered done with using this compositing group. The secondary colors of the jersey are, the alpha channel of the, the alpha colors of the secondary colors of the jersey are dropped to 0.5 value. And for the primary jersey color, they are dropped to 0.75. Okay, just, I'll just zoom out a little. So, that's in the OpenGL shader, we can determine in which places we should be using the primary color and in which places we should be using the secondary color. But there were some possible problems like the partial transparency on the edges of the animations. So, we needed to test it and I didn't want to just shoot the animations renders to our engine programmer and wait to try it. So, I wanted to try it myself before giving them. So, I created this node group to imitate the behavior of the actual GLSL shader. So, let's just switch it on and then, as you can see, it works. It changes the colors of things and of course, this is slow in the compositor. But it's really fast in the GLSL shader. Okay, things like that. So, again, this is another use of Blender being an integrated software. It has everything in it, so you can like prototype for many different stages of your game development workflow. Okay, let's see what else. Okay, this was, in this example we were using, we port for prototyping this time. This thing is actually a player. I don't know if you can realize it, but it's the player we just saw. And we wanted to try, okay, we can't use actual 3D here, but we might be able to use billboarding, layers, billboarding in the asset so that we can fake the effect of the 3D and we did that. But we also needed to try the same thing for the players. So, I just fragmented the player and rendered it to test the effect just a second. So, as you can see here, it's perfectly visible from top and these are just, yeah, it seems a little weird. So, using this effect, we were, okay, let me just show you what we created by this. So here, it's really a subtle effect. So, it's really not too visible, but if you look at it, you can realize that on the edges of the camera, they are a little, yeah, they are, the upper layers are just a little more to the left and on the left side, yeah, they're just a little to the left side. So, it's really subtle on the players, but when it's combined with the actual pitch and the stadium atmosphere, it's much more visible, so we needed that and we could taste it in the report directly. So, it's another cool thing about using Blender for game development. So, okay, this is another thing. In this example, in one of the games, we had like 30 different units and all the units has 10 levels each, so we had a huge amount of assets and we decided the map was hand-painted and we first tried ray tracing the units, but they didn't fit because hand-painted, specularity and stuff doesn't really look like the ray traced one, so we decided that we need to get a little closer in the ray tracing step to the hand-painted textures and since the asset amount of amount is huge, we didn't want to just give it to texture artists and ask him to paint every corner and edge of the thing so that it looks like hand-painted specularity. So, we came up with the idea that we can use freestyle for that. I just set a crease amount in freestyle and make freestyle draw white lines on the edges and then blur the white lines, whoops, I'm sorry, blur the white lines and multiply it with some Voronoi texture and then you have some kind of specularity-ish look in your stuff, so this was another cool thing. And okay, this is the main thing about my presentation, it's the sprite automation system I wrote for one of our games. So, Sebastian Cuny said two days ago in his workshop, if you don't optimize Blender for your own workflows, then you are doing it wrong. He's a professional, so you should listen to him. So, okay, we did that for the sprite generation, for our whole sprite generation stack. The result is something like that. I'm going to first show it to you. So, this is the, I don't know, if probably most of you are familiar with the concept of sprites, but they are practically pre-rendered PNG sequences to be used in the games. So, this is one of the, and Atlas means a collection of the sprites to be used in the game because the GPU is much more efficient when you are using them as Atlas, at least it was until recently. I'm not sure what's right now. Okay, so, why we needed such system? Process was really hard and problematic to maintain manually. There were too many assets. It was really easy to make mistakes in the scene setups and the lighting setups. There was too much varying settings like the grid sizes of the units we were rendering, the number of directions they had. So, we need to have some kind of automation system. Also, it was too much repetitive work. Concept revisions were really common. And when something is revised, you just need to do all over again. We didn't want that. And some other reasons. So, we said, okay, we are going to automate as much as possible in our workflow. And, okay, the first workflow was something like this. You have concepts, then you do the modeling, texturing materials, environment setup, lighting, render settings, render, then you generate the Atlas and it's then in-game. And we tried to, we decrease it to the diagram on the right. You just do the concept, do the modeling, texturing and animation, of course. Then the sprite system handles the rest of it. The environment setup, the lighting, the render settings are all handled by our in-house sprite system. So, it's much more easier. And also, they are dynamic. So, they are not hard-coded. The settings that are provided are not hard-coded into the blend files themselves. They are just edits when you are rendering the model. And they are not saved into the model. So, when you need to change some of the things, like the lighting setup or the render settings, you just change it from the code and you don't have to change anything in the blend, actually created blend files. I'm just skipping this. I just put this here so that if anyone want to hit pause on the live stream and take a look at it. Okay, this is the interface of the system. It's a batch processing system because for too many SS, it's much more efficient. So, the interface is command line. You just say the sprite is the name of the commands. And after the dash, the letters are the command parameters. And means create a new unit. I means import, which we don't use it right now because we create all the files in Blender now. But while I was, when I created the system, we also had other modellers that use Maya in the company. So, we have to use some kind of exchange format. And import, what import basically does is import and call other files and do some changes in them. And R is for render and A is for Atlas generation. So, when you enter this command, and of course, every other name you see after that is the actual name of the units we have. So, when you give this command, it just creates a unit skeleton kind of thing in your folder here, Arshi. Then imports the actual call other files into the models, renders them, and creates the Atlas. And then you have the new assets in game. And it generates something like this. This is the actual result of the generated image from that command. Yes, it's something like that. Maybe I can, if we have time, yeah, this is the time, seconds. Yeah, all right, this doesn't work again. Remove this to here. Let's get rid of this one, if I can. This one, okay. So, let's just start the terminal here. So, this is the system. For example, in the asset directory, we have this model directory and the render directory and the Atlas directory. This data JSON is the actual data file that we have the data like for each unit. What will the grid size be? What, how many directions should be rendered? If it's a character, for now, it's eight directions. If it's a building, normally just one direction and stuff. Okay, so these are the blend files. For example, Barrex is one of our buildings. So, you have this Barrex blend file here. And the only thing in the file is the model itself. None of the environment setup are actually here. They're all added dynamically. So, these are just the default blender file settings. Okay, so, one, okay, for example, if I want to create a new file, I just say sprites and which is for new file and say foobar, for example, if this is the file name. And, okay, here, I should have something called foobar and all the skeleton files are here. The materials are linked to a central material library. Compositing node groups are also linked to a central library and the textures and static textures and stuff. So, if I can just, for example, create a mesh here, maybe just do a little creation to it, like that, maybe. And I link the materials and I define a material for, I think, say white, that's a white one. Okay, and I save the file, I close it anymore, I hope. And if I say render foobar, since, okay, normally foobar, if it's, since this is just a name that I imagined, it's not defined in the data file and it's not, if it's not defined in the data file, the system thinks this is a unit to be going like eight directions or something, which is more, which seemed more sensible. So, for example, now it's finished rendering and in the render, we should have something called foobar, zero one, which means the level one of foobar units and it's rendered in eight directions like this. And you also have all the, like, little modifications, like the imitating and painted texturing stuff. And yes, these are all different directions of the same thing, as you can see. Yes, and if I say sprite A, which means create the atlas, then in the atlas directory, I should have something like this and this is the thing that I just did. So, it's in the atlas. Okay. I think that's all I will be showcasing, just a second. Okay. Yes. And these are my, yeah, if you need anything, if you want to learn more about any of the tricks we did or any of the systems we developed, you can just mail me or you can ask me on Twitter. So, thank you very much for listening.