 Hello, thank you all for coming here out of curiosity did anyone hear about this project before what armory wow okay great hello thanks for inviting me okay so what this project is about the goal is basically to bring cycles and work in the same space and workflow and use that for creating real-time experiences for this we will reflect a subset of Blender API that makes sense for real-time so basically if you know Python BPI you will kind of know how to work with armory because the data is more or less the same and also similar to cycles the the core of the of the Genshin is also independent so we can take it we can integrate it in other tools I recently saw the recycles implementation in Cinema 4D so potentially we could we could take this solution and apply it to other places but what this project is not about is we don't want to fork Blender basically it all works mostly like an add-on and optionally you can take the Blender version which has the player built in but it's it's a separate player that does not interfere with like the rest of the Blender like it's a it's a single patch that you can apply to Blender build it yourself if you want the source is already up on GitHub and yeah this way it's basically to fight fragmentation we would not want multiple versions of Blender and eventually it will be a nightmare to keep that up also what is kind of a challenge that's some people think that this is a full cycles renderer in real time which cannot happen because path racing will always produce better results but what we do is we sacrifice some of the quality to shorten the render times and lastly this is also not PBR viewport for Blender so there is no duplicate we're going on because the renderer is separate of keeping simple and yeah so this was probably I've been working on this for maybe over two years and everyone told me this was the dumbest idea ever because like building a full rendering engine and hoping for like getting a funding for it getting it useful to people and not just being a toy so it needs to be a lot of better not just not just as good as other solutions and that's where the Blender come in so number one is editing this is one of the first shots that I got from August two years ago everything really simple it was basically just exporting the assets and running the simple scene and thousands and thousands of our frustration and joy but mostly frustration we get to something like this it's still not perfect but so what this bring us is that I don't think other tools have is a completely unified workflow when everyone works in the same space everyone operates the same interface you don't export anything there is no new programs to learn for artists there's also use Blender ecosystem and if you already know the basic of Blender you will have very easy way of learning how to use armoring more and like I said the the building player is a separate renderer but to keep things ruining running in synchronization we send the messages between the two processes where we listen to the operators from Python and we translate those operator operators to the armory so if you move an object in Blender viewport the changes will reflect immediately in the player and if you import new new assets or do model changes the changing will stream to the viewport while running so to do user it should feel like completely seamless experience yeah to actually make it live since it's a real-time renderer it's recommended to use hacks which I will explain later but you can it is also possible to use Python not it for all targets but eventually it should be possible to use for all targets yeah one more random screenshot and you can also JavaScript and if you don't like coding at all there are logic nodes and they look just like you would expect from other the the thing that I did not anticipate before was that since this is running running in in the cycles mode we can basically render the scene in armory and sideways with cycles and what this brings us is that we can compare these scenes and basically see what's wrong with the real-time renderer and very very easily improve the output based on this that I I don't think there's a thing like this built in other tools like out of the box you can align to viewports and you can see exactly what reflections are of the lighting is bad and another advantage is that you can render like you will do in-game with armory and you can render cutscenes with I don't know cycles and you get much much better realistic graphics in cutscenes and eventually it is it's not impossible but the goal is to take any cycle scene and render it in real-time but of course it's reduced quality so this is an example of comparing the two cycles I'm not sure this one this one is cycles and this one is armory so the lighting is still off but the the reflections are kind of head of all right and how to improve this improve this further is we need the solution for global illumination and that's moves us to part two that's about the renderer and this is I think this is my most favorite thing about armory but yeah like I said the renderer is written from scratch because I wanted to keep it very very small and the less code the better so it will be easy to maintain going for going forward and since it's kind of a specific renderer like when I needed to build the same format that matches the basically what what blenders gives yeah so all the time there's now lots lots of notes built in but I picked five that I think are maybe the most important those are basically draw world note which if you insert this into the render path you can reference notes from your world setup what this means is if you drop in the sky notes it will it should it should just work if you drop in environment texture armory will take it and generate pre-filtered maps so again it should it should just work and you can actually if you don't like the built-in shaders that transform the these notes you can also rewrite the shaders which one you want you can also reference compositor nodes and reference all the lamp lamps in the scene and this way it was also super easy to add new features like stereo rendering for visual reality where again just a new node was added and only the parts of the scene that may make sense to render twice was put under the draw stereo node so it is very easy to share the first on the shadow maps or other data and do only work that you really need to and another easy addition was grease pencil integration you can play the video back doesn't work all right that doesn't matter it's alright well basically you can use the grease pencil data and integrated with the rest of the armory feature so it will run in the browser or all the other targets you can also insert the code into the render path and this way for example it was super easy to achieve effect like dynamic resolution scaling where if the frame time is getting too high for complex scenes we can lower the viewport resolution and it will look a bit worse but it's better than getting big shot shuttering what I also think is that it's a great place to prototype new 3d rendering techniques because you don't need to care mostly about anything apart from shaders you don't need to set up the render code for doing all of this and when you're doing this you can easily replace the nodes and see what's wrong what's happening when you connect them differently and another point that I think is quite nice is it's good for learning the high-level workings of the renderers because again you can run it sidewise with the nodes and you can modify it and you will see the output what happens immediately for example if you disable depth clearing and you will see that the depth is wrong and this also allows to completely throw away the rasterizer and maybe I don't know where that would happen jump to path racing because there is no any ties to forward or deferred rendering it's all dynamic my experience with this was there was a great presentation about rendering in the new Doom when they did not use deferred renderer but forward renderer with thin G buffer so basically you render the scene as usual but you only store normals and depth and this way I was able to prototype in just a few hours and for some scenes it actually turned out to be faster than deferred rendering so when you are building a project you can experiment and see which one is the best for your type of experience by default there's also a classic PBR node included so it is very easy to import PBR ready assets these are the mega scans these are against standard PBR textures the I think there only like three types of palms and they are rendered over over again with geometry instancing if you drop a node to hate connector you can very very easily get facilitated displacement going on it's it's not late it's not yet adaptive like in the latest six cycles I think when they're two pots point seven eight is the latest but eventually I hope it will be also adaptive yeah and this is a hotel room bar Pietro who was very kind and share this model he was working on with me and it turned out quite quite good but the lighting under under the table it's still it's still it's still quite quite off the shadows are a little bit over the bed so to improve this we need better global illumination so there are there are two ways there's offline solution where since again there is this big advantage of having cycles running alongside we can render the probes using using cycles and pre-filter them and use them for real-time rendering this was one of the tests and can actually see the red team on the balls on the right and the blue team on the balls on the left but to make it fully fully dynamic I started working on voxel cone tracing there's a video but I'm over it doesn't work yeah it's all it's again it's very simple right now the scene is voxel eyes in a single pass and then the lighting is calculated one more condition and again there is a nothing pre-baked you can move all the objects change the materials and the reflections are quite good this is not yet as good as I would like and the last part is how they're gonna deploy this and run this that's where a technology named car comes in it's a it's very very small library but it gets rid of the all the annoying stuff that you want to get there are very very little dependencies and no graphics API specific so we don't touch any open GL code any directory decode this is all handled by car and my favorite part about guys that there is no recompilation so if you want to modify something in the engine it's like you would modify something in python you change a line of code and you can immediately see the results and this I think is quite important for contributing because we were easily so yeah as mentioned by using hacks we can target any language like C++ for desktops and JavaScript for the web and for performance critical code we can still use C++ library to get away from graphics API we use car and to not lean on any render technique we use armory and this is one of the demos you can play online it's thanks to kydros everywhere and it actually fits under one megabyte and yeah it's almost done and release will be I call it preview zero because it's still it's still far from where I like to but it will be out today there are some documentation but really not much and it may not work at all also it will be paid for now as I would like to keep it small and have time to support all the problems looking to them properly but if anyone has ideas how can I make it free then please get in touch with me yeah there are some examples and that's it thank you