 Oh, thanks you. Just let me get my stuff going here. Yeah, so as already mentioned, my name is Christian Stonna. I work from the Oslo office for the Qt company, where I've been for the last 12 years. Last four years, I've been working on the 3D offerings of Qt. But yeah, that's enough about me. Let's just dive in. So these are the topics I want to go through today. I'll start by looking at the 3D offerings in Qt and try to answer some questions around that. Before I take a look at what Qt Quick 3D is and what it provides, this is not going to be a deep dive into 3D. I'll try to keep it very simple, using a very simple example and building on that, just showing off how to get easily started with Qt Quick 3D and basically showing how easy it is to use. I'll also take a look at mixing 2D and 3D content with Qt Quick 3D and Qt Quick. That's one of Qt Quick 3D's strong points. So I'll take a look at two ways of doing that. I think that will be interesting. After that, I'll take a look at the new features coming up in Qt Quick 3D 462 and say something about what we think will come post-62. Yeah, so these are the offerings of 3D offerings in Qt today. We have Qt 3D, we have 3D Studio and we have Qt Quick 3D. Of course, all of these provide 3D support in some way, but in a different way. First off, we have Qt Quick 3D, which is a high-level abstraction with low-level APIs. What I mean by that is it has a considerable barrier to entry. It has new concepts like the entity component system and a fully configurable frame graph. It's extremely flexible with what it can do, but I also think it's something that targets more advanced users that has requirements that are outside the two other offerings we have, 3D Studio and Qt Quick 3D. There's also a fair amount of abstraction on Qt 3D, which I've also mentioned here. That's mostly for a bit historical for how we ended up with Qt Quick 3D. We actually tried to put Qt 3D under 3D Studio, but at the time we had run into severe problems scaling down to the entry level embedded hardware we saw our users having using. That got us into a bit of a problems and we had to backtrack a little bit. We then came up with Qt Quick 3D as a replacement for, together with 3D Studio as a replacement. No, sorry. There's many 3D solutions here. Qt Quick 3D will, together with Design Studio, will be a replacement for 3D Studio in the future. 3D Studio is a very design-centric solution. It has a very tight engine to open GL. It also has concepts that feel fairly foreign to Qt, or that Qt already had better solutions for states and components. It also had declarative languages to describe its own scene format, materials, effects. We felt that didn't really fit with the unification story we tried to do for 2D and 3D. In the same way Qt 3D works, it's very self-driven, so we have extra straps needed to integrate and sync it with Qt Quick. I know there's been improvements to Qt 3D lately to improve on this, but this is more a historical context. We also wanted something even more high-level and easy to use, and something that just fitted into the Qt Quick story already and integrated well there. The result of all that is Qt Quick 3D. It has very high-level concepts, like camera light model, and those can easily be inserted in existing Qt Quick applications to add support for 3D. Qt Quick 3D is a very passive module in the sense that it only does what it's instructed to do, which is normally, it gets its drive from Qt Quick. So it's more of an extension of Qt Quick than a separate thing. Yeah, I'll then go into which one I should choose. Well, as I already said, Qt Quick 3D together with the science studio will replace the 3D studio from old, eventually. So that leaves only Qt Quick 3D and Qt 3D, and I think in most cases the right choice would be to pick Qt Quick 3D. It covers most 3D cases and integrates really well with Qt Quick, and I think the 3D needs of more advanced users will be identified by those that actually have that requirement, and they'll see if Qt 3D is the right choice for them compared to Qt Quick 3D. Christian, short comment, we still see the start slide, so you need to start advance the slides, and I don't know if you need to press the arrows. Okay, so then I'm feeling, oh, it's not, okay, there it's changing. Okay, it wasn't changing. Sorry about that, I didn't actually notice that. Yeah, so I think this is where I'm at, yeah. So the primary goals for us when we wanted to create Qt Quick 3D was that it should be simple and easy to use. It should have high level concepts like camera light model, and you should just be able to put that into your quick scene and get 3D content. We also wanted that there was no prior knowledge needed from the user about 3D. Of course, we had to back that with the excellent documentation, which I think we have been very good at adding. We also wanted it to be lightweight, similar to how Qt Quick initially was made. It's very embedded, centric. So we always keep an eye on that, that we have really good performance on embedded, and I think that's something that benefits the desktop in the end as well. Another thing that was very, at least I feel very strongly about is that we needed a code and a rendering that was easy to understand and reason about, and be able to easily see the flow data and code. And yes, and that's also something I think we've been able to maintain, even while we've been adding new stuff. Another key thing was that we needed it to integrate with Qt Quick. Mixing 2D and 3D content should be easy and inexpensive. So we needed, we didn't want to always go through off-screen rendering. We didn't want the renderer to be completely detached. So we needed the renderer to be able to talk to each other as well. That's what I mean by the unified rendering. One benefit of doing that and getting that is also that we can get really good-looking text, but not just text, like any Qt Quick item that's rendered directly into the scene with 3D transforms will look fairly good. The last important point for us was the tooling. We needed excellent tooling. We needed to have at least something as good as the 3D Studio for designers. It needed to be good for both designers and developers, not just one of them. So that's been in the back of our mind and that's something that's important for us. And it's something that's much more important when you're doing 3D. There's much more involved with lighting and positioning and stuff like that. So artists are usually used to working in very visual tools. So if you look at the architectural overview of Qt Quick 3D, you can see it sits on top of the RHI together with Qt Quick and it has its own scene graph with special items and we have a tight coupling between the two renderers. We don't do 2D rendering in the 3D engine or 3D rendering in the Qt Quick engine. We didn't want to destroy that by adding more complexity to either of them but instead we made that it's so that they have really good communication and can do inline rendering, jump from one to the other. Another point I think is worth taking up is asset and asset conditioning. This is also something again that's more important or more visible when you're doing 3D. Often the content you create comes from content tools like Blender, Maya, Substance Painter, those kinds of tools and they will export into some format like GLTF or FBX or their own formats and we need to condition those assets into something that's efficient for our real-time renderer because when those are exported they are exported with fidelity in mind and not necessarily efficiency. So that's where the conditioning comes in. We have this tool called Balsam interestingly enough which is our main conditioning tool that converts formats to QML components and it also then generates the textures and materials and everything that's needed to then put that into your application. It also has other things like you can do texture compression and yeah that's what I can think of at the moment. In addition we have the shader gen tool that's an experimental tool. Again it hasn't changed. Sorry about that. It removes. So we have the shader gen tool which is an experimental tool. It generates materials at build time by inspecting the scene and then trying to figure out what materials that will be needed in the application so that we don't need to do that at runtime and then therefore saving a lot of time. That's a fairly complex operation so that's why it's still experimental. We haven't seen that much use for it yet but that's something we're going to invest more. I also want to mention that it doesn't actually generate driver specific shader binaries but it generates spare v byte code that can be consumed by the Qt RHI later so it generates materials in essence. The next item here is the runtime loader which is a new item for Qt 6.2. That is what it says it can load the same assets at runtime. It at the moment supports glTF2 and what it does is a bit opposite of the Balsam tool. The scene is not generated at build time but it does it at runtime so it creates all the objects at runtime and you can reload with different type formats and now different models and you can use the runtime loader which is a special node to move and interact with the model or even scenes. It's fairly new so the API is not that extensive yet. Let's look at a minimal scene and how that looks in Qt Quick 3D. This should be fairly familiar at least the beginning here. We have a window and we have a view which view 3D which is a view into our 3D scene. We have a camera and a light and a model and that's it. I've set the source here to be a cube that's a built-in primitive and of course I've given it a material with the color green and yeah that's how easy it is to get started. It doesn't take much to get something on the screen and it doesn't take that much to get something more impressive like this. I've stole this model from the glTF2 sample library. I'm not that worst in the 3D tools that I can create something like that myself yet but again this is the same example. I've only changed the model to be the runtime loader here online 30 and I've given it source which is the glTF2 binary format file and for some extra effect I've added a light probe which is a HDR texture which it will use to light the model giving a more realistic lighting condition here and just to show that same texture you can see the reflection of the texture in the helmet in the shiny parts. I also said that the background mode to use the skybox mode which means it's going to use that light probe to to build the skybox as well. Yeah so the message here is that you can get something something fairly nice like this fairly easily. Yeah so let's move over to mixing 2D and 3D that means we do we have two modes for this there's a textured path and a direct path or inline path for both 2D and 3D. Scenes can still be rendered into textures that's still a flexible way to do it if you need to apply say that scene on a model that's not flat say a cylinder or something like that but both 2D and 3D items are defined in the same same scene. So if we look at the example here we have a button on line 27 which is just defined in line in our 3D scene and that's pretty much all it is to it. The DIST item which is a Qt Quick control item a button will then be rendered in line by the 2D engine so Qt Quick. What will happen is that when we get to this item it will branch off to the Qt Quick engine with the 3D transform and that will position and draw the the item as expected. The nice thing of this is of course that there's no need to go through offline rendering and so we don't pay any extra cost for that which we have discovered costs a lot on some devices. The drawback of this is of course it's flat as I said the textured mode you can apply to a model you cannot do that here unless it's flat you can push position it so it is on a flat surface but it's still going to be a flat item. I also highlighted on line 31 that the coordinate system in the 2D world and in 3D world are slightly different with the Y position flipped. The consequence of that is that if you want to position the quick item in the 3D scene you should be aware that you either need to do this or maybe better wrap the item in a node and then you can position it more logically in the 3D scene. Another thing to to be aware of is anchoring. Anchoring actually works but it when anchoring to a 3D item it will use the the center of that object as its reference point so you might need to put some an offset to to get what you want. Yeah and then we have the texture path which is a bit more involved as you know need to to set a material on some model that you want to render that source scene to so you can see here on line 33 we have a texture it has a source item property which you can give your cute quick item and that will then be rendered into a texture and become the texture and you then can apply that to a model. This is very flexible but again there's some caveats here especially when it comes to zooming and getting close to items that are going through a texture they will not appear that crisp they will get blurry the same way as you would if you scale an image you'll get some of that if if the size of the the item is not very correct and that's usually not that easy when you're constantly moving the camera or scaling items so here I put both of the the modes side by side of course I've exaggerated the the left side which is the texture path a little bit but it shows the kind of effect you will get when you go through a texture and this texture as you saw in the in the example is is on a rectangle here and of course when you get closer the texture will get stretched to to fit as as the the rectangle gets bigger. The opposite happens or not the opposite but as you can see for the the inline rendered or direct rendered item it stays really crisp the same with the texture but also the the lines from the button it looks really nice and and crisp and it will do that at whatever distance you are to it it will it will render with with from crick crick ready with the correct information and 3d transform so new features in 6.2 are instancing which I've shown here that's a tech was tech preview in 6.1 we lifted the tech preview for 6.2 so it should be be fully usable and I don't think we did any big changes to it yeah I came up with this horrific example again this is the exact same example I started with the only difference is I put the push button as a texture on the the cube and put instancing using random instancing having random position and colors for it I think there's about a thousand cubes here yeah so instancing is of course about drawing the same object multiple times and then doing it in one draw call which makes it very efficient the bad thing of course is that there's only one of these cubes that is the real one and that you can push next is particles we added 3d particles that is that's the 3d version of the the particles you might be familiar from Qt Quick this also was tech preview in 6.1 but that also has been lifted now for 6.2 it uses instancing for the model mode which you see here there's two modes for doing particles there's the model version and there's a sprite version both instancing and particles have excellent blog posts I recommend looking at those I'm not going to dig deeper into this today so yeah please have a look at those runtime asset loading I already mentioned so in the picture here I've also enabled instancing for that that's also supported so I don't remember how many models were used here but same thing we're doing instancing with that whole model with random position and color the API as I also touched on this the API for this is is not that extensive yet it's completely new now for 6.2 and so we're investigating ways to give better access to the sub components at runtime but that's something that will come in later versions last feature added for 6.2 is parallax occlusion mapping that's a more advanced technique for doing bump mapping or normal mapping for those that's familiar with that basically that's trying to fake depth the appearance of depth in the in a model using textures and without adding adding extra vertices to the model and yeah that's it's pretty nice it gives a really nice realistic look yeah so what's next tooling that's gonna be a big point for us we're gonna spend some time getting a design studio up to speed with the the new stuff in Qtquake for 6.2 and hopefully close the gap with 3d studio soon so so those using 3d studio will get a nice transition over to Qtquake 3d and design studio we're also going to look at the asset pipeline and add better and more support for formats there I'm not completely sure what we planned there except that that's that's a topic we're gonna do in parallel with the improving design studio and of course getting feedback we want more feedback about what what's working well and what what's missing and what what people would see yeah I think that's it thanks for listening in and maybe if there's some questions we can take them now yeah thanks Christian yeah and I'm sorry sorry about the slides that I don't know what yeah no problem um so there's one question from Hannah she's asking are there any plans for Qtquake 3d and AR VR applications or team VR or open XR or yes to all I would say we're experimenting with this there's I don't think we have like a version set for that yet but there's versions and prototypes floating around testing this so definitely something that will be coming okay and Nuno asks can you show and I don't know if it's already done so maybe he asked can you show an example of loading the gltf at runtime interacting with it um and do you import the camera lights and animations at runtime and how do you expose them in QML yeah so I cannot show that here we the code is available already and there's an example showing it off that's the one I showed yeah so it does load the whole scene so it does load cameras animations and the whole the whole lot from the scene the problem is that as I said the API is fairly limited yet I'm investigating ways to expose those things of course you can have multiple cameras and animation tracks and we need some way to to expose that but at the moment we don't have that and it would require using private APIs but fully possible if you're willing to to to dive into those private APIs okay yeah that's good for questions folks saying it again Christian yeah thanks