 Right, let's talk about the VR, the AR, the MR, and all the other R's, or shortly XR, I guess. I'm gonna start with some excuses, like I was traveling for the last week and totally exhausted and tired and everything. I had to slap together the slides really quickly. But yeah, it's gonna be fine. Let's start with the most important thing that's always how I start my presentations. It's me, oh, let's start with the gender first. I'm gonna first do a little bit of an introduction to OpenXR, because that's basically the really new brand new standard and that's exactly what we're gonna talk about. Can we turn up the volume a little bit? Okay, I guess. We're gonna start by talking about OpenXR, which is the brand new specification that was released by the Kronos Group. And I think it's important to have a little bit of an understanding about what's going on there. I've been doing a GSOC project on exactly this topic, basically getting OpenXR support into Blender. At least the basics, the basics, it's a little bit limited, but intentionally so. We're gonna look at that in a bit. And then we have sort of like an unofficial XR team now. There should be some people here. But it's basically a group of people that are experiencing that already did lots of experiments with, we are on Blender, and we want to sort of combine forces and try to get something started finally. And yeah, I've been in contact with some people here. Oops, walked too fast. And then we're just gonna do further work. So it's pretty much like an introduction to OpenXR and then like present past, future thing. Now we'll come to my, to my favorite slide. It's okay, I'm doing Blender development for five and a half years now I think. Currently I'm working full time at the Blender Institute since two months now. So now since a month also in Amsterdam. I guess you've all seen and used my work, focusing on the usability side of things. So it's really visible stuff and I guess you've all seen it, I'm not gonna show you everything. Not gonna show anything in fact. Yeah, focus on usability aspects. My personal interest in VR, pretty much non-existent. Like I've never used, I don't think I've ever played a VR game. Well, we do at the Institute every now and then. I don't watch VR movies or something like that. So I'm not really the most qualified person to work on this kind of stuff. But still I care about usability and I think this is sort of a really interesting technology to support and I think it's an important one to support. But also it's kind of like, if nobody does it, well if I don't do it, nobody does it really currently because people are working on all kinds of other stuff and I thought it would be a nice idea to have this as a GSOC project and I really enjoyed that. But yeah, so my personal interest is not so big and we are, even though I do enjoy working on it from time to time, but yeah, we have a great team of people who are really more qualified to talk about VR than I am and I think we complement each other nicely. But we're gonna talk about that. OpenXR. So OpenXR is this brand new royalty-free specification by the Kronos Group and it has this great support by the industry, whatever that is. So if you can visit their homepage for the OpenXR specification or the OpenXR, what is it, standard, you can see all the great, the cool quotes from the industry people and how everybody wants to support it and all the cool stuff. It seeks to solve the fragmentation in the XR industry. So we all have this issue like with all the different devices from different windows and they don't work with, there's horrible compatibility with the software and things just are really fragmented and there's this graphics from Kronos that they always show and especially that's currently the situation, everything is a bit of a mess. You don't really have any structure or something else in there and OpenXR tries to solve this by having layers in between that are standardized and so eventually and hopefully you'll just be able to use whatever device with whatever software and it just works out of the box on Blender. So that's the idea behind OpenXR. Okay, that's stupid, there you go. Yeah, so it is really not just about VR, it's also about AR and MR and again all the other R's. So with this we should getting support for things like AR and MR, so even like HoloLens stuff that should be doable in the near future. Like I don't see a big challenge for that, just say two days of work or so from what I can tell now because OpenXR, yeah, I'm on the stage and this is being recorded, this is stupid to say, but like OpenXR does a really good job at unifying things and of course there are all sorts of things that still have to be figured out. As I said, this is a brand new specification. It was just released at SIGGRAPH this year, the 1.0 release and I actually started implementing the provisional spec, the 0.9 release but yeah, now we're at 1.0, there you go. Another thing that's really important is there's lots of proprietary software in the VR world. There's not much that's GPL compatible or that's free and open source compatible in general and OpenXR allows us to still use this technology, still use the modern proprietary code but we have this standard interface now and so this is much less troublesome if you can actually get the best provide or okay, that's stupid to say again, we can actually connect to the best software for the VR devices. What with OpenXR you have this thing, I always compare it to OpenGL. In OpenGL you have the specification which is OpenGL but then you have the driver which actually implements the specification and it's pretty much the same with OpenXR, you have OpenXR as the specification so it's basically just a description of how does the API look and all the types and everything but you still need to have a runtime installed on your system which is sort of a bit more high level than the driver so that means you have the thing like say SteamVR or Oculus or the Windows mixed reality platform and those are the currently available OpenXR compatible ones so there's the Windows mixed reality one which is the one I used the most for testing and for development, it's the most major one when it comes to OpenXR support I guess. Then Oculus released one recently just a few weeks ago or I think it's wrong to say they released it because it's really just they bundled with their software and it's available and you can basically use it but it's pretty much in beta state or I don't know what the exact state of it is, the one they publicly announced and then there's also Monado from Calabora which is a free open source runtime which actually runs on Linux and only on Linux right now. On Saturdays or tomorrow there's in our old speed of when you the Debali there is a conference on free open source XR stuff so if you want to hear more about that, go visit them, give them a quick visit, I'm also going to hold like almost the same presentation there just much longer, I'm afraid. So yeah, this runs on Linux, it's pretty limited right now. They have a great network there and lots of people who can help them out and everything so I do think this is going to be pretty great but right now it's just not ready yet for use really. Like even if I want to close the VR session with the Monado runtime in Blender, I have to actually close Blender because there's no way to exit the session. Like it has those kinds of problems which are not solved yet but I guess it's going to come and then we finally have like really usable VR experiences on Linux, free open source. To long didn't read OpenXR allows Blender to access the state of the art XR technology without being dependent on the device, software, platform, whatever. It just allows us to kind of plug and go. All right, so let's talk about how I worked on OpenXR stuff. So yeah, this year I took a bit of a break from Blender development and wanted to go back and I thought GSOC would be a nice opportunity to really get myself to focus on it again. So that's what I did. But pretty much from the get go, the idea was that we really focus on making it solid and well performing, not so much on having all the features and having tools and having like an initial UI design implemented or so. So we really wanted to focus on the solid and well performing part of it. Which means, if you use it now, okay I should also add, I wanted to do a bit of demoing and initially this one was planned to be just a demo presentation and it just show stuff. But it's a bit difficult because I need a machine and I would have to set it up and I'd try to set one up and then we had driver issues and then I had to leave for traveling and all that kind of stuff. So yeah, can't demo anything now. I think I can set something up for Sunday in the open office day and the institute. So just come by me and I can show you stuff. But the thing is right now it's kind of disappointing because all it does really is you take on the HMD and you can see the viewport and VR which is kind of cool but it's really limited intentionally so. I didn't want to work much on the viewport itself. I wanted to make sure that the overhead we add with the VR specific code is minimal. And I think I did a finished job at that. Like we have an overhead of less than 1.5 milliseconds on pretty much any scene just for the VR stuff. So we can actually get to 100 FPS in decent sized scenes pretty easily. So I think that's kind of fine. With the solid and well performing VR viewport we have some implications. So I did have to define the initial core architecture to get the OpenXR stuff hooked up and to get all that to work. Carefully designed error handling strategy. Like this one I am really, I really took attention to because what I basically want is that if something happens some kind of error because the device isn't plugged in or because OpenXR doesn't find a device the runtime has some issues or whatever. This kind of stuff happens and I really want that there are no side effects to Blender so that the session just closes and you can still continue working in Blender but I want to have like a really useful error message and not just do nothing. So yeah, I took quite a bit of, I made sure that this stuff works well. Yeah, and that's the other thing I said like we really want to void any specific VR draw overhead and just try to get the viewport performance basically on its own and not add more overhead to it. So let's talk about some results. We do have well performing VR rendering-ish like it's always hard to say some things and again I'm on record. For solid mode I think it's pretty fine now. Like I said, we can easily push out 100 FPS also in theory because it's still like limited by the device refresh rate. So we are not actually showing 100 FPS but it's actually the runtime make sure it's only at the 16 FPS or 19 FPS which matches the hertz of the device. But yeah, in theory we can do quite decent frame rates I think for reasonably complex scenes. I don't have any concrete numbers unfortunately here. We can dynamically connect to the OpenXR runtimes which basically means we don't just have one specific runtime that you need to set up or so it's really just if you want to get the SteamVR runtime to work you just install the SteamVR stuff and you may have to enable it as the active runtime on your system or whatever. But that's pretty much it at least that should be it eventually when OpenXR is more adapted. But then yeah, you just open Blender and you can start your VR session and you're good to go. Currently we hide the feature, like this is not in master yet, it's in review and it should go into 2.82, that's the goal. Stella here, yeah, that's our goal at least, right? I'm sorry, you should review it. Anyway, right now we sort of hide the button to start the session in the UI just because if we have this in master and we say, well, toggle VR session and then somebody presses the button and it's just boring. Like there's not much fancy in there, you can just view the scene. So we have re-rupted it into an add-on and it says that it's more or less a preview of it. Yeah, again, the error handling, that's something that I find important. Yeah, I'm quite satisfied with that although there are some issues to rule out. Then there is this big issue. There are, for example, the Windows Mixed Reality platform is DirectX only, Blender is OpenGL only currently. So if we want to use this Windows Mixed Reality platform, we somehow have to give them DirectX data, basically. So yeah, this was a bit of a hassle but I basically got something to work so we have compatibility between OpenGL and DirectX. We basically convert textures at the lowest or pretty low level and the overhead of it is really minimal, like I'm totally not concerned about that. So yeah, that turned out to work quite well. Yeah, this uses some OpenGL extensions and it may be a bit troublesome on some hardware but it seems, I think OSVR, which is another runtime, they used the same code or the same extension, it turns out. And they said it's working quite reliably for them, so good. And I've also added all kinds of debugging utilities like OpenXR is hard to work on and hard to debug so that kind of stuff is important for further work. And then again, there's all kinds of stuff that you have to look for for the architecture to make everything to work. And all in all, I am quite satisfied with the outcome of it. Of course, there's always stuff that you would like to tweak and improve and everything but all in all, I think it's quite nice the way it turned out. So yeah, let's see some screenings at least. Don't have anything to show as a demo so I'm just gonna show some screen shows. So this is the Windows Mixed Reality portal, basically drawing the BlenderView port. This is the same for Oculus. This is a screenshot by LazyDodo or WindowsMainChina. He always tested the Oculus stuff for me because I only had the Windows Mixed Reality kit available. You can check the patch on fabricatordeveloper.blender.org slash d5537, that's where you can check my Google Summer of Codes main patch. I also have the final report and I'm gonna click, oh no. There we go. This is like lots of text and everything and they try to explain stuff really well there. But yeah, if you want any further information on the GSOC project, just check this side. What else? Go back, go back, go back, go back. And of course, there's a demo video by Zymion. He made it for me because I've did all the development like literally on my laptop for the GSOC thing because my machine at home is pretty old. But he was kind enough to make this little video that I've tweeted about and people have probably seen it. Yeah, so on the left side you have Blender and he just enables the add-on and on the right side you have the Mixed Reality portal again and all he has to do then is toggle VR session and we are going into the VR viewport. And as you can see, this is not like a totally trivial scene. Like there is quite some geometry in there. I think like three million polygons or so. Or I think it's 1.4 million vertices, yeah, three million trees. So yeah, this is not mine. So yeah, the performance is quite doing quite well in the solid mode at least. I didn't look too much into EV because there are lots of other issues to rule out there, especially like with reflections and all that stuff. But that's something we can look at at any time. The totally unofficial XR team, and I'm saying totally unofficial because it's totally unofficial. It's just a bunch of people that I was connected to, sorry, don't bother me. Okay, again, totally unofficial. It's just some people that I'm connected to and Dalai gave me further contacts. And we started writing about stuff. First of Dalai, of course, he's sort of the current coordinator for the development team. And he has done, sorry? One of them. One of them, he's one of them. One of the machines at work. Then, of course, there's Daniel, who's also here. And he has been doing quite some stuff with everything painting related and everything grease pencil related. And they have the MPX project, which is basically drawing mainly grease pencil rights in VR. And he really cares about that workflow. And I think that's a great one to support. Jama, is he here tomorrow? Okay, he's an awesome guy. I never met him yet, or didn't meet him yet. He's also, he's been doing concept art for Jurassic World and everything. And he's also really into the VR stuff. And yeah, I've only heard good things about him. This is me. Then there's Max Grichenbauer. He worked on PlantXR, which was sort of an implementation for Planner to get VR to work and have some toolings and all that kind of stuff. And if you want to check it out, just Google for PlantXR, it's actually quite cool. I'm not too sure if like everything that he's got in there, if that's something that we wanna have in Planner, I think we would do some things a bit different, but it's a really cool project. I recommend to check it out. And then of course, there's Sebastian. Where is he? There he is. He's also been doing lots of industry work for VR. And I think that kind of input is also totally important to have somebody who's actually using this stuff in the industry or for professional work. And he needs a clean pipeline to go from Planner into what are you using, Unity or whatever, Unity. So yeah, that kind of stuff has to work well too. And so I'm glad to have him here and also Simeon who is working with them together. So yeah, that's gonna be handy. But as I said, this is totally unofficial and everything if there are more people who are capable and have much experience with VR and maybe we are on Planner, we can totally extend the team or whatever. But yeah, we didn't do much yet, but I've tried to sort of lay the foundation to kick work off and that's what we are gonna talk about now. I have a couple of thoughts for how we continue this work. I really don't want to just take the UI and put it into VR or something like that. That's not gonna work. William Reines was sort of the lead designer or whatever behind 2.5 and 2.8. I think that's fair to say. He said if we do that, we are just gonna make a new interaction mode for Planner which is much more difficult to use. So I really don't want VR to be a more difficult way to use Blender. I want it to be great and I do think there is lots of potential in the workflows for VR. So let's kind of design a XR UI from scratch. Let's not try to just take stuff and put it into VR. And my proposal was to make this really use case focus, like think about what are the use cases for VR? Where can VR really shine and try to start from there? Flash out those use cases and then say, all right, we want sculpting to work. We want this kind of modeling to work in VR. We want, you know, start from there and not just start in the void and see where things go. But, and this isn't important, but keep things consistent. Nevertheless, Daniel pointed out really early on that he really wants the workflow to be smooth when you switch between VR and pen and mouse and everything and like mouse and keyboard workflow. So there has to be this consistency or not just consistency, but also like interoperability with the different ways of working with Blender. So I think, yeah, that's a really important point. And I think that's actually a quite nice boundary for the, that defines sort of the project where we want to go. And I think we can take it from there. Tomorrow we'll have like a little like closed meeting which is also stupid thing to say, but we're just going to meet up and talk things a bit further through and see if we can get things started a bit further. And yeah, there's this task. And I'm going to open the browser again. This is basically just a big design task that I wrote. It's based on, it's basically the same as this document that I also wrote towards the end of the GSO project. If you want to see the details and all the open questions that we are thinking about and see what's the current higher level stuff that we work on, that's pretty much the place you want to go. So again, we have all those questions like how will navigation work, like key mark bindings, reference spaces and all the kind of, all the things that have to be figured out early on. So yeah, that's pretty much what I hope to get addressed over the next two or three months or so, maybe. And then we can start implementing stuff. And I think my role is mostly like trying to get the framework in there so people can actually start developing tools and that we have the foundation there to further develop stuff. And then I think my role kind of fades away and I will give it to people like Daniel and Jarmar, but also the Max Grishenbauer from the plant XR team. And I think they can just do a much better job at implementing all the different tools and thinking about the workflows in detail than I could do. And I am kind of happy with the team we have now. And I think they complement each other nicely. And I think that's the best way forward for this project as a whole. So that's it for the status update. I hope we'll see much more in the future. I hope next year we can have another presentation with lots more cool eye candy stuff. But this is just lots of information for now, which is where we stand. Thank you very much and enjoy the conference. Thank you.