 Hello and welcome to my talk moving KDE to another reality about our project Xoutestop. My name is Krzysztof Hark and I am working for Calvora and the project Xoutestop was sponsored by Valve. So why the name Xoutestop? Let's go over the terminology quickly. The types of entire reality systems that exist are these here. You probably have heard about virtual reality. It's when you have a headset like this and put it over your eyes and you don't see anything of the real world anymore and instead it renders a virtual world for you. You probably have heard about augmented reality. This is kind of the opposite where you still see the real world but virtual objects are inserted into it like labels over objects or maybe Pokemon that you have to catch and maybe you haven't heard about augmented virtuality. That's not something people usually talk about. It's kind of the opposite of augmented reality again. It's if you have a virtual world that you see but have real objects in this virtual world. For example, if you have a keyboard, a physical keyboard that is represented in your virtual world, for example, Logitech is building one of those so you can have an easier time typing in VR. MR is something you probably have heard about that was once used as an umbrella term for mixed reality that was out of this but in more recent times Microsoft has named their product line Microsoft mixed reality headsets so they kind of have to turn now. The new neutral umbrella term is extended reality abbreviated XR so that's why our project is called XR desktop. So what does XR mean for desktops like KDE? It first means that we have new hardware glasses so to quickly go over the hardware glasses that we currently have for our normal desktops, we have a monitor that is basically a rectangle that goes from 0 0 to 1920 times 1080 or maybe more if you have a better monitor but from your operating system perspective that's just a coordinate system and it doesn't care where this coordinate system really is in the real world. You have a keyboard that's a lot of digital buttons, about 100 and you have a mouse that has a few digital buttons you can press and it also has analog input to the movement. So what is different about XR devices? So here are a few examples that we have. One is on the left is the Google Cardboard you have your smartphone you insert it into our cardboard box with lenses and this is a very simple VR headset. Then you have the Vive or the Vive index that I have here that is separate to a PC with HDMI and USB and another example is the Ducre stand-alone AR headset. There's an example here and what they have in common is that they have a screen that you have mounted on your head, short for head mounted display and so this screen covers your field of view and if you just render your desktop on it without any kind of tracking just render your desktop on it and look around you'll probably get dizzy and probably motion sick even. So that's why these headsets have tracking built-in and IMU is something pretty much all headsets in all price range as have an inertial measurement unit that measures rotation. So if you look up the headset can render the world rotated down. If you have ever done any graphics programming you know that looking up is the same as rotating the world down around you. If you want to render correctly, if you do two steps to the left then the world moves two steps to the right. For this you usually need an external tracking system. The wife has some basic stations that I don't have here right now. You can see them later. Another thing is that you don't want to just extend your desktop to a headset. If you connect this headset with HDMI to your PC it is presented as a single HDMI monitor. It's actually two but the firmware lies and says it's one monitor and if you just know your desktop on it you get your left eye on the left side of the desktop on the left eye and the right side of the desktop on the right eye you don't really want that. So early VR applications are rendering a full screen window on the headset but now we have VR runtimes that handle this for you. So if you now connect a headset to your PC it doesn't show up in X11 at all. The R runtime will open this headset directly and render to it and your VR application only has to care about rendering to views. There are the names of a Vulkan extension that handles getting this display in direct mode. There's one that sounds very X11 specific, acquire XLIP display. That is what we are mainly using because on Wayland this is kind of missing. Through the vault from WL routes is working on a Wayland equivalent of this extension. The one you see there. So please support this in Quinn on Wayland so we can have direct mode there. Another thing is input glasses. You have don't have a mouse for VR you have controllers like these. You can maybe see it. They have a touchpad in the middle. There's a thumb stick, two buttons. There's another button, an analog trigger, stuff like that. So you can really have the same input as a mouse with these controllers. And of course they also have a tracked position in 3D. So you get a 3D position by your mouse just as a 2D movement. So desktops cannot really use these input devices directly. There's to be some mapping going on. So to summarize in short, standard monitor in 2D has a, yeah, your VR display is like a monitor but it has a dynamic position and because you can look everywhere and the R runtime will track your headset. You have basically infinite space where you can place your windows. You can also place them further back near or up down where you have only a fixed rectangle for your standard desktop. Yeah, and these also have more input that I forgot to mention. For example, Valve calls this finger tracking here. This is a finger tracker. It's a little input so you can have gestures with single fingers. So a lot of this is of course something desktops cannot handle right now. So our project was bringing windows to XR or VR and with our project XR desktop. So a short interjection is about our VR runtime. We use OpenVR SteamVR that is Valve's runtime. And what does it even do? Since there is no standard for these devices, you'll get, you know, block them in with USB but there is no standard service that reads from these devices. So VR runtime is something that calculates the position in space and provides this with a simple API. In this case, OpenVR, which is Valve's API. And this API is publicly documented. There is a header that you can download and there's more library for it. But the API is controlled by Valve. It has a plugin API so you can write your own drivers if you build your own HMDs and it handles all the rendering for you. So you only have to render two textures and the runtime will handle putting it on the HMD. There's also a new standard OpenXR that is made by Kronos. Valve also helped by creating the standard but this will not automatically make everything open source. You can think of it as the OpenGL of interacting with these HMDs. So you will still need a VR runtime just like you need an OpenGL driver to interact with the hardware. So how did we bring KDE Windows to XR? A simple approach would have been to make a new window manager that you start and it only renders to the headset and doesn't show anything on your monitor. But this is not what we wanted. We wanted explicitly to have the windows that you have on your KDE desktop to also show up in the VR headset. So this end we have made a cave in plugin. We kind of abused the effects interface for this. We can talk later about this if it is a good idea and what we can do about it if we can improve it and GNOME shell doesn't have a plugin interface like that. So we have patches for GNOME shell and of course we didn't want to push everything into the window managers so we implemented this in libraries. On the left side you have lib input synth. So if you have these controllers and want to generate input for Windows you somehow have to get this into the Windows. On X11 this is hard to do directly so we use xfake test input that you maybe know that's an API where you can generate input events. For example you can say move the mouse towards this coordinates, generate a left click or print this character for keyboard input. And for valent we can have different backends. We are hoping that maybe we can generate input for Windows directly without having too much trouble. On the right side we have our main library xr desktop. This library provides the VR windows that wrap the desktop windows. So a VR window has a 3D position in space. It can be rotated in any way. It can be scaled and the interaction with controllers with these VR windows is also handled in this library. And below that we have GXR that is our wrapper for VR run times like SteamVR. So in the future we can have an open XR back end and run on an open source VR stack. Currently we only run on SteamVR but we already have started work on supporting the open XR standard. And on the bottom we have Gulkan which is a wrapper for Vulkan which is implemented in Glib. That's why it's called Gulkan. So all of these libraries are implemented in C with G object which is not ideal for KDE but that's what it is. Because we wanted a C interface for everything and didn't want to have too much trouble. So the implementation in GNOME shell works directly and the effects plugin can use Glib right here is just as well. So there is that. That is what I mentioned. So how does the cave-in plugin, what does it even do? It gets textures from every window that you have on your desktop and puts them in one of our VR windows. It does that without copying any textures. We use OpenGL and Vulkan interrupt for this. This requires the extensions that you see here, the Vulkan external memory extension and the OpenGL memory object extension. With those you can create a Vulkan image which is basically a texture. You can export a file descriptor for this texture and then import this into an OpenGL texture. Then you have an OpenGL texture and a Vulkan texture and they share the exact same memory on the GPU. So you only have to render with OpenGL to the OpenGL texture and can read the same texture from Vulkan without having to do any copies. Unfortunately, this is not available on Intel right now. The OpenGL extension is not implemented for Intel. So we can only use this on NVIDIA and on the AMD open source driver. On the VR side, the VR windows are implemented as overlays. That is something that the SteamVR runtime provides us. Overlays are basically planes that you can put with a position and a rotation in space and those can run on top of VR applications like VR games. So what you can do is you can run a VR game and play it and on top of that you have maybe a Twitch window, fair stream and read what your watchers are writing in the chat or something like that. OpenXR has a similar concept that would be a quad composition layer. So this will also work on OpenXR. We also have a dedicated desktop application that is a full VR application that renders the complete desktop itself and this is also using Vulkan directly. So how does it look like? How can you use input from this and put it into desktop windows by sectualting LibreOffice apparently? It doesn't really like G-Streamer. Let's see. Nope, sectualt. It's kind of annoying. Let's see if this works. Yeah, this works. Yeah, so LibreOffice was sectualting on this video for some reason. So you can see on the front of this there is a point array that comes out of this controller because it's inconvenient if you always have to walk up to your window in order to grab it. So you can use the trigger button to grab your window and move it around. As you can see the relative position of the intersection point between the point array and the window remains the same. So you can position your controller as you like and move your window around as you expect. This is not so great if you take two steps to the left and then your window may be oriented like this. So you can press the trigger button all the way through and then the window will face the controller. Very annoying. LibreOffice really doesn't want to talk. I think it's worse with two monitors. Nope, let's go to the video. So here's our video demo. As you can see there are two controllers. I only have one here right now and you can create left clicks with the controller. As you can see there is shaking involved when you press a button and we have compensation for that in XR desktop. So you can actually draw single points or do precise clicks and you can also draw with it and as I said it can run in front of a VR application. So here I will start one of the most popular VR applications which is Beatsaber. You may have seen it. It's the rhythm game where you have to hit blocks. And you can also see we have some issues there because we have our dark blue pointers that are drawn by XR desktop and we have light blue pointers that are drawn by the game and we have to think about how we can coordinate this that only one of them shows at a time. As you can see the game is running in the background and here in the foreground I have a browser window that shows Twitch and I have opened the SteamVR keyboard and input as such. So I can watch a Beatsaber video by playing Beatsaber and as you can see the window is usable while it is displayed on top of a video on top of a game. This is another feature that we have because if you have many windows you are cluttering your game so we can select a few windows and then hide all other windows. So in the video I have hidden all the other windows and only show this Twitch window and the last feature is disabling input for this video so you can play your VR game uninterrupted. You will still see the window but you cannot click on the window anymore so you don't accidentally write something or click on your window if you just want to play your game. So this is basically one use case that was important to us that you can see a window while playing the VR game because up until now if you played your VR game you were completely cut off from your desktop you didn't see anything. SteamVR does have a desktop view but it's not great. It just shows the complete desktop like VNZ and this shows each window individually in VR. Let's see so what's in the future for XR desktop? There are many more UX concepts that we want to implement for example a parabolic pointer. You can see this example here from the SteamVR home environment. The pointer is not straight anymore. You can see it points to the floor and if you point your controller up the pointer goes further out. These controllers can also vibrate so they have haptic feedback. You can also give acoustic feedback with the headphones that are on these headsets. Of course hunt tracking and we are also thinking about a native 3D UI so what we could do is we could use GTK widget and port this into VR. We have an example here but it's not really that great. We can also talk about using QT for this. QT 3D would be interesting maybe. Fund rendering could of course be improved because it is optimized for pixel display and if you rotate your windows in VR then it won't correspond to one to one pixel anymore. There's a project from Mozilla called Pathfinder that could be interesting to look into and quickly at the end I want to change topics to what is there about open source stuff in XR. I also already mentioned open XR. The Kroner standard as you can see on the left side that's the current situation. Every application, every runtime has to implement support for all of these runtimes on the bottom so it's actually less and open XR comes to the rescue on the right so every runtime will implement this standard. Like OpenGL all applications will be able to run on one standard. At Colabora we have Monado for this which is an open XR runtime and our next project will be porting XR desktop to run on Monado. So this will enable XR desktop on a complete open source stack. So Monado is basically the visa of open XR. It's also hosted on free desktop GitLab. I encourage you to check it out. We use headset drivers from OpenHMD which is another project that is friendly with us and I think I can wrap it up here. If you want a demo of XR desktop I will be upstairs. There should be some signs out there. Now you can try the demo of XR desktop or if you want to try the R in general you can also do that. Okay thank you Christoph. Unfortunately we don't have time for questions. Eichel, can you set up?