 Okay. Hello everyone. Welcome to my talk game development with OpenXR. My name is Christoph Hark. I am a software engineer at Collabora. And I represent a little bit about what we have been doing at Collabora. So in this talk, we will go over the basics. What is OpenXR? What can it do for us? Why do we need it? Then I will give a little overview about how you can get started with OpenXR. I have a list of example code that you can look at. There is a simple C example that I wrote. I will show the code a little bit and go over it. And lastly, I made a Godot engine plugin for OpenXR. And I will show this plugin and hopefully a live demo works. And the last point is to do what still needs to be done in the Godot engine plugin. Let's get right into it. OpenXR, why do we need it? You can see a lot of VR devices that are currently on the market on the right. Those are very different. The controllers are very different. They have very different inputs, different buttons. Some have touchpads, some have trigger buttons or finger tracking. And the traditional input APIs can't really deal with that. These controllers are also tracked in space. You can move them around and inside your game engine, the position in space will be reflected. And there is no traditional API for that too. And you also have these headsets. They have lenses. You have stereo images. So you have to render two images. So there's a whole bunch of stuff you have to do for VR headsets specifically. And this is currently done with vendor VR runtimes. Every vendor of such a VR headset makes their own runtime or makes a driver for an existing one. So inside one of those runtimes, there is a hardware driver. You connect this headset with USB to your computer and the runtime interprets this data and calculates the position and the button inputs and all that stuff. And it also handles rendering because if you have a headset on your head and it fills your entire field of vision, you really do not want your image to freeze at any time when your game has a hiccup, even when you want the image to be very responsive to your head movements. And this is something that VR runtimes do for you. Each of these runtimes has their own API and recently Kronos released OpenXR, which is a standardized API that hopefully soon every vendor will implement. I will go into a little bit of detail about the current situation on a later slide. So we can think of our vendor VR runtime as the rendering input stack, like libinput or xorg or valent compositor, but specifically for VR hardware. So what VR hardware and runtimes can you use today with OpenXR? On Windows, you have the Oculus Rift and Oculus Quest's standalone headset. I believe you have to opt into a better for this, but I'm not sure. I don't use Windows. But you can use OpenXR on these headsets. For Windows Mixed Reality headsets, you can enable OpenXR and run OpenXR applications, but there is a caveat. You can only use direct 3D applications and not OpenGL. This might have changed, but this is my last information. On Linux, you have the runtime that we at Collabora have been developing. We called it Monado. It uses headset drivers on the one hand OpenHMD, which is a community project that has developed and we have our engineered headsets and written open source drivers for them. Unfortunately, many of those are still incomplete, but you can help develop them. There is a BOF edge for them, I think, tomorrow, where you can look if you are interested in open source driver development. You also have a talk about the Monado runtime if you are interested in that. And Lubas will make a talk about XR Desktop, which is a desktop solution for Linux, and he will go into the history of OpenVR Runtimes 2. Runtime also developed native drivers, for example, for the PSVR headset or for the OSVR HDK 2. And I have been working on integrating LibSurvive, which is another third party library that is based on code that actually was developed at Collabora first. But they developed tracking for headsets like this, like the Valve Index or the HTC Vive with these base stations. And I will show a live demo of this later. So what do we need to know as a game developer about OpenXR? The specification is derived from the Vulkan specification. So if you know this and reach through OpenXR, it will look familiar. It inherits some API concepts. And the concept of having a loader and headers that are provided by an OpenXR SDK that is provided by Kronos. And your application only interacts with this loader and these headers. And the loader will be responsible for finding and using a VR Runtime. That's similar to how you would use a Vulkan driver. If you write a Vulkan application, you don't link directly to a driver. You link to the Vulkan loader and use the Vulkan headers from Kronos. And the Vulkan loader will be responsible for finding a Vulkan driver. And this works similarly. The API is extensible. So there is a core specification, but each vendor can make their own extensions. For example, if you have unusual display configurations, like if you want four displays in your headset, you would make an extension. Or if you have an unusual input device, or if you have one of those cave systems where you have monitors all around you instead of a headset, that would be intermitted in extensions. Yes. So this is our list of examples. I will put the slides on the Boston sites. So you can find all of these. Noteworthy our GXR, which is a library we have developed for the purposes of our XR desktop project, which Lubas will talk about in his talk. It implements an open VR and an open VR and open XR backend. XR Gears is a standalone open XR application. That is basically like GLX Gears, but in VR. And my example that I will be presenting is the open XR simple example. That is really plain C. There is absolutely no abstraction. You can read it from top to bottom and see what's going on directly. It's currently Linux only and only open GL, but pull requests are welcome if you want to implement Windows support for this. This is how you would compile an example for open XR. Basically, there is a package config file that is provided by the open XR SDK. Do you just use the link and see flex from that. And for running this application, either you give it a JSON file that describes which runtime should be used or there's a mechanism where active runtime JSON can be installed system-wide and then you just start the application and everything happens automatically. So for the simple example, I thought we go through the code a little bit. So as you can see, this is just plain code that goes all the way from top to bottom. If you look at this code and think, there should be an abstraction in this code, you are absolutely right. This code, you should probably not program like this. The clear abstract color function, the clear abstract color function, you should put this in functions and stuff like this. So the first thing you want to do is you will create... Yes, the first thing you want to do is to create an XR instance. An instance is like in Vulkan, basically a handle to your runtime. The way you do this is usually you create a struct that is an info struct. All these structs have suffixes and if you want to create something, the struct is called info and you put some info in there. For example, which extensions do you want in our example OpenGL. We want to use OpenGL, so we enable the OpenGL extension and call XR create instance. The next thing would be XR get system. Our system basically describes a form factor. So in the core specifications, the form factors are the R headset like this or in the header you can see there is also a handheld display that would be like an Android phone that you can move around and you can basically have a magic window into our virtual world. But this example is a head mounted display. You get the system. The next step is enumerating few configurations. These are kind of tight to the system, but not really exactly the same. So as I showed, the head mounted display usually has a stereo configuration with two displays, but if you have an unusual configuration, for example, in the header you see there is a configuration for quite a lot too. That is a special headset that has a very high resolution display in the middle and a lower resolution display around that. So we have four displays and that's not a stereo configuration anymore, but a quad configuration. Another concept in OpenXR are spaces. The spaces that you really want for your application is either a stage or a local space and this describes basically your physical space and all your coordinates that you have are relative to this space. So these reference spaces that the OpenXR specification defines are basically a stage space for a system like this where you can move around in a stage. There will be other configurations for augmented reality systems that you can move around in the entire physical world, but currently if you probably want the stage space as a reference, let's skip that because the time is short. The last important concept I want to touch is our action. This is a concept that has been introduced by OSVR and SteamVR, but it has been basically perfected in OpenXR. So typically if you do input APIs pretend, for example, that everything is an Xbox 360 controller and you can get the A button, the state of A button, the state of B button, the state of the joysticks and stuff like this. But with VR hardware, you have so diverse hardware that's not cutting it anymore. So instead, there is a new model where your application defines a set of actions that it wants to have. For example, it says the user can jump and jumping is a boolean action. So either you jump or you don't and your runtime will have to bind this to a button. For example, the A button on this would be a jump. And then your application all has to worry about is the jump action triggered or is it not triggered and the VR runtime will do the rest. So this is what happens here. This is a grab action where you would take something and in OpenXR there are a bunch of known controllers that are specified exactly. So this controller is in the OpenXR specification and your application can say for this controller, I want the jump action to be on the A button, but it's not really necessary. OpenXR, XR runtimes are supposed to provide an interface to the user where they can see the application has a jump action and the user can see my hardware has these buttons and then I can connect this. So this is basically a suggestion but it doesn't have to be... the runtime doesn't have to do what the application says. And let's give the rendering part because of time. The application, when you run it, looks like this. I have to say that this is currently running on our Monado OpenXR runtime with the LibSurviveTrior for this headset. So this is an entire open source stack and there's no close source software anywhere in this stack. And this is basically the application. It gets the position from the headset and represents that in the world. It gets the position and the rotation from the controller. And this is basically the example. It is on the free desktop GitLab, so if you want to look at the code, you can check it out. And maybe a bit more interesting is the Godot plugin. So basically what I have on the right side here is the Godot OpenVR FPS, which is an example that Godot developer wrote for the SteamVR system and to port this example to my OpenXR plugin. Basically what you will do is you'll delete the add-on for OpenVR and you'll copy in the plugin, the more add-ons. You just copy over the OpenXR plugin and then in Godot in the project you have to basically edit one line in a script that says load the OpenVR plugin. Let's say it's OpenVR, but I already changed it and say use that load the OpenXR plugin. And then you run this project and you'll get basically this input. You can basically use your OpenVR project on OpenXR. So here is some teleport input that was already in the project. You can teleport around, go in there with just some example content that I did not make. This is Bastian Ollic who made this demo. You can take one of those jumps and make it go throw it not very well. But basically you can develop a game with it. As you can imagine, this is all very much work in progress and there's a lot to do. So the code of this plugin is also on Fritastop GitLab if you want to check it out. Let's see. What is to do? First is I want to package the OpenXR loader and headers in distributions basically like the Vulkan headers and Vulkan Vulkan headers and Vulkan come on. Yeah, whatever. Let's just do it like this. I mentioned the action bindings in my when discussing the C code and as you probably didn't see it but in Godot there is currently no UI to define these actions. There is an open issue for the OpenVR plugin I have commented in this too we should build a UI inside Godot where you can say there is a jump action and then you should also be able to suggest for the ViveIndex controller the jump action should be on the A button for this UI that is really needed before this is really usable because right now it's basically all hard coded inside the plugin code and it's not easy to edit. There is also some low level stuff that should be improved in the Godot code base here is a definition of a struct the OpenXR specification and you basically have to give this information to the OpenXR runtime this is for example a GLX context currently I call GLX getCurrentContext which is maybe more of a hack it would be nice if you could get the GL context that Godot uses from Godot and plug it in there and of course a Windows port would be very nice because currently this plugin only works on Linux so basically putting some defines like use platform windows and fill the respective platform specific graphics struct there is a window specific OpenGL context and stuff like this and of course we need some testing that basically concludes my talk I have some promotion to make for a first XR conference in 2020 that we will make the ad collaborator will make in Amsterdam if you want to know more you can go to the website thank you for listening questions a little bit yes I am involved in the development of the OpenXR specification personally a little bit we have we at Collabora are a member of Kronos and have been working with Kronos to develop the OpenXR specification we have some members who are more some colleagues who are more involved in OpenXR for example the specification editor for the OpenXR specification is from Collabora it's right in public if you know him so what is the best way to get involved in developing features for OpenXR so basically the official way is you join Kronos and participate in the discussions inside Kronos maybe a little less official way to contribute to our Monado OpenXR runtime make an implementation of our feature and prove that it works and then maybe we at Collabora can help stream it into the OpenXR specification