 Hello, and welcome to this talk about free open source virtual reality and augmented reality. My name is Jakob Ponykrantz, I currently work for Collabra, and I do basically open source XR, and what I'm going to talk about now is basically explain a little bit what AR and VR are a little bit about open XR, about Monado, the project that we started, the status of it, and show a little bit of demos, and then wrap the whole thing up. So, what is AR and VR? As you can see, augment, adding things, making things better. So, one of the things that probably most of you have run into or seen is Pokemon Go, where you can sort of place Pikachu or any other Pokemon in the world, and that's AR. You add something to the thing and you give it a new experience or things like that. But it can also be used for other things, not just games, but also working things. So, this is a background headset, and I'm going to explain a little bit about the setup. So, the guy you see in this sort of video call is sitting somewhere else, and the operator that's wearing the headset is streaming what he sees to that guy in the headset. So, the operator can see, and he can also then draw or highlight things. If you see that little ring there, that is actually not something I scribble on. It's what the operator scribble on and told him to say, pull this hard disk or tighten that screw. So, exactly like how I highlighted this with the pointer, he can highlight things in the real world. So, that's actually a very powerful thing that you can do with AR. And, you know, it's not just games. So, what is virtual reality? I really like this quote. I thought it first, but it was actually made by Adam Savage, but it's actually from a video or a movie. And you might have heard it. I react to reality and substitute my own, which is kind of what virtual reality is. You just replace everything, but often what you think is, you know, a VR headset and you just see things. But it can be a lot more than that, but I'm going to limit myself to that definition. And of course, as you might know, games where you play and you have different experience. But by definition where you have replaced the whole reality, there's also like, this is also by that, you know, technical definition, I would say. This is also virtual reality because even though it's a limited experience, it's still completely replaced all reality, sort of saying. So, it's kind of like how you can see everything is replaced. So, what do you get when you add AR and VR together? Or how can you talk about it? And that's sort of where XR comes in. Everything that's in there and it's not extended realities. X is replaced for either augmented or VR. So, I'm going to talk a little bit about open XR. Like, how do we do XR? It's a very simplified stack. You have your program, you have some sort of platform that runs, that program runs on. And, you know, you have hardware underneath, the platform sort of extracts it. And if you have a program to talk to the platform, you need an API. And in the past, how history has to repeat itself, there's various different APIs. Kind of like the picture to the left here, where everybody has their own thing. And everybody, if they want to support everything, you know, they have to write multiple APIs. You know, it's not a good situation. But where open XR comes in is that there's a system of royalty-free collaborative API that everybody can implement. You know, you only have to run right code runs. And then you can support wide variety of platforms and hardware. So what's so cool about open XR? It's kind of just like OpenGL or Vulkan, you know, it's made by Kronos. It's open in that sense. And as I said, everybody can implement it. It's actually out now. It's action-based and I'll go a little bit into that. And there is no open source implementation for it. It produces us. So one thing that is kind of hard when you have your targeting different platforms or you, you know, every controller looks a bit different. You can remember like GameCube controller or Xbox controller or those look a bit different. So instead of the programmer going, I have an Xbox controller and I, you know, just get the input like X-bottom or pressed. They instead give context to the program. So you say I have an action and that is teleport. And then they say you might want to bind that to trigger button on this type of controller and you might want to bring it to a button on this type of controller. So that, but the runtime is ultimately decides what actually goes into that. And the runtime I have then has a lot more context to understand what the application is doing. And also they can show that to the user either before or during runtime where you can more easily remap. So you can see like shoot arrow or shoot cannon button or action and you can remap that to a button. Say you have, you know, you have a twitch finger or something and you want to just move it to another button or your controller doesn't, or the controller you have doesn't actually match what the developers are running on it. So the default mapping might not be the best for you. So I think that's like the best, second best thing of OpenXR. Either the first being it's still open and the second is just that it's action based and it abstracts inputs in a completely new way. I think it's, you know, a big step up from just getting the buttons and then the game have to do all the mapping themselves. So I'll talk a little bit about Monado. Again, how do we XR? You know, you have your program, your platform and your hardware. Monado is the platform. So to go expand upon this a little bit, this is kind of a highest level diagram of what Monado is. So all of the purple bits are Monado. We have our own composter. So not to be confused by a violin composter or a, you know, X composter. It's a composter that takes what the game has rendered and turns them into something that looks good on the screen because you often have a whole bunch of lenses. You can have even kind of mirrors for your devices. And it's just dealing with taking care of all of the nitty gritty bits of basically advanced optics and physics of how to turn photos into something that looks good in the eye. And you have a couple of drivers and you sort of have this platform corner aside. And on top of it, you have the open XR stack tracker, which if you're familiar to Mesa and Calium, how that is built up is basically built around the same principle. But now we also have the composter and the platform in that package. So it's not just drivers. So it's probably very familiar to Mesa developers and the graphics people. But it's, you know, it's a slightly more complex thing. And, you know, you have part of Monado. There's a whole bunch of XR libraries and those are, you know, we try to extract things from drivers. So drivers mostly are about bit twiddlings, the things that come from the hardware and then just taking components and building up a, you know, functional system. The composter is quite large. There's, you know, wrapper around it and all those things. So another one, slightly more complex, but still scaled down. So you have to get the feel of how it all fits together. You might notice the couple of things that says XRT underscore and something. Those are sort of the interfaces. I'm going to talk a bit more later. And something that we have in the future is we're going to put an IPC layer so we can just have one, you know, compositor or daemon or, you know, run all of the drivers in one process and then clients can connect to them from outside. Then I can keep running and you can switch games or switch things that you run on top of it. So you can run, for instance, if you heard of the XR desktop project, you can run that at the same time you're running games and things like that. So I'm going to talk a little bit about the whole all XRT underscore things that you were seeing in the triumphs. So basically those are interfaces that kind of, you know, similar to the Gallium interfaces. It's a pure C, so all nice and that. It's not stable, so it's all internal interfaces, just like the internal interfaces of the kernel and the internal interfaces of the Gallium drivers. So the basic thing we start with is the XRT device. So it's for both HMDs and controllers. It has sort of an aspect pattern. If you ever heard of that, where instead of having a inheritance structure, we have side struct that you kind of allocate them point to. So a device, when I'm just saying this interface, I figured that some devices might actually have cameras and some might not, and some might have an, you know, how do you deal with, I have a control with the camera or I have a controller as not as a camera and I can, you know, the inheritance structure does become, you know, that pattern isn't good, so instead we have these aspects that you add to it. And yeah, there's functions on, you know, getting input, setting output, like making it vibrate, and then there's a whole bunch of information there on like how the compositor should try the pixels to the screen that's on the device. And then there's the compositor. This on the other hand has base class and then specialization on top of it. There's some glue code for being able to support both GL and Vulkan interfaces. Then there's this real compositor and they talk between each other with FDs, a DMA bus. Yeah, and then this all basically is just a mirroring of the open XOR interface functions that you can look up in the right history and see the, sort of, how they map up. Then we have the probeer, so here's everything that goes that's not anything else, but we need glue together or we need to, you know, how do we find devices, how, you know, take care of that from code and sort of glue wins this and everything is just a policy holder. Stick it all together. So where are we? We are mostly complete with open XOR. We have currently only an in-process compositor. We're hoping to do that soon so we can get around multiple programs at the same time. We have sort of built up a video processing framework and I will kind of show it that later on. We can track these PS Move controllers and we're working on improving that as well. We have a really nifty debug UI as well and I will show that off. We support multiple harvours at night of the day and we also have the No Start and Dream Daydream controller we have on our branches but we also have the OSVR SDK, Drivers of it by Pro and Index. We have an experimental branch for positional tracking of those through the Live Survive project. It's up to the places in VR and Move, the Racer Hydra and we also have a wrap around OpenH&D to use all of those drivers that are there to make sure that nothing gets left behind. In terms of code, I thought it was a little bit fun so 35K of lines of code. You can see one driver is between 500 lines or 1,500 so we have extracted, I think, a lot of what goes into the driver out of it and you can see all of the heavy stuff is outside of the driver so it's writing and the driver is not really hard once you understand the bits so it's all the reverse engineering and then the other hard parts are tracking and math that goes around it but those are in helper libraries and this is what Monado actually gives you. We are hoping, planning on the short term roadmap is finishing up OpenExo support before the conformance improve the tracking, add PSVR tracking and there are two process compositor but that's kind of just what we want to do but if anybody wants to do anything else they can do it on their own and also we haven't, how do I call it, called dibs on any of your things so if you think, oh I really wanted to do that you just talk to us and we can help you out yeah, it's just kind of our own plans even further out how do we do all the things that sort of steam, if you have ever tried SteamVR their own system UI, setup UI the sort of safe space that you come back to if the game crashes incorporating lighthouse tracking and also looking into AR and SLAM tracking so a couple of demos and pray to the demo gods first and foremost so I will start off by showing just basic tracking and the cool part, I won't tell this to you guys so I don't accidentally film you so this is just basic two webcams, a stereo webcam I'll give you some best tracking sort of available for everybody I will speak of the demo gods so this is not actually showing from inside, it's sort of running in what we call headless but it's easier to understand so you can see that both two controllers here, I'm over on them we're working progress and all that you can see it's a little bit jittery but you can see it's actually some of them are quite jittery and I will actually then show off the debug UI for this this is our cool and nifty debug UI and I can sort of explain what's going on so this is what the camera is seeing it's a very short exposure I can turn it off to show what it's actually showing so you can see the exposure is very short but then the balls over exposure so I can tell them apart now but if I lower it again you can see there's one that's red and one that's purple and then I can also see the filter so you can see this is the red filter and you can see there's a bit of a noise from that one a little bit I know why that is so that's because the white balance is off and now there's no no noise on that so it's just picking up this and that's it's fed into the trackers and they show where the ball isn't drawing it's finding the ball it's kind of a bit hard but yeah that's where we are on that that's also what I have been doing Christopher had a nice talk where he showed some advice tracking in his talk about doing game development game development yeah that's it for demos a little bit of rinse if I hot plug non desktop displays Gnomeshell really likes to reorder all my windows so that's super annoying so if there are any game developers please look into that because I usually plug in and plug out HMDs and Intel gets a special call out because we're using an extension that's not available on Intel to interrupt between OpenGL and our composter that sucks there's lots of lots of work I actually have the same status page already in October and so are we actually doing anything yes, tracking is just really really hard to do just now actually have some improvements on the move tracking we couldn't get it in right now and we have made some progress on the PS3 it just takes a lot of time and how do we do settings is Gnomeshell interested in making their own UI, do we have to make our own UI for it and we're also really interested in building up a FOS XR community not just for driver developers but also higher-up in the stack but our focus is there but we have last year we had the first FOS XR conference and we're planning to have one this year as well so the link there is to the FOS XR Twitter account we'll tell you when we actually know when it's based probably be probably be in Amsterdam .com so how about your links once I actually manage to get it up we do actually have both interchips for XR stuff so if you're interested in this and you finishing up your studies and looking for interchips definitely go into the link and there's also regular job postings I don't think we have anything specific for XR there but I do think you should send it in your CV anyways or you should talk to me if you're interested in it yeah and there's also other pages for Monado and OpenXR so what's going on and if you see me you know come off there talk or ask questions because I apparently blew through all the slides and I didn't think you would go that fast so yeah talk to me about just about anything we are my own little programming Amiga's Voxels yeah especially about joining Collabra so thank you very much and any questions because we have lots of time applause yes you mean so Oculus actually do yes they have okay sorry when do we see more support for OpenXR Microsoft was one of their first one that shipped out support so they are there Oculus has also shipped out support on their desktop I think yes and there's also us but yeah more of them are coming Steam Valve has their Valve time so I can't really comment on them when they want to do it but hopefully at some point and they are in the working group so definitely are interested in it by the way it's still early but there is definitely a shift and I think all of the main players are on board so yeah soon TM or Valve time soon but there are already are out there and people can use it yes could you please elaborate on the the fragmentation layout okay sure specifically okay so in the past yes sorry can you comment on the fragmentation sure so in the past everybody had their own API, Oculus had their own API SteamVR had OpenVR Samsung and Daydream and we even open source community had their own API with OpenHMD and most application brought it either to just OpenVR and maybe Oculus or you know some windows stuff there but and you know we have all the Android applications so it's yeah you couldn't really especially for us in open source you know nobody wrote to OpenHMD unless it was a crazy open source guy so it wasn't really an ideal picture so what OpenXR gives us is that there is one API and you can make portable application the API I would say is slightly more complex than say OpenVR but OpenXR also handles a lot more types of devices and sort of is future proof but you know it's current so our target is just VR but it's definitely designed with AR and all of the knowledge that they windows doing on HoloLens doing AR stuff actually know so it's ready for it and as time go on we will release extension and then update the core just like OpenGL does where we had one the first one which didn't even have textures and then you have version 2 which had shaders and then more shaders built up so just like that OpenXR will be built up what's going on as we go along yes Do you have any knowledge or something that you can talk about about the device platform? It's not there yet there was probably have a face to face next week and there will probably be discussion about it but it's designed by committee takes a long time and everybody has to agree but it also things slowly so yeah there's definitely a will for it and it will probably at some point show up hopefully but in the meantime you can write on auto drivers and you will support OpenXR that way so yeah Next question Say you have a device that is not powered by HDMI or something but expects the stream press stream to send to it Yes about devices that doesn't plug into a display port or HDMI or anything like that sure not always modular so we had definitely thought about replacing the compositor instead sending it to say encoder and you're sending it over the Wi-Fi or 5G or say a USB link which is basically what Oculus link is so there's definitely if we had infinite amount of time and infinite amount of code monkeys because it's fun and it's a cool feature to have but priorities and just getting over the initial hump of what you want to do it's open and everybody can hack on it if you think that's fun please come and talk and hack on it and then everybody can have it and that's a power of open source Next question oh actually one sure that's okay how do I yeah OpenXR was started by I think Valve, Epic, Unity, Oculus so they were on there from the beginning Microsoft joined after a while and that was sort of yeah we all shared because we didn't really think that but Microsoft has been the one that's actually been pushing it most so they were there at the start and I think hmm so everybody is fine with opening the front layer, the thing that application wise too because that's actually solving a problem for them where unreal game developers I don't want to target all of these IPIs I guess not one just have to solve the problem everything that goes on behind that's their IP and all of it locked down and stuff like that but yeah there's probably contact protection behind the scenes as well but the front part they're very open to they are releasing run times that people can use yeah yeah yeah yeah but that's where Monado comes in so we are completely open source stack that you don't have to do anything of that we do have to then reverse the hardware and figure out what actually what the bits do that we are getting from the device and we have to do all of the complex math to do the tracking and slam tracking and tons and tons of university research and spending a lot of time so it's always a question of resources that we have that we can do I mean we want to do everything and our primary goal is to make sure that when the AR revolution comes open source has the solution and also provide you a platform to do also VR as well so yeah yeah and everything like political questions I think that is fine just keep asking even remotely to XOR that's okay I think that's it thank you very much and