 Okay, so welcome back everybody. Welcome to the newcomers. If you were not there for the first talk, so quick information, entrance here, exit there. Safety, just go through, like we're in front, safest room of first time. If you have any questions, suggestion because this is your dev room, even if you help organize it, do come to us, we're mostly friendly. If you have friends that didn't manage to get up at whatever AM and fight the rain to get here. It's all live stream, including the slides. We double check, it does work. So please do share. Now, one of my most favorite exciting topic. Well, we'll see what XR is, I don't want to give any spoiler. But Rabimba, as you can see also from his magnificent t-shirt, Modzila tech speaker and supporter of Modzila overall. So what is all those words? Is decentralized social Web Explorer, please? Thank the Fabian. Good morning, everyone. Today, I'm going to talk a little bit about virtual reality, a little bit about augmented reality, very little bit, also a little bit about IPFS. So without further ado, who am I? So the ego slide. So I'm a student at Rice University. I'm doing my PhD there. And I have also worked with Modzila as an intern and also a volunteer. I'm also a Modzila tech speaker. I've also worked in IBM in a little bit in blockchain as part of my internships. I am also a Google developer expert in web technologies. So that's kind of in a nutshell, who am I? And let's delve into what is AR, VR or MR, or internally XR? How many of you have at least tried one form of these technologies, maybe in mobile or awesome? So as you can see, this is a pretty old picture. That's even Sutherland. That was what he envisioned VR was. That's like whole room setup. And from there, we have progress to something like this and this. So how can we create content in VR? How can we actually bring this in web browser and how can you create content in that? I'm gonna briefly touch over how can you create them and then we'll go to understand how these can be decentralized. So first, the things we are gonna talk about is the frameworks and how you can create them. As I can see, most of you have at least played with it. So the two things I will talk about is A-Frame and how you can use A-Frame to create web VR. Any of you have tried playing with web VR or A-Frame? Okay. So that gives me a chance to actually show how you can actually create VR content. So before you start, what are the devices right now that supports web VR or web XR? So these are the devices kind of where you consume the content. So you see there's a range of device here from like $3 Google Cardboard, which you can also print or build at home to very expensive Oculus Quest or HTC Vibe and all these things. Now, the experience in all of them are not exactly similar. So when you are building content, you also have to take into account where you are building it or who are your audience. Now, why web? Why not build in something like Unity or like an gaming engine which supports VR? So the primary logic here is that web is open. It's instant. If I create something today right now in the stage and I share a URL with you, you can just tweet it or share with your friends that okay, see, this is a nice cool example and they can just open their mobile, open it in their browser and if it supports and we'll see if it supports or not and they can immediately get into that content. They don't have to download anything from any Play Store or App Store or they don't have to wait for the content to load. Now, where is it supported right now? So these are the devices where you can play with web VR right now today. So Firefox, Microsoft Edge, Chromium, Chrome for Android, Oculus Cameras, Oculus Browser, Samsung internet and something's not here which is the Firefox reality browser which right now is running here. All this support web VR out of the box. Now, one of the questions is like where is Safari? Safari support right now is not completely there. We have a mobile polyfill which kind of makes it work but the performance is not yet comparable to the native performance of these browsers but Apple is on board so you eventually will get support for Safari. Now, what is A-Frame? How many of you have at least heard and outlaid with 3Js? Quite a few. So 3Js is kind of like graphics library where you can build cool graphics demos or anything animation with that. A-Frame is built on 3Js so it makes building those things much easier for a VR perspective. How easy? So for our first Hello World example in VR these are like all the lines. Hopefully you can read it. These are all the lines you need to create a basic VR scene. Let me show you what I mean by basic VR scene. So this is in code pen. This is the exact same code. So what I have here is something called a scene and inside that I have four, five elements which is a sphere, cube, cylinder and the sky element and this is closed. There is nothing else here. What it produces for me is something like this. So this is a VR scene right now running from the browser. If I press on this and if my browser supports a VR scene then it will just transport me to the device. So if I had Oculus Go, the portable one or the normal Oculus Quest I would have plugged into my laptop and I would have been into the scene. To make this scene, all I needed was the siblings of code. Now let's see what happens if I, since this is a web, let's make it way bigger. Okay, where did it go? Okay, so it was so big it covered the whole scene. So this is how you can build like very simple VR scenes. There is a big, like very good documentation in the A-frame website. I'm not gonna go into that. I just wanted to show you how easy it is to build VR scenes using A-frame. Now since this is easy web page and this is essentially a static web page you can use your regular tools like Firefox Jeff tools or Chrome inspect tools to actually inspect and also edit these right from the browser and it will work. So this is essentially the whole scene. So this is the scene, this is the sphere and if I change something here that just like any web page. So that's how easy it is to create a VR scene. Now since this is essentially a website this works very well with all of the web technologies you have already been playing around. So D3JS, which is React, Angular, everything works pretty well. That what that gives you, that gives you something to build, that gives you something where you can have D3 chart, load it in A-frame, get adjacent data from somewhere, load it inside and just play around with it. I think I have time to at least show how it looks like. So there's a huge community in A-frame. You just go to the blog and you can see a lot of cool interesting demos and it's very good documented. So I'm not gonna go too deep into this but I'm gonna show you what you can do with it. So these are a few demos which are various communities and a Muslim team and also Google has a website and Google experiments where they have created. So this is a painting application in HTML where you can just paint something and you can share the URL and anybody can actually go inside. So if you have played Google Tillbrush this is kind of the HTML version of that. You can use 360 degree videos or what you can do with D3. Now what I want to talk today mostly about how you can use this technology for social interaction. So this is a project called Mozilla Hubs. The social hub, what it gives you is that you have a shared space connected by WebRTC where all of you can go inside and talk with each other. You can literally use it for meetings. With that I'm gonna try a demo. Can you try to open this in your mobile right now? So when you open it, it should give you something like this and if it loads eventually, precisely for this reason I have recorded demos. But why? One last try. It is loading, hopefully. Well the beauty of this is that you don't need me. If more than one of you have joined the room you literally should be able to see others' avatars moving around in your mobiles. I could have shown it better if it loaded here. Meanwhile, I'll go on and show what it should look like. So this is kind of a recorded demo. So this is how it should look like inside the scene. You can play around with the ducks or you can look around and see each other. So essentially you can have the same experience in XR. You can have the same experience in XR and you can have these on you and move around and kind of see like this. Now comes the other part of the talk, the decentralization. Before the recorded one, I would try to actually do the demo. So okay, why did that go? I have no idea why I cannot reach the top of the screen which I need to. Let me try it once more. Yes. So what I'm now trying to do is that can we have this same experience in IPFS? How many of you have heard about IPFS? Awesome. Awesome. So what I'm trying to do is that IPFS is interplanetary file system. So essentially your data is not coming from any server. It is coming from my machine and also when I publish it in IPFS and others see it it also comes from some of their machines in basic sense. So it is decentralized, it's not coming from any server. When I publish any static file, so when I was showing you the demo for hubs, it is essentially an HTML, the client side, it's a Node.js application and it's connecting to each other using WebRTC. So now I have a very simple demo using a frame shared component that is using the same technologies of WebRTC to connect to VR scenes and can we use this WebRTC in IPFS? So this is a demo, this is running inside IPFS. As you can see, I have the code. You'll see the code at the end of the talk. So can we use this and communicate? So what I'm doing is that this and this is gonna try to communicate with each other using WebRTC from IPFS. If it fails, I can have the recording. Yes, it did not fail. Can you see it? So I sent something and here I have the hello world and if I say something here. So as you can see, we can communicate with two IPFS static sites using WebRTC and can we integrate this with something in a frame? And that has to be a static site. So what I essentially will try to do is use a fm component called fm shared component. So what this component gives you like immediately out of the box is something like this. So this whole component when you just add it and initialize it, it gives you a chatting room using WebRTC where if you move to other room and somebody connects with this URL, you should be able to communicate with each other using voice. This is running in Glitch and it doesn't have any server component and the code is essentially... So the only thing you need, the minimal setup is this template and the script. So which essentially is this. So this is the room setup and these are all building up the models inside the scene. And then if you use it in published in IPFS and use what I showed, you can actually communicate using IPFS and it is decentralized and it's not coming from any server, right now it is coming from Glitch. Also what you can do, so this was a recorded demo. Also some other things you can do is that since if you don't have a server component and you're just creating static files, for example like this. So this is the painting in real world, kind of the A-Painter, you have XR Painter, you can go to the Mozilla Hacks website and you have very good write up how you can use this. So you can also publish these in any IPFS site and it will work. So web is the ideal platform for making AR and VR component and today I'm gonna argue that not only web, you can actually use these components in a decentralized way. So if you want to learn more about these and all the resources, this is the site you should take picture of. All the codes I have shown are in GitHub and this slide is also in GitHub and everything you can try at your home. If you have any queries or questions or anything related to talk about, you can ping me in Twitter or in my email address. This talk will also be, the talk is available right now in the link that is essentially a GitHub page and thanks to everyone in Mozilla Reality team and a lot of other people whose demos I have used for this talk today. Thank you and if you have any question, reach out to me and do come to the demo. This is awesome demo you will be able to see which I cannot cover in 25 minutes. Thank you. So you will be able to chat with the running back right after but if you already have questions, I know you have questions because it's kind of based on how many hands were raised and not raised on trying VR and 3GS, you do have questions. So who is the first? And a question ends with a question mark, an opinion that's for after. So we'll have silly questions if you don't have. Thank you for the presentation. How will the speed, how will the performance be down the line with, especially with web browsers now taking more and more RAM? How do you see that scale with, yeah, AR and augmented VR? So the question is how is the performance with large components and everything running in mobile browser or maybe desktop browser? Like what are the performance caveats? I think that's the question. So right now, if you're running it in desktop, the only performance caveat is you need a good machine to run it in the Oculus browser. So sorry, the Oculus Quest. So the requirement is from graphics perspective, not that much into the browser perspective. Right now we have reached a position where you can run it comfortably in mobile phones. So the demo I showed, which is essentially this one, this is running in a pixel phone, like a three years old phone. It's running in a browser and in experimental browser that crashes all the time. But even then you can see it's pretty fluid. So you can just see it and it works pretty well. So you can definitely build on extra performance on that. And VR performance, like it's already people are building industry applications on that. Thank you. Next question. Quick remark while you're thinking about the next question. A lot of the performance is actually not really executing in the browser, but how dedicated software like Unity and Unreal are packaging the assets, for example. A lot of it, like in terms of optimizing sexual, mid-maps, all this is done before it's actually shipped to the browser. So don't compare directly, let's say, Unity app to a WebExcel app and think it's not as performant because there is a lot of pre-processing basically happening upfront. So next question. I don't see any hand. Maybe I'm just tired. Do you think the web will migrate? I mean, will have a future with the VR will be the regular platform for the web? So the question is, do I think the web will be a good fit for this scenario? Will it be regular, can it be used in regular performance things? I think it is a great platform. And let me go back and show why. So this is a real example. This is a web page where you can just load a 360-degree video inside the page and imagine going to BBC's page or Amnesty International and you hear about a bombing or hear about something in your desktop browser or maybe mobile, you just click on the link and if you have a browser, you just get into an immersive experience. So that's exactly what Google has been experimenting with in the search where you search by some name of animal, maybe panda, and it will give you an option that you can get the panda right now in the room from mobile browser and see, okay, this is the animal I'm searching for. So I think this is a great example. Okay, last question, and after that, we'll have to defrag because you guys are not together and we have a lot of people waiting. So last question, otherwise it will be after, but it's probably a nice question that anybody else want to hear, so.