 Thank you Can you hear me? Am I coming through on the live feeds thing? awesome wonderful This is gonna be very quick because I've got like 20 minutes to tell you about some ridiculously silly cool stuff That we've been doing in web VR and matrix first of all who knows what matrix is Pretty much everybody brilliant. I can skip the boring bit. So The two people you don't we're an open decentralized network for real-time comms and Critically we're going to talk about open comms on VR AR. I sound like I'm clipping. Is there anything I can do to not clip or is that Sorry for people online who can hear horrible distorted Matthew Gonna skip the boring stuff. There's matrix no single party into all conversations Blah blah blah matrix architecture Here's a quickly new version of the ecosystem diagram, which is showing a lot of third-party clients out there now We've got the matrix spec. We've got servers We've got the clients from the matrix org team in green, but we've also got lots and lots of community projects on purple now We have Neco on QT a cotone and fest Frank's who'll on GTK and rust Thunderbird reach out many many other clients and also other third-party community servers in rust and Java and all sorts of exciting languages That as the matrix ecosystem Basically is growing but the whole idea is to basically disrupt What's up and slack and all of these big silo services by building an open network for communication now? AR VR there is basically no interoperable communication layer today And one of the things we'd love to do in matrix is to provide a way for people to talk whether it's by VoIP video or Just a kind of decentralized interaction layer for AR and VR now we worked on this last year a little bit and Basically built a demo that you can use right now in fact on Matrix org slash VR demo. Let me try to find one. I made earlier. That's not it That's what I made earlier too many tabs. How about I open up a new tab? Okay, so let's go to matrix org VR demo and Yeah, this is looking promising and what this lets you do is to do some VR Conferencing it also does 360 degree video with a web RTC call embedded on it And for probably the best way to show it is to hit the go VR button And hope that we have enough net access to re-download all of the web VR stuff Now this is totally normal Chrome also works and perfectly in Firefox And what you see yourself in here is a Star Trek the next generation season 2 holodeck We can wander around this. This is a cloud a model where you can go out and view the beautiful HTML Background wallpaper we can go in and you can see this chap here is welcoming this to the room And this is just a video clip if I go and click on the red button here though I get a couple of things we can do the 360 degree demo or we can do a phone box Which will take us into the VR the conferencing demo I'm gonna risk the VR one and jump into it like so takes me back into the room here and Hopefully now we've got a table here, and it's got a matrix room URL Which is VR VC female guanaco on conf.matrix.org, but currently I'm the only person in here Oh except amundine who's down there in the audience and amundine has just joined it So what we're doing here is having a web RTC call Which has been negotiated over matrix and this room is basically a VR manifestation of a chat room in matrix and What else can I show you here? I mean you can see the phone rates pretty good Something that could be a lot more fun though is that we can take an Android phone like this crappy old s6 Which is probably around for battery. No, it's trying to upgrade itself. That's perfect timing. Thank you, Android Perhaps I will install overnight and let me go to the same place here on Android I'm a dirty iOS user. I'm afraid. So this is always a little bit risky at this point Sorry, let me just get the thing up on it. So I'm going to the same URL matrix to all VR demo Hoping that the Wi-Fi is going to be up for it. Oh Hello, so that's Michael down there He's gone and joined the room on the matrix side and you can see that he's going in there amundine is your video Cracked out. Where do you go? Well, I'll see whether I can join it from here Irritatingly my Android phone is trying to take me on to a different room silver condor not to be confused with female guanaco Anybody know what a guanaco is? I think it's a type of llama. Yes. There we go. Thank you It's the third largest member of the South African cannabinoid family I think oh next to the vikunia and the llama and the other one. Anyway, I think I've got onto it now. Hopefully almost Okay, I'm gonna hit. So I've got the same thing here. I'm afraid I'm not going to be able to easily broadcast this Oh, there we go. We've got we've got even images being replicated up into the Thing there. Obviously the brightness is now crashing on this to make my demo even more catastrophic. Let me whack it up there Yes, it's gonna. Okay. So here we are. So we've got the same thing going on here on my phone And now it's this again is stock Chrome Android using a frame as web VR and Irritatingly that's not helpful. So from in VR the the room is over there So some people if I kind of do that will be able to see the corner of it Let me try to scrub it around There we go almost there. Oh, God's the worst possible positioning come on Yeah, I think it's possible that the scrubbing thing what I'm gonna do is to actually press the Goggle button here, which would put it into There we go. There we go. That's not actually working. So you can see a Stereoscopic view of the room. It's pretty good frame rate even on the s6. So it's doing about 20 frames a second and critically well, it should be Now even oh no, I need to actually go into the room So I'm gonna hit the blue button so I can get the phone booth up and I tap on the phone booth Come on work. Damn you Well, basically it actually works surprisingly well here and the idea is that you can take a bit of headset like this And you go and plonk it in there and you can get the cool up to let me just try to get into the room Yeah, the problem is that the hit target is for some reason not registering on there It's not I'm busy tested it a minute ago, and it was great It probably also doesn't help. I'm in the wrong room Let me just try this quickly one more time bear with me. It's gonna be amazing This isn't even the real demo by the way This is showing what we were up to last year as opposed to the 2018 variation of this Okay, so I'm back in VR on the phone. You'll have to believe me. Ah It's possible that too many people have now tried to go into this conference This is a problem of using a live demo which is being generally destroyed So annoying is so cool when this works and the hit targets are even working to on this now Okay, that's we come back to at the end if we have time. Let me say the much better demo the one that we've been Oh, god. Yeah, it's all happening in here Little known fact is that you can keep stacking up until you go through the ceiling which is kind of fun Okay, so that's the carnage of the female guanaca room Let's get out of here and talk about what I actually wanted to talk about which is This basically this was about you see in a frame matrix using enter and encrypted Communications and it's a full mesh by the way of conferencing there So you do actually have end to end encryption going all the way through now The idea we did this was to show that matrix is a lot more than instant messaging blah blah blah Let's skip to the fun demo Problem is these were just crappy old plain old if you will video planes and it's missing the whole point of VR AR putting a 2d video conference into a 3d world is pretty boring. We want to see who we're talking to in 3d It's gives you a lot more presence in the kind of emotional sense It allows you to fix gaze correction Which is a I still think is a pretty big deal that when you're looking at somebody On a 2d video call they're looking at the camera Whereas you're looking at them not looking at you and so your gaze never meets and it causes horrible Emotional gap whereas if it was in 3d you can just position yourself in 3d until you're kind of looking at them And obviously hopefully it should look a bit more real What we didn't have last year is a camera that gives good 3d depth That's sort of capturing and video, which is perfectly optimized for 3d video calling with good API support, but then Happily Apple came out with the iPhone 10 with its true depth infrared dot projects Which you can see down here on the right-hand side Which is splashing infrared dots all over the place and gives you 640 by 480 at 24 Hertz of depth data perfectly calibrated with the YV Front-facing camera, and I've no idea why nobody has done 3d video calling before because it's really fun and this Thing here is basically built as a 3d video call device as far as I know nobody's done it until now Hopefully so we decided to build the world's first 3d video calling using the iPhone 10 matrix web RTC web VR in seven easy steps and First of all we had to hack web RTC to add support for depth capture This is actually pretty easy You take the AV foundation video capture and you said please don't request the video stream request the depth stream And it kind of works which was kind of reassuring Step 2 well now you've actually got to bake it into a client predictably enough we used them raw iOS Which is built on top of the matrix iOS SDK? and Luckily Google now you ship a cocoa pod for web RTC Riot and matrix is all cocoa pod base 2 so it was very easy to glue into a local cocoa pod Step 3 Now this is where it gets a bit more interesting So the data that you get off this thing is 16 bits So half precision floating-point depth data measured in meters So you literally get the stream of half precision numbers and somehow we've got to map this into web RTC now The there's a web RTC working group dedicated to encoding depth over video and they're not gonna like this talk because we completely ignored all of it and they're kind of Their first principles is to say if you're streaming depth don't treat it as video depth is totally different Compressors in completely different ways. It's floating point versus like 8-bit color and all that sort of thing I'm afraid that what we did was to go and see that these guys are UCL in London wrote a paper back in 2011 talking about encoding connect data And it was depth into video and they basically said yeah, whatever take 16 bit depth data And you put some the best way to look at it is this in the blue channel You just go and do a linear interpolation So you basically got coarse granularity stuff in the blue channel and then on the green the red You have a periodic function as a triangle wave with a phase shift that incodes the fine level detail and practice This means that if you've got the camera there It starts off looking very red and green and stripey and you move it forwards And it slowly gets more blue and then the red and the green kind of oscillates as you go through the triangle wave Which in theory could be quite a cute way of doing it The main problem is that it comes out looking like this so I'm afraid this is my ugly head with Bright blue where it's in the foreground darker in the background and you get this lovely kind of moray pattern, which let's face it This is kryptonite in a bad way for video colex this is the least compressible thing in the world and I'm not really sure what the guys are thinking of in that paper because it really doesn't work So this is the dot clown that you get off it. It's possible. I screwed it all up, but Basically the I mean the problem is that you can't encode the twiddly bits around the edge and When you convert it from RGB to yuv and then back again it mangles it all up anyway So all of this fine detail gets put back into the RGB space and basically it was a bit anticlimactic I also screwed up by not converting the floating point stuff into a kind of linear Integer domain for doing the calculation and instead treated it like a bit filled and did a bit wise cost so the contrast of the Data was basically wrong, which I didn't think was a problem based on that looks like a pretty contrasty image to me But basically you ended up with this fairly crappy result step five Give up and on the fancy depth encoding and say yeah come eight bits of depth is good enough for anybody Especially if it's between let's face it here and here 256 step values between there and that's good enough and Embarically here is a beautiful Dot cloud of my face in green showing that you know, it's not looking too quantized. It looks like it might be all right Step six now green is great, but we need to have color So what we need to do is to capture both video in the depth for this thing at the same time This is a little bit more fun because webRTC very much assumes that it controls the camera and thou shall only have one camera per capture device and Meanwhile, you need to basically have a way to share the capture device across two totally different video tracks So at this point we butchered the webRTC API horrifically for iOS to allow you to share an AV capture session between two different Capturers and then it's actually only a couple of lines on iOS SDK to set our new form very spec compliant constraint called matrix depth not to be confused with the standards based on depth constraint and Suddenly we're starting to get somewhere. It's starting to look a bit more promising Except the dot cloud might have a few limitations It's actually what's great if you kind of back away from it in VR and all the dots merged together And it kind of looks a bit like a 2d video a long way away But then you get in close to see all the cool v3d stuff and it tends to crack like that So Absolidity basically replace the dot cloud with a displacement maps and mash using a vertex shader Ideally you'd use a geometry shader, but WebGL doesn't give you geometry shaders irritatingly and You switched out for basically an a-frame plane with 640 by 480 vertices Using vertex shader and tweak it to try to reduce some smearing and it looks like Sorry trying to do the real demo now. What can possibly go wrong? So Here we are back in a local host and version of the VR demo and this time I've told it to phone Matthew to our matrix.org which I'm logged into on right iOS on my beautiful my phone 10 So, let's see what happens here So we're back in the holodeck and Good news is that this cannot connect to matrix. This is a really good starting point. Let me Pray that this thing is on the right Wi-Fi. Oh, I've got a call coming in. So I'm gonna answer the call using cool Kent and Oh So here you go ladies and gentlemen the world's possibly first ever iPhone 10 3d video cool Okay, let's face it. This is not exactly refined This is a two-day hack and in the last couple of days in a kind of oh my god Foster actually accepted my VR talk. We better show something in VR But hopefully it's it serves purpose. We got some problems here and first of all It turns out that's quite a far way away and you get this Amazing tearing effect now what I what we're trying to do is to reduce the smearing by make it transparent By looking at the partial derivative of the depth buffer and turning those planes invisible the only problem is that my graphics card apparently has a really crap partial derivative function going on in it and So only half of them disappear and I'm sure it's a GPU's fault and in no way my codes to blame that Then what else have we got? We've got the fact that the depth signal and the video signal are kind of sometimes getting a little bit out of sync So if I move it around a bit actually on how at the moment is working perfectly Trust me you can get like a hundred milliseconds of delay between the depth and the video buffer Which ends up being quite trippy because everything sloshes around obviously you want to sync them together Tensile iOS gives you a nice API to keep moon sync To properly genlock them together, which I'm not using at all because we'd have to thread it through web LTC But it would be nice if it works other problems Well, it's basically the rest of the fat not working is the 8-bit depth and you can see it's a little It's fine. Oh, we're in the middle of my head welcome to my head everybody. That's what it's like to be inside me If you go and push free like this you can see oh, there's obviously a front clipping plane Which is actually the dot projector running out of juice But that's relatively working. Okay, but if we look at it from the side again We're gonna have This weird clipping effects, which is sort of cool. It's very help me Obi-Wan. You're my only home, but It's kind of even worse quality than pull it a lot too deep you could manage all three CBO whichever one it was So they have it basically that's what I want to show you and the idea is to basically say This doesn't have that much to do with matrix But it's using matrix as a signaling protocol and to do the to do all the signaling And honestly the matrix side of it was a pleasure to use if you've spent too much time doing web RTC with simple custom HTTPS stuff Just having a simple HTTPS protocol. We have an STK and I don't know whether I've got it open I don't if I opened up Xcode we'll probably come straight up with a couple of lines of code that show Especially if I hang up so I get some CPU back Come on Xcode no Xcode it's probably upgrading itself to Basically, it's about ten lines of code in the end, but the you can check it all out. It's all Sitting on get hub next up on VR is basically not to do any VR on matrix because we need to make right suck less And that's actually the priority, but In future we want to do well geometry on matrix and avatars and physics Meanwhile on matrix itself lots of your stuff on riot. We got funded last week. Thank God So thank you to everybody who supported us But we're also hiring for designers and front-end devs and crypto and going in Python people and void people in VR people If you want to get involved and here are the URLs for the playback side and the capture side of 3d video cool Thank you very much