 Hi there, my name is Alan Jeffery and I'm the technical chair of the Servo project and I'm here today to talk about one of the applications of Servo which is in streaming video using Gstreamer from WebXR content. So what is Servo? Well Servo is a browser engine so that's the core of a web browser that's responsible for downloading content off of the internet and then displaying it to the user and traditionally this has been part of the 2d web using standards such as html or css or javascript our old friends and more recently though there has been efforts to allow 3d content to be displayed using standards such as WebGL and WebXR and so Servo was originally developed by Mozilla and is now part of the Linux foundation and the original motivation for Servo was twofold so one of which was finding out whether web components could be developed in Rust and then deployed in ForaFox and the other was to experiment with newer web technologies such as WebXR and the experiment of developing web components using Rust was very successful and so technology's got transferred out of Servo and into ForaFox such as the Stylo CSS engine and the web render graphics backend but also like Servo was being used as an experimental platform for finding out what you could do with an embeddable web engine and what kinds of experiences this would enable and I'm going to be talking about one of those today so the next bit of the technology stack that we need to look at today is GStreamer so GStreamer is a streaming media plugin architecture where plugins can be sources for video or audio filters such as transcoders or syncs such as your file system or streaming over the internet and the way that content providers use GStreamer is by stringing together these plugins into a pipeline where media is consumed at one end is formed by various filters and then is produced in a sync and one of the things that's interesting about GStreamer from our point of view is that it has quite strong connections into the Rust world so there are Rust bindings for all the GStreamer APIs and also it's possible to develop GStreamer plugins using Rust and that's very useful for Servo which is primarily implemented in Rust and GStreamer is used inside Servo as the media subsystem but that's not actually what we're looking at today what we're looking today is using Servo to provide a source so Servo is designed to be an embeddable web engine and one of the places we've embedded it is into GStreamer so there's a Servo source plugin that you can use to take web content and put it into a GStreamer pipeline so for example you could use this to layout the title sequence for a video using HTML and CSS and then use the Servo source plugin to produce that as part of your GStreamer pipeline and one of the nice features about this plugin is that it's entirely using GL memory so if the rest of your pipeline supports GL memory this means that the video content never actually has to be in the main memory of your machine and can instead reside entirely on the GPU and if you're familiar with the webcam embedding into GStreamer the WPE plugin and this is doing a very similar job for Servo and like I said you can use this to render traditional 2D content into a GStreamer pipeline but you can also use it to render WebXR is the W3C API for developing virtual reality and augmented reality applications using web technologies and these can be both 2D experiences such as holding a phone up and seeing a virtual object in your living room and immersive 3D experiences for users wearing appropriate headsets and for Servo we've got a WebXR implementation with back ends for both the HoloLens and the Magic Leap augmented reality headsets and as part of that the Servo development team was quite actively involved with the W3C standards effort so plugging these three technologies together so WebXR, Servo and GStreamer and this means we can take WebXR experiences and put them into a GStreamer pipeline and this means you can do interesting things that I don't think the authors of the WebXR experiences intended so for example we've got a back end for Servo that renders to red cyan 3D so if you put on your red blue glasses you can actually then watch a 3D experience at home without a headset or we've also got another back end that renders rather than rendering stereoscopic views of a 3D scene which you would for a headset we render six views of the scene to a cube map and then we take that cube map and we project it into 360 video and so this means we have the ability to record 360 video out of an arbitrary WebXR scene as long as that scene hasn't hardwired the idea that people only have two eyes so we've been able to take WebXR content that was intended for viewing in a headset and we've now managed to reuse it for scenarios that the original authors didn't intend and this is part of the philosophy of the Web is that it should support this kind of accidental reuse.