 So, it's been a long time since I'm in mind to speak at Fozdem. For those of you who don't know me, I'm Phil. I have been in the video industry coming up for 10 years now. I've worked at BBC in London. I've worked in San Francisco for Bright Cove. And I work for a small San Francisco start-up called Mux. My role, I have a really weird job title. It's Streaming Specialist. And what I get to do is play with things, experiment with things, come to conferences, come for ideas, write blog posts, those sorts of things, which I really enjoy doing. Now, this is a topic that I've wanted to talk about for a really long time. So, my title of this talk is Reaching as Many Viewers as Possible Using Only Libre Video Technologies. Now, having actually written the talk, I would love to change that a little bit to a significantly more wordy version of it, which would be reaching as many web viewers as possible with the best user experience possible using only Libre video technologies. Now, the first question everyone's gonna have is, hey, what do you mean by a Libre video technology? So, to me, that means two things and kind of in order of importance. I want to avoid all patent-encumbered technologies, but I also wanna prefer technologies that are developed in the open. Technologies where I can get involved in the community, make a difference and improve things for everyone. So, nobody really wants that, right? Wikipedia wants that. Wikipedia is a really fascinating example, a little bit of a detour, but the experience of putting video on Wikipedia is really interesting. You should try it if you haven't tried it. First, you need an account, which you don't need an account to go and edit articles. You have to go into your settings, find a video uploader, enable that. Then you have to upload something in a WebM container. I think actually, technically, it means it's VP8 or VP9, but yeah, it's actually inherently very much more limited and this isn't to belittle what Wikipedia are doing, but it is a case where we have an example of somewhere where we only want truly Libra technologies. Now, really interesting, two great points. One is that the tutorial for doing this is hosted on YouTube, not on Wikipedia itself, which I think is interesting, but also it actually suggests uploading your video to Vimeo and then downloading it from Vimeo to put it up to Wikipedia as well. So, when I kind of think about these systems, I think about there being five decisions we need to make to build a video system. I'm not actually going to talk about the first one of those because I think probably being in this room, you kind of know what sort of encoders are available in kind of the false space, but we're going to talk about codec, the container, the delivery technology and some player choices we can make as well to build a system. So, starting with codecs, which I'll spend most of the time. What I've done is gone down to the major codecs and divided them up into what I would consider a Libra codec and what I would consider a patent encumbered codec. So, in the Libra codecs, we've got things like VP8, VP9, newer AV1. We've got Vobis and Opus for audio codecs. On the patent encumbered side, we've obviously got all the MPEG video codecs as well as AAC, which is MPEG's audio codec and then AC3 and EAC3, which are Dolby audio codecs. So, it's pretty easy obviously to see what sort of choices we've got, but we can actually narrow this down really quickly just by thinking about what sort of codecs are actually sensibly deployable today. So, AV1, we're getting some browser support now, but it's limited and the cost and complexity and time of encoding anything is unrealistic right now. So, instantly we actually think that AV1's not feasible for big time streaming right now. What's really interesting to me here is we actually lose all but one combination of our patent encumbered codecs. HEVC briefly made an appearance in Edge and then got removed. VBC obviously isn't finished yet and the Dolby codecs are really only seen in living room devices anyway. So, I wanted to test those codecs. So, I put together the most basic test possible, a video element in a web page with these kind of three combinations. H.264, VP9 and VP8 and associated audio codecs and I put them through a set of tests in the Evergreen browsers. So, the Evergreen browsers are those that are constantly updated, automatically updated and actually following some sort of release schedule. Here it is, here's what it looks like. And there's our little animated videos in there and this is real data. I actually went and captured the video capture from each of these and then trimmed it and put it in. It was far too time consuming. We can actually see obviously as we'd kind of expect AAC, AVC plays absolutely everywhere, absolutely fine, no problem whatsoever. VP8, VP9, so we're good on Chrome, we're good on Firefox, we're good on Edge, but Safari starts to let us down a bit. But let's kind of put that a bit more into numbers. So, I think that works out with no extra work, is about 85% of desktop browsers as of today. The equivalent for AVC and AAC I think is over 95%. But we're within 10% for the starting point, I actually think that's pretty good. So, the big problem areas by numbers are Safari, which represents about 5% of traffic and Internet Explorer, but itself is another 5% and another whole different set of headaches. But we'll talk about that later. Is that a problem? It claims to be a problem. But here's kind of a big problem with all of this. 49% of traffic is mobile. Now, little side story, I don't know why these lines diverge. I couldn't find any one who's written about this, but to me that sudden drop off of 10, 20% traffic swing and then suddenly for it to flip back feels really suspicious. So, I actually suspect that a little bit more than this number is actually mobile traffic. I don't know what was going on there, but this is the stats that I was working from. So, if we need to deal with that 49%, the majority of traffic, we need to deal with mobile browsers as well, which seems fine. So, on mobile there's really free browsers, there's Chrome on iOS, Safari on iOS, and Android, which is just Chrome. So, I ran the same set of tests again, obviously, but there is also a bit more disappointing here as you can see. So, obviously, AVC and AAC plays fine across the board again, and then we basically get no playback whatsoever on anything iOS. So, regardless of whether this is an in-app experience when you click a link or regardless of if this is someone with a browser, you're not going to get any playback at all if you're on an Apple device. So, funnily enough, when you work that out as some numbers, it's pretty concerning reading. We need to reach as many people as possible. That's our objective, right? So, it's 41% coverage on mobile. That's compared to, I think, over 90% for AVC and AAC, so that's pretty bad. iOS alone represents 37% of the content we're missing. So, pretty bad. So, how do we work around that? So, polyfills. So, polyfills are a way to render a codec that doesn't actually have native support in a browser. Usually, it's Canvas and web audio APIs combined to actually do the rendering in the browser engine. My favorite, personally, is ogv.js. It's written by Brian from the Wikimedia Foundation. It's a great example. It supports VP8, VP9 and AV1. It has support for things like WebAssembly to accelerate the process. It does have, obviously, drawbacks to this approach, non-native experience. It's going to be CPU and memory heavy, and there's no media source extension support, because that's actually inherently built into the browser and no source buffer support. So, that caused us some problems a little bit further down when we get to it. So, I took ogv.js, our polyfill that I decided to pick arbitrarily, and threw in our free challenging situation so that we can see that working here. So, we now have VP8, VP9, Vaubus and Opus playing back on our Safari browser, our Chrome on iOS, and our Safari iOS as well. So, that works absolutely fine. It'll get challenging, obviously, the CPU and memory requirements go up as you go up through resolutions. This is fairly low resolution, but it totally works. And, there's a little bit more evidence of that working. Since 2015, ogv.js does exactly this for Wikipedia. It gives Wikipedia the browser coverage that they wouldn't otherwise be able to achieve given that they prefer VP8 and VP9. Well, only deliver VP8 and VP9. So, really quick conversation on containers. Containers and codecs generally come as pairs or exceptions to that. A Librica container, I would tend to look at Matroska or WebM, which is a subset of WebM. Generally, you'll find VP8, VP9, AV1 video, ogvus and Opus in there. And then for Padden and Cumbered, we're talking about MPEG technologies again. MP4, ISO Box Media File Format, or Transport Streams, and you'll find the MPEG codecs in there. There is also a spec for VP9 in ISO BMFF, which is actually one of the things that Netflix uses quite a lot of, but outside of Netflix, it's not used massively. So, delivery technologies, one of the most important pieces. So, what's wrong with just our progressive WebM that we served? Well, it'll actually work, it'll get you there. But we wanna give a great experience to users. We don't just want to give them a file and hope it works, right? We could pick a one megabit file and serve it to every single user, but some users are gonna get buffering. Inevitably, I was on a train yesterday, my bandwidth was going up and down, hotel rooms, cheap Chinese routers that don't really have any of quality service built into them, and free kids next door playing video games. So, suddenly, my bandwidth's changing all the time. So, we implement what's known as adaptive bitrate or ABR technologies. So, encode multiple bitrates, multiple resolutions, segment the output files into different chunks, and then just switch between those chunks based on what bandwidth we saw the last few times we did some chunks. Now, there's two big adaptive bitrate technologies out there, HTTP live streaming or HLS, and dynamic adaptive streaming over HTTP or dash. All adaptive bitrate standards work on the basis of a manifest file or a playlist file. This just gives you metadata about what streams are available, where to get them, what the files are named, those sorts of things. In HLS, that's an M3U8 file, which is actually a variant of the old MP3 playlist file format, and it actually uses several manifests, a master manifest and rendition manifests as well. Dynamic adaptive streaming of HTTP or dash is an XML file. It's a single manifest file for everything, so there's actually less round trips to initialize a stream, so it does have some benefits. Now, I didn't immediately put this on my Libra to patent encumbered scale, because it's really interesting, because the way I look at it, obviously dash is an MPEG protocol, so it's pretty much a non-starter. It actually has a patent pool, it's XML file with a patent pool associated with it. Some MPEG people will tell you, ah, you don't need to worry about that, but it's not gonna stop people coming and trying to charge you a fee or suing you. HLS is an Apple protocol. It's developed by Roger Pantos out of Apple. It's developed basically completely behind closed doors. Any extensions or variations of it are pretty much you're on your road. You have very little control of that, so when I think about the second piece of what I like in a Libra system, it is to be in charge of my own destiny, which I can't be when it comes to HLS. Now, there is, I must admit, a snapshot in ITF RFC, but that's not actually gonna change, that's just to encourage manufacturers to build TVs to a set version of the standard. It's still evolving and it's still evolving on Apple's basis. So ah, we totally don't have a Libra AVR technology. So option one here would be stick our things in our ears, use dash and hope nobody sues us, but here is my two suggestions. One would be creating an open AVR standard. We don't have one, maybe we should make one, obviously included the appropriate XKCD for building more standards here. One I pitched a couple of years ago was moving pitchers amateurs group, simple adaptive streaming over HTTP, so mpag sash, we called it. The idea was to build a simple JSON based exchange format, much more browser API friendly, trying to point it at media source extensions rather than building something that's an XML file, which doesn't really play well in a browser. But yeah, so trying to build something that actually is an open standard and work with a community to do that. The other option would be really to use HLS, not to try and introduce more standards. It's not an open system, there is a snapshot, but if we wanted to stay Libra, we'd need to do HLS with WebM, VP8, VP9. That's totally not supported anywhere and I don't see it being added to the specification, but I'll mention that more a little bit later. But it still really isn't a full story because if we want to do ABR, we need player support, so if we create a new ABR standard, we actually have to go back to all the players we want to be compatible with and add player support for that protocol. And even better, if you want to do adaptive bitrate, you need source buffers. You need a source buffer to be able to switch out between bitrates. And unfortunately, none of the polyfills actually have that capability right now, so we'd also have to go and work on all of our polyfills as well. So I mentioned in passing players, so I want to talk a couple of player components that I think are the dominant ones in space. If you want a very high level player framework, VideoJS is very comprehensive, does everything including styling of your video elements. It's got HLS dash built in. We could totally add a sash support of a plugin. It's great plugin based architecture or add WebM in HLS support if we wanted to do that. Has a native OGVJS integration, which is great as a starting point. It means we can integrate those and get good coverage. Apache 2 license, so quite happily usable and developable. HLS.js is a much lower level piece of technology. It's really powerful. All it does is add HLS playback to the video element. Doesn't give you anything more than that. You're still in complete control of your styling and everything like that. Much lighter weight. We could totally extend it really easy to do WebM HLS. That would be no problem whatsoever and a great license as well. Depends on the level of abstraction you're interested in working at really. So where does that get us to with our Libra video chain? So what we could do today, we could do VP9, Vorbis and WebM. We could use VideoJS with an OGVJS polyfill and that would get us to about 90% of desktop users and I think at least 80% of mobile users. We actually get Internet Explorer for free with this because that totally works with OGVJS. So it gets us a chunk further. So that in my opinion isn't bad. Being within 10% of proprietary stuff for me is a win. I wasn't expecting to kind of get anywhere near that close really. But I think the next thing we have to do is talk about adaptive bit rate, decide which direction we're gonna go in that and also put that adaptive bit rate technology into ABR, into the polyfills as well for it to work properly. Now, I promised to mention AV1 and Mistalk. It really hasn't been much AV1 chatter this year. Last year it was everything. So does AV1 help with this? Maybe, hopefully, I really hope so. So Chrome and Firefox have AV1 support. Firefox was literally one week ago, two weeks ago. It was very, very recent. And Apple and Microsoft have joined OAM. So technically we have everyone we need in AOM to hopefully do something. Microsoft actually have an AV1 decoder you can get on their app store. It totally works in edge. So gets us another step further. But Apple, like, Apple have not made any announcement of their intention with joining AOM. They've just joined it. That's all we really know about it. But they also, last week, announced they're gonna remove VP8 and VP9 support from QuickTime, which doesn't seem to indicate kind of an aligned direction on that. If Apple did wanna do AV1, they'd obviously have to have an adaptive bit rate standard. Apple's one of the places where you can totally just push a HLS manifest to a device and it will play with adaptive bit rate and everything. But they would need an ABR solution and Apple are very heavily invested in HLS. So that would be HLS. Now, my bet, complete gamble here, but my guess would be if Apple do invest in AV1, what we'd see is AV1 in fragmented MP4 serve for a HLS manifest. My bets on that are basically, I don't think Apple are happy to incorporate pieces that are not open in their ecosystem, obviously. There are big investors in HLS4 and HLS625. The container to them is fine to not be open. They're not aiming for a Libra system. So I think I'm expecting to see that as their approach to this. But hopefully I'm wrong. So all the code, I demonstrated the codec test and everything you can run it yourself. There's a hosted version. There is also the code on GitHub. If you are interested in talking more about the impact sash proposal, it's on GitHub as well, you can read it there. And tell me how much of a terrible idea it is and how I'm going to get sued by MPEC. And then there's also a player's playground. So this is the two players I mentioned, also a bunch of other players that you can go and play with and try your manifests with and see if you can get playback going for anything you're experimenting with. I'm good on time actually, but two more things. I want to mention two community things. One, I know Slack isn't hugely popular in the open source community sometimes, but there is a huge community of video developers on Slack that is well and truly worth joining if you haven't. If you want to chat me about this, it's in Hash Libra and also impact sash. If you want to debate more about adapted bitrate stuff and also Demux Foundation, Demux Foundation building the community around video engineering. We also do a two day conference in San Francisco. Jean Baptiste did a fantastic talk there this year about his new AV1 decoder. And also we have a podcast that I'm involved in as well, but it is totally worth checking out the Demux community as well for more video stuff. Cool, I actually finished on time, so if anyone has any questions, back yourself up. Is there any question? We have time for quite a few questions, yes. So in your proposal, you mentioned VP9 and the Vorvist. So is there any reason why to pick VP9 over VP8 while we wait for AV1 to sort of be ready, I guess? Only from a compression ratio standpoint. Both of them will play back in roughly the same footprint if we're kind of looking at the more modern browsers. I guess the flip side of that would be for VP9, it's computationally more expensive when we're looking at using OGVJS as a polyfill significantly, so you might cap out at around 720p on a modern laptop as opposed to you might get higher resolution but obviously higher bit rate. So I don't really think it matters which we do in the interim while we wait for AV1. It's about whether we want the trade-offs on the polyfill or anything else. And maybe a little bit somewhat on the edge of this, but as you mentioned, Apple is dumping VP8 and VP9 from QuickTime, which sounded a bit surprising, I didn't know. But in a slightly different area, so in real-time video on Safari, so WebRTC, the next stable version because it's already in technology preview, they have added VP8 support. So I think that may also solve a bit of these issues because if you can play WebRTC stream on a video tag, at least we should be able to play anything, I guess. I would hope so. I actually haven't texted the preview version, but yeah, I would hope so. The thing where they're removing VP8, VP9 support, they put out this announcement last week, it was with a massive list of codecs that were gonna disappear from any ability to be played in QuickTime. And I don't really think, it's actually from the next major MacOS revision, I think. I don't think anyone has fully understood what that means yet because several of them are a lot of people, it doesn't play this anyway, so what exactly is disappearing? So I don't know, it's a very strange Mitch message coming from Apple on that at the moment. I don't really understand that. I guess they just don't wanna maintain some old code and there's some really old stuff in there as well. Gotcha, thank you. Other questions? I clearly forgot to plant some questions earlier. Maybe my question is out of scope, but did you take into account any specific platform constraints in the whole video experience of the end user? No, so this was the deliberate aim to hit as much as possible, to be as wide as possible. I didn't take into account anything around, yeah, limited CPU or bandwidth or anything. No, but, yeah, very, very fair point. So next question, well then, thank you, Phil. Take it.