 Hello again, folks. Now let's talk about media. And I think maybe two episodes ago I explained about Dash and MSC and HLS. Right, so I've been making another prototype. In fact, to be honest with you, I'm quite away on with the build. But I was away at BlinkCon in the States last week. And so the diaries are lagging behind reality a little bit. But we'll come on to that in a while. What I want to do is I want to talk, as I say, about media. I talked about Dash and MSC and HLS. And I've been making a prototype using the Shaka player. So let's talk about that first of all. Why would I use Shaka player? Because if you've got MSC and you're swapping between different quality versions of your video, you have to do things like tracking the bandwidth. You have to figure out where you can switch and get the next segment of your video and all these kinds of things. And it's actually a tremendous amount of work that I don't think is worth doing when you can get the Shaka player for 43 kilobytes g-zipped. Now, it feels to me like that's really small for what it's actually doing on your behalf. You should actually have a look through all the docs and everything. In fact, we'll link to it in the show notes about the Shaka player and where you can get it and have a look at the demo and all these kinds of things. It does loads. It's really quite small for what it is. And I felt like, you know what? Actually, it's got loads of unit tests and it's just properly done. So that's what I've been using. Right. So with that in place, how do we actually would we use the Shaka player? That's what has been on my mind and that's what I've been working on. So let's have a look at what's on my screen. So I've got a bunch of FFM peg commands. I should explain what I'm going to try and do. I'm going to start with a video. I've been starting with one video that's running at HD. And then what I wanted to do was make different quality versions of this. So I'm using FFM peg. You can see on screen, I've got these FFM peg commands, which is just, I mean, they're sort of reverse engineered from looking around at documentation and other ways of doing things. I just chose a 720 and a 540 and a 360. But let me just show you this. There is a page on the YouTube Help that covers live encoder settings, but nonetheless encoder settings, bit rates and resolutions. So for example, they're saying, if you're going to do a 1080p video, you probably want it 1920 by 1080. Big surprise. But bit rate would be 3,000 to 6,000 kilobits per second. So what I'm doing this for real and not just in placeholder mode, sort of like, oh, let's just do a few versions. Just see how this works. I would probably be following something like this to kind of go, right, that's what a 1080p looks like. That's what a 720p should have, 480 and so on, all the way down to the ones that I care about the most. So I'm just running ffmpeg, just making a bunch of these three different variants of my video. The next thing I'm using is a thing called the Shaka Packager. And what the Shaka Packager does, and that is quite a mouthful to say, Shaka Packager. It feels like it just has a nice little takatakata. I really like it. We can say it one more time for you, Shaka Packager. There you are. What it does is it takes the videos that you've got, all the different variations, like the 720p, the 540, the 360, and you give it a profile. So whether it's live streaming or video on demand, I'm doing on demand, not even thought about live streaming. You tell it where you want it to output the dash manifest, because that's what it's going to do for. It's going to take in all the video and so on. And it's going to split things out into video tracks, audio tracks, different qualities, blah, blah, blah. And you tell it how much, for example, you want it to buffer, which, in my case, I'm saying, three seconds would be good. Just get me three seconds up front. Further down the line, there'll probably be some dancing that I need to do there around pre-buffering video content so that, for example, we can give you a flying start with video. But I haven't really got my head around that space yet. Like, how do we get the first five seconds of a video primed by a service worker so that we can just play instantly? I haven't really thought about that. And in the dash manifest, you can just say, hey, you know what, when you first load this video, just try and get the first three seconds off the network. That's one thing. And then the segment duration that you can see here is, as I understand it, it's how often you can switch qualities. So I'm saying every three seconds, if you need to, you can switch qualities to the Shaka player through the dash manifest. You can switch every three seconds. What's important about that is when you reverse back up and look at the video calls, you see here that there's this key int, min key int, no scene cut. What we need to do when we take our original video and we turn it into the different quality versions, we actually have to put these things called iFrames, not iFrames like we would, they're different iFrames. They're iFrames, but they're not iFrames. You know what I mean? They're just, they're video iFrames. They're key frames. That's what they are. They're key frames. So we need a key frame. Say, in my case, I'm doing it every 24 frames, which means, because it's a 24 frames a second video, I'm getting a key frame every second, right? So every time we hit one of these key frames is a safe point at which we could switch to another quality. So the segment duration should, if I understand this correctly, be a multiple of the key frame. So if I've got a key frame every second, I could switch every two seconds, four seconds, five seconds, as long as it's a multiple of what my key frame value is. Does that make sense? I hope so. But I have a segment duration of three seconds. That's what I'm saying. Right. What I also do, as you'll see here, is I actually do three more packages. One for the 721 for the 541 for the 360 because I was just interested in whether I might need a separate manifest for the offline stuff, which I'm gonna come on to shortly. Right. So, taking a video, turned it into three different quality versions, as I say, in production you do, you go through that YouTube help and just kind of be like, okay, I need like five or 10 different versions and so on. And you probably just set up a pipeline for processing your videos. And then we're gonna package them up into a dash manifest or have a dash manifest, which points to all these things. Let me show you what one of those looks like. Sample manifest full, that's the one. So this is what the Shaka Packager outputs for us. And you can see that there's, I mean, there's loads in here and it's a bunch of XML. But here's the minimum buffer time, which is that three seconds that I set, how long the video is. So the media presentation duration is 71 seconds in this case. And then inside here, you'll see that these things, these adaptation sets and these representations, but you'll see what's crucially important here is that you've got a representation for the 640 by 360 and 60 by 540 and 1280 by 720. So you can start to see that it's listing out all the different qualities of video that you can actually use for playback, which tallies entirely with what I passed to the Packager in the first instance. And it has really interesting things like the amount of bandwidth that that particular video quality is gonna need, the codecs it's using, bunch of other stuff as well. And to be honest, there's a bunch of stuff in here I don't fully understand, like the initialization range. Not quite sure what that is. It's quite exciting. I'm sure you'll agree. Maybe not, I don't know. And then further down, we also have audio configurations as well. So again, we've got 360, 540 and 720 audio. I have a feeling that in reality, you probably only just want like one audio track, something that's like an MP3 or an AAC or, you know, you probably, the video quality I reckon is gonna change a lot more and be the one that's the more bandwidth sensitive than the audio is my guess. So you might just be like, oh, I'm just gonna take one audio track and I'll have different, loads of different qualities of video, but I don't know. I'm still figuring a lot of this stuff out for myself. Right, so let's take a look at some of the code to actually take that dash manifest and put it into use with the Shaka player. So this is the code, which is not particularly neat. I'll give you the heads up. It is very much prototype code, but hopefully we can just kind of pick out the bits that make sense for this particular context. Here we go. So we have this initPlayer function that I've got here and all we do is we wrap a video element in this Shaka.play, kind of makes a wrap around and starts kind of giving us additional things that we can do, but we can also talk to the video element directly if we need to until it's a .play and you know, seek and all those kind of like .currentTime and all those things that you normally do, with a video. So it's kind of nice actually. It's a really nice way of working that you just kind of still predominantly can just listen to events from the video player, but when you're actually setting it up and doing playback, the Shaka player stuff kicks in and switches all the different qualities for you. It's pretty cool. I actually really like it. So the thing we do is we give it the manifest. So the manifest that we created with the Shaka Packager, we can hand to the player, the Shaka player and that's, a lot of the API is promise based. So we call it with the URL for the manifest and then when it's loaded and it loads the manifest and the first little bit of video across all the qualities listed. So it kind of gets, the first few seconds of the 720 and the 540 and the 360, whatever versions you've got, does that and then it fires the callback for the .then of the promise that says, okay, you're good to go. So when you get that, now you can see I actually have a bit more code going on here which we'll talk about in a minute because I've been trying to get it to work offline as well. Been successful in that, at least partially successful. So we'll talk about that as well. But you can then start calling .play and that will work. So let me just show you what that looks like. So don't talk, Paul. So as you can see on my screen, there is the video and actually if I go to the network tab, you can see it's requesting, there's a moment in time it's requesting the 360 video, 540, 720. It's decided that my bandwidth, because it's coming from local host, my bandwidth can support this 720 and off it goes. And it makes, let me show you this as well. This is quite interesting. It makes what we call range requests. So the request itself only has a request in there for, only has a request. The request has a range in there, which is rather than saying give me all the video at once, it's saying I want this byte range to that byte range. And the Shaka player figures out that range request itself based on, I guess the bandwidth and how big the segments are. And it goes, well, I guess I need that number of bytes next to make that next chunk of the video. So it makes this range request. There are other ways to do it. You can, as I understand it, instead of having one big MP4 file or whatever video file you've got, WebM or whatever, you can also do it in chunks and actually have individual files. And that's also our way to do it. I've gone for one big file because I think I'm gonna have to pass one big video file across for Chromecasting later on. So it just made sense for me. I was just like, I'm just gonna have one big video and make little chunked range requests off that. And as I say, that's working. So we make one of these range requests and the response from the server is, it's over here and it looks pretty tiny, but it's basically just those, you can see the size of it. I mean, the original file is much bigger than that, but you can see that you just get, say, 477K of the video or however much of the video it thinks it needs. So that all works and you can see as well that the bar here in the controls only fills up with the amount that's buffered. It's all good. That's working just fine. Now let me show you what actually happens if we try it again, but this time I'm gonna throttle the network connection back down to like regular 3G. So probably as a guess, it will start asking for the 720. Yeah, you see it started asking for the 720 video, but soon enough I reckon it's gonna drop down and ask for the 360 when it figures out that actually the bandwidth isn't so good. And again, this is what ShakaPlay is gonna do for you so I don't have to do that, which is brilliant, because it's code I wouldn't want to write. There we are, you see. Now you can see the requests are for the 360 video and the 360 audio. That's what it's decided I actually want. Cool. So that's kind of like the dash side completely sorted. Let's have a quick chat about the HLS as well. In my script further down, I take those 720, 540 and 360 videos and I call this media file segmenter, which is something you can get downloaded from the Apple developer portal. And what that does is it will slice and dice the different videos into particular formats for you. This M3U8, you can see this, it's like a playlist file. If we actually go and have a look, I think, let's have a look. 720, M3U8. I don't really know what I'm looking at other than I can see that there's like, this is like three seconds and it tells you which sequence of files. You can see it makes all these files in the process of running media file segmenter. So that does that bit. And then after that, I move some stuff around, but then you call variant playlist creator, which is another one of these tools that comes with the Apple downloads that you can get. And you pass it to the three playlists in my case, but all the playlists that you've made and it makes one master playlist, which looks like this. And that is essentially, I guess, roughly equivalent to the dash manifest. I mean, on the technical side, it won't be, but for Android and desktop Chrome and desktop Safari and whatever, you can just use MSC and that will just work. So you've got the dash manifest for those. But for iOS Safari today, you do need to create one of these M3U8. So it's kind of, it's the equivalent, I guess, of the M3U8. Good. On the code side, if for whatever reason we get an error from loading the manifest, which should happen, I believe, on iOS, because it will go, oh, I've tried to load the dash manifest and I tried to do all that stuff and it's just not working. I have a fallback that just says, okay, switch across to HLS, which is where you get that M3U8 file and you just say video.source equals M3U8. So it's actually very, very straightforward to use, very convenient to use. And iOS Safari will just kind of go, oh, sure, and it'll start adapting and getting the right things for you. So you don't actually have to do an awful lot with HLS other than point your video element at the M3U8 playlist file. I think the upside of dash is that the Shaka player can manage the bandwidth and you get to have a little bit more of a say on what bandwidth you think it should have and which track you think it should lock into and all those kinds of things if you want to. With HLS, it's much more like the user agent just kind of goes, I've got this. And if it decides to download the entire video, there isn't much as the developer that you can say other than, sure. So it is worth knowing that that's what's gonna happen. I mean, there's nothing you can do about it particularly. The only way I've found actually to at least tweak it a little bit is to say on the video element that you do preload equals metadata. If you don't set the preload attribute, it will go off and download all of the content which may or may not be what you want. So there you go. Right, I'm aware that I've taken up quite a little bit of your time but I still wanna show you the offline stuff. So I hope you're sticking around, I hope you're enjoying this. Let's have a quick look at what it does, what we require for the offline stuff to work. So at the moment everything's just going off to the network and the idea was what if we could pre-fetch everything into the cache, into a cache so that when the request goes out for that part of the MP4 file, saying the dash version, that the service worker can be like, oh yeah, I got this, this is already in the cache. I'll just create that, I'll handle that range request. I'll get the item out of the cache, get that bit of the file and then just send it back. So it's kind of transparent as far as Shaq is concerned. We just kind of go, just get that video and it will go off and do it. Enough chatter, let's look at a little bit of, hang on, let me get my words right. Let's look at some code. There we are. You wouldn't have thought it was so difficult to say but apparently for me, yes. There we are. Service worker, this is my service worker as it is today and this is really the interesting bit. So imagine that this fetch is called on every time a request is made from the page, okay? Cool. What we do is we do a match against the caches and if there's a match, we'll handle it. If not, we'll just do like a pass through. So what we need to do is figure out, well at what situation would we call the caches.match and get something back and the answer is when there is something in the caches. So if we went off and got the video and audio and everything else that we'd made and put those into the cache, then we'd get a cache hit. One thing I discovered and I was very grateful to Jake Archibald, if you don't know him, should follow him on Twitter and all that. He was helping me figure out how I should actually handle the range response. You remember I said that when we make the request, it's a ranged request. We only ask for a certain byte range. When we get a cache match, we get a cache match for the whole thing. So if we just say, well, return whatever the cache has, rather than returning just those bytes, you'd actually return the entire video which is not what was actually asked for you. We were only asked for bytes from, say, zero to a thousand, for example. So we actually have to do that ourselves. So what I do is I've taken Jake's code here and what we do is we create a ranged response. So that takes in the request and we find it from the cache but what we do is we call dot array buffer on it and that gets us the entire video. We'll come back to that in a moment. And then what we have is some code that looks through the range header and goes, okay, which bytes did they actually want? And then that gets sliced out from that array buffer. We set the status to 206, which I think is something like partial content or partial response or something along those lines that says you're not getting the whole thing, you're just getting the little bit that you asked for with the range request. And we set a couple of other headers like the content length and the content range and then we return that instead. For me, this is a bit unpleasant. Not only because there's a bunch of code in here that could arrow, I guess, could go wrong, but mostly because calling dot array buffer at the moment on the response would get me the whole video. So if it's like a one gig video, calling dot array buffer will get me a one gig array buffer in memory for every request. And the solution here I think is gonna be to slice and dice the video instead of one big file in the cache, actually have lots of different files like smaller chunks inside the cache and then just pull out the chunks that we need to make the ranged response. We'll see how that goes when we actually wanna get to production and how big the videos are and whether it's actually a thing, but I fully expect to actually have to do that. I should also mention, by the way, that there is an offline implementation already in Shaka, which uses index DB to do its thing. So that all takes care of it, but it doesn't use a service worker today. It's all in the foreground. It populates index DB, but it also does other bits and pieces like if you've got EME and licenses, which again, I don't understand fully, it will store those for you as well. Right. Let me just show you though what it actually looks like when you do this. So when I hit cache video, hopefully, oh, it's set to the regular 3G. Let's do no throttling. Oh, if only life was actually like that where you just go, no throttling. You see now we've got the cached video. So if I just clear that and start running it. Now all the requests are coming from the service worker. Essentially, it's doing its job. The service worker is stepping in, and rather than going out to the network, it's actually creating the response from the cache. So that's a way to transparently provide the video to the Shaka player. So all this code, in fact, the whole site code at the moment, let me show you this, is up on GitHub. You can get it at github.com slash Google Chrome slash sample media PWA. We'll link to that in the notes below. If you want to have a play around with the prototype, I'll add a prototypes folder to this so you can at least see roughly speaking what I've been doing and go from there. In the meanwhile, have fun. I'll be back very, very soon with an explanation of what I've been doing on the server side. Catch you later. Hey, folks, thanks for watching. Don't forget that there is more content that you can find kind of over here-ish. And if you want to subscribe, there's probably a button. Oh, maybe there, maybe somewhere around there. Click that if you've not done that. Brilliant.