 I am part of the Chrome Media team. Our job is to make video and audio on the web just fantastic. And we want as many sites delivering media to as many users as possible. To that end, there's been a lot of new web APIs over the last few years, especially lately for mobile media playback. And it's really exciting for us when we see sites using these APIs. For example, Forbes launched a progressive web app. And when developing their narrative for the progressive web app, they made video an integral part of the experience. iHeartRadio launched a web streaming experience and using media session API and service workers, they were able to give their users the experience that they expected. And then at the beginning of this month, GeoCinema launched a PWA for media playback. It's only been a few weeks, but they're already seeing 10% more session time in their PWA on average than in their native app. So there's a lot of activity going on in web media, and especially mobile web media right now. And we've seen a growth in mobile web media watch time. In fact, per user media session time has increased by double digits over the last six months in Chrome on Android. So what are we going to talk about? This is the last talk of the day. And we're going to do it in three sections. The first thing I'm going to do is give you a few updates from the Chrome media team. And next, Francois is going to join us. And he's going to talk about some practical applications of media and how to get things done. And finally, the team behind the voot.com progressive web app will talk about their PWA experience. And they're also going to talk about some new work they're doing for offline media playback. The bus icon will make a little more sense once they start talking about what they're doing. But it's really interesting because it even is providing media for users who have no internet access at all. So stick around for that part. It's a really neat application of the web. So let's start with a couple of updates. First, let's talk about autoplay. It's probably not a big secret for anybody in the room that one of the most common complaints with Chrome is when audio starts playing and the user didn't want it or they didn't expect it. And it is just a common complaint. And frankly, it's one of the reasons that users install ad blockers. And it's also been difficult for developers because the policies haven't always been consistent between desktop and mobile. Now, the fact is that autoplay has a lot of great applications when users are expecting it. And so the team has been working to try to unify everything and make it easier for everybody. So looking at what's best for the user and what's best for developers, the primary goal was to unify everything across desktop and mobile. And so as we talk through these policies, these apply to both desktop and mobile to make everything as predictable as possible. On all platforms, muted media will always be allowed, including media and iframes. You can always autoplay muted media. Unmuted media will be allowed in three situations. First, if the user is already interacted with your site. And what we mean by that is a tap or a click. And when we say that, what we mean is that if they go to one page and they tap or click and then they go to another page, on that page, the media can play with sound. Second, on desktop, if the user is interacted with the site a lot with audio, we're going to assume that audio is allowed for the site. And finally, on mobile, if the site has been added to the home screen, that's a pretty good indication that the user doesn't mind autoplay with sound. In terms of code and in terms of how you detect whether or not autoplay is allowed, you can do this by checking the play promise after you try to play media with audio. If the promise fails, if it's rejected, then you know that only muted autoplay is allowed. One of the things that's important to know as well is that iframes always can do muted autoplay, but the site can grant permission to access the site policy. And that's going to be done using an attribute. The attribute name is not yet finalized, but it's going to be something like gesture equals media or something like that. And so two things to know about autoplay before we move on. Number one, what is the best approach? What we recommend based on our conversations with users and the work that we've been doing is muted autoplay. Allow users to self-select into the audio experience. Now that's not going to work for every site. In some cases, click to play may be more applicable, but that is our recommendation in general. The other thing that we recommend doing is because there's a lot of technical details to this, keep an eye on the autoplay policy page, the bottom most link there. That's where you'll see updates as the dates and the details around things like the attributes change. Next, let's talk about Shaka Player. For those of you who aren't familiar with Shaka Player, it's an open source free player that allows you to do adaptive bit rate stream fairly easily. It gives you smooth playback for video. Instagram's going to go into a little more detail about how they use Shaka tomorrow in their talk, but for now, what I'm going to talk about is things that are new in the latest release. Shaka 2.2.4 was actually just released within the last 48 hours, and it includes all of the features here. It includes support for Apple HLS for on demand. That's with fragmented MP4. It also allows sites to customize the rendering of WebBTT and TTML. And finally, it allows offline protected media playback. That's today, something you can try in Chrome, on Android 62 and later. Coming soon will be Apple HLS support for live video, including support for MPEG-2 transport stream files. That's a fairly common HLS format. They're also working on adding background fetch for offline, and that's really going to round out the offline story by allowing users easier download flows. And finally, Shaka has a demo player, and the player today includes examples on how to do things like full screen and rotate, which Francois will cover, as well as they're working on adding progressive Web app features to the demo to give you reference code. So those are the updates. So now let's switch over to what sites are doing today. The Web is really great for media. You can write a great media experience once. It works on Web, it works on mobile, and typically what you want is you want fast playback, you want a great UI, and you want people to be able to access it anywhere, including offline. Not everybody knows how to do that, especially on mobile. So what I'm going to do is hand it over to Francois. He spent the last six months working on articles and examples on how to give a great media experience on mobile. Thank you, John. Hello. So here's the fact. After two seconds of buffering, users start dropping off at around 6% per second. In other words, faster playback starts means more people watching your videos. So let me walk you through two techniques you can use today on the Web to accelerate your media playback on first load by actively preloading resources. First, if your video source is a unique file hosted on a Web server, you can use the video preload attribute to provide a hint to the browser as to how much information or content to preload. Resource fetching will start only when the initial HTML document has been completely loaded and parsed, while the window load event will be fired when resource has actually been fetched. Setting the preload attribute to auto indicates that the browser may cache in-app data that complete playback is possible without requiring a stop or further buffering. There are some caveats, though. As this is just a hint, the browser may completely ignore the preload attribute. When user has enabled data server, for instance, Chrome won't preload anything. And on a cellular connection, Chrome will only fetch media metadata such as dimension, track list, duration, and so on. So here's the second technique, my personal favorite, link preload. Link preload is a declarative fetch that allows you to force the browser to make a request for a resource while the page is still downloading without blocking the window load event. Resource loaded with link preload are stored locally in the browser and are effectively inert until they are explicitly referenced in the DOM JavaScript or CSS. So this code shows how to preload a full video on your website so that when your JavaScript has to fetch video content, it is read from cache, as the resource may have already been cached by the browser. If the preload request hasn't finished yet, a regular network fetch will happen. Note that I'm using as video here in the link preload, as I'm going to use that resource as a video. If it were an audio element, I would be using as audio. So what about preloading a chunk of a video, you may ask? In that case, I would use as fetch. This is how to preload the first segment of a video with link preload and use it with media source extensions, also known as MSC. If you are not familiar with MSC, that's okay. Let's simply say that this API is what makes adaptive media streaming possible today on the web. Netflix, Twitch, and YouTube use it extensively. For the sake of simplicity, the entire video has been split into smaller files, like file one, file two, file three, et cetera. The goal is to feed all these chunk files to a video so that playback is smooth. So here's what happens. When the media source is created and ready, I'm fetching the first segment of a video that may already be in the browser cache, by the way, thanks to link preload, and appending that data to a buffer belonging to the media source of a video element. And that is pretty much it. Last thing, as link preload is not supported everywhere yet, you should detect its availability with these snippets in order to adjust your performance metrics. Now, it's not because you know how to preload content to accelerate media playback that you should do always do it no matter what. There are many, many singles in the web platform you can use to provide a delightful media experience to all users, including the one with limited or metronetwork connections. It includes the device battery, memory, network connection type, whether or not a video has a poster image, the available storage left, and whether or not data server is enabled. I have created a tiny demo page that takes advantage of all these singles at this URL. And if you're trying it now, you'll see that the video may not preload for all of you. And that's because we are all different and beautiful, by the way. So let's start with the first signals, the device battery. As usual now, this is a progressive enhancement. If the battery is too low and not charging, let's not assume blindly that preloading a 1080p video will make users happy. We could either lower the quality of the video or, even better, in my opinion, not preload at all. If user is on the low end device, you may want to be kind there and not preload any media. Just wait for user to ask for it as the device may not be able to handle everything smoothly. I assume user is on the low end device if its device memory is less than one gig. And as you can see, it is pretty easy to get that information. Note that a device memory HTTP header is also available in Chrome. That means if your web server knows how to handle this HTTP header, also known as Client's Hint Header, you can serve an appropriate response based on the device memory. Estimating how much available space is left on the device is crucial, and if you know there's not enough to preload or store your video, you should stop right away. You may also use that information to gray out a make available offline button. That is why I recommend you always provide a way for users to clean all their stored data on your website. Now, if user is on a cellular network connection, you should assume that network data is not infinite and they may even have to pay for it. So please be mindful. Use the network information web API to detect user-connected connection type and act accordingly. Hopefully, a dedicated metered property will be added to this API in the future to make this even more reliable. And finally, here's one way to detect in JavaScript if user has turned on the data saver in Chrome. All you need is to create a video element, set the privileged attribute to auto, and check that the reflected value is not known. I know it could be better, and it will. A saved data Boolean attribute is in the process of being added to the network information API. So, finger crossed. By the way, there is also a saved data HTTP header also available in Chrome. So you might want to check this if your web server supports it once again and serve an appropriate response. Now, a great media user experience relies on many progressive features. And I'm here today to tell you that the web platform is ready. And I'm really excited. Like, I'm actually minutes. That's so simple. There are a lot of cases where a user may simply want to listen to you. And if that is a primary use case for you, you should definitely use the Media Session web API. This API allows you to put some custom media metadata in an image in a notification accessible from the lock screen as well as unwearable devices. It is also nice in general because users can tell what's going on on their device and easily control media playback. One cool thing I've noticed recently is that it also works with the Google Assistant on Android. Users can simply say, OK, Google, post music, or fast forward, and it just works. So let's have a look at how to use the Media Session web API, if you will. As I said before, this is a progressive feature. So it starts with an if statement. If the browser supports it, you can provide some media metadata like the title, artist, album, and a list of artworks. You can specify many more artwork sizes than I did in this snippet. The browser will always pick the one that is the best for the user's device. I would suggest, though, you always provide 256 and 512 pixel squared images as they are the most common ones on Android. Once you've provided some metadata, you may also want to respond to the media controls exposed in the notification, such as seek backward and forward, play, pause, next track, and previous track. And for that, all you have to do is set up some action handler so that when this action happens, you can take care of them. Now, if you have some custom media controls in your web page, you can make sure the state of your UI is reflected in the notification by overriding the playback state to playing or post. I personally really love this API because it is simple and powerful at the same time. Let me show you, for instance, some custom action handlers for seek backward and forward. This code is pretty simple to understand, right? These one-liner functions are only about countering the current time property of a media element, and that's all. I'm sure you would love this feature in your favorite podcast web app. Now, here's another key part of a perfect media user experience, and we call it Rotate to Fullscreen. As user rotates device in a keyboard, let be smart and automatically request Fullscreen on the video to create an immersive experience. Note that video with no custom control get this for free, like global play, for instance. So how does that work? Let's use the screen rotation web API, which is suddenly not yet supported everywhere and still prefixed in some browser at that time. Thus, this will be another progressive enhancement. As soon as you detect the screen rotation changes, the video is Fullscreen if the browser window is in landscape mode. If not, exit Fullscreen. That's all. Eight lines of JavaScript, and all of a sudden, you can make the media user experience significantly better on the mobile web. What about out-of-focus media playback? When you detect your web page or video in your web page is not visible anymore, you may want to update your analytics to reflect this or pick a different video track of a lower quality, post the video, or even show some always on top custom controls to the user. And to illustrate this, let me share with you two API you can use today. With the page visibility web API, you can detect the current visibility of a page and be notified of visibility changes. This could simply post the video when the page is hidden. This happens when the lock screen is active, for instance, or when user switches that. As mobile browser now offers controls outside of a browser to resume a post video, I recommend you set this behavior only if user is allowed to play media in the background. And for info, when the page is hidden, video with no custom controls are automatically paused on Chrome for Android. With the new Intersection Observer API, you can be even more granular at no cost. This web API lets you know when an observed element enter or exits a browser view path. In this code, the visibility of a small mute button is toggled based on the video visibility in the page. If the video is playing but not currently visible, the button will be shown in the corner of a page to give user control over videos. Here's a tip for you. If you have a lot of video in a page and it is using an Intersection Observer to pause on mute off-screen video, you should consider resetting the video source to null instead, as this will raise significant resources in an infinite scroll case. Whew, we've covered a lot. So a great example of all these features is vood.com, who has launched earlier this year a really cool media PWA. So please welcome on stage Rajnil, who will share with us Heidsgawn and the Future Plans. Rajnil, it's yours. Thank you, Francois. Hi, I'm Rajnil. I am the head of Product and Technology at Wycombe 18 in India. We've had a great journey. We launched in May last year our OTT product, which is called vood. Today, we are at about 26 million monthly active users, and everything they spoke about, about how to use the various APIs they've been releasing this year, we have been able to use it, and I think we have seen some significant improvement to a progressive app experience by itself. When we launched vood for the progressive app, we actually called it voodlight, and that is a product that we are now focusing on extensively. The business metrics of what we have seen as an uptake has been phenomenal, but I want to share a little bit about why we thought it was important to make a progressive web app. India is a large country, and as we reached the 26 million scale which we have reached right now, what we realized was that there were a lot of devices which were underpowered, they did not have enough memory on them, and they still wanted to access the videos that they wanted to see. And we realized that one of the best ways to do that was to be able to do something which did not require them to take space from their phones from an app, and that is where the progressive web app came in. One metric that I wanted to point out here was how our entire distribution across the web ecosystem has changed after we got the progressive web app on. It has gone from about 20% of daily active to 50%. So that's a number we see that has helped us get new traffic in, and obviously the marketing guys are really happy because the cost of acquisition has gone down. I will be speaking today about two different offline examples, but the first I want to speak about is the one that we're doing on progressive web app. India has about one billion phones, but only about 260 million 4G and 3G connections. Offline is a way of life for users to be able to access content and be able to play it later. Now, the internet access might be available to these users for various parts of the day, maybe at office, maybe when they are at a friend's place or a coffee shop. So we have taken the experience of offlining from the apps and really brought it to the browser. So user can set something to download and when the download is complete, he gets a notification and he can go off and be able to watch it at his leisure. How, to explain a little bit about how we have achieved this, I would now like to call Arik from our technology partner, Kaltura. Thanks. Arik. Hi, my name is Ira Geisler. I'm the VP technology at Kaltura. Kaltura is an end-to-end service provider for OTT experiences, powering large media companies such as VOOTS OTT platform. At Kaltura Engineering, we basically have three pillars on which we measure our success. First off is performance, providing a top-notch performance across any device, across any platform for all of our customers' users. Second is accessibility, providing our customers' end-users the power to access content wherever they are across any device, whether they're online or offline. Third, the user experience, providing a consistent, feature-rich user experience for our customers across any device, across any platform. All of this while keeping in mind that most of our customers, such as VOOTS, require us to support premium grade content which usually requires some form of content protection. As such, PWA is a natural component for us to use to provide all three pillars. For instance, up until now, providing offline content support for content-protected content, for premium grade content, usually required using a native application on top of ExoPlayer. By utilizing PWA capabilities, we can offer the same experiences using a simple web application, mobile application. So now let's look at a typical download experience on a mobile web app. So first off, of course, the user will request to download a specific piece of content from the web app. The web app, in turn, will go to the CDN or Origin, download the respective manifest. If the content needs to be content protection, there will be another request to the license server, providing information on the user, the content, the device for the license server to make a decision regarding what kind of license to provide for that specific piece of content. And now comes the interesting part where the application will pass on to the service worker, basically a list of files to download in the background. And the service worker, in turn, will start downloading these files from the CDN or Origin, while not blocking the application at all. Once this fetch is completed, a notification will go out to the application, which in turn will, of course, notify the user, but now the content is basically available for offline playback. So now let's look a little bit at some debugging tips and some experiences we've had along the way developing the PWA app. So first of all, keep in mind that if you check the Bypass for Network checkbox, this basically means that the fetch event on the service worker will be skipped, it will not be invoked. Another thing to remember is update on reload is very good for development mode, it's very helpful. Just make sure to uncheck this when testing, of course. Specifically regarding background fetch, today you still need to use a flag. This flag is the experimental web platform features flag. Another thing to remember, and to keep in mind while writing your code and we'll see a snippet explaining this in the next slide, is basically if a single fetch listener fails, the rest will basically not be invoked. You need to keep this in mind while developing. Developing a progress bar is very easy, very simple. Basically a service worker can update the application on its progress within the app, basically telling the application how much of the content itself was actually downloaded already. So this together with the total file size, basically you can create your own progress bar, basically telling your users how much of the content was already downloaded. Now let's look at some code. The first snippet is basically the invocation of the background fetch API. You can see a very simple wrapper function. The first thing we do is we will insert into the index DB, the IDB basically an entry representing that downloaded asset, that asset that is going to be downloaded. This will be used later on in case a service worker gets torn down or failed for any reason. Basically we still have a record of that downloaded asset to re-invoke or to provide some information to the user. The next call is basically the very simple invocation of the background fetch API, providing an ID and providing the list of assets of course. This next snippet is basically, this is the event that gets fired out once the background fetch has finalized. This is the event. The first thing we'll do of course is we'll retrieve the index DB entry that we just saw putting inside the IDB. Then we basically correlate that back to the background fetched assets in order to make sure that we're talking about the same assets. And the next step is basically putting the downloaded fetched asset into the cache. And this basically means that now the content is available for offline playback for the user. Basically what will happen now after these functions is basically the service worker will notify the application that the background fetch has finalized, has finished. The index DB will be cleared of the entry that we no longer need and the service worker will be cleaned up. You can see a very simple workflow for a very cool experience. Now I'd like to call back Rajneel basically to talk to us a little bit about another very cool PWR experience for Voot. Now I want to tell a different story about offline. India is a large country and we know that there are about 1.3 billion people. One of the main challenges is that because of the geography, a lot of places still don't have stable 4G networks that people can really stream content from. The other challenge of course is that the data costs have been really high. Over the last year the costs have come down with the launch of a new 4G network but still data costs money. We looked at solving the problem in a different way. We said, what if we did not use internet to really get the content to the people? Maharashtra is a state which has about 112 million people and we tied up with the state transport cooperation there and they have about 17,000 buses. What we looked at doing was launch a product called Voot Go. So what you see here is essentially a bus station where you can sticker out and say log on to Voot Go. When you enter the bus, inside there are stickers telling the users to be able to access by typing Voot Go. What would happen is that they would find a Wi-Fi network and we created these proprietary boxes which were placed inside the buses. These boxes can store about 200 hours of content and can be refreshed every time the bus goes back into the depot and the users launch Voot Go. We customize the UI to make it extremely multilingual and easy to understand for people who might find English as a challenge, which is a lot of times. So we launch with two languages. One is in Hindi and another called Marathi and we created dedicated space for kids content also in this. And today we are running this on about 10,000 buses. We have seen some amazing results. We have about 40 minutes of content consumption per user happening and we see about nine users per trip connecting on to the platform and using it. What's cool thing about this is that when we reach scale of about 17,000 buses, we'll be servicing about 6.3 lakh people every day, which essentially means that we will be doing about 2.3 billion passengers who will be coming onto the bus. Now that's a large scale deployment of a product to get content out and this of course remains completely free for the users because this is advertising driven. So what we're excited about is creating the best of internet in a non-connected environment, being able to deliver content at scale and creating a completely new business model for the media. So that's what I had to share about something that we're doing on Voot Go. With that, I'd like to call John back on stage. Thank you. Thank you. Thanks Rajnil. Well as you can see, there are a lot of great opportunities out there for mobile web media and we've listed some of the key APIs on the screen right now. For those of you taking pictures in the audience, those of you at home, hopefully this is a good reference for you. A couple of notes on the APIs. Background Fetch is still in development. It is available in Canary right now aiming to release early next year and as mentioned it is something that you need to turn on the experimental web features flag if you want to try using it. Offline Persistent License Support is in Chrome 62 for Android and it's coming in Canary 64 soon for desktop as well. So if you're doing protected content offline, you'll need that feature. A couple of things we didn't mention in this talk but also worth looking at. If you want to send media to a television, please do look at the Cast API. It's easy to add to your site. And for anybody who has not yet seen HDR video, VP9 10-bit, tomorrow at the demo pod we will have that playing on an HDR screen so that you can actually see the future of some of the great video quality both in terms of dynamic range as well as white color gamut. So this is all coming, it's really exciting stuff. Ultimately the web is more ready and more capable for media playback than even a year ago and so it's been great for users, it's great for sites because it makes media just a single click or a URL away. We're excited to see continued growth in this area. So with that, thank you for your time. Myself, Francois, we'll all be available to take questions afterwards and we look forward to seeing you there. Thank you.