 Hello everyone and welcome back to the state of the web. My guest today is Adi Osmani He's an engineering manager on the Chrome team focusing on web performance and today's episode is all about JavaScript It makes the web go round, but too much of it can actually be a bad thing. This is the state of JavaScript. Let's get started Adi, thanks for much for being here. Thank you for having me. So how long have you been doing web development? Whoa, way too long. It's been about 17 years now. And I understand that you used to build your own browser. Is that true? Yeah, instead of learning web development I decided to build a browser to understand how browsers work So I like built something that supported like HTML parsing and CSS and my path to JavaScript was building like a very small engine for understanding how it works. So back then what was the state of JavaScript like? Was it just being used as a toy for DHTML? JavaScript was being used for adding rainbows and sparkles to your web pages It wasn't being used to like build really complex apps And that's kind of where we're at right now in the post-gmail era We're building entirely complex applications sending more and more JavaScript ads for users So how do developers know if they have a JavaScript problem? So building interactive experiences on the web today usually involves sending JavaScript down to users often too much of it and The impact that can have on the experience is that if you're sending too much JavaScript down that involves a CPU having to spend a lot of time Processing it and take a very long time on mobile devices If you're on a slower network that can also mean it will take a while for that JavaScript to be downloaded The way I like to think about measuring if JavaScript is a problem for your site is Using a workflow that is you know measure optimize and monitor So for measuring if JavaScript is a problem. I recommend folks check out lighthouse We've got a number of performance audits in place there. So Go run lighthouse from the Chrome developer tools It's also supported on web page test and that will give you some insight into you know Are you just sending way too much code down to your users? If you are it'll tell you if JavaScript boot up time is an issue And it'll also give you a few ideas around things that you can do to Reduce that amount of script now the reason why JavaScript has got such a cost is that there are a number of different steps a browser has to take in order to process it like if You know if you're loading up a web page that JavaScript first of all needs to be fetched over a network Then needs to be parsed by the browser parsing to take a very long time a slower CPU It has to be compiled and then executed For many sites if you're shipping mechs and mechs and mountains worth of JavaScript down to users That can take seconds and seconds of time and that's not so uncommon in the HTTP archive We've seen at the 90th percentile sites are actually setting more than a megabyte of JavaScript and that's gzipped and minified Yeah, so that's a lot of code. That's a lot of code. That's so that's yeah as you said That's minified in gzip code. We're saying, you know uncompressed is probably two or three times that so a lot of people are you know having a browser Deal with mechs of JavaScript literally that has to process Just agree. Yeah, but the good news is that the median is only only 400 kilobytes It's still a huge amount of code, but it's not in the megabytes. Luckily So what guidance would you give to developers to use JavaScript responsibly throw it all out? No, you can't you can't throw it So what I recommend doing is only ship the code that you use your needs for the current page or the current route Very often what sites will do is they'll bundle all of their JavaScript into one big monolithic blob and they'll ship that down Instead of doing that. What if you broke it up in smaller pieces? So just ship the code needed for the current page once that's fetched You can use techniques like prefetching pre-caching using a service worker to Go and start fetching in lazily loading in the rest of code needed And the benefit of that is the user can start interacting with the spirit the experience very early on And you know if they go and start interacting with the rest of the pages on the site You've had an opportunity to you know get that code into the cache and maybe a little bit quicker So in general try to do that the technique that you'd use for that is called code splitting It's a technique supported by lots of tools things like webpack and parcel support it It's possible to code split using you know things like react and view and angular JS preact Most popular framework support this I would love to get to a point where more frameworks are code splitting by default or rather just encouraging you to Kind of shape your code so you're not going to be bundling everything into a single file But code splitting is the first big one. I think another thing you can do is take advantage of techniques like tree shaking So tree shaking although it sounds a little violent is this idea of removing unused You know imports in your code, so if I'm Going to be importing in a library like a you know some piece of you know, you why maybe I'm importing in bootstrap I'm importing in something else. I probably don't need everything that it provides So tree shaking is this idea of only including the imports that you're actually using and stripping away everything else to reduce that overall amount of script in the experience There are lots of other things we notice like most of the sites that I trace will Include polyfills that are probably not needed for modern browsers a Common one is people still polyfill JavaScript promises You know promises have been all supported for a number of years. I promise but I think that you know audit the polyfills You're shipping down Check on the weight of the libraries. You're importing one thing folks can do is use tools like source map Explorer or Webpack bundle analyzer and what that'll do is it'll give you this really pretty big graph Showing you okay. Well, here's your application logic. Here are all the libraries that you're importing and by the way Maybe it'll highlight opportunities to remove duplicate code Because sometimes we'll like import things multiple times by accident Or opportunities just like swap out the libraries we're using for much smaller ones a popular example is if you're building an app It's working with dates in any way people will use moment JS and the moment JS locales Those have a heavy weight some as a couple hundred hundred kilobytes. You could switch that out for something like date FNS It's much smaller still has the same capabilities for the most part But keep an eye on the cost of your script because you know There are usually a few things you can do to bring that size down So I have to ask what is the state of third parties in JavaScript? Maybe adds and like third parties are an interesting one So third parties have got this reputation and and well earned so of like tanking web experiences quite often And this is a particularly a problem with publisher sites new sites I would say that we often see a lot of first-party code still having a big issue So as we increasingly start building experiences using JavaScript You know, there's going to be a balance of like first-party code being a big issue Third-party code being a big issue for third-party code. It's really important to first of all again audit the impact of those experiences There is a chance that you know, maybe your site is using an ad network of some sort Maybe you don't have as much control for the type of content. They're going to pull in What I would do there is just take a look at whether there are options for you to lazy load some of those frames Some of those embeds so that you know the users are not paying for the cost of them when they're just like loading up the site We are exploring a few ideas in Chrome at the moment to try helping with this things like adding support for Lazy loading to the platform for images and iframes. I'm hopeful that that that will help I'm hopeful that future ideas like feature policy where you can like sort of control What a third-party can do will also help a little bit here And in general just try to audit those third parties as much as possible and check if they actually Are providing value to the site? In in the case that you're trying to nail down, okay Well, what specific third-party is responsible for a lot of script the Chrome dev tools actually have support for this feature called third-party badging So if you go and record a new profile using the performance panel Open up the page take a look at your trace. What it will say is These are all of the different sort of brands or companies who've loaded up third-party script in your site you can then go and Organize it sort of bottom up or or by different ways of visualizing the data and see okay well, who has actually been spending a lot of time in script and Figure out are there are there ways for me to send feedback back to those ad providers or Options for me to evaluate if I could switch out my providers to somebody else that isn't going to tank performance quite as much So assuming a developer Can improve their JavaScript and their web performance. How can they ensure that they can stay fast because regressions happen? Requirements are changing the more they touch their code the less likely it is that it's gonna stay fast. Yeah, absolutely I think as an industry we need to shift towards adopting performance budgets Something that I commonly see is you know an engineering team will get really psyched about improving their performance they'll do a sprint where they try to make things better and They'll celebrate those wins come back a quarter later and suddenly everything is on fire and the reason for that is often that we're not enforcing budgets for you know, how well we are agreeing as an organization that the experience has to be We need to get to a place where like you know sites have agreed on you know Maybe not shipping more than 170 kilobytes of minified and gzip JavaScript for their experiences Whatever number makes sense to keep your time to interactive, you know in a good place, but Once you have those budgets in place it can be enforced at a continuous integration level So, you know, if I'm an engineer working on a new feature And I submit that to my team hopefully, you know in Travis or something else They'll be able to see in the pull request. Okay Well the cost of this feature was so-and-so and it has or hasn't broken the budget And if it has broken the budget, I think that that's an opportunity to empower sort of project managers to make a call on You know, is this feature we're adding actually worth it to the Impact it's gonna have on the performance of the site and the user experience I feel like today, maybe we aren't giving PMs quite as much data needed to make those types of calls So there's there's a amount of work we need to do on the engineering side on the PM side I also think that we need to do an amount of work to get just businesses to understand the value of Adopting performance cultures and keeping those budgets in place because as you mentioned like regression analysis is something that's kind of it's missing right now You do a lot more there for folks that want to adopt performance budgets for their sites I'm happy to recommend checking out Lighthouse CI Also happy to recommend checking out speed curve or caliber These are third-party solutions but they have support for setting performance budgets based on sort of a metrics level or a size level and They'll give you sort of alerting capabilities as well So if your team happened to you know work on new features and they break your JavaScript budget You know, you can start getting alerts in the middle of the night saying hey fix it Which is slightly better than the situation we've got today Are there any changes browser vendors could do on like the web dev tooling side? Yes, absolutely. So it's it's a really tricky problem to solve, right? Because a browser isn't just going to completely disable JavaScript for everybody that would that would break a lot of the web What we can do is take into account, you know different heuristics, so if you are on say a effectively slow 2g network and We don't think that there's a chance you would have otherwise been able to view any content one of the sort of interventions we're exploring is this idea of Disabling JavaScript at least for a preview of the experience So maybe we just show you you know initially the site without JavaScript allow you to see some content make a call on You know is this actually the thing I was looking for and if you want You know we'll show you like a little info bar at the very bottom of the screen So you can view the original site and the original content if you'd like But hoping to give people a slightly better user experience at the end of the day now These types of interventions are something we're very careful about we're trying to make sure that as much as possible It's only going to be explored for sites that we know are probably going to be able to function without that much JavaScript in place But we're exploring some ideas. I think there's definitely a lot more the browsers can do in this space Well, Addie, thank you so much for being here I'm going to go check out your browser and compare Benchmarks on JavaScript to see how that compares to Chrome today. The answer is it's going to be awful Measure the cost of your JavaScript and try to figure out if it's a problem for your users So we're going to have links to everything we talked about in the description lighthouse web page test Click on those measure your JavaScript and stay fast. Thanks for watching. See you next time