 Thanks very much everyone and I don't know about you, but I've thoroughly enjoyed today So what I want to do is just start by very quickly Putting your hands together for the all the organizers the volunteers basically everyone who's come together for WordCamp Europe at around 9 30 on the night of October 29 1969 a group of researchers from UCLA sent their first message on the ARPANET to that University of Stanford just down in San Francisco and the message they sent about half the length of California was simple log in And with the sending of a five letter message the path for the internet and eventually the worldwide web had begun It wasn't in the suspicious beginning because the message that actually made it to Stanford was low performance issues have been with the net from the start and It's performance that I want to talk to you about today the direction the web is going is something that concerns me a great deal I started developing websites back in the 1990s and back then we had a maximum download speed of 56 kilobits per second Which was very much a theoretical at best So back then we cared about every bite on the page. We had to we had no choice And I still think about bites on the page in my role as WordPress engineer a human made If anything working on high-end sites using WordPress increases the need to think about performance like it or not WordPress has a reputation for being slow and in some respects. This is actually fair enough in others It's an exaggeration. You could say the same about any CMS about any other code on the net. I Haven't think the front-end code is slow and this too is an exaggeration The code itself isn't slow. It's the quantity of code on the typical website that is slow Simply put there are too many bites on the page Each month the HTTP archive records the state of front-end web development Including how we as web designers and developers are changing the web By collecting statistics on the Alexia top 1 million sites In April last year we passed an average of 2 megabytes per web page, and we have not looked back We're currently sitting at a little under 2.3 megabytes per page And we will pass to two and a half meg per page by the end of the year That's for each and every page on the internet There are any number of statistics that I could show you to demonstrate what we as web designers and developers have Done to damage the web and yes, I said damage the web I Could show you the slowly but steadily increasing number of assets we're using currently about a hundred per page, but I won't I Could show you the 15-fold increase in the use of web fonts or any number of the 44 data points made available by the HTTP archive But I won't show you those either. I won't show you those because while they're factual, they're all quite abstract So instead together We're going to do an experiment What we're going to do is we're going to wait out the typical web page loading in awkward awkward silence That's 4.2 seconds before render begins 12.7 before visual completeness and 15.2 for the page to finish loading And it's worth remembering that this is the average load time on a desktop using a fast wired broadband connection And while it takes 15 seconds for the average Average web page to load. I think it's noble to have a site readable by the visitor within two seconds And that's not some random number. That's chosen for a reason Walmart have found that their conversion rate declines exponentially the longer the user waits for the web page to load With the first four seconds being the real killer We're undermining our own return on investment Once you decide to improve performance, you need to know where your site stands so you can form some goals Well, you'll hear people talking about page speed score as the convenient shorthand and I include myself It's a single number that provides minimal insights For more meaningful web page web page test is the most convenient tool for measuring the effect of a change It's pretty much just enter the URL and go and When you first run web page test you're presented with some key metrics and I've highlighted the three that I consider the most important Time to render starting render completing and for the document to finish loading On web page test I tend to spend more time looking at the timeline view My biggest concern is how visitors experience the site loading often referred to as perceived performance As long as the text is readable and the calls to action are working within the user's viewport It doesn't really matter what happens elsewhere on the page. I Recently added a blocking HTTP request to my HTML header Switching between inline CSS and an externally referenced CSS file as it happens and I compared the results Adding a blocking request to my HTML header roughly doubles the time before the before the content is readable for me I view page speed scores as a convenient clickbait headline to be used as a tool when selling the idea of Improving a site's performance, but it's the full data set from tools like web page test That will help you determine the quick wins and the larger pain points to form some longer and some shorter term goals The various goals come from asking yourself from design right through to launch What will get my visitors reading quickest? What will get them acting on my calls to action quickly? Which is where perceived performance comes in again because sometimes the quickest path for your users reading Will be counter-intuitive. It may even make the overall website larger It may require you to break some rules In WordPress parlance. It may require willfully doing it wrong So let's write some code and break some rules The first rule I'm going to break is such a common practice that it may not occur to you that there's even a rule to break And that's to put Java script in the HTML footer If you have a particularly large web page, this does mean that the JavaScript will load last but So your site needs to work with it without JavaScript But JavaScript is really fragile. So this is just a good practice anyway And Anyone who codes WordPress things regularly will have seen something like this. It loads a JavaScript file the WordPress way and That last line there is particularly important. It puts the script element in the footer of the HTML This here is one of the most common WordPress code examples around Welcome to word camp Europe by the way But then there's jQuery Just name it as a dependency and jQuery loads the WordPress way so in theory. We're doing everything correctly And this is what you end up with in code right up in the HTML header there. It sits proudly jQuery alongside its little friend jQuery migrate to maintain backward compatibility But it's not until the themes functions file right down in the HTML footer that we finally put it to use Loading jQuery and some other libraries the WordPress way blocks rendering and it slows down your site There are quite quite sounds reasons that WordPress does this it does this for reasons of backward compatibility Which brings us to the first rule that I'm going to be breaking to start missing around with the core JavaScript libraries Here you'll notice that I'm setting the group on the jQuery family of scripts and this consists of jQuery core jQuery migrate and jQuery which is essentially an alias for the both of for both of them and I'm setting them these to load in the HTML footer. I set them to group one So our HTML from earlier becomes much more pleasing as the jQuery family of fonts load in the HTML footer And if you're handing over a site to a client, that's also Relatively risk-free. In most cases the worst that can happen is that they'll end up Enabling a plug-in that puts it right back up in the HTML header, which is kind of where we would have started in the first place Loading JavaScript asynchronously tells the browser to get on with rendering and to deal with the JavaScript when the file is downloaded And not to worry about file order. The first script can run last or the last script can run first Asynchronous JavaScript simply relies on adding the async attribute to the script tag and With that attribute the browser just deals with making it asynchronous and it's possible to use a filter to add the attribute a Script loaded asynchronously is easier to maintain if it's entirely self-contained So if you don't need to support older browsers, you may want to move away from using a library jQuery and some of the other libraries Were so successful and so good and continue to be That many of the features have become a core part of the spec So in this example behind me the JavaScript file with the handle pwcc scripts is entirely self-contained So in my filter I add the async attribute to the script tag picture fill can and is designed to be run asynchronously and The final step is to apply it as a filter as you can see on the last line there As asynchronous JavaScript doesn't block rendering it can be loaded in the HTML header We're basically doing the reverse of the change we made to jQuery For each of the scripts that load asynchronously we can assign them to group zero to the header and I've made some notes about asynchronous JavaScript at the URL on screen Before continuing it will help a little to understand what happens when a browser requests a web page a Big problem with HTTP version one is that the browser initiates every single connection and Requires a separate connection to download every asset We can which can make it very easy for the web browser to block itself from preceding while it needs to collect the next asset The HTML needs to download Before the browser knows to download the CSS The CSS needs to download before the browser starts rendering the web page Once the CSS downloads the browser discovers that there's fonts and images to load and these block the loading of the JavaScript well the JavaScript The JavaScript it triggers the loading of an iFrame with its own HTML CSS images repeat repeat repeat And all of this is to conveniently ignore JavaScript and fonts and other items in the HTML header that can block rendering And this is that same waterfall view as displayed when you run a site through web page test And here's an example of where the browser has to think about things before it can proceed Which is quite frankly a waste of 200 milliseconds But going back to our simplified view wouldn't it be great if instead of waiting for the browser Everything came down all at once over a single connection In very simplified terms That is how HTTP version 2 with server push works very simplified terms And by now if you've been reading about the blogs and listening to the podcast You will have heard that it changes everything As you'd research HTTP version 2 you'll start to come across articles with titles like this Everything you know about performance is now wrong and former best practices and our anti-pattern and considered harmful I think titles like this are a little bit misleading because I've taken a look at the statistics And I don't think we're quite ready to forget everything. We know just yet The biggest hindrance is web server support At the moment a little under 7% of websites are running HTTP version 2 So if you work in client services or on a plug-in that gets installed on client service Or really quite frankly if you haven't upgraded the server yourself There is a good chance that your right your work will be running over a server using the old protocol Which can be a little bit disheartening However, if your server is running HTTP version 2 there is some good news a Large number of your users will be getting the new improved protocol Unfortunately during the transitional period over the next couple of years a large number of your users will not be Which means that you need to account for both HTTP version 1 and version 2 traffic And that's okay. Most of us deal with these kind of issues every day It could be creating a PNG fallback as a fallback for an SVG or providing a download link as a fallback for the audio element As we saw earlier CSS blocks rendering We can speed up the loading of our CSS by making use of the preload spec It's oversimplifying matters a great deal to sail that CACNC attribute for CSS But it can be made to work that way and it covers both HTTP version 1 and version 2 with server push Over HTTP version 2 with server push the preload spec can be used to trigger the CSS to download in parallel with the HTML It all happens over one connection For HTTP version 1 on the first load of a website The CSS is never cached in the browsers in the in the user's browser So we can make use of inline CSS to speed up the render of the page We can then make use of the preload spec specification To asynchronously load the CSS file so we can use it on subsequent page loads We saw the positive effect of this over HTTP version 1 earlier as this was the change I made to my site when demonstrating web page test Look at that. It's beautiful inline CSS Without the inline CSS The render the render actually happens later slowing the down the ability of the user to act on your calls to action to convert And improving the CSS the performance of our CSS is going to take a little bit of work Although it starts with in queuing our CSS as we normally would using the standard WordPress built-in WordPress function The follow-up step is also simple enough. We had we add an HTTP header Over HTTP version 1 it instructs the browser to start downloading the asset as early as possible Over HTTP version 2 with server push it instructs the web server to push the asset to the browser If we don't want the server to push the asset we can tell the server not to just with the no push suffix But I prefer my wordpress to be well Wordpressy So I like to rappel of the HTTP headers in what in a WordPress helper function to work alongside WordPress's asset in queuing we can then preload the CSS by referring to its handle Implementing server push becomes one Human readable line of code However, there's a problem with that code here it instructs the server to push the asset on every load If the file is already in the browser's case our attempt to speed up our website has actually resulted in extra data being sent down the line A better solution is to check if the file is in the browser's case and only attempt to push if it is not Regrettably that's not something that browsers report and for security reasons nor will they To fake the case detection. We set a cookie indicating that the file is likely to be caked and The is fake is cased function simply becomes a check for the existence of the cookie having improved Loading time for HTTP version 2 visitors. It's now time to switch to HTTP version one one Where we want to inline the CSS and load the file asynchronously using the HTTP one portion of the preload spec Unfortunately, the preload spec isn't widely supported by browsers. So we do need Polyfill it and those clever bunnies at the filament group have a tool for this creatively called load CSS As we would any other JavaScript will register load CSS to use and it's worth noting that I'm registering this I'm not in queuing it the difference being that registering a file makes it known to WordPress late ready for later In queuing in terms of the HTML if the CSS is uncached we want to deliver this over HTTP version one the inline CSS followed by preloading the external file Asynchronously and to allow for the JavaScript in ped we then load the CSS as we normally would inside a no script element because we're not animals and To achieve that in our functions file We then print our CSS file in line again using a function Based on the assets handled and now that we are preloading the CSS We absolutely need to in queue the the polyfill The full source is a little bit complicated. So I've put it in full at the URL on screen on screen This gives us our inline CSS on the first load on The second and subsequent page loads the visitor just gets the external reference as they normally would and the CSS loads directly from their cage This of course allows us to get back to a using the HTML header for its intended purpose Which is specifying fav icons Wouldn't it be lovely if that was an exaggeration and this all brings us back to our headline from earlier To keep our sites performant We will always need to consider the impact of every single byte that we put on the page Now over the next two or three years Not only we will have will we have to think about how they visitors Impact visitors to our site on a fast or a slow connection We'll also have to think about the impact over both HTTP version one and version two And at times it will be annoying at times. It will be frustrating, but it will keep our jobs interesting And I don't know about you, but that's exactly why I love working in the industry in which we do Thank you very much