 Hello. If you came here expecting to hear Indian stock, you'll be disappointed. He bailed out, so I left it on me. And yesterday when I was asking Seb, he said it will be a small group. So thank you for turning up in large numbers. And I just found out five minutes ago it's being recorded. Yeah, sure. Thanks for the encouragement. So for this, of course, with last minute presentations, since I don't know what else to talk about, we'll talk about practical tips on front-end optimization. If you do not believe websites should be fast, I think my talk would be pretty non-interesting. And my aim is to show you that it's sufficiently complicated, so you should leave it to companies like Dexecure or Cloudflare, etc. Or if you're not impressed by the presentation, at least subscribe to Cloudflare since they're hosting the talk. So if you believe in this slide, you can just walk out of the meetup, go to Dexecure.com, subscribe to them, and never worry about performance again. However, if your infrastructure is based on a relational database and you use a lot of joins on the fly, we can't help you. Just rewrite your backend. Don't talk to me about that. Okay, so this is a very non-exhaustive list, but if you want to optimize a website, apart from the backend stuff, if you want to do front-end optimization, either you can reduce the network-related times, like time to first byte, etc. Or you can make the rendering of the page content itself faster. So Google Dev blog post has a lot of good content on how different content types are blocking, etc. And you want to make sure that the first render shows you something, shows you colors, etc. And of course, you can make the loading of assets faster. So your web page is not without images for a long period of time. So the first obvious thing which you must be doing is either use a CDN. And I found out today that Seb is presenting his hybrid CDN. So of course, since he's hosting the event, so subscribe to Seb's awesome service or you can subscribe to Cloudflare. Dexacure works on top of both Cloudfront and Cloudflare. I would also like to mention Akamai, Fastly. Or if you are really adventurous and don't think any of these is good, you can use your own DIY varnish setup. There are many people who do do this. Okay, so I didn't have much to talk about on the network side. So now let's try the... So you should be removing the render blocking parts of HTMLJS. So whenever a web page loads, the CSS is actually render blocking because it determines how the content would look like. And you want to have as little render blocking stuff as possible. So you want to get all your CSS files and your HTML files to the client as soon as possible to optimize the time to first render. Okay, so do you think this CSS import is blocking or non-blocking? Raise your hands if you think it's blocking and non-blocking. There's nothing in between. You either have to choose one or the other. So of course, this is blocking because you didn't put a media type. How about this? Is this blocking or non-blocking? Anyone on the side of non-blocking? Yeah, this is also blocking because it is actually indistinguishable from this one. Okay, let's try this. Is this blocking or non-blocking? How about if I open this on a browser which has width greater than height? Is this blocking or non-blocking? So this will be blocking. So one, this will wait till the page is loaded and it will be blocking only if your orientation is portrayed. So you always want to put media queries on your CSS when you are importing them so that you don't end up waiting for a lot of stuff which you don't end up using. So again, this is a very good example. I doubt anybody uses it, but if you are printing a page, then use something like this because this will not be blocking if you are just viewing the page. Though if you are using style sheets to print, I don't know. So what DeXecure does is that it actually, one, we only enable this for very brave and forgiving websites, but we do rewrite the HTML and CSS so that we can remove the blocking parts, potentially remove some unnecessary CSS, reorganize the HTML and JS, etc. I don't really recommend doing this because rewriting your entire HTML would take significant time, but just be careful to use media types and media queries that should get rid of the basic problems. So now we come to asset optimization. So there are many types of assets, images, CSS, JS, videos, all the other crap which we will not discuss. So let's go to images. So when you are serving images, you want to choose the right format, the right quality. Quality is a gender term I'm using for the right compression parameters and the right size. So size is the dimensions. Maybe I should have put dimensions. Okay, so at least I hope you guys are doing this. So as an example, if you are serving images to Android tablet, then it should be a web page because you're likely seeing it in Chrome. It has bigger dimensions. Desktop has the biggest iPhone. Wow, this slide is slightly dated, but iPhone should be the smallest. And you also vary the format. Firefox doesn't support web page. So to do this is actually, if you use a CDN, it's pretty straightforward. More CDNs let you cache the other content by device type headers. So for example, in cloud front, you have headers for mobile, tablet, and desktop. I read a cloud for documentation. They also have similar stuff. So okay, let's see another example. So just varying images by device is usually not enough because you also have to care about bandwidth. For example, if it's a small Android phone in low bandwidth or in areas where connectivity is poor, you still want to serve smaller images. If it's an Android phone with a lot of bandwidth, then you can serve higher quality. Also, you have to kind of keep in mind that the decoding abilities of every device are not the same. And in some rare cases, increasingly lesser now, the decoding a smaller image might take a longer time than a bigger image. And of course, Microsoft always needs to come up with a new format. So if you are on Windows phone, you should be serving JPEG XR. I have tried to file bug reports to make them not do it, but let's see. There is something now. Okay, so people, if you're still not convinced that you should be using a front-end optimization service, of course, you might be brave and you can say, I can do this myself. WordPress does this. So they re-encode your JPEGs at quality 82. Again, this is around two years old because, of course, I don't use WordPress. But life is not so simple because there are lots of images which look beautiful at quality 31, or probably 110 the size while other images would have compression artifacts at quality 85. The context is important. The viewing device, the network conditions, the viewport width, et cetera. And of course, the best format depends on the image itself, which is a bit of a problem because you do not want to have monkeys looking at every image and then deciding what quality to send it at for any significant size website. And of course, the receiver decoding capabilities, which would be the hardware, the browser, et cetera. Okay, so I'll cover the most common formats. I just put JPEG XR for completeness sake. We do support it, but I won't discuss it in depth. So JPEG is supported on all browsers. It doesn't have an alpha channel, so you won't see transparent images. PNG, again, is supported in all browsers. And despite what they say, it's not useful for opaque photographs. WebP is awesome with some conditions and it's supported on Chrome, Android, et cetera. So again, if you are encoding JPEG, I would say use MOSJPEG. Don't go to libJPEG, turbo, et cetera. You have to be careful about the quality since this is the main parameter you would be optimizing. Let's not go into DCT quantization. Chroma subsampling does affect the perceptual quality of the image and of course, progressive images versus non-progressive. If you are hard coding this, then you should be using progressive images unless, of course, the target device doesn't have a good decoder and usually what works best is progressive with custom progressive scan scripts. So PNG comes in. PNG is said to be a lossless format, but again, since we are trying to make things faster, we, of course, don't follow the rules and we make it lossy. So you can quantize to PNG8, which means one byte for each pixel and it's maximum 256 colors. I have tried to quantize it further, but that is only if you don't want to recognize the image and you can transform it to PNG24 by applying a diagonal blur, but oh, sorry, I did not know that this will be recorded. So the only problems with WebP in my experience are that it always uses Chroma subsampling. It is usually better than JPEG, so if you are serving to a Chrome and you do not want to use a third-party service, which you should, then you can just serve WebP. It supports alpha channels and lossless WebP is actually kind of useless, but yeah, it's similar to PNG, but offers better compression. So this is more fun. So that is the image on the left is the original image. This is the JPEG image, compressed. Wait. Shit, I forgot. Damn it. Okay, so what was this supposed to be? This was supposed to be a 201K PNG. So which one would you choose? Someone say the third one. Otherwise, okay. Fine, you would always choose the second one because the second one is... No, Trump is for the one you will choose. So you will choose the second one because you did choose Trump. Okay, I was told not to be political. So okay, assume this is 201K PNG and you will choose the JPEG. So as a general rule, if an image is a photograph, don't bother converting to PNG. Just use JPEG and assume this is 200K PNG. I blame Haroon. He didn't fact check this. Okay. So as you can see, you will see different compression. I'm not sure if you can see, but you're supposed to see different compression artifacts in both the images. Actually, I found that in JPEG images, sometimes compression artifacts make it look sharper than the image already is. Anyway, the point is that for photographic images, use JPEG. Oh, yes, this is correct. Okay, so image on left, which one would you choose? No, somebody say the middle one. Anyone for the middle one? This is not fun. Awesome. So okay, now why you are wrong? Because one, this... So any image which is artificial, which is not a photograph, which means it won't have a lot of colors, try to use PNG. So here you see the... let's see the compression artifacts. So okay, of course you'll use the right one. The middle guy was wrong. Compression artifacts in both, but the PNG overall looks better and it's smaller size, 50K. Okay, now we come to chroma subsampling. So chroma subsampling is actually a process that happens when you compress JPEG images. So... which... So okay, so usually when you are trying to see what is the right format and the right quality to send, you always try to compress an image to different, sort of like an A star search. You try to compress it to different qualities and formats and then see what works best. So if you see that there are a lot of artifacts when you compress it, then do not use Lossy WebP. Do not do chroma subsampling since Lossy WebP will always have chroma subsampling. If it has some artifacts, of course you use... you do not subsample chroma that much. If it has no artifacts, then awesome. You can compress it more. Okay, now the fun part. So which one would you choose? Actually I also forgot what is the answer. But which one would you choose? But which one would you choose? Anyone for left? Wait, let me check the answer first. Oh yeah. Say it is right because the compression artifacts in both are similar. Even though this right image does look a bit sharper, but we will forget about that. It is smaller. Yeah, it is smaller because you again use subsampled chroma parameters more. Okay, how about this one? No, that is not a compressed image. That is the actual image. I resized it so it doesn't look so... Okay, so... what would you choose? See again, it depends on the context. If you are doing a presentation and you don't want it to be recorded, then you will choose this. But... Oh wait, oh shit. Actually this should be PNG because as you can see, you caught it. And this was a trick thing. And I should not have had this slide. But anyways, you would use this one because here chroma subsampling introduces a lot of artifacts. And the reason why they are more... So Q100 and even for Q100 you see a lot of artifacts because this image is very dependent on color and color contrast. So chroma subsampling matters. So set that parameter. Don't just set the quality parameter. It's almost done. So again, if you are like me or like most of us and you are lazy, use some kind of perceptual metric. So multi-scale structural similarity is something developed by someone in Canada. But it actually gives you a good measurement of how different two images are. So if you are trying this yourself, then you compress an image, compute the MS, SSIM, SSIM or DS, SSIM value and then see how... and then see how two images compare without having to actually go through every image. So for example, if I say that an error of 10% is acceptable to me, then you set that as a metric. But the problem with that is that again, the error that would be acceptable depends on the image. So you are back to step one which means this slide is kind of a waste. But... Okay. I am running out of time, so I compressed everything in one slide. At least minify and compress the JS and CSS. And of course this sounds simple, but when you handle a lot of websites, you realize that CSS minifiers actually failed on my CSS. So you always use it at quality zero just to be safe. And yeah, this is a good this is a good table written by this guy to show that the JS minification algorithm is the best even though the parameters actually say that it is not. So it's kind of stupid. But okay, somebody launched this new compression format called Brotli. It is awesome and it's supported on Chrome. But of course if you are using AWS CloudFront, they mysteriously remove the BR header from accept encoding. So then you have to write a lambda edge function to insert that header. But then what happens is lambda edge can be used in cold start time. So actually even if the even they are compressed asset is smaller, it ends up taking longer. Oh, the conclusion you can use Cloudflare workers. But anyways, use Brotli. So if you want to test how awesome your website is, you can chase her on. So here's a human browser. As soon as you pass in the URL, he will launch a website with different kinds of devices and all different kinds of browsers. And he will tell you how optimized your images, your JS, your CSS is. And if we end up making it worse, all your complaints redirect to him. Thank you. So guys, because you know how patience is running low, like for example Google, Amazon, Akamai actually about 10, 2006, 2007. On what's the ideal loading time? And back then it was 8 seconds because everything after 8 seconds, 33% of the users would abandon the page. So Google did new studies and now based on 2018 stats, it's 4 seconds. So for desktop, 4 seconds is the ideal loading time. For mobile, it's actually 2 seconds because everything above 3 seconds, most of your users are going to abandon your page. So hence, sorry, I think we have to wait a bit, no worries. So basically the goal of DeXecure and what the vision is is to ensure that every websites, website loads with him. That's the wrong demo by the way. No, no, no. Take the demo, I send you on your Facebook. So basically the goal is that yeah, chat. Basically the goal is that every website loads with him 4 seconds regardless of where the user comes from. Whether it's from old Samsung phone or from poor bandwidth background, the website has to load on a mobile with him latest 3 seconds, on a desktop with him latest 4 seconds. We've done about, in the last few months, we've got about 83 clients, websites that we're serving, and we're managing each of them to load in about 4 seconds on a desktop, 3 seconds on a mobile. This is an example that we've done actually today, so it's computing. Basically it'll show a very optimized website. It's a company that does a lot for optimization and yet the website loads on a desktop in 5.19 seconds. Now the goal for us was to make sure that it loads in 4 seconds. How's it going buddy? Yes. It is undefined. Come back to this later. Do you have internet? I sent it to you on Messenger. You just wait for it. Just wait for it. So this is going to be the last thing we're going to show. It's... Are you serious? Nitish? It's... We've done about 80 case studies that show that even Google launched the tool. I'm not sure if you've seen it. Think with Google? So there we go. This was a very, very well optimized website and they were losing a lot of money. Basically when they came to us they said we have a high abandonment rate and this is exactly connected to the recent Google study that shows that everything above 4 seconds people are simply impatient. They're not willing to wait. Our tool basically got it down to 4.19 seconds with a bit of... Can you... So in average 1.18 seconds this seems little. But if you look that it reduces the visitor loss by 20% just at one second. This is a company that has 15.21 million views a month. So you can imagine what 20% additional instant traffic means for them. So we're talking about 15 minutes refresh 20% more traffic. So... that's about it. So I'm passing on the mic. Thank you Shobhan and Chase for talking the demo. We'll take a couple of minutes break while next people