 If you are building a website today, chances are you are using some sort of build tools. Alright, so one question. When you are starting a new project or you are giving a brand new repo and then like, hey, Soma, please start this project. Where do you start? What tools do you choose? Like, tell me about your, like, setup. This has evolved so much over time. I take a long time to decide and I usually start out without a bundler. My preferred starting place is, I just want to write latest JS, latest CSS, and the rest of it is out of my way until I'm going to solve it. It really depends on what the project is. I've been building a lot of static sites lately and Eleventy is a recently, I guess, a relatively new tool for static site generation and I've been converting some Jekyll sites into that and it's just felt so nice and not super robust, which is what you sort of need for a static site, but then if I am building a more dynamic project, I'll either go with like next JS or Gatsby if I want, like, react with a static site because I like how it parses out to just HTML, CSS, and JavaScript. But there's a thousand tools and it really just depends. That's the answer. It depends. So I was a Webpack user and I use Webpack because at the time it was the only one that supported code splitting because like, I knew I just needed to add this tag in the head, but like the HTML plugin had went, no, this is mine now. You may not touch. And then other things like in the earlier days of Service Worker, I wanted to, I just wanted like a list of the output files. I want like, you know, let me know what the hash is going to be for these files, but reading the Webpack plugin docs, I just, ah, I was getting so frustrated I couldn't figure it out. I used to use Gulp or Grunt a lot because at least I understood what was happening and then with Webpack, I really didn't understand what was happening. Now over time, I have kind of fallen in love with Rollup. So I have my own build system that I've been managing. It's probably my sixth build system I've made. If you go look at my GitHub history, I have a Grunt one. I have a Make one. I have Gulp ones. I've been following all these build tools for a long time. Here's how I feel about Web projects is complexity is inevitable. There's no way to get around it and you're either injecting complexity from the beginning and making it worse or you're waiting until your complexity confronts you. Yes. Even if you start simple, complexity is inevitable. Idea of making a website in principle is straightforward. You make HTML document, add style to it, and add some functionality to it too. But in practice, web development gets a lot more complex. Your application code may depend on outside libraries or different modules. You might be importing web funds or you might be pre-lendering portion of a page as a static site so that it can get delivered faster to the users. And chances are you're probably using build tools to manage all of these complexities. But because tools expect certain setup, sometimes seemingly simple tasks like add online to HTML gets harder to accomplish. There are many challenges like this in web development. So let's look at how we manage JavaScript. In the past, we've written everything in independent file or different script tags and carefully combined them or added to HTML by ourselves. The way we write JavaScript, we used to have this massive file. The humans needed to know which one goes first and which one goes after. Yeah, and even like you're making me remember where I started with bundling, which was a PHP script that concatenated all my JavaScript files together, which now feels wrong, but at the time felt very powerful. But now we have modules, which means dependency of each module are specified in the code, which means a build tool can analyze the file and create bundle for us. Even better, some tool like Webpack analyze which part of the code is actually being used and extract it to make smaller bundles. When we started writing JavaScript, this way, I feel like Webpack came in the scene as a tool of choice with things that has a lot of bells and whistles and do things like tree shaking and scope hoisting. So could you explain how Webpack handles this module JavaScript field? Yeah, so Webpack supports like a huge number of module formats, like some of the common ones, obviously, ES modules, Kana.js, everybody knows about, but it actually supports parsing and understanding the structure of system.js modules and AMD modules and even Walsum imports. And so it takes all that information and in its in memory graph representation, it attaches that and can use it to, if you only imported one thing from a module, it can essentially delete the export from that module that you didn't use. And that way, that code path won't end up in your bundle. And when you take that, you fragmented out, maybe that export was using another import from another module and now that's unused. You can see how flowing that information through the graph eventually you could end up removing a fair bit of code. So Webpack doesn't actually convert modules to an internal source format. It is more focused on understanding them and their structure as they exist on disk. Our JavaScript file does not have to be a single file. So tools like Lollap will split them up into smaller chunks. Can you explain why Lollap is really good at it? Yeah. So Webpack has its own loader and so does Parcel. Whereas Lollap by default is Ecoscript modules. That's where it lives. It lives in that world. So the output it generates is way simpler than the other tools. In terms of code splitting, Lollap's implementation is very pure, I would say. It will create the smallest number of chunks that it can. But it will create a small chunk. Like maybe a chunk just containing one function if that's the only bit shared by two entry points. And that's something that Webpack and Parcel don't do. They will duplicate that module in both of their bundles. Whereas Lollap will always just create a separate chunk. It's very pure. It will never duplicate code. And different script runs in different thread. Ideally, common dependencies are exported as one chunk. But subtools doesn't understand it. So it creates duplicate. What was interesting to me was that the Parcel supports Walker and the main thread splitting out of the box. Is there any backstory for that? There is. And I think I can take a little tiny bit of credit for Parcel supporting that because it was, I think like February 2018, you, me and some others were working on Squoosh. And Squoosh made heavy use of WebAssembly and put the WebAssembly in the WebWorker and then used Comlink to use those WebWorkers. And we built Squoosh using Webpack. And we figured out over time that actually the way Webpack built that project, it put a copy of Comlink into the worker and into the main thread. So the user ended up loading that code twice. Now Comlink isn't that big, so it's not that big of a deal. But with bigger dependencies, that could actually become a significant problem. And so I filed a fairly long bug on Webpack, like with a graph and everything, explaining why that should change. And it hasn't been fixed yet. But it's been discussed a lot. And I think Sean Larkin told me that Webpack 5 will finally be able to address that problem. But shortly after I opened that bug on Webpack, Devin Govec, who is the maintainer of Parcel, actually opened an issue himself on Parcel for himself saying, I think we can do this. I think Parcel can fix this problem. And so I don't know how long it took him. I guess a couple months later, the bug was closed and suddenly Parcel now supported this code splitting across worker boundaries. And that was just really, really cool to see. Beyond making JavaScript bundles, build tools help us manage assets too, sometimes separately and sometimes through JavaScript. So I think that as CSS has evolved, the tooling around it has grown as well. And I think the first like really big instance of CSS tooling came about with SAS and less and stylus and all of those pre-processor tools. And that was when Ruby was like really big. And they were written in Ruby and it was sort of clunky and slow because then you had to wait for your CSS to process before it was spit out from SAS to CSS, for example. And then Node came around and that became a lot faster and SAS was rewritten and people were still using a lot of the benefits from that language, the SAS language. And then post CSS sort of replaced a lot of that need because it allowed for you to do some of these same things. But instead of pre-processing and waiting for the developer to see their CSS file be exported, you could then run through the CSS file after you have already written it and then apply transformations and changes with post CSS. And that also enabled you to have pluggable, very small bits of code that was way less robust and just large in terms of developer fingerprint in your dev files and your architecture by enabling like these small plugins like a auto prefix or plugin or something like for size, where if you wanted to use a size keyword, you could use that and you could even write your own. So that was really cool. And I think that that has continued to this day. People use post CSS all the time. And now you're seeing a lot of additional CSS tools that allow for things like tree shaking and some of these optimizations that are beyond just minification. And that is really coincided with the switch to framework based JavaScript as being so prevalent in the way we architect our projects now. You know, naturally, I think CSS and JS happens or maybe it's JS and your CSS, but it's sort of like inevitably what will happen is you'll find a moment where you needed something that was just so richly dynamic that some declarative static styles might not work for. So I've been putting CSS and my JS for a long time. But if we're talking about the new CSS and JS libraries that sort of take your object notation or let you do extending and abstracting in different ways at the client side, those are great too. They all options for writing styles still have foot guns. So you just need to be careful and you'll learn your tool the more you use it. Okay. So what's the quirks when dealing with assets? Yeah. So I mean, Webpack has like a super long history with various techniques for doing this. And it's simplest. There are tools for essentially, you can take a loader, which is a transform you apply to a module. And you can apply that to something that isn't JavaScript to turn it into a string in a JavaScript file. And so like historically, the way that assets have worked in Webpack has tended to center around turning them into a JavaScript thing, even for assets where the asset itself might have dependencies either on JavaScript or on other assets, it will turn that into a JavaScript module with a string and turn the dependencies, you know, the CSS import statements into JavaScript require calls. And so it will sort of inject them into the graph by converting them to the equivalent JavaScript code. Now, this is changing in Webpack 5, but for other tools like Parcel, assets has been a center of it from the beginning. One thing that has Parcel apart from many others is that they actually don't use JavaScript as their entry point by HTML. So what is considered an asset in many other build tools is their main entry points for in Parcel and that makes sense because on the web, that is the thing we go to. We go to HTML pages and from there on, we reference assets. So whether you reference an image from JavaScript or from within HTML, Parcel will understand that. And that's actually really cool. And it does this to many, many layers. So if you reference a CSS file from HTML, and that CSS file references images, all of these things are tracked by Parcel and they will be hashed and they will get a version number and whatnot. So it actually builds an entire asset graph. What someone just explained is called asset hash cascading. If one file in a graph is updated, then the file hash changes. And because of that, hash of the file that used the updated asset should also change. This is important so that we can control cache for better performance. Let's see how Lola handles it. Yeah. So right now, hashing is rollups weakness. And it's something they know about and it's something they are going to be working on fixing very, very soon. So when you generate the hash for a file, that little bit of the file names going at the end of the hex letters and numbers, you need to do that as late in the process as possible. And you want it to just be based on the contents of the file, like not the directory it's in, not your config settings, anything like that. If you've got JavaScript, which imports JavaScript, which imports JavaScript, it does the right thing with the hashing. You change a leaf one and it changes all of the other ones in the chain because all of the URLs have updated. Whereas with assets, you use a magic string in rollup and it will change that to be the assets URL. But it does that after hashing. So you update your asset, it changes hash. It updates the JavaScript file fine, but it doesn't update the hash of that JavaScript. So it's a weakness, but they are fixing it. These differences in gatchas in our build tools are a constant source of frustration. As Yuna said, the best tool for the job really depends on your project. And so we want it to make it easier for you to navigate this landscape. Tooling the port is a new website that gives developers like you an overview of various features supported by different build tools. We built this website so you can evaluate and choose the right tools for your next project. Or maybe you are in the middle of migrating from an infrastructure and hitting a load block. Tooling the port should help you answer your questions. We've essentially written a test suite for different tools based on common web development practices. So you can lead why those tests are a little event and see how each tool handles it. And when you are ready to implement yourself, you can look at our test code to see how you might integrate certain features into your build setup. We also welcome your contributions, so if you think certain features should get tested, please raise an issue on the repo. Thank you and please reach out if you have any questions.