 Hi, welcome to this Webpack 5 mini-series. This is a very unusual talk for me. I usually cover one topic in-deep, but here I want to cover multiple topics. I want to give you a broader overview of this topic, so this is a concordination of multiple smaller talks which cover one aspect of Webpack 5. It will cover how to get started with Webpack 5, what can change as you may expect, bigger features like persisting caching or modular acceleration, but also smaller improvements like optimizations or other features. I hope you enjoyed watching this. So how to get started using Webpack 5. So if you currently install Webpack 5, Webpack from MPM you will get the stable version and the stable version is Webpack 4. And if you want to get the latest Webpack 5 version, beta version, you want to use the next tech on MPM and install it via these commands from MPM. To get started using Webpack 5, you may want to read the migration guide about Webpack 5 migration. It contains a lot of useful information, what to do, how to prepare to upgrade and what you want to install and what major pinpoints like breaking changes or what configuration options need to be changed and also some help in general. If you want to know more, then there's this change log repo which lists a lot of useful information about details like what features has been added, what features has been changed, all the baking changes, all the details about little changes and changes to the defaults in the configuration, changes to the configuration APIs or internal changes for plugins and loaders and much information in this repo. And later we want to move this change log to the official documentation but for now it's in this temporary repo to allow us to iterate faster on this. One major breaking change in Webpack 5 is that we removed our deprecated things, so if you're getting deprecation messages in Webpack 4, you may want to get rid of them before upgrading to Webpack 5. We also removed the default polyfilling of Node.js native modules by default, so if you're using modules like crypto or utility VM or all these Node internal modules which are usually not available in frontend or on the web in general and we no longer polyfill them by default, you can opt into polyfilling manually but we recommend to use frontend first modules which focus on frontend technology and web standards and don't rely on polyfills for Node.js modules. You want to get rid of this dependency from the frontend ecosystem to the Node ecosystem and basically frontend and Node should be more separate and we want to push into this future with Webpack 5. We also upgraded the generated code standard or syntax in Webpack 5 to a higher standard and this means EA11 is no longer supported by default and if you want to still support EA11 you may want to use a configuration option to opt into the older standard of code generation which will support these older browsers. The idea is that Webpack 5 should live longer than these browsers and so we implemented defaults that should be prepared for such a future where higher code standard is supported by all relevant browsers. There are a few behavior changes in Webpack 5 regarding supporting newer specs from newer web specs that has been released in the meantime. For example JSON modules, there is a spec now for ECMAScript integration with JSON modules and if you used to using named exports from JSON modules, this is no longer supported in the new spec which only supports the default export. In this case Webpack 5 will now emit a warning if you are using the old spec and make you aware of that and also the WebAssembly ECMAScript integration spec has been updated or released which makes some behavior changes regarding WebAssembly integration into ECMAScript module standards. In Webpack 5 you are able to opt into the new spec or the old spec and we recommend to align your code with the new spec which is proof. For plugins, if you are using plugins, you want to make sure that all your plugins are Webpack 5 compatible and you may want to upgrade all plugins to the latest version to get the support. The first feature I want to show off is Persistent Caching, it's also one of the biggest feature in Webpack 5, it requires a lot of internet changes and effectorings to make it work like it works. So Persistent Caching is like normal caching but it's stored on the disk instead of in memory. So currently Webpack 4 only uses a memory caching to make watching incremental builds and watch mode very fast and Persistent Caching brings a caching feature to the disk and allows you to restart Webpack without a large hit in your compilation time. So to show it I've prepared this little example, it just uses a lot of libraries to make the build a little bit slower and for preparation I already did a development and a production build before and the development build took about 40 seconds and the production build took about 80 seconds. So now when I do the same again without any change, like a production build, it will take only a factor of this time and it will restore everything from cache and build in about four seconds. It still retakes all the files, all the build dependencies and everything to be sure that nothing changed and then all the compare files difference is if there were modifications to the output files that were deleted and we emit them everything so it should be safe that everything is fine. The development build is usually just the same with smaller, bigger files because it's not minimized, a little bit faster. Maybe also notice that we are generating very cool names automatically for your chunks based on the modules they contain, that's one part of the development experience improvements in Webpack 5. In the production build we generate ideas that kind of hashes, this is an optimization for long-term caching to make chunk file names change less often. Anyway, back to persistent caching, so I also can now be able to start a development watch mode within a few seconds compared to the full build which would take 40 seconds. After starting a watch mode, I will enter the in-memory cache mode so every change from now on are done in memory so this will as fast as we would. When restoring from the persistent cache we don't load any loaders initially as any of these preprocessors you need for your build so if you do the change for the first time we need to boot up like bubble loader in this example and give it a few times to boot up and also need a few iterations to get to full speed when it's optimized by the JavaScript engine. So you see like a few seconds for the first try and then it usually gets as fast as usual, like 500 milliseconds in this example. Okay, and you also see that we don't store the persistent cache on watch mode, we wait like a minute idle time until we store the persistent cache to don't want to interact with your usual flow and want to keep everything fast. So if you do a production build, it will be a production build with changes done in the meantime and it will restore the cache, update the compilation with the changes and it takes a little bit longer than just do and build without change because we now have to run the loaders, do the minification of the changed file, but in the end the build after persistent caching even if files changed or files are missing will be very stable, very safe and it will always generate valid result like usual. So here it probably takes a few more seconds to build this like six seconds compared to three to four seconds when I've done this the first time because now like this file has to be re minimized because I changed this file, persistent caching is an opt-in feature so you don't get it by default. You need to enable it via some configuration you see here. So basically it's a cache option which uses file system cache and you have to give it like some build dependencies. Build dependencies are things that define how your modules or your complete compilation is built. So like webpack version and also webpack configuration, webpack adds itself to the build dependencies but you have to add the config file manually in this case and so you give it a web configuration. When build dependencies change we will do a fresh build and clear the cache before. Here we also use some additional logging. Usually you won't get any of these messages like timing information about the persistent caching and so on. It would be silent and transparent and you won't see it in any way if not from timing. Persistent caching is also usable by plug-ins so plug-ins can use the persistent cache API to store their own data, own caching information like the minimizer is also a separate plug-in the tether webpack plug-in and the tether webpack plug-in uses the webpack 5 cache API to store their minimized results and caching information in general. The webpack also adds a few optimizations for tree shaking or other things. So the first thing is one time logic is only injected when it's really needed so if we bundle an empty file you would actually get an empty file as an output file so this is really useful if you don't use all your time logic. So tree shaking here is an example with basic tree shaking on the top right you would see the production version of the bundle and here we would say there is a development version of the bundle. First thing you may notice is that we start to use error function yes we generate a high level runtime code by default but this is controllable by the ECMO version option so if you disable this or set it to M-MOSCOP version 5 it will use function instead of error function but we leave it in the more optimized way. So basic tree shaking works as usual with this little path info option you are able to opt-in to more information about tree shaking so you see this in the development bundle where all the exports are listed and if they are provided used and are they renamed. The renaming has a little bit changed for long-term caching so we don't generate A, B, C, D by default instead we generate we hash the name and generate a short identifier. This is better for long-term caching because they don't change it so often. So basic tree shaking still works, I import apple and only apple is injected, is exported and all other things would be dropped by minimizer and you see it's when modules are concatenated it would be really optimized to bundle. We enabled a deeper nesting of exports so if you have an example where you re-export the module and let me show you the module first it's like apple, green, banana, yellow and strawberry red exporting. So if you re-export this module with a namespace object and re-export the namespace object and then use it with a deep path to this banana export and that would still be able to handle this generate exactly the same output code in production and development will be able to practice deep information from exports and all the manual or layers on this side so it's supported. The next example is the inner graph optimization in this case we use some functions and import some exports and these exports are only used in some functions so in this case we export the get function which uses the f function and the f function is using swap export but the test export doesn't use it at all so if you only use the test export the pick will be able to analyze it and will be able to drop the swap export from the bundle so in this case nothing is used for module. If I use get instead it will be using swap. Next thing we optimize is some kind of secret feature it's like a tree shaking of commensure as modules. I have the same module as commensure as here with like assigned exports, module exports or also defining proper distances we support more complexer cases but not everything is supported so commensure as is basically only supported in a kind of sometimes way so only if it's delicately analyzable you basically support commensures tree shaking in this module. So here you would see that the pick will drop all unused exports in commensure as too and mongrel exports the same way as like in ECMASCAPE modules and it's also available to require some commensures modules and use exports from there in this case we could also optimize it to detect which exports are used same is available for the interop logic if you require a ECMASCAPE module and only use some exports of them this will work as usual. We also added tree shaking for JSON modules and also deep tree shaking from this so if you only use some properties like version or some deep properties like depth dependencies for that pack from this package JSON data we will optimize JSON and drop all unused properties mongrel or properties if they only used in statistical analyzable way and basically only injects the values you are really using in the bundle. We are also using a JSON path optimization for faster one-time execution of JSON data and recently we added another optimization or manually magic command for the dynamic importing values can specify which exports are used from the dynamical imported module if you're only using app export or the default export then it would be possible to specify such a thing and at one time it will generate a bundle computer we can add a bundle which only has these exports. It's not able to mongrel these exports because they must be in marine specimen. So in LAPEX 5 as a progress feature and also progress profiling feature got some update and we now show progress by plug-in so if you would do a build with progress and profile enabled it will show you live and the current plug-in it's working on and it will also give you some timing information about each step and each plug-in and it will be very useful to investigate your builds and see what's taking a long time in a very basic way without achieving a real profiler so here in this case we see tether plug-in takes a lot of time but it's expected but you can also see for details for some internal plug-ins and so on. Another way to get more insight into a webpack compilation process is the logging system so we have a cool logging system which can be enabled or the output of the log can be enabled with some stats option like logging the bows and we'll print all the bows information about them and with webpack 5 we added a lot of interlocking timing information to the log output and so if you do something like that you will get a lot of timing information for steps within the compilation and for different plug-ins and so on and you could even enable debug mode for logging to get even more logging information. In webpack 5 we now expose typings from webpack so we generate our own typings from our source code and expose them as typing declarations for TypeScript so if you're using TypeScript or just using JavaScript or user-stool quota any editor that supports TypeScript typings you could add some annotations to your webpack configuration like this in like JS docs and for JavaScript or in TypeScript and it really gives you code completion and information and descriptions for all your configuration. For example this allows me to write like experiments and get information about all experiments like enable mjs whatever it also allows you to type check your configuration and the whole system also works for plug-ins if you apply the plug-in you get all API of webpack as code completion and type validation. Webpack 5 supports advanced configuration for anti-points. Here we see an example where we have two anti-points like core and admin and the normal import or the modules the anti-potential content is here specified via the import property and this object allows advanced configuration like you can configure separate file name or file template for this file like the core or the core content hash or whatever and you also can specify a separate library option and say this is a port as umd library and other anti-points are exported as other library type maybe it's useful to specify two anti-points which export separate libraries and another greater bigger feature is the depend on feature it allows you to specify anti-points which are loaded on the same page before the anti-point. Here we have an example with core anti-point which is loaded on every page which contains react and rectum like something basic and then admin anti-point which is loaded in addition to the core anti-point on the admin page and it could share some libraries like it also loads react and rectum and in this case you don't want the admin output file to contain the react and the other libraries which are usually already contained on the page via the core page so in this case you can use the depend on feature and webpack will create a parent-child dependency between these anti-points or these chunks and in the end the core endpoint will contain other libraries and admin will only contain libraries or modules which are not already contained in the core so in this case it would be really small and only contain the console doc statement because libraries are already on the page via the core file. For this next example or this next feature we assume we have a large scale application and we want to develop this application or a part of this application with separate teams and each team should have the ability to deploy their work separately from the other teams. The idea is that at one time all the work of the teams comes together and is linked together into a single monolithic application so this module federation as new feature in webpack 5 is good to have and we see here we have three repos or three containers for each team and these three parts should come together in a full application single page application this could be a mono repos could be separate repos doesn't matter so to use this feature we would specify in the in the webpack configuration that we would like to use the module federation feature and here we would in the application we want to consume components lib from tb and another component lib from team c and to use them we specify all the dependencies in the remotes property and webpack will make sure that every every time we request a module from component lib of another component lib it will look it up at one time on in a script tag that is not at one time at ccr on the other hand and the other teams would use the same feature and would also use the module federation plugin but in this case they use expose feature to expose modules to other teams so here we would expose component which it is a public name for component and we would expose our source file component which is a basic react component in this case another team would do it the same way you could also expose more properties the tricky part is we want to share libraries in this case so we don't want each of these separate builds to load react on its own and react domain loader and data functions for this we use the shared feature of module federation so in each of these compilations we can specify which libraries of the build should be shared in this case we want to share react and the component lib want to share react and want to share data functions and the another component lib here in this case also want to share all eventual loader modules now we compile each of these applications i did it as preparation for this separately compiled each of them would generate a disk folder which has all the files generated file in it at one time we can now start loading these applications and this could look like this everything comes together at one time and here a component from component from component lib from another component and i could also if i use this toggle button take a look at the code toggle button would load some component here this react lazy and load it on demand so you can also load other components on demand like i usually do so basically everything behaves like these are normal npm packages but it's on one time it's linked together so here you see this application running it feels like a single patch application there's no special logic here to like iframes or whatever to make it special in any way it's just getting modules together in a single application at one time and for the framework on framework level it feels like every code is just a single application it's also possible to load modules on the on demand here i loaded some modules on the click on the button the usually constrained with repack is that any on demand loading should only take a single round trip to the server and this is still true with module federation it will only take a single round trip every files from separate modules will be stored in parallel for module federation the initial page load needs an additional round trip to the server compared to a normal single build application because it has to get the information from the other separately built application parts and grab the information how to load this and load the files from from this from these containers so basically load these container files which only manifests about where to load modules from this container and then it will be able to load modules from this container this example also shows a few edge cases you may run into so here like the application uses react 16 and the component lab uses react 15 in major version so this is technically incompetent and by default module federation would load both versions so it would download react 16 and react 15 and each separate build application part will be provided each own version it's compatible with but for framework components like react or angular or other things it's not possible to load multiple versions of the framework because it's not for technical reasons it's not possible to make a react app from different react versions that would get you weird errors so in this case i can use the advanced configuration for the shared modules and specify that react is a singleton and shared module singleton set module means it will only ever load a single version of this in this case the highest version and instead of getting its own version they will provide an warning that technically incompetent and yet you should look into grading your component lab another edge case is here that we are both component libs using different versions of date function so here it's version 2.6 and another component lab is using m2.14 and in this case all builds or application parts would agree on the highest version which is compatible with this so in this case it will load the date function from another component lab because it's providing the highest version thanks for watching have a good day and enjoy the conference