 for the introduction and hosting and having me today. And thank you everybody else for joining. And I'm really happy to be a game with the Singapore joke. And hopefully in the future, we'll be able to see again, face to face. Now let me, oh, I think Michael, you are sharing. Yes, okay. I was looking for the bottom, sorry for that. It's okay. All right. Okay, so where are the slides? Here we go. Yeah, let's play this. All right. So we have about 60 minutes or so. And so I know it's a little bit late where you are. I'm streaming directly from Switzerland and it's a little bit past my midday lunchtime, but for you guys, we'll be very close to dinner time. So I will encourage you to ask questions anytime you feel like and on the chat. And Michael, if you could please do me a favor and let me know whenever a question is raised. I will do my best to answer as quickly as possible. If not, then we'll still have time at the end, but you still have any questions. So the topic of today, maybe, what I wish I knew about maybe years ago. So what's going on here? So as Michael said, my name is Andres and I work for Oracle and the Java champion and I created an event called Hackagut, which is a series of meetings where people come after hours and work on open source projects just because we love open source. And that's pretty much it. So when I was in college, this is back in 2000 and we've not even 2000, this is 1995, Java came out and I was one of the first students that asked my professor at the time was, we were learning C++ and asked if I could do my final project in Java instead of C++ and since then I never looked back. So of course, Java was based on ideas to present a synthesis very close to C and C++ to allow the vast majority of developers to jump into the platform. Now, because it was just a merging, there was no build tool per se in Java, we only have the compiler. So as time progressed and projects started to grow in complexity, it was needed, it came very apparent that a build tool was actually needed and we only had access to make at the time. And if you ever have worked with make and C and C++, you know that there is some sort of portability, but it's still a bit difficult and it's not the same thing as in Java where you have one language that can work in different platforms. Because of this, a project emerged called Apache Int and this project was created in order to build Tomcat which was one of the first major open source projects in the Java space. And just because everybody was happy with XML at the time, well, you have to use an XML descriptor for Int. The advantage of Int is that it gave us a portability, it gave us a descriptive DSL based on XML to define anything that you need to and that's a little bit also part of the double S so that you can define and you can do anything you want to or need to which meant that your project will be perhaps very, very different from another project from another team, another colleague, another organization or a completely different company. So moving from one project to the next was a little bit difficult. Despite this, projects continue to grow and they agree so much that the need for another build tool arose and that's how we got into Maven 1. So Maven 1 took the lessons learned from Int and gave us dependency resolution. So no longer have you had to commit your jar files or dependencies into source control, you can rely on some metadata files and external service Maven Central to download and configure the dependencies but it also gave us a project structure, convention of a configuration and a few other things. And that's how the Java ecosystem exploded because it allows us to really build projects at a much bigger scale. And sometime later in, I think it was around 2007, a couple of people figured out that maybe Maven wasn't what they really needed because Maven jumping from Maven 1 to Maven 2, this structure was more constrained. Things were or appeared to be fixed or more or restricted. So these people said, we want to have more flexibility. It's not enough for my project to be bent to the tools wheel. I wanted the other way around. I want the build tool to be bent to what my project requires. And that's how we got Gradle. And for this reason, some people believe that Gradle is just like Int where you can do whatever you want, which is true. And you don't have to use an XML DSL, you use a programmatic DSL which you originally was based on the Groovy programming language. Now there are two, if you are familiar with Kotlin you can also use Kotlin. And at least at this time in 2008 where I jumped from Maven to Gradle. Why? Because I didn't want to deal with XML. I thought Maven was just a simple tool that was wrapped around with XML and it was quite dumb. Actually, that was not true. In 2010, Maven 3 came out. By that time I had already jumped into Gradle I didn't look back into Maven. And that was my mistake. Because as it turns out, there are really good features that are really neat features in Maven that are similarly provided in Gradle or there are even features that are not available in Gradle that are really good to have. And for this reason is why I decided to come back to Maven and have a hard look into what this build tool actually is and what you can do with it. So I'm sure that you may have seen some of the features I'm going to discuss today. There may be some others that were completely new to you. So we'll see how it goes. And again, let me round you. If you have any questions at any time, please let me know. And English is not my native language. When I get very excited about topics I start to speak faster and faster and faster. So it feels also let me know if I should just slow down a little bit. So the first thing is override project properties. Now you may define project properties in a couple of weeks. You may have seen that you can define properties using a properties block on the palm file or properties may be defined by plugins. Actually every parameter of a plugin in this case a module, a module is the thing that actually runs and reacts to the life cycle from maybe. Every parameter that is annotated will expose automatically some sort of project property that can be overridden if needed. Now you can also define and override properties on the command line using the minus the keyword key, but now this is pretty cool because it doesn't matter how the property was defined. Then you may be able to set some bodies. Now let me show you a quick demo of this. So let's jump into the terminal. And I have, I think, let me see a couple of projects. Let's jump into properties, perhaps this is fine. Right now let's see which version of Maven I have, I think it's dash dash version. And this is latest version of Apache Maven 363 with Java 1.8 update 181. So that's not the latest, it still is Java 8. Great. So now let me show you a simple pump file that I have here. It has the group ID artifact in version coordinates. These are known as GAV coordinates. And just a simple dependency. And the source code, I think, just make this show that there's nothing up my sleeve, so everything is clean. I do a tree here and I only have a simple file class like this. And if I display the source, it's supposed to be a simple application that can be run. So files are good. Okay. Now remember, I'm using Java 8. So if I'm to compile this code and I say something like Maven compile, then tells me a few things here, it builds perfect. Now let me inspect the compile code. So that's minus B, minus B, minus C, targets. Nope, not that one. Classes, sample class, and let's look at the head. Major version happens to be 49. Does anyone want to take a guess which major version of that maps to Java? That's right. That's Java 5. So what's going on here? So Maven but has some defaults and the base compilation value for target and source is set to Maven 5, to Java 5. So there are ways that we can override this or set the defaults. You can define in the phone file the entry for the Maven compiler plugin and put the source and the target. You may use the properties block to define just Maven compiler, target, Maven compiler source, or you can also do it in the command line by just doing this, Maven compiler, target. Let's make sure that it's clean and then do compile, right? Oh, sorry, my mistake, I didn't define a value and Maven told me, pretty good. So there we go. Maven compiler, target 1.8, clean compile. And then we look again until on the decompiled and vice code and happens to be major version number 52. So this is Java 8 now. Now, every other property that is exposed by plugins or your point class can be overridden in the command line like this. Okay, let's go back to the slides. So the next one, and this is a pretty neat feature, is that you may invoke any plugin on a project by passing the gap coordinates and the goal and any additional parameters that the goal may need for its execution. Now, for example, say that there's the echo plugin and you want to execute it. The echo plugin will require the group ID, which is com-github echo-eco-maven plugin, the artifact ID is echo-maven plugin, version number 120. And the goal that we want to invoke is echo and we pass our parameter. Now, because we're doing an invocation in line, that makes great that we can override parameters using the command line. Otherwise, we will have to modify the pump file to provide say parameter. If we already modified the pump file, then it might make sense to apply the plugin. So this will defeat the purpose. I can show you a quick demo of this. And I believe in this particular project, I have a couple of commands. So let's see, command number one, it looks like this. So I'm going to execute, I can execute the exec-maven plugin to run the application. And this will compile exec and notice that it passes an argument for the main class. If we didn't do this, then it would be a failure. So I can do this by passing the command to the shell and let's see what happens. It compiles, it builds and we get hello world as a result right there, we can see it. Good. Let's see, I have another command. Nope, not that one, like this. Notice that this will be the same thing, executing the code, but particularly exec, and there's a subset of plugins that Maven is aware of, especially those that have a very specific group ID, which is the Oracle Apache Maven and the module house group IDs. If those plugins are found in those group IDs, then you can simply define the name of the plugin, not the artifact ID, but the name, which is part of the plugin descriptor. And the goal in this case will be a exec job. So if I am to invoke this command passing to the shell, I should get a hello world output again, which is true. So I'm again, I'm executing an inline plugin passing parameters. Now let's look at the last one. I think this is the actual invocation of the hello world. And just to be sure that there is nothing fishy here, I'm going to copy this right there. And let's do Maven, the gap coordinates plus the goal, which is echo, then define echo message with a value, hello Singapore from Switzerland. And then we get hello Singapore from Switzerland. And that's pretty much it. This works for pretty much any project, any plugin you find out there. All right. So next one, the dependency resolution, which is one of the reasons why many people jump from and or make or something else into Maven. Now the dependency resolution is pretty great, but it has some rules that many of us, including myself, may not be aware and things will result in some interesting things. So we'll see in just a moment. So basically you define dependencies in a dependencies block. That's the first case. And if you have a hierarchy of palms, so that you have the palm and the parent and the grandparent all the way to the super palm, you can have a many dependencies blocks. The dependencies block on your palm, the one that you couldn't be compiling, it's the one that wins over everything else that is found on the hierarchy. Now, when you define explicit dependency in this block, this dependency has precedence over everything else. These are known as direct dependencies. So if you have, let's say guava explicitly defined on your palm or in a parent file, and then you find guava somewhere else in the dependency graph brought by another dependency, these will be known as a transitive dependency. It doesn't matter what version of that guava is in transitive, the direct one will always win. And if it's a transitive dependency, then the rules are different because maybe we'll look at the location of the dependency in the graph, how many hops it has to go from your current palm to reach that dependency. And the one that requires the less hops, it's not the version number, but the one in location that is closest is the one that is going to win. And this is one thing that stumbles a lot of people. So let me show you something very quickly. Back in March, I ran a quiz on dependency resolution with movement. It was just 14 questions that seemed quite simple. So let me show you the first one. I only show you two of them. The first question says, say that you have a palm file like this one that defines two dependencies and those dependencies happen to be guavas, so same group ID and active faculty, but different version number. So if I were to resolve dependencies with dependency tree, which version will be the selected one? Is it going to be 21, 27 because it's the first? Is it going to be 28 because it's the latest? Or is the last defined? Or is this going to be a build error? Because, well, to duplicate dependencies, right? That should be a problem. Well, if you run dependency tree on a project that is defined like this, turns out that maybe it does output of warning, which is right there on the top that says, ah, I'm not so sure about this one. You have a duplicate dependency here, 27 and 28, but I'm going to allow it. Why? Because Maven says, I'm going to resolve the last one, version 28. Now, surprisingly, 36% of the people that answered the questionnaire, which is a bit more than 500 people, they said this turns out to be a build error. Actually, it's not. So here's question five. You have two dependencies, two direct ones, truth and juice. Both of them have Guava, a strategy dependency in one hub. Juice depends on Guava 25, and truth depends on Guava 27. So which one is going to be the selected version when I resolve this? Is it going to be version 25? Because juice is first. Or is it going to be version 27? Because truth is second. Is it going to be a build error? Well, turns out the answer is version 25. Why? Because juice is the first one. This is a transit dependency. Guava 25 is found first in the graph, and that is the chosen version. So let me show you a quick demo of this, and let's go back to the terminal. I think I have this example, transitive one. And so here's the example. So this is juice and truth in this order. So if I were to resolve dependencies, which by the way, the dependency plugin is not applied by default. It's not even coming from the Maven super pop. So by just defining dependency three, I'm going to invoke an inline plugin. Because dependency is part of the core plugins, I don't have to specify the gap coordinates. Here we go. Maven dependency results in Guava 25, just as we saw in the slides. So what happens if we invert the order and we've defined truth and just stack them? Let's go into this one. That's fun. There we go. So now truth is first, juice is second. If I do a dependency tree, what's going to happen? The selected version is now Guava 27. Again, because it was found first and it's the nearest to the palm that you're currently executing. Okay. So these debunks, one of the myths that people have when encountering Maven, which is, oh, my project uses semantic version. My dependencies also use semantic version. So if I put everything correctly with semantic versioning, then Maven will resolve all the latest versions of dependencies, right? It doesn't matter where they are in the graph. Well, it's not true. And this is what Robert Schulte, the Apache Maven PMC chair had to say it about the current estate. Maven never looks at the version number. It only looks at the location in the graph. Now, this decision was made many years ago, and it may be the case that this decision has to be re-evaluated with the current times. Now, if you really need, or if you really want semantic versioning rules for the resolving dependencies, then you should have a look at the Maven Enforcer plugin because it provides a couple of rules that allow you to do this, to detect when there is no convergence within dependencies though, so they will result in a bill failure, or when you want to make sure that you always select what will be seen as the latest version of the upper bands. Now, if we want to fix the problem with this pop, here's one way they can do it. You can use the dependency management block and say, it doesn't matter which version is coming in transitive closure from truth or juice, it doesn't matter in which order you define either of those. The dependency block says, if you encounter Guava with that group ID, an artifact ID, then the chosen version should be 28. And I can show you that actually working. I have another project here, it's called I think a transitive tree. If we look at the palm file, the difference from the other one is that it has a dependency management block. This one has the latest version from Guava, I think it's 29. So remember truth depends on 27, juice depends on 25. If I do the dependency tree here, then this chosen version for Guava is going to be version 29. And this is because of the dependency management block. Now you may define this block in your current palm, or you may define an apparent, or you may define it in the grandparent or anywhere else in your hierarchy. As long as it's somewhere in your hierarchy, then that value was going to be chosen. All right. So I mentioned the Maven and Forrester plugin which provides a series of rules that allows you to verify that your build is behaving in a certain way. You can also fail the build with something else not working properly. I find it funny at first, and I really like to remark, the Forrester plugin is a lobbying iron fist of Maven. Some of the behavior provided by the Forrester is my personal belief that should be included in Core. It may be the case, and as I'm just looking at my big crystal ball, that some of that behavior will be included in Maven Forre. Now, I asked a question on Twitter a couple of months ago, I think, oh, again in March, who uses the Maven and Forrester plugin? It turns out not that many people use it, and that's a bit of a shame because it actually solves a lot of problems with dependency resolution, but not all of them. If you're interested in reading more about the Quests, the URL for the Quests is right there is on my personal blog, and these are the results for how people did when answering everything. So the three bars in green represent the percentage of how many questions were answered correctly. Only seven people out of 509 got all 14 answers correct. If we split this questionnaire in sections, the first section has six questions on just a regular POM file. The next section had four questions on a POM with a parent POM, and the last section had questions on a POM that consumed a POM file, a bill of materials. So the first question you can see that the number of correct questions, correct answer was pretty low because many people thought this would be a bill error, and that the percentage of correct answers in the other sections is a bit lower, which tells me that we have a bit of gaps in understanding how parent POMs files work and also POM files. So this is another view, and a bill error was always an option in all the answers, but it was never a valid option. So those answers in gray shows you the percentages of how people thought that, well, this looks weird. This is probably results in a bill error, but actually none of them turn out to be a bill error. There are a few other issues that you may encounter with dependency resolution. I highly encourage you to have a look at this session by Ray Sink and Robert Schulte, both of them Java champions, on how they solve different problems. The fact that maybe does not resolve versions using semantic version is one of the topics that they covered, but a few others that they will exemplify and mystify in this particular session. So there's a link to the video and that link also has the link to the slides, if I'm not mistaken. I will also make available the slides for everybody later. So next one, Maven Clean Store. This is a fun one. So how many of you, and that's myself included many years ago, whenever you build a Maven project, the first thing you do is Maven Clean Store, right? Or when we see a document or a readme file for a new project that relies on Maven, the first instruction that we see is Maven Clean Store, right? Well, let me tell you, the first rule of Maven Club is, you do not invoke Maven Clean Store. The second rule is, you definitely do not invoke Maven Clean Store. The third rule is, you should use Maven Verify. And if you're not happy with seeing Tyler learning to say that, well, what about this one? Maven Verify gives Rainbow Studio Project. So what's going on here? Okay. So we know when you have a multi-project build, I mean, this works for single builds and multi-project builds, but the impact is more apparent in a multi-project build because there are more stuff that needs to be built. So back in the days of Maven 2, we only built single-project builds. But because projects grew in complexity, we needed to build more than one project at the same time. That's how Maven multi-project builds came to. And because it was not part of core, there was a project called the Maven Reactor Plugin, which gave us this capability. That's why I don't know why they pick up the name Reactor, but that's how it stuck. And it was so good that when the Maven team decided to push out Maven 3, they said, we should make Maven multi-project builds part of a core. So let's bring in the information from Reactor, these learn lessons and put it into core. That's what we have right there. So the install goal actually, well actually the install goal is not a goal, it's a flight cycle phase, which will allow plugins to hook into and the install lifecycle phase happens to be the one that pushes the built artifacts to a repository, it happens to be the local repository. But it's also true that the install lifecycle phase executes after the verify phase. Now, if you do not need to push these artifacts to Maven local, that's better because you do not have to pay the price for IO copying files from one place to another. Another thing that also happened in Maven 3, which did not happen in Maven 2, is that the core is not aware of incremental builds. And plugins must opt in to rely on this feature. If you have a build that has all the plugins that you're consuming are incrementally built aware, then invoking clean will completely undo anything in regards of incremental build. So you will have to pay the price for computing everything again. So what we say now is that more often that not, instead of using Maven clean stall, just give a try with Maven verify. This should make your build sell a bit faster depending on the size. And most of the times they will do the right thing. I believe there are few small use cases, corner cases that are not very common where you will still need to do clean and where you will still need to do install. So remember, if you really need those artifacts in a Maven local repository, then you definitely want to use install. But if you do not, then verify this enough. And then you may be asking Andres, I in a multi-project build, I consume other projects, my siblings, I need those jars. So I need to install them into Maven local. No, that is not true. That's exactly one of the things that Maven verify does. As part of all those breaks are part of the reactor, then the build artifacts will also be exposed to every other project that actually requires them. So Robert had another session at Debox Belgium 2019 explaining some of the history behind Maven 2 and some of the things that were changed from 2 to 3. And clean install and verify is one of those. So it actually feels like in booking Maven clean install, it's a little bit of a cargo code. Because back in the day from Maven 2, we definitely needed to install and clean all the time. But since Maven 3, we can do verify most of the time. Here's another neat trick. And this is coming from my, this is something that I was looking for from my time that I spent in Gravel. So in booking Maven with dash AM and dash PL, this is pretty neat on a multiproject build. So we know now that Maven runs a multiproject build inside a reactor. Every goal that you set up in the command line will be executed for every single project in the reactor, which is good. So what happens if you have a big multiproject build, let's say 50, 100. I have seen Maven builds with more than 600 modules. And you only want to invoke a subset of the reactor. How do you do this? Well, if you pass the flag dash PL, PL stands for project list. You can pass a path or a series of paths separated by comments. And then Maven will only build those projects. But the trick is, as long as those projects do not have any other dependency between them, if they do, and they do depend on other siblings, you need to build a few more projects. If you want to build everything in a single session, again, using verify instead of clean store, then you also need to specify the AM flag, which stands for also make. So let me show you a quick sample of this. And this will be the partial project number one. And let's do a clean, just be sure that I don't have anything hidden, okay? So the root project looks like this. Gap coordinates on four modules, one to four. Good. The four modules look like this. So there's project one, project two, project three, and project four. Let's look at the content of project one. So trivial project that only has a dependency to what defines its parent and defines the artifactity, inherits everything for its parent, not dependencies. Let's look at two, reference to its parent, the name and the dependency on project one. So it depends on a sibling, so far it's a good. Project three, similar to project two, depends, defines on the parent and depends on a sibling, project two. And let's see, project four, I think that this does not depend on something else, yes. So this is similar to project one. Now, when I do here at the top, Maven compile, actually, yeah, let's do compile because we don't have any test, this will be faster. We'll see at the beginning, the information of the reactor, the build, which is the root project and the force of projects. So Maven now is building everything. It goes to every project, so there's project one, there's project two, somewhere is project three, there's project four, and then the summary of the reactor, everything is fine. Okay, I want to build project two only. Maven, PL, project two, and then let's do clean compile. Remember that project, let's start with project one, let's start this. Let's build project one, it's built, perfect. Now let's build project two and failure. And it says I cannot find the jar file project one. And this is probably the reason why many of us keep doing Maven clean store because, well, I need to find it. The only way that I can find is to repository. So yeah, right? Well, no, in this case, if I just add the dash AM flag here, this will run a subset of a reactor which we can see here at the beginning. This will be the reactor build, the build, project one and project two. And then the summary, and project two is working now. Is it that great? Excellent. Now, let me show you, I think it's project three, the sources. This is an executable class, okay? We know the project three depends on two, which depends on one, all right. And being an executable class means that we should be able to do something like this, specify project three, we want to compile, but we also want to execute that class. And we know that we can pass an inline plugin definition. We also know that we can define the parameter, in this case should become ACME main, if I'm not mistaken. And we should be running project three, right? Let's see what happens. Oh, I don't know. The reactor order is fine, but the executable main and plugin says, it cannot find come ACME main on the build project. What's going on here? Well, remember I said that every goal that you invoke in a reactor is also invoke for every single project that is part of the reactor, which means that the Java will be executed for build, project one, project three, and project three. What we only needed in project three is there a way around this? Yes, it is possible. So now that we know that this plugin, inline plugin or any that is explicitly defined in the palm will be executed for everybody in the reactor, then we can change the palm files in such a way that the executable main and plugin is defined for everybody, but disabled by default. And it's only enabled on the project that we actually needed. Sometimes this requires defining dummy values for some properties, some parameters, for those projects that where the plugin will be disabled. Why do I mean by this? So now let me change to another version of this project. I think it's called partial two. Let's do a clean, so nothing strange here. If we look at the palm file, now this looks different. I have two properties here, exec, escape, and main class. These are properties exposed by the accept plugin. Then I have a plugin management block where I say I'm going to use the exec main plugin and going to say if any of my projects, so the build project and any sub module requires the use of the exec main plugin, then these are the coordinates and I can have any additional configuration or default configuration, in this case, I don't need it. And then it applies the exec main plugin, which means it's going to be available to every single project that I build in my reactor, but because the exec escape property is set to true, then the execution of that plugin will be escaped by default. And notice that the main class value is some gibberish, some dummy value. The undefined class does not exist at all. It's just something that is not empty. If I look at project one, it should look exactly the same. Similarly, project two, project four, identical, project three will look different. As this one redefines the value for the main class, but also notice that it redefines the value for the escape parameter explicitly saying false. This means I really want to execute exec maven here. So we execute the, we invoke the command line as we saw before, there's no need to do this because now we have a value. Let's see what happens now. One, two, three is the output. Notice that the reactor says I'm going to build four projects out of five. So project four was escaped because it's not part of this subreactor. Exec maven plugin is escaped in build, escape in project one, escape in project two, but execute it in project three. So that's how we make it happen. Okay, next one. We cannot have seen this already, aggregating pumps. So an aggregating pump is a pump file that contains a module section. That's the only thing, that's the only requirement. They are typically conflated with a parent pump, but an aggregating pump could be something that is not a parent pump. Now notice also that the entries in module are actually path names. They are not artifact IDs. They are not project names. They're path that match to the file system. But because it is very common to find projects in directors that are immediately adjacent to the pump that aggregates them, then we usually see names or paths are identical. But if you happen to have a very deep structure of pump files and there is no thing in between that you will see in the module that it will see different paths. Now here's an structure of an aggregating pump, which I call build, where I have a couple of projects of pump and base. And these two projects, pump and base are completely independent from one another. They don't even share the same parent. But if I put this aggregating pump with having those two modules together, allows me to build those two modules at the same time. Now think of the following example, which is, I think I do have a demo for this one. Yeah. So let's switch into physics multi. Let's look there quickly here. This class, oh, there we go. So here's an example of paths. So subprojects of project one, two, and three. Why? Because subprojects doesn't have a pump file. Anyway, let's go one level up and think is this one bomb. This is a structure I want to show. So let's do a clean here, just to be sure that nothing is there. Then let's look at this structure. I have two projects based on bomb. Think for a moment, base is your project. And bomb is common slang, or some other sentence. You found that there is a bug in one of these dependencies, again, common slang. What can you do? First, you file a ticket if it's not already a ticket existing for this particular bug and hope for the team to fix this project and so you can consume the dependency. Or you can try to fix it yourself. Now, these will typically require you to build the dependency, publish it to a Maven central or your local artifact repository. Check with your consumer project if it's working. If it's not, then continue the loop, right? But it will be much faster if that dependency will be part of the same multiproject build as your project. That's exactly what the aggregating pom is allowing me to do. Because if I do this here, say bomb, and I showed you, this will be again, common slang. There's just a regular pom file and I can simply do Maven verify here and in both, just this project, nothing else. Then I go into the base and let's look at this one. This does consume the pom which will need to be available in a repository. I believe I don't have installed anywhere. So if I went to build this, it's going to fail. So if I just do Maven verify here, it's likely to fail. But if I go one level up and notice that there is no parent definition here, nor here, right? Completely independent from one another. So if I'm at the root where I have the aggregating pom and if I do verify, now I have a reactor and I have built the two projects. So this will allow you to build things much faster instead of just pushing in terms of the results into an artifact repository. All right. Next we have bomb files. Now, bomb files have had are now being used more and more in open source projects. So if you consume a spring boot, Quarkus, Macronaut, Jackson, GRPC and so many other dependencies, all of them provide bomb files. So what is a bomb file? Basically, it's a pom that follows certain convention. It has got coordinates. It may have a license, SCM. It may have a parent if you want to. But the most important thing that a bomb file should have is the dependency management block. Because this says, I'm going to group a set of dependencies that go together. In this case, this bomb file is quite simple. It's trivial. It only has a single dependency, but if you were to pop open the spring boot bomb or hibernate or the other set I mentioned, they have dozens and dozens of dependencies. Now, because of the virtue of dependency management block, if you were to choose Guava, it doesn't matter anywhere in your dependency graph, the chosen version is going to be that one. Now, once you have defined a bomb file again with as many dependencies as needed, you will consume them in this way. You have to define a dependency management block in the consumer and put the coordinates for the pom file, so gap coordinates, but also two additional elements that are very important. First, the type of the dependency must be pom. Because otherwise, maybe we'll think that there is a jar file attached to this dependency, which is not, of course, there is no jar for this bomb file. And the second thing is the scope. We have six scopes in Naven and the six one import is the one that we must use here. So what happens when Naven sees these instructions is like, okay, I'm going to locate the bomb file based on those coordinates, download it, look into the dependency management block of that bomb file and graph all those definitions and put it in the consumer as if they were defined in the consumer. So now that we have this, then I can consume guava and just by defining the group ID and the artifact ID, and there's nothing else that I need to do because it just works as if it were defined on the project itself. Okay, I think this is everything that I wanted to show today. If you have any questions, please let me know. I must confess that I was one of those persons that thought that Naven was just a simple dumb tool wrapped around with XML. Turns out that it really has a lot of features underneath and there are a few things that work really well. For example, in bokeh and inline plugins in Naven it's not possible in color weather, but there is an option to make it happen. So I have written a series of blogs on entries on Naven which is the first link. The second link is about a friend and fellow Java champion, Chandraguntu which also has more in-depth knowledge on some inner workings for Naven. And if you're still working with Gradle or are you happy with Naven but for some reason have to work with Gradle and you're missing some of the nice features from Naven then the third link will help you. This will provide you those features that Gradle does not have, but Naven does. I work for Oracle, so if I said anything about Oracle if you make any purchases based on Oracle don't believe anything that I said, you're safe. This is how you can tell an actually an Oracle employee, safe hardware state. So with this thank you very much for your attention and so now let's open the floor for any questions that you may have. Michael? Yeah, we didn't have any questions in the chat but I do have one question for you. Okay. I know there used to be a company behind Naven and I forgot the name but I think they stopped. I'm just wondering who is backing Naven? Is it like just personal contributions from people or is that like a company sponsoring people to work on Naven? Yes, a really, really good question. Yeah, the company name is called Sonotype. And the, so maybe it's an Apache project and the Apache way is to have a healthy, to build a healthy community where not a single entity dominates the other project. Sonotype did dominate for a while. Now they are still people from Sonotype contributing to maybe in final mistaken but the overall guidance of the project is done and is performed by volunteers. So Robert and Carl Hines and Herbe and many others that contribute to Apache Naven do it under free time. No one, except for as far as I can tell, I mean, I may be mistaken, but no one besides Sonotype members are paid to work explicitly on Naven all day long. So if you find any issues with Naven I will certainly encourage you to reach out in the main list or in the issue tracker to let the Naven team, hey, this is not working or maybe I misunderstood this feature or maybe there's some missing documentation. There are some, there's a series of issues in the issue tracker that have a label called up for grabs. These are issues that the Naven team has deemed easy to get started. So if you want to contribute to an open source spray that has a big impact in our industry, this will be a right place to do. Now Naven continues to be the dominant build tool in the market and it's surprising that a lot of people use it but there is no company behind the days, no continuous support other than provided by volunteers. So if you can help in any way, we will certainly appreciate it. You can help by doing reviews, producing code, creating awareness of the tool or if you can convince your company to do some sort of sponsorship. There are many ways to do this. Okay, thank you. Do we have any other question? Anybody would like to ask a question and leave it on the chat or if you have any questions or any questions on the chat or by voice? I think we agree on the rest. Before leaving, if you guys are okay, that's our tradition in the Java user group is just to do a virtual group picture. So everybody who is okay with it, if you can just turn on your camera for once again, you take a picture. Just as in the memory of this event. Okay.