 And I will get straight to it. I will talk about how you can add new modules to the NFCore modules repository. And there's a couple of steps involved here. So mainly we will talk about what files you actually need to add, what data do you need, or what data do you use for testing, and what do you need to do to get the tests running, and then finally also what kind of little tools we have to help you along the way. All right. So you've already heard what some of the files that make up a module. If you want to add a new module to the NFCore modules repository, there are five files mainly. Ricky talked already about the function NF file, the main.nf file, and the meta.yaml file, which are stored in the software tool and subtool directory, depending if you have a tool or that has a subtool or only a tool, as Ricky already mentioned. Now for the NFCore modules directory, you also need files for running the tests. So these are stored in the tests software tool subtool directory. And there's a main.nf file that defines the next flow code to run your tests, and the test.yaml file that just specifies the specific things for the test. We will talk about this a bit later. And then finally, you will have to add a few lines to the filter.yaml file, which is stored in the .github repository on the modules repository. And this is just for getting GitHub actions to run the tests on these new modules. We'll talk about each of these during this talk. All right. So as Ricky already showed, there is a template available, which you can access with the NFCore modules create command, which will be available in the near future, we hope. But you can also just copy the files for now. And this will give you templates for all of the necessary files. So here you see the template for the main.nf file. And you'll see it has lots of to-dos. And you just can follow along all of these to-dos strings. And then this will guide you along making a proper module. And there's another example. You can see here the meta.yaml file, which is really important for, especially if you put stuff on NFCore modules, because these modules are made for a lot of people to reuse them. So you really want to make it as easy as possible to reuse those modules. So here there's infos about the input and output and what software this module contains. All right, so instead of copying them, as Riek already showed you, you can also use the NFCore modules create command for this task. So you can just type in NFCore modules create, point it to the modules directory, and then type in the name of your module. And this will create the directory and all the files you need for your modules. So in this case here, we have the NGS tool with the subtool align. And it creates all the main files. It also creates the test files. I moved the name of the modules already in there. And also it will append a few lines to the filter.yaml file so that your tests are run once you have them written out. So this is just a small habit tool that will be available fairly soon, we hope. All right, so another important step for when adding a module is, of course, to where you actually get your software from. So these modules currently they can run with three different tools. So one is Bioconda and then with the container technology Singularity and Docker. So the best way to do it is to first get your Bioconda version of your tool. So here we have the FastQC module and the Bioconda version of this. And then you go to the Bioconda page and go down here where your module is. And then you will see this link to the FastQC text here for the FastQC tool. And there you will find a list of text. So basically you get the Bioconda tool and then with this, once you have a Bioconda version, you can look for Singularity and Docker containers on Biocontainers and on the Galaxy Project Singularity page. All right. Now, next slide, multi-two containers. So sometimes you have, you need several tools in one container because your module needs several tools. For example, for Bowtide, you usually need all the same tools. Now in this case, there's an option available. You can look at biocontainers.bro slash multi-package website. There are several containers that contain multiple tools. You can also use the Galaxy tool utility that allows you to look for containers that have several tools like here in this case. We're looking for containers that contain Bowtide and same tools. And then you can just use those containers. But still sometimes you will have, you will need a multi-two container with that doesn't exist yet. So you just have to create a new one. This is actually not really tricky. It's fairly easy, but you have to know how to do it. So to do this, you can just fork the Biocontainers multi-package containers repository and then test your tool combination locally. So basically just make sure that your Bioconda tools work together in one environment. And then you just add the names, the versions of these tools to the htcv file which is contained in a multi-package containers repository. Make a PR against this repository and once this PR is merged, this repository will create a new model container that contains your tools. As a small tip, you can look into the CI tests, so the continuous integration tests in your PR for the idea of this container because it can take quite a while until this container is online. All right, so you have written all your files and set up your versions. The next steps are to finish your model files. So write the next flow code. I won't show you how to do this now because you probably already know this and then write all the information to the meta YAML and then you have to write the tests. So here you see the main.f file of the FASQC tool. And here you can see that first, like Rick explained, you can just include your module like this. You include it from your software directory. And then in this particular case, we have two tests, one for single end data and one for paired end data. And you see that both of these are workflows that define some input and then just run FASQC on it. And to run these modules, you need some data. And for this, we have some tests data in this, in the NFCore modules repository. And in this case, it uses the genomics SARS-CoV-2 FASQ data. So this test data on the repository looks like this. You see it here on the left, so here's the data. Then we have a generic folder and a genomics folder. And the genomics folder contains different organisms that can all be used for any kind of genomics tools. So in this case, right now, it just contains the SARS-CoV-2 organism because it's a really small organism and we tried to do everything with this organism right now. So it contains BAM files, BAT files, FASQA files and all kinds of other files. So it should contain most of what you need. And you should always try to use data from there. But sometimes you will have to use other data sets. So in this case, try to add it to the SARS-CoV-2 directory. So for this, look at the read me and then just look at the naming guidelines and probably just asking the select channels on how to do that and what data to add. It's important that it all goes together so that it's all made with the same genome so that it all works together. For any kind of other cases, like for example, you cannot use viral data for some reason, you need any kind of other data that's not available here or you don't have a genomics tool, you have a proteomics tool, for example, that's not implemented yet or any kind of other issue that you have with the test data, just the best option is to really start a discussion in the modules channel in Slack and then ask around what to do because you really wanna try to minimize the test data set and try to reuse the data as much as possible. All right, once you got the right test data and got your test written your tests, the next thing you have to do is actually run your test, see that I run through. And then from the output, you can generate MD5 sums. So when you run your tests, this will create an output directory inside your modules directory. And then you can generate MD5 sums and those are used to verify that your module always generates the same output. So you can, this is pretty easy to do and I'll show you in a minute how to do this. Once you put all the MD5 sums and you have to write them into the test YAML file, sorry. Once you have them there, you can then run the pie test and see if everything validates. And then you're basically good to go with your module. All right, so now comes another video. So let's see how this works. I hope this time it goes on. So here we'll just try to run a module. So you have to define first a profile here we will stock what to run the module. And then you just, as usually, use next loader run command pointed to your module code and use an entry here. We use the test fast. You see single end entry, which is the workflow that we defined that we saw already earlier. And you always use the tests config, next load config configuration file for this thing. This will then create the output directory with the outputs of this particular module. So we will see it in a second. Now we have the output directory with all the stuff in there. And we can just type MD5 and generate MD5 sums for those outputs. And then we have to put those MD5 sums into the test YAML file. There will also be tools available soon that I'll show here and that will make it possible to more easily create those MD5 sums. That's particular nice if you have not only two output files, but maybe 20. Once you have run your tests and finished your test YAML file, you have to try and see if everything works with PyTest. So you can do this like here. You define the profile again, in this case, Docker. Here we run the test for the Minimap2 align single end module. And you can just run it like this. And then you can verify that all your tests work. So in this case, it will not work because the MD5 sum is not correct. So you see here that it's not correct the observed sum is not the same as the expected one. So we can fix this by just using the observed MD5 sum, copy it and fix this error in the test YAML file of this module. And then afterwards, it should work. It can also happen that every run, there's a new MD5 sum. And then you might not be able to check with these MD5 sums for some files. For example, log files, sometimes there's information in there that's changes every time you run it. And then you can just log check for the MD5 sum. So now we run it again with the new MD5 sum and it will work this time. And then the module is ready to go. So that's it about running the pie tests locally. Once you confirm that these are working, as I already mentioned, you have to add a few lines to the filters.yaml file, which is in the .github directory. And it's just three lines, the name of your tool. And then you point it to the directories where your tool sits in. And then this is used online to run the pie test again every time your module has changed to some extent. And finally, one more thing, you can also use the NFCoreModuleSlimCommand. Once this is available very soon to check while you're creating your module, if everything is according to the guidelines. So this SlimCommand runs a couple of checks to see if everything is as it should. Here we test it on the FASTUC command and you'll see that it works perfectly nice. The FASTUC module is a nice module, which is already, which is passed all the tests. And then we can also run this, for example, on the StarAlign module. And we'll see that here, for example, the BioContent version is outdated. So you probably should update your BioContent container. All right. So this is more or less all there is to adding new modules to NFCoreModules. If you have any questions, please go ahead and ask them now or always in this live channel, there's always someone to answer them. Thanks.