 Welcome to this quick demo of how to upload your data from Galaxy to a number of different local data stores. So an issue people are often times facing once they have finished their first couple of real-world data analyses with Galaxy is how to retrieve key results files efficiently for further use or storage outside of Galaxy. If you have been using collections in your analysis, then even the download of a batch of results files is not too complicated. You simply go inside your collection of interest, like I show this here, for a collection of viral consensus faster sequences. You click on download collection and you get a download of all elements of the collection in the form of a zip file. Easy. There are two issues with this method though. First of all, getting the data out of Galaxy may only be the first step you want to take for sharing it or to then upload it onto some other platform anyway. In that case, having to download an entire collection of data to your own computer just to re-upload it from there elsewhere may not be your preferred solution, especially if you're in home office like many of us and have a relatively slow internet connection. You might much rather want to upload to an online platform from Galaxy directly in that case. Another issue is if the data you would like to download is not contained in just one collection. Let's imagine you've set up a sask of two genome analysis spot, like the one running on usegalaxy.eu. This spot may now produce many, many similar histories, one for each batch of sequencing data it analyzes. So here's an example of the situation on usegalaxy.eu which demonstrates the issue for consensus building histories. Each of these histories contains one collection of viral faster sequences, but how can we get at them conveniently? So in both of these situations, connecting your Galaxy user account to a data store is the answer. To set this up, go to user preferences, then manage information and on that page scroll down about halfway. You will find the option to connect Galaxy to either your Dropbox account or alternatively to your personal NextCloud or ownCloud account. Of course, you will have to provide credentials of some sort that Galaxy can use then to authenticate itself against the other platform. In my case, my department at the University of Freiburg operates their own NextCloud server on which every member of the department gets an account. So I'm going to use this in this demo. So I'm copy pasting my NextCloud's server domain in here. And then I would complete this with the server path and the username and password, which I will do outside of this video now. And then I will scroll down and press Save here and I will store that information with my account. Now once configured, I can use this connection in two directions. I can either upload data from the configured data store now, which is done via the regular upload manager. So I open the upload manager here. I go to choose remote files. And now in this dialogue, I'm seeing NextCloud on cloud offered. And if I click that and if I entered my credentials correctly, then I get access to all my files and folders on that NextCloud server. So for example, I'm seeing here a folder called Galaxy Workshop demo, which I generated on this NextCloud instance before the workshop. And can go inside there and then disappointingly, I find there's nothing in here. But if there was data, I could upload it now into Galaxy. What I can also do, however, is to use a dedicated tool to export data to any folder on the configured data store. So that tool can be found in the send data section of the tools bar. And it's called export datasets to different data storages. Selecting this tool brings up a tool interface like for any regular tool. You can now select datasets you'd like to export, which can either be regular datasets or collections. So I will again choose here my collection of faster files from before this one. And then under the option directory URI, you can reach a dialogue that again lets you navigate any of the data stores configured for your account, like the NextCloud on cloud account in my case. And I can now navigate using these small arrows to my demo folder on this NextCloud instance and select it as the place to export to. So let's test this and I press execute now. And while this job is running, let's head over to my NextCloud web interface. And I should see those datasets come in there now. So I'm inside my Galaxy Workshop demo folder and now I can just refresh that page. And then in fact, all these faster files are coming in. Few of them have been transferred already. More will probably come in soon. Here we go. We have all 18 files ready on my folder now and the corresponding Galaxy job has turned to green. It's marked as completed. I can also read in the logs that in total 18 files have been exported to my NextCloud server. Now one particularly powerful feature of this way of exporting data is that it's tool based and as such can be used in workflows. So going back to that other example with the many, many consensus histories produced in the SARS-CoV-2 bot account on usegalaxy.eu, we solved that issue of getting access to all key results files with an additional export workflow. Now that workflow gets launched again through our bot scripts. Whenever a consensus workflow run finishes and it uses this export datasets to different data storages tool that I've shown you before, to send not only the collection of faster files but also the collection of BAM files of aligned reads and the collection of variant calls in VCF format that both of these service input to the consensus workflow. So it exports all these three collections of key results files to an FTP server hosted at the Barcelona Supercomput Center and from there all these key results files are then made available publicly via any FTP client. I'm going to show you these data export histories now. So here you see one of these data export histories created by the viral beacon export workflow. And when we go inside there, then you can see that this has actually three export logs in it. So it exported the faster consensus sequences, it exported the BAM files and it exported the VCFs. And so you can see that in that case it exported 339 faster files, 339 VCF files and actually twice the number apparently of BAM files. But this is because of this export indices, Galaxy generated indices options of that tool. So just bring up the to run again here. So here I selected in that workflow export also Galaxy generated index files for supported data types. And I set this to yes. And that means that not only the BAM files in that collection of aligned reads but also their BAM indices will get exported to the Spanish SPTP server. So to save compute time there because they do not have to be recalculated on the Spanish side. So this is how powerful this tool is. And it's really a great addition, very recent but great addition to the Galaxy toolbox.