 Guys welcome back to Daniel's tech world on medium and YouTube. So we're here today for another one of our backup videos and this is just kind of concluding the week for me We week of backup. So there are generally once a year I just kind of go through all my backup approaches to make sure everything is as good as it can be an idea I like to also make it a bit better to make some improvements So part of those improvements for me has been has been moving from s3 to Backblaze P b2 I didn't do that only because it's a cheaper source of object storage, which it certainly is their pricing is really really really good That's backblaze b2 mostly because it's a It's an object storage platform that is intended for backup. So putting stuff into an s3 bucket I just find you know, it's obviously not intended specifically for backup. You can move stuff into glacier Then you need to getting stuff out of glacier is a little bit more difficult Has to be done programmatically. So I thought that be it made more sense As I am in the habit of backing up cloud to cloud so that you know, I'm backing up my cloud services to another cloud I thought that putting it into backblaze made more sense and I took a look at it before and it was only after I discovered that fire FileZilla Pro, which I have my other screen here actually has a discreet Ubuntu. They don't advertise it for some reason But that works perfectly. So once I found an easy GUI for being able to upload stuff You know kind of continuously to b2. So, you know, I just kind of as I go along I pull stuff out of my peak cloud of my Google Drive and they move stuff up Just as I finished and I can archive folders to keep stuff running lean so once I did that I decided to move everything over I pushed everything over and now my difficulty is trying to find a integration between P cloud and b2. Now P cloud is a I think a really really good Cloud storage provider one of the smaller ones, but it is quite has a good amount of features in it The problem that I encountered if I can just find the github page for our clone here I Just opened an issue a few hours ago. It should be here. This is my issue Basically when you try to authenticate with P cloud if you have two-factor authentication second-factor authentication Enabled which of course if you're putting sensitive documents in the cloud you definitely should do that The integration doesn't let you through you can see it says two-factor authentication required without any option to Work around that. So I didn't want to try messing with disabling it I know I know that Google handle this problem by allowing users to create app passwords Which are single-use passwords that other programs can use and then the user can be still be protected with 2FA So this didn't have that so I decided to go looking for another Solution and this is just a new P cloud that I've set up here. It's got 74 megs in it it's just using my demo Gmail here and I was trying to find a solution and I think that this should do the trick basically so These are the automatic Folders that P cloud give you basically so there's just a PDF here 20 gigabyte PDF a few Junk Demonstration I should say audio files demonstration pictures demonstration audio So firstly just select everything and you can do this on your own P cloud with you know However, however much data you have in there now if you clicked on download selected This is what I was hoping to work the first time If I just pause this and if I go into my downloads manager in Chrome You can see that the actual unique link they've kind of masked the link somehow So it just gives the API and ends and get zip and there's really nothing more to see here If I just put this here So that's not I was hoping that would work it didn't work what does work However is if you select all your folders, then you click share download link And just call this for example back up 080520 click on generate here and this will Create a sharing link just a regular sharing link one of the peak light features as you can share link And manage that link. So just I'm just going to copy the link over to the clipboard momentarily here And just also to point out if you click on shares here in P cloud You should obviously as soon as you're finished this process even even if you don't share the link with anybody For the purpose for the purpose of security. You should stop the share just so that is not that is it's not out there in the API So basically I have this on my clipboard now, so I'm just going to take the public share link that that generated and This will bring me to and this is completely on Unprotected I can log out of peak light and I'll still be able to see the contents of all the folders and This itself, but these two buttons at the top if I click the download button Here and I go for download directly as zip and You don't need to do this. Just click no. Thanks start to download Watch what happens now. So another download has started this time. It has a name that matches the archive There's one one more important difference. We go into our download manager instead of just getting this API AMS and ending in a just a just a hidden Link just going to clear the other guys We actually have something a bit longer so we can just take a closer look here at this And it's the same link as we had before with API dot be like public ZIP You can see at the end of this API called there is a unique code for the download so We can use this to Download use a simple double double you get command to download this directly onto a remote machine And then using the faster connection of a remote machine Put this using our clone up to be to so I will now demonstrate how to do that So I've just gone into Amazon web services AWS and I've gone ahead and just fired up my EC2 instance now the beauty of EC2 as opposed to renting renting a VPS or you know some kind of Any kind of cloud remote server that isn't strictly on-demand is that you'd be incurring Monthly fees this use case works perfectly with the EC2 model of Compute on-demand or compute as a service So I'm just fired up easy to and it's going to take a second or two for that instance to come running and in another Terminal over here. I am initiating an SSH connection to the instance And I will start the video again when that connection has gone through okay, so I've connected in SSH in a terminal to my EC2 instance and I can just take a look around the instance and see where I am now I have a 75 gigabyte SSD running on this instance over here So it should be enough for us to to comfortably complete the complete the operation So I'm just going to go to my clipboard here and I'm going to simply just run a very basic W guess and copy in that and We can see now that the it's obviously it's only 74 megabytes. So it's fairly light. We can see that it is It is getting something You know of the right size basically even though you can see it's going to save as this guess It's going to create it's going to save as it's not going to actually save as a zip We can see this This is basically the download file Okay, so the next step that we're going to do is basically Just copy this you can also use MV actually that makes more sense, but just just in case it doesn't work Let's go for CP. So CP the download file and We're going to just call it back up zero eight zero five two zero that's it so we can see we have successfully copied that and Let's unslate it deflate it and let's see if this is indeed the archive So running the on zip command you can see that all the Demonstration files are contained. So it is actually the the archive So I'm just going to go ahead because we don't need them. I'm going to get rid of this and I'm going to get rid of the deflated archive because I'm just going to push the zip up to Up to be two. So now it's time to use our clone So I just navigate it here into back plays in the web UI Which is always something I do just to make sure that everything works as expected And so I just don't need to remember the bucket names and the folder names Demo bucket DR is what I created for these demo videos, and I've just created a photo called pp cloud there So I'm working here on the terminal in the ec2 instance As before so our clone sync and we're going to move up back up to B2 Demo bucket DR forward slash p cloud demo Bucket DR slash p cloud and I forgot to add my verbosity operator, which I'd like to do as well there so Let's see if this works This is okay. This looks like an error message, but it's actually fine. It just means that it's about to kick in and there we go so this was cloud to cloud the upload speeds on this guy on this ec2 instances in the region of 800 Megabits per second so incredibly incredibly fast wire-to-wire transfer here It may not show up actually this takes This sometimes takes about three minutes for the web UI. So unfortunately, I am not able to actually prove But you can trust me that 74 you can see and that was what one six point seven seconds to move up a 74 megabyte archive up to a bucket and B2 so I think my actual p cloud is somewhere in the region of three to four gigabytes So moving that up. I expect will take no more than a couple of minutes as well. So that's a this is a methodology for manually running an extract of p cloud in order to get just to refresh for a second You just kind of Put your files together into a shareable download link give that a name And then Initiated download and this is this is a bit hacky, but it works Initiate your download and then just grab the download URL And you can then use W guess or curl to download that Over over the wire onto remote onto a remote server and then again using wire-to-wire cloud to cloud Using the R clone command-line interface on your remote host I used an ec2 instance, but you could use a VPS or something of that nature to push it quickly Far superior upload speeds and a home home connection up to a B2 bucket And then your p cloud is safely archived in another cloud storage location Hope this was useful. If you'd like to get in touch or feedback or discuss backups or anything of that nature I usually just type in the web URL, but it's Daniel Rosal dot co dot. I L or you can do Daniel Rosal dot com and there should be a contact form there with a PGP key for the Privacy conscious. Thank you for watching until next time