 Hi, guys. Welcome back to Daniel's Tech. We're on YouTube medium DanielRosal.tech. My name is Daniel Rosal, and this is another video about surprisingly backup. So what I want to show today is a methodology. Methodology is maybe being a little bit kind here, but basically just a way of downloading Google Takeouts a little bit more reliably. So Google Takeouts, I've discussed in a previous video. I had it just up here. This one about using an AWS EC2 instance. Little bit complicated, but what I was showing in this video here was how to use EC2, fire up an EC2 instance in order to download your Google Takeout and then put that up to a used RClone, which is basically a command line interface tool based on R Sync. RClone lets you sync to remote sources. So what I did was I basically logged into Google on the EC2 instance by using a virtual desktop. And then I basically, once I got into the desktop, I ran the takeout. And you can see here in the video, I was just testing out the metrics aboard the EC2 instance. And the upload speed was a lot quicker than what I could get on my internet connection. So I basically took advantage of that fact to download the Google Takeout on EC2, set up RClone. And then I actually just synced the download up to Backblaze B2, ClipStored, to give me my whole 321 compliant backup. So that was an interesting methodology. In terms of something you'd want to use all the time, it's kind of a lot of hassle. You would be incurring costs for that EC2 instance unless you'd have to remember to terminate it. It's a lot of work. So basically what I'm doing now for backing up my Google, and I run through this process approximately once every few months, let's just say that, because this is a little bit of work, but I think it's work that is worth doing. Just to always have, I really have my, besides my desktop, the data I have floating around the internet is in just a few places. One of that is my web hosting. The major one would be my Google data, and the third one would be my cloud storage. So cloud storage happens. I can back that up pretty easily, automatically, into my Backblaze and down to my computer. I can do my hosting automatically. I have that set up as syncscripts. And Google is kind of the odd one out, because there are these kind of tools for downloading your Google Mail, your Gmail, and your Drive, and your contact, but there's nothing, even with the Synology, the programs that you can access in Synology DSM, there isn't a tool, and I've been looking for this, but I have not, I've scoured the internet and not found this, looking for something that'll basically pull in incrementally everything you get in a takeout. And you can see here there's 51 products in my takeout. So there is a bit of a gap here, and until that gap is filled, I continue to make sure I'm getting everything out of Google to manually do this. Now there's something important to say about takeouts, and that is this, when you run a Google takeout, one of the products that you typically click would be YouTube. Now what you get in a takeout in YouTube is the brand account associated with the Gmail address or G Suite address you're running the takeout from. In other words, if you have multiple YouTube brand accounts associated with that Gmail or that G Suite address, you're not going to actually get those. So what I do, and this is why I add the DSR, ghostwriting, exciting YouTube pages for subscribers. For example, this is a brand account that is associated with my Google over here, and this will not be included because it's not the default brand account for this G Suite address. So what I do in practice is whenever I make a YouTube video for this account, I will actually upload that straight away to the NAS. The NAS will then sync that to Clip Storage, that gives me my three, two, one backups, and that's that. But obviously it would be preferable from my perspective if I didn't have to do any of that manual stuff and I could just put a calendar entry in my diary to do a Google takeout. So what I do now is I do this takeout. This will go on to my NAS, and I could do this directly on the NAS actually, but I'm gonna just do it on my desktop. And then I use Hyperbackup, which is one of the DSM tools, and using Hyperbackup, I just copy the whole NAS onto a flash drive. It could also be onto a hard drive enclosure, but I just do a flash drive. And what I will do then is basically put that in my car. You heard that right, in my car, in the boot of the car, because there isn't currently, I can't think of a better offsite. It's technically offsite in the sense that if there was a flood or something happened to the apartment, the car would hopefully be okay. It's not an ideal offsite, but I guess part of my interest in backups is just experimenting. This isn't, these aren't necessarily the perfect or ideal ways of doing things. If I had a better internet connection available, I would probably just mirror it up to the cloud every time, even though that would involve uploading 50 gigs. But I'm a current internet connection of two megabits per second, that is just not practicable. So this is the way I'm doing it now. Okay, so basically you're gonna get this email when your Google takeout is ready and you have your nice little download links. Now, what I do, and here comes the core of the video, just pop open your download manager and what I did basically was just, and you're saying yes, you really made a video for this. Indeed I did, indeed I did. Right click and click copy link address and then you wanna get a multi-thread Google download manager. So it doesn't matter if you're not on Linux, if you're on Windows, they exist for Windows, Mac and Linux, so you'll find a tool. I'm just more familiar with Linux, which is what I'm using here. And basically just copy that from your clipboard, the URL and then just click okay. And that did not succeed. So that did not succeed, so what I'm just gonna do is try that again over here and just copy again from the clipboard. Okay, so it looks like my download links expired, so I'm just gonna go and do that again. Now you can see it's got two or three hours left. I'm just gonna explain the rationale for this and that's basically that if you just download from your browser, you're not gonna have, if you use WGet on the command line for instance, you have that continuous feature, which is really what you want because it's very possible that you'll be downloading this file, this 48 gigabyte archive and your internet connections are gonna drop at some point and there's a high chance that that can interrupt the download, so that's why I do this. I'm just gonna grab this and then what I actually do is put this on pause. I signed out in my Gmail, which is why the links broke. So let's try this again, hopefully this time we'll have better success. So I've just gone copy again and there we go, and that's it, so basically do this with each of your downloads and you can see the last time, the left time. Now what I would do is go ahead off or have initiated that and pause it because we don't need to download it twice and just to confirm 102 that that archive is still downloading, still building even though I paused the download in the browser itself. So that's my methodology essentially for a good way of downloading your heavy and this is a 48 gigabyte archive, so relatively heavy, your Google data and after this completes, I will be moving this onto the NAS in an ideal world, I would then be replicating that to S3 or B2 or some other cloud storage in the not ideal world in which I live, what I will be doing is next time I back up the NAS using hyper backup that I'll copy, that I'll get this on the NAS and copy that to my wonderful offsite backup location otherwise known as the boot of a car with a hard drive in a box, but hopefully you have a better situation, could be a friend's house, could be the cloud, I mean ideally it would be the cloud, but that is how it's done. So hope this video was useful, if you have any feedback or want to get in touch for any other reason whatsoever, please feel free to reach out to me at Daniel Rosel, 2000rosel.co.io, wishing everybody a wonderful day.