 Hey guys, welcome back to Daniel's Tech World on YouTube and for this video I want to demonstrate my current Ubuntu desktop backup approach now for people not involved in the world of You know, I haven't thought twice this video is really intended for anyone using Ubuntu Linux who wants to take I'm just demonstrating my current backup approach now. Gonna go through the various tools. Gonna go through the rationales I'm gonna try do so relatively quickly because this is not the most grossing subject I'll be the first to agree with that. Although actually the more you learn about backups the more interesting things that there are to learn about them This documentation is on GitHub. My username is Daniel Rossell JLM. I Always think I might change that so just type Daniel Rossell get up. There's only I think two Of us in the world and I think the other person is not involved in technology in any way So my GitHub account if you look up the master backup strategy Repository that I currently have pinned to my home page, which has been forked once which is which is a nice milestone for myself This is it so basically I documented this in a markdown file Also in the repo. I have a couple of scripts that I use in my own backup operations, you know taking local files you can see What I'm doing here just giving myself I have a B2 bucket My Linux backups and I'm just kind of just the auto key config The open box config LibreOffice stuff that I don't want to have encased within a whole system backup that I might want to look at that's the rationale there and I have my these little graphics here the PDF if anyone wants it of v1.3 I called it v1.2 initially then I did some updates while I was making this infographic. Hence it's now v1.3 And basically what this is is it's currently what I use for backups now this master backup strategy I called it master because I described both my local backup approach And my cloud backup approach, so I want to back up really as many different Things as possible. I want to get everything on my desktop. I want to get everything in the cloud I want to get everything I call this minor cloud backups. It's my own terminology I'd never heard anyone else describe major and minor clouds, but I'm talking by major cloud stuff like Google Drive Dropbox P Clouds and Sass-minor cloud is stuff like reddit, Quora, Twitter now I'm really really vigilant about cloud backups, and I want to make sure everything including reddit and Quora at least I mean ideally to do this every three months this unfortunately There's a cool. There's a cool tool called skivvia, which is just about the closest I've I've come to seeing what I'm looking for what I'm looking for really is something that you know would take reddit and Quora are not here because it's it's manually you need to literally write into their team and say can you please give Me my data export and that's only available as a result actually of I believe GDPR which gives Grants data subjects the right to have a copy of their own data They didn't have it before then I think it's fair to say that was the results a lot of people have started offering it now That never did that before this is a nice tool skivvia. You can see you've got stuff like sugar Zoho dynamics your major CRM's here and They just give you a bunch of connectors basically right so You can see backups a major use case solutions data. Let's go for data replication This schematic explains it quite nicely. So you have all your various things here in the cloud Your sales force your hub spot And then it's just this connector basically And then just mirroring that to wherever you want it to be mirrored such as a data warehouse, for example This is really a mature enterprise product intended for big scale something a bit more user-friendly would be mold cloud Which again their library is not brilliant it kind of has stuff like Evernote and then doesn't have other stuff It's inconsistent, but I use it because stuff I use Happens to be covered Although I couldn't get b2 to work unfortunately So that's what I do in the clouds I as I say you can use more cloud and you can use this nice tool called flexi phi and that really moves stuff at Lightning speed across the wire direct cloud to cloud Another approach which I will just show quickly is using spinning up your own ec2 instance on a ws and p cloud to B2 back up I think I should find my own stuff because here we go not many people are probably Using both p cloud and B2 and they're looking for a way to back them up. So that was a previous video I did and That's basically is firing up ec2 Installing or clone. That's a cli that is for cli to cli backups Based on our sync I believe and then on the ec2 instance actually directly running that backup and then obviously the rationale there would be using the Using the upload speed that that is available directly on the cloud as opposed to your own internet connection So this is my cloud approach and the overall strategy I have here my graphic from my article on I believe it was Linux hint calm My overall goal here is to be compliant with three two three two one So I'm gonna explain what three two one approach to backups is now if you do not Trust me on this A lot of really good stuff has been written about three two one carbonite for example have this resource So let me read and I'll show you my my My my own thing so this is why I call three two one a misnomer because You can see here that includes the original copy and at least two backups So this on my schematic here I let's say let's for the sake of argument say that we're using an SSD as our primary As our primary Ubuntu desktop that we're trying to protect It could be an HDD a hard drive or it could be an NWM E But in any event we need to to get to the first number three we need to create two backups So it's actually a regional source plus two backups as carbonite make clear not three backup copies The next thing in the three two one rule is that the backup data should be on two different Different storage types, so we're gonna have two backups and those backups cannot be on the same Storage types, so let's say for example. We were gonna back up our SSD here onto the SSD Now we could if we bring over a time shift. There's nothing. I believe Preventing us if we go into the settings with I am using here SDA is is my original. Yep So I can change it over to SDA to which is the partition that I have my Ubuntu desktop on here So there's nothing actually stopping me from creating backups onto the actual primary Device I'm protecting the problem in that situation is that they're on the same storage media That means that basically if that hard drive fails for instance the backups gonna go with it The backup's gonna go with it, so therefore They should be on two different storage media Generally, you can achieve this and I'm achieving this in my Ubuntu backup by having and if I just bring over This settings page actually because it does a nice job at showing What I have in place so SD SD a is a 480 gigabyte Drive an SSD and that's where my operating system lives on You can see I've actually gone through relatively a relatively small amount of that. I just recently actually swapped out a 240 gig for a 480 gig just to give myself more room. I would say tangentially the reason with Linux backups The longer your system is stable the more you stand to lose You know your modifications tend to accrue you tend to add more packages You might add unattended backups for example, sorry on this hit unattended upgrades, which I always do and I just find that over time. This is actually how I got into backups. I would basically every second year I've been using Ubuntu for like 10 years Every second or third year something would break the system Leading as I've said at the start of the video would be a leading release a non LTS release First piece of advice stick with LTS to LTS releases more stable Typically something like an LTS release would break the system ruin the package manager ruin NVIDIA at that point It was really just not worth spending the two or three days You can do that once or two times. You can reinstall your system two or three times You will if you do this get to a point where you say never is this ever happening to me again And that's why I decided to get really big into making sure good backups have been taken It's really paid dividends because I now have a stable system But in order to keep it stable I need to keep on top of this backup approach And that's why I'm going to be spending the next 20 to 30 minutes Explaining us on this YouTube video, and I've certainly already spent a lot more than 20 to 30 minutes Putting together this documentation on GitHub for anybody interested, but let's get back to what I have So this is my primary system here And then SDB is where I stick my clone zilla now that's 240 gigs because they want to change anything here That's 240 gigs because I take on zilla less regularly. It's not automated and therefore it's more work SDC is another drive that I have formatted to ext4 and This is where I put my time shift time shift backups Time shift is creating basically snapshots restore points something like what you have in Windows with your Windows restore points similar to that not familiar in depth with the Windows tool because just because I don't use Windows that frequently But it's a similar idea. So I basically have you can see this is 480 gigs This is 480 gigs, and I'm using actually not a whole bunch of Of that space just just as I'm not using a whole bunch of space Actually, I'm using roughly the same it seems I think that's a coincidence because those are snapshots and there are three of them and this is one operating system So I basically am using for the sake of my backups two dedicated discs one for clone zilla one for time shift and why do I do to essentially because Time shift is a great tool, but I don't trust it 100% If you're restoring from time shift you go into a you can do it directly from Let's just exit out of this You can restore from you know just click on restore and you can roll back to a snapshot Now that's do that's obviously that if I were to go through this process now It would be restoring a board live system, right? I've booted into the system that I'd be looking to restore from I Don't trust that really 100% you can also use a command You can do the CLI boot into your Ubuntu system get past the group menu and you can there Run the same operation from the command line interface before you've booted to the system and it's not live I trust that a bit more. However, I trust clone zilla more than I trust time shift clone zilla is a very low-level Backup methods and that's why I do to but something else you could do would be to You don't need to have two different like basically I'm running two different two different Backup programs each gets its own drive and I back up my hard drive my main drive SDA on to SD be an SDC That's going actually beyond three to one because it's creating two on-site backups and then one off-site backup So three copies of my data here You could do raid one raid is redundant array redundant array of independent disks and you could just have another SDD Another another drive NVME as I said HDD doesn't matter of the same size and you could use raid to constantly replicate locally I Find this approach advantageous over over doing a raid configuration and that's because I actually get to keep three snapshots here, so raid you have the latest and There's a point that sometimes made that you shouldn't mistake redundancy for backups so raid creates terrific redundancy You know the whole world of backups Fits you backups and you have data recovery If you just look up the difference between them, you know backups refers to creating the backups ahead of an anticipated disaster like a hard-dry failing Disaster recovery encompasses the full strategy for responding to a disaster event and putting the backups in into action. So that's that's concerning. It was the restore site site and another interesting allied field I would say is business continuity management and So if you do redundancy if you're creating another raid disk, you're giving yourself terrific redundancy that in other words if you had let's say Let's go back to our schematic here sort of schematic Location if we hadn't we could put in another we could let's say we created SdE Or as to say SdD wasn't Windows. It was another it was another component in our backup arsenal So we could have a we could be running raid between SdA and SdE and we could also have that would give us redundancy that if SdA Were to just without warning fail the heart the SdD completely failed We could immediately and that's business continuity swap over to SdE But that would be our latest copy essentially before the failure occurred in SdA so we might find that there was There was stuff There's stuff in that in SdE that wasn't to our liking But we wouldn't really have much of an option in order to actually go back in time We need we need we need to restore from a backup point. So in that scenario our SdB would still be useful for us because we could restore from SdB to SdE or from SdC to SdE So that's why in a kind of nutshell why redundancy and backup are not the same things and why both are actually required optimally One thing to mention here is it doesn't you can see these are all disks sitting within my computer If we go back to our My diagram here. So basically why let's just understand three two one here for a second So we've covered this we need original plus two. We've covered this Different storage media actually we haven't so let me explain it Basically as I said if we have the backup if we had SdD backed up onto SdD if SdD failed we'd have no backup Now where else could we what else could we do? We could have a NAS and that work at that storage Like a Synology and Synology have just sent me a NAS to play around with so that's kind of cool and If if we had SdD backed up backed up to our NAS in the event that are that would be one on size On a different storage media. So that's three two one compliant and if SdD failed if our primary failed we would be able to You know restore onto a new storage media from our Synology or it could be a hard drive that we connect us in an enclosure or it could be an external you know one of these USB Western digital External SSD that you just exactly something like this That achieves about the same thing as if we had a hard drive in an enclosure and then kept the enclosure in in a in a Like a barracuda for example now What's what's the what's the difference between this and an NAS the difference would be that let's say between uses we put This in our cabinet that means it's not connected to electricity So when we're looking at the reasons why your onsite backups might be vulnerable We're looking at basically a few catastrophic events. We're looking at for example theft would be one everything in your apartment or workplace is stolen We could be looking at fire the office goes up in fire your home God forbid it goes up in fire and everything is destroyed That is so this this will not actually be any be of any use if we have our backup on Something like this in that case, but if it's a power related event For instance, there is a lightning surge or a massive power surge And it blows out everything in the in our in our home office. Let's say if our NAS and our Computer are both connected on that same circuit. There's a good chance that The NAS will be fried alongside the SSD. So if we Actually take our backup on something not connected to the power or or we could do we could do another onsite backup If it's an NAS that we're backing up to onsite We have the advantage that the NAS will be constantly online and we could therefore automate this onsite backup We could just do a simple cron job that would use our sync to incrementally back up our SDD Sorry our SSD on to the NAS at midnight every night. Let's say. I mean, let's say we keep the desktop running 24-7 Obviously if we are doing something like this and we're not keeping it connected to a network device We're not gonna be able to avail of that So what would what we would need to do? So that that's a use case why you might want to Both let's say do an automatic backup on to a network attached storage and you might also want to do this And that would be again go beyond beyond 3d1. We'd have in that case to On-site backups one off-site backup because we're gonna do that next and that would therefore give us three What are primary data source plus three? And don't forget as well just jumping back to the cloud for a second And if you look at this final part of my diagram where I'm describing doing a pull from b2 On to our local server if you have if you back up from one cloud to Google Drive to another cloud b2 That's not three two one compliant chat. You've only created one copy of your primary data source and not two And not to now they are on different storage media. So that's great This is on, you know, Google Drive's data center. This is in b2's data center. Realistically That's fine But you might want to pull it down and keep another copy on an onsite storage like the NIS we mentioned and the reason for that is because if we're dealing with local backups, it's easier to typically easier to Restore local from local and it's typically easier to restore our cloud from cloud Why is that if we're restoring if we're restoring a In the scenario where we suffer STD disk failure here, and we need to restore for mutton or NIS It's going to be a lot quicker if we you know put in a new SSD and there's it by the way There's a use case for having always having a spare SSD on file Unused format in whatever you want EXT4 for this very reason that we could just stick in a new EXT4 let's say our disk fails stick in a new SSD for attach it to the motherboard and run clonezilla and Just quickly restore and bingo. You're back in action pretty pretty pretty good in terms of business continuity almost as good as using rate Doing that it would be slower if we had to restore onto that new SSD from an off-site backup namely Namely from clout clout berry, which does have a restore capability and I'm going to talk about clout berry shortly So quicker to do quicker quicker to restore local from local and For the same reason quicker to restore clout from clouds if we lost our data in Dropbox It would be quicker to upload it to another Dropbox account over the wire from another cloud than it would be to Do so from our desktop You know or are from or are from an NAS that would be pushing up Why is that it's the same reason that I did my P2 to be to back up here? On an EC2 instance because it was quicker to much much quicker to upload data on the cloud Than it would be to deal with my and I don't want to do this be test because just to show my IP address But I have an uplink of about three megabytes per second megabits per second. Sorry So it would take an eternity Um, all right. I don't I don't I don't want to make this video a marathon. We're already at 22 minutes So I'm going to try to keep this within the 30 minute mark So what do I do one? I do time shift time shift. I configure to If I can get my time shift back here. I back up onto this is for local backups. I back up onto Uh disk the same size as my as my main one and I keep A monthly I keep two weeklies and I keep one daily So I don't know why you don't have four snapshots here Time shift is good. It's my first protocol if something goes wrong. It means that I can do quickly Let's just jump out of this again restore from any of my restore points today's 12th of june I have one from a few hours ago that that would be the one I would use If I Install some package that broke the package manager or did something that just created a mess And they didn't want to spend the next two days Getting out of that mess and I wanted to get on with my life I would restore from that guy If I had installed a packet a few days ago that then seemed to create and I could trace it back to the created some kind of massive Series of events that resulted in my system being bricked I would go for that guy and again further back in time I would go from this backup on the 5th of june that would be about a week ago As I said, I think I'm missing a backup here missing a snapshot Um, they're pretty light You just I would recommend having the same size as as the as a drive. You're backing up storage is pretty cheap You could also do this onto network at that storage. You could have an nas as a Synology as I as I said And this is just a front end that we're using rsync Uh, it's basically just using rsync, which is an incremental backup command line interface and just, you know creating a few snapshots Um, and uh, so that you don't have to bother writing scripts bash scripts. You don't have to bother Running rsync with the correct operators. It does all that messy work for you. Um, and It's one thing to point out as it says here devices with windows file systems nt fs and a fat f at So it just runs so you need to have exd4 or other linux formatted file systems just to run So this little guy is um, and you can also exclude and include stuff basically I actually have forced it to include the stuff in my home user directory by default it excludes that so I've added That folder, so it's actually basically taking a full uh, a full disk and that's because within my user directory I actually keep uh um, I keep some network attached stuff and uh So that actually if you have something like google drive that's mounted you might want to exclude that from within the include of your home user directory So that's time shift in a second in a basically and it's used to create my first uh on-site backup Now the limitation as I said you can restore from this Interface or you can restore from as the cli and you just basically boot Into your computer. Uh, you know get up a shell. Uh, get up a a bash prompt and type in timeshift minus minus These are some of the commands minus minus Restore and you'll get you know you'll list out or just you know man timeshift to call up the manual page And you it'll tell you which snapshots you have you tell it which snapshot you want to restore to You hit yes, and then it starts to restore process. Uh, so that's my first protocol the second. Um, that's an incremental backup now Uh, I mentioned I use clonezilla clonezilla is a very very uh low level tool clonezilla.org. I believe I believe yes Uh, you can just download clonezilla put this onto a live usb. It's a very very low level device. Um, unlike, uh, Unlike our friend timeshift. It doesn't have this Limitation you can see many file systems are supported ext 234 As well as the f at file systems and at ntfs. So it's actually it's cross platform So you don't need to just worry that you can actually use this also to back up a uh A window which is going to get rid of this uh, this advertising here. Um It also supports lvm 2. Um, and they've got some screenshots here. So basically I use this This is really what I expect is going to work. Uh in the event that for some reason my uh incremental Restores are the timeshift doesn't do the job. Um, I I I've tested this and if you really really want to Knock yourself out. You should ideally and this is part of uh, remember we looked at backups versus Disaster recovery versus business continuity. This is really part part of uh disaster recovery Would be uh testing the restorability now clonezilla When you run it has an option to test the restorability of the backup. It's just created But if you want to or you want to be extra sure I would actually go as far as recommending That you stick in a uh test um a test drive here In order to actually um try to write You know try to try to use clonezilla to restore onto the test drive Boot into that and if that is replicates the functionality of the uh of the primary disc You've just backed up then you know that the backup image is restorable that it's good. Uh, I Just trust because it's my second tier. I don't have Patience you could say that's really a lot of work. Uh, but that's the optimal thing so clonezilla Is uh something I do manually, uh, and I do this really as I said here quarterly and it's a full disc image It's not incremental another important distinction. Um, I literally go in I have another video on my youtube and I will actually uh, It takes about probably 20 minutes and uh, I'll put that on to you. I in my diagram. I've got an external hard drive Once I get my new nas hooked up. I'm probably going to start doing them on to nas. You can also, uh, You can also encrypt those afterwards. Um, I think you can encrypt them as the process runs, uh, and obviously if you're backing up on site Ideally for security, you should be encrypting Uh, encrypting those backups. Um So, you know, if you had an encrypted file system using timeshift and then encrypted those clonezilla backups Um, and this process I do quarterly because it's far less convenient. I don't need to worry about timeshift It does it works by itself clonezilla. I mean you could be extra diligent and do it weekly Um, it just creates an image and uh, I use the uh partition to image So I take it doesn't back up the disk. It drives it specifically the partition that my ubuntu Linux desktop lives on and then it creates a nice image of that on this guy and in the event that Uh, timeshift was not able to successfully restore Um, restore a new there's another actually important distinction here and that's timeshift In order to restore in the event that my, um, ssd failed And I needed to restore. I would need to firstly stick in a new Uh, ssd I would need to install the timeshift program. I would need to then attach my uh, this guy my my uh, sorry I would need to install timeshift onto linux desktop And then I then I would be able to you know, see the existing I would have to tell out where my timeshift snapshots are and then pull them in You could say it's a little bit because you need to go through a whole manual install It's a little bit slower than this is just disk to disk. So I could just put in a new ssd in the event that uh timeshift did not work and um, You know, it would probably be quicker in that instance that the disk was completely failed As opposed to there was a just a packet's gone wrong and the packet packet manager Uh, wasn't working if that were the case a simple software glitch I would probably restore from timeshift if the disk if it was actual if it was an actual hardware failure I would use um, I would use this guy because I could just directly run a restore over clone zilla From uh, whatever it is the nas or the uh external drive onto the primary And I'd be back in action without needing to reinstall that's My process for creating and uh, as I said This is actually going beyond 321 because I'm creating two backups locally and one in the cloud Final thing I will demonstrate is what I'm using for uh, clay backup. So this is a tool called, uh, Clydeberry I think it's like whatever their community edition is cross platform And I'm not going to say too much about this. You just create a new backup plan Uh, you tell it basically what to do you then add you can see all my backup storage is here I have all my b2, um buckets mapped out and really you just give it a job You say I need please back up my local system the whole file system or my user directory In practice and this is maybe just the final thing I'll see in this video Linux backups what directories? You don't want to back up when you're backing up, um Here we go. We have a nice little resource. Uh, you don't want to when you're backing up a Linux system You do not want to do everything. Um Linux, yes, yes, yes There are things in a Linux system that are not going to be useful and actually they're going to to the country Uh, they're going to actually make uh, I believe they're going to make it impossible to to restore. So you want to They've given a general, um, this resource for Linux slash unix backups And you can see some of these i'm the boot. Uh, you should not include lost and found You should definitely not include this you might want to look for a more specific List of things you can exclude for your the specific Linux distribution you're running Uh, so I would in my Clydeberry plan. I would uh, if I'm doing a full disk one You can see my full disk backup plan here if I've basically told Clydeberry What's uh skip? I told it where where to put the data in b2 And that's pushing it up to the cloud now. That's a really really slow process It takes days and it's painful. It is incremental. So once you run it for the first time, um You can put it on a schedule as well or you can just run it manually each time Um Sorry, I'm going to distract them actually Let me just finish this video before I'm I'm going to actually run it again now Um, so that's the final thing the final chink in my armor If you like so to speak realistically in terms of restoring a brick Linux system, uh, there's really no need to go beyond timeshifting clone zilla But because of the three two one rule and just general best practice in which Credibly, there could be something like a uh, as I said again God forbid a fire in this place and both the nas And the uh, and this uh, the computer itself were, uh, you know destroyed And I really really want you know and even in that case to be honest because all my data is on the cloud It would just be quicker easier to install linux and uh, I think if anyone's apartment burned down Uh getting back their operating system just as it was and getting the programs the way they were and all the configurations is not going to be anybody's Most pressing objective And there's another approach you can take which I'm trying to do and get up as well And that's just basically uh, if I go to my a bunch of modifications, I've tried to just map out This is really documentation. I've just done for myself um, you know, I've kind of Listed out the packages. I did this a few days ago Just what packages I have that I could also just kind of refer back to my own github documents and say Uh, this is the stuff that I want to get back on my system But uh to comply with the best practice. I'm keeping a secure off-site backup up in b2 And if in this catastrophic scenario, I outlined I would uh, you know, I could basically Um, if I didn't do Clydeberry, I could push let's let's just say for the sake of argument I had a fantastic home internet connection with um One gigabyte per second upload speed. I could take a I could take a full just full local backup with clonezilla I could put that onto a network attack storage as I said a sonology device And then I could upload from sonology on to b2 and in a reasonable amount of time And I would cap a reasonable amount of time realistically in a few hours I'd push that as an off-site backup Then in the event that um, this guy fails and we weren't using something like raid. I would install a new ssd Uh device, uh, I would uh, Just install a little a little basic linux, uh the same distro I wanted to restore from I would then uh, Download my my clonezilla Backup from my from my b2 from the cloud put that on to a Put that on to a usb or Uh, download that back to my network attack storage I would then stick a live usb into this system And I wouldn't even need actually an operating system to be in existence yet So I can just skip actually the installing linux Put in my live usb running clonezilla and I would restore and I would tell clonezilla to restore From let's say an nas was in this picture restore from my uh, My full system backup on the nas Onto the linux backup it would onto the linux desktop the new ssd It would run their process in about 10 minutes And then I would simply boot back to my boot back to my machine and I would have a backup from That would replicate the system from whenever I took Um that full system Back up and uh push that off to the cloud and in the event that I was you know doing this process from A different location entirely my apartment had been destroyed and I was uh backing up, uh, you know from the cloud I would follow this and I'd be able to get my old computer desktop back in action just as it was when I was instituting this backup, so Uh 36 minutes later that is essentially how my current approach and as I said it's a work in progress v 1.3 When I have a v 1.4, maybe I'll wait for the v 2.0. I will make another video Thank you for watching and uh any questions or comments from anybody Uh, they can go on to my website here at danielrowsell.co.al Um, and I'm always happy to uh, you know discuss backups or whatever. So thanks for watching and uh Check out new videos soon on daniel's tech world