 So I'll explain again in English, for the English audience. So because this talk is going to be in English. Hi, my name is Vinit Talwar and I'm the co-founder of an online radio station called as Firewood FM. So yeah, today I'm talking about WordPress backups. So regarding the other things you have mentioned, which is we were the media partner of Wordchem Europe and we have covered some interviews over there. And also we are the media partner out here and covered some of the interviews with speakers yesterday. So yeah, and we'll be recording a couple more today, hopefully. So let's begin the talk. So my talk is about backups, disaster management and recovery. Now you guys would be like, backups? Okay, that's a very common thing. I know about backups. What's special about it? So you will find out today. So first of all, everyone have tweeted from this hashtag because the organizers have cracked really amazing Wordchem. If you haven't done it, make sure to tweet about it. Just any tweet, like you like the event or whatever it is, just tweet about them. Imagine this. Okay, so let's say you are a company owner. You have your website running and you earn thousands of euro every day. And one fine morning you wake up and you find your website is not there. I mean, just a white screen or completely full of error or some message from a hacker guy that's saying, okay, your website is hacked. Please pay in bitcoins or whatever to this and you get your back. That's spooky, right? You don't want that, right? So that's the importance of the backups that I would like to talk about in this. So yeah, this kind of possibility is you're going to have. If you lost your website, you're going to lose your data. Let's say you're a mission critical site. Let's say an e-commerce site. You are going to have a revenue loss on a daily basis. Or if the big sites, they're going to have an hourly basis or even minute basis for the websites like Amazon and all. Obviously, they take care of all the stuff and we're not talking about that. Your hard drive crashed, you got hacked, malware, malware attack, or even nature's attack. What if there was an attack in the data center or not and the stairs thing, so yeah, let's not also count that. So wait, heart attack? You don't want a heart attack, right? So let's proceed. Backups. Okay, I know about it. What's so special about it? You need a backup. That's what I'm trying to say because there could be any possibility that your website is not there. What should be your backup strategy? There are some of the points that I'll be covering in this talk about your backup strategy because there could be possibility that you might be running a small blog, just writing about your food and just posting it, or you might be running some enterprise, some company, small business level company. You all need a backup. Everyone need a backup strategy. Where do you need to backup? How do you need to backup? What do you need to backup? And which data is important with a priority that you need to backup? And how often do you need backup? For all these cases, you need a backup. But again, this question comes, what do I backup then? They are, I mean, in a WordPress site, okay, what's more important for me? Well, in an honest way, everything. But to be precise, I will say your document root, your document root is something where all your code lies, but, and your database dump. These are two mainly, but you may not want to keep complete document root because your WordPress core is available freely and you can anyhow merge it anyhow. All you need is your config files in case you're running WP config file and your content folder because your themes, plugins, and upload data, they all lies here. What is this WXR file? Some of you might be running multi-sites, okay? And let's say in multi-sites, they're like, let's say 200 sites you are running. So you may want some solution so that this WXR file is generated from your WordPress importer exporter in your backend. If you go to tools and export, this button comes in if you go like that. So this XML file, that's your backup file, that's what I mean. And your server config, obviously your engines configuration or Apache or whatever you are running. But where do I backup? Okay, I mean, you can simply download it and put it in your computer or let's say you are running your local agency. You can put it on your NAS server, Google Drive, Dropbox, obviously, or your own server because in my talk, I'll be mostly focusing on Amazon-based services. So mostly, so that's why, let's say you're running an EC2 server so you're hosting there. And in the end, we'll be talking about you're automating your backups to S3 buckets. So that's the other point. And AWS Glacier, you might be wondering what is that? So, consider this, you're a big agency. You produce tons of data every day. Let's say your assets coming, your database size is growing, growing, growing. But you don't need the data that was like two years old. You need to archive your data. For safe archival data, you can use services like AWS Glacier, that's because of cost effective. These are nothing, but you connect your S3 bucket with this service and you upload it. So it's not the motive of the talk. Point is, what are the tools that we're gonna discuss here? Let's say you're a small business agency. You can simply use the old school way, which is like FTP and PHP might mean, okay, that's, you will be like, okay, that's what I know, what's next? You must start using Git. If you don't use it, do version control, that's very much important. If you don't know how to learn Git, there are tons of tutorial on YouTube. You must watch it. How many of you know about this, or a cool thing called WPCLI? Yeah, that's quite cool. You know, they're doing really great. You must use it in your system because you can automate your every stuff using WPCLI and that's what we're going to see with one example in the end. About the cron jobs, you can automate using cron jobs. A cron job is, you can say, a certain script that's running at certain point of time, you can set up this in your system and you configure this in a shell script. In the end, I'll be discussing one example with a shell script and S3 bucket is somewhere where you're putting up your data automatically and this is the CLI of S3 that we're gonna use in this setup. Okay, manual backup. You can take an FTP backup and database dump. Okay, that's a basic thing. What next? In CLI, there are like a couple of commands like WPDB export. CLI is just one command and your database dump is out there. Just like two commands, okay, you run this command and you just take it out, okay. Yeah, and WP export. WP export generates a XML file for you and you can just simply, let's say, you need to migrate your site from one server to another server, you just do this way. Okay, you're like, nah, I'm a lazy person. I need something automated. Dude, give me something automated. I don't want this all the effort. So, there are some ways, let's say this because the basic one, let's say this is a plugin way. Okay, yes, get, definitely get. You must use get. But you don't want to host everything on get. You don't want your uploads directly to be going on get and increasing the size of it. If you know, I think some of the providers like, let's say Bitbucket or GitHub, there's limit of two GB if I'm not wrong. So, you must not host all the stuff, let's say uploads directly. You can use some plugins for automating your backup. Okay, plugins, these are like backup, WP, duplicator, WPDB backup, these are good. And the WordPress importer exporter so that you can export the XML file. And this better search replace, this one is use. Let's say you're migrating your site from one place to another one. So, this one is useful to change the domain names, protocols, let's say, you change xyz.com to example.com, you can say, and it will change all this thing in your database. So, this is the plugins way. Let's say most basic way. Let's proceed further. Okay, okay, okay. Let's talk about it here. This is the thing. So, you wanna take a backup, okay. So, what I'm discussing right now is the scripts. So, how many of you are confident with the shell scripts? Okay, quite a few number. And they use it in real time, actually? Okay, quite less. And how many of you familiar with use WPCLI on a real basis, like a daily basis or something? WPCLI, one, two, three, five, six, okay, quite a few people. Very less. That's bad. No problem. I would try my best to keep you guys entertained so that you can actually start using it. So, okay. You have a folder in your document route, let's say it's at this place where HTML, your websites are lying there. I'm assuming you have only one website right now. For instance, let's say example.com. So, your scripts, you're going to configure something like this. You cannot run simply WPDB export. Your binary lies at one location. If you are running this via shell script or via cron job, it won't directly run because it won't get the location. You have to give the address of the binary. So, you give the address of the binary and set up WPDB export. This is just one script. I have compiled this in a GitHub project later on so you will see it all later on. Let's say you're running and then you're making a simple zip out of it. Just saying that, okay, certain date, certain month and making a zip of whatever the SQL file came because whenever you run WPDB export, it will take a simple SQL dump and generate a .sql file. You're nothing but just wrapping it in a zip file and packing it here. Now, you're just making a folder where you're storing your database. So, it's like your date and month, year and month. So, year and month, you're making the folder and storing your monthly backup in a folder. So, you're just making it directly. If it does not exist, it will create a directory at this location. Now, in the next line, you're moving this file, whatever you have generated at this place and you're moving it here. In the end, you're just removing your .sql file. You don't want that to be available in your document route. Otherwise, it would be publicly accessible if the file permissions are wrong. So, that way, you're taking a single file backup. All you need to do is just put the script and just run it. Make sure your scripts are executable. That's more important. Let's proceed further. Okay, this is one example. How about I have 100 websites? I don't want to run it for 100 websites. That's pretty bad. So, what do I do? Okay, you just simply wrap it in a for loop and just your script is out there. So, every time, so what it will happen is, and yeah, obviously this .dot means complete script is going inside, except the first part. So, you just wrap it in a for loop and it will generate all the files and all the folders out there and it will make the zip file for you. Okay, that's fine. But, okay, still I have to run the go to the server and just run it and that's not my thing. What do I do next? So, next step is synchronize to S3. How many of you know S3 buckets and how many actually use it? Very less. And, okay, S3 bucket is a storage solution by Amazon AWS, where you can have host your data to distribute it over your website. For example, you're running your website and you have quite a lot of media assets and somehow if they are coming from your server, that will create a load time. So, what you can do is, you can configure an S3 bucket for your site and that way the assets come from S3 bucket. So, it will save load on your S3 server. So, this is quite good for performance. Normally people use it like with a content delivery network or a cloud front. So, this is another thing. So, you can use this S3 buckets to automate your backups. To automate your backups on S3, first you need obviously an account of Amazon web services. So, there's a thing called free tier in case you don't wanna try, pay them just you can try it out for one year for free I think and I think you get five GB of storage every month. Something I'm not sure with the numbers you might need to check out. They have a CLI called AWS CLI. All you need to do is just install in your command line whatever you use, maybe use MingW or Bash or whatever you use, just configure it and commands are quite simple like AWS configure. That's it. And it will ask you a user exercise ID and access password. You can generate it on the AWS backend. There's an identity and exit management system where you can generate all of these. So now, getting this S3 buckets inside this flow, how do I do that? So, when you have installed AWS CLI, it goes inside, it will be a binary at this location, user bin AWS, you simply say the command is simple, AWS S3 CP current location and remote location. Okay, so every S3 bucket have a certain URL. It's like you have static.example.com or example.com. In S3, you have a unique URL. If your bucket exists over there, it has a unique URL. I assume that in this script, you have an example called your bucket name is your backups. So your URL was S3 colon slash slash yourbackup.com. That's your URL, that's I assumed in the script. So what I'm happening doing here is, I'm copying your backup's directory and tossing it into this folder I've created called example.com on S3, okay? So your command is simple, S3 copy, your current folder, your destination folder, and your recursive. If it's not there, you can use it. There are quite a few other commands also like synchronize and S3 sync also, but that's another thing. For us, copy is okay. Okay, now you're like, what? What is that? It's too much. No, don't, no need to worry about this. This is like, this looks like a terrible, but it's not. You will get to know it and the GitHub link are given in the next slide. So you can also review it. I will also show, go through the, this is just a quite a few, quite a handy tool. So just nothing special. So all I did now here is just compile all three into just one slide. A for loop and this line to export the database. And like, yeah, just generating it and automating it. So, and what next? In, with, before closing the for loop, I'll put this S3 synchronize line. So what is going to happen is it will export your database file, make a folder and make a zip, then remove this, move the zip to a specific folder, remove the SQL file. And in the end, synchronize it to your S3 bucket. That's it. You can then set it up, you can also automate it. Like, okay, now the script is ready. So what next? You can just test it, like running it on your server. Let's say you put it in a shell file, example.ssh, and then you just simply execute it and you can test it if it's working or not. Because you may need to change the document to address. It may be, let's say, public HTML or whatever is your address and your backup, your S3 bucket name could be different. So you may need to do this minor changes. You may need to take care of the file permissions. This script needs to be executable, like group plus X file permissions. So at least it's working for them. Okay. How about automating it? So there's a thing called grontab. Do you use it guys? Yes, one, two, three, four, okay. Yeah, quite a lot for evil. So now what I did now is just put the script inside a folder and just simply run as a grontab. And let's say this script is going to execute at three o'clock in the night every night and it's making a zip folder and putting it in S3 and synchronizing it. Completely awesome, you did automation of it. So this is a nice tool called grontab.group. So let's say because many people have problems with this, what this zero three star star means. So you can just get to know about it by this website and you can just put it and it will give you the example. So pretty easy, pretty awesome, you can try this. Okay, this is the GitHub link that I was talking about. So github.co, I've posted some of the information. You can just have a look and just try to make it completely easy and I hope that will help for you. So now the thing I mentioned about the data archival. Okay, let's say you are running 100 websites in your server. So you need all of the data. Let's say you want to do a backup strategy of having a daily backups. Even some people want hourly backups also. Yeah, but that's creating a tons of data. Your server hydride may be limited, let's say 50 GB or 100 GB. You don't want that to make it full and just keep it empty. So that's why your S3 backups are immediately useful. So later on, let's say when you're synchronizing S3 backup, you can also set a script to remove those folders that's staying in your computer also. Okay, this is about database. What about my entire data? What about that? So let's say in your document route, you can write the similar script to synchronize your document route also to the S3 buckets. All you need to do is just change this path. You don't even need to zip and all. Just simply say AWS S3 sync, AWS S3 CP and put the document route address and the path of your S3 buckets. And that way you can simply sync it and you can also wrap it in a shell script file and automate it by a Chrome app. Pretty easy, pretty awesome. Just have a look and if you get any question, just let me know and I'll put the slide on this link. You can also have it and review it and any questions just also you can let me know. So these are my contact details. You can just write me an email and yeah, you can just write me an email and yeah, we are also constantly looking for people who are interviewing for us because this project called WP Shoutout, we are doing community interviews with what can rank an area, what can Europe last time and this time now in what can post now. So yeah, just looking for the interviews. If you guys wanna join in, sure, just shoot me a mail. We'll be happy to take you. So just let me know what's your backup strategy by tweeting me and just what's your plan for it because backups are important. You don't wanna die with a heart attack, right? So just let me know. All right, any questions? Yes, yes, please.