 Welcome to my talk. I'm going to talk about taking backups with Laravel. My name is Frik van der Herten. I'm a partner and a developer at a company called Spassie. Like probably many of you, I'm active on Twitter. My handle is Frik Merzen. And I have my own blog where I talk about Laravel development and modern PHP development in general. Together with these two awesome guys, I organize the PHP Antwerp user group. If you're ever in the vicinity of Antwerp and want to speak at our user group, contact us. We're happy with every speaker and with every level and length of talk. Now my company has been around since 2003. We create websites, applications and web shops. And we're quite small. We're only with four developers and one manager that guides us. And we specialize in Laravel development. Now before I give my talk on the backups, I want to talk a little bit about open source software first. Now in my company, we use a lot of open source software. We use Nginx, we use Laravel, we use Ubuntu. Basically everything listed at the slide when they open up composer.json or package.json. All those things are free of use. And without open source software, our company couldn't really exist. And I bet that many of you thank your job due to the fact that open source software exists too. So because we use a lot of it, we try to give back. And we also create a lot of open source software. We currently have 90 packages listed on packages. Most of them are Laravel, but we've got quite a few framework agnostic PHP ones as well. And recently we broke the 2 million download mark. And currently packages are being downloaded at a rate of 300,000 a month, which is great that our work is being used. A little humble brag. This is at the GitHub award site, which just sums up the amount of stars that GitHub repos of an organization have. And you see there's Laravel at the top with 50,000 stars. But we're at number six worldwide. So this is really great. And we like the fact that people star our repos. Creating open source software for us has a lot of benefits. First and foremost, we learn a lot by just creating the software and by the issues that our users post. Because with every issue, we see that that's a chance to learn. Same with the PR that gets sent in. The benefit for us is that we can see the perspective from another programmer and how he would approach this and the benefit for the guy or woman that posts the PR is that they get a free code review. So it's a win for everybody. Everybody is learning. Another benefit is that we're also forced to write documentation and tests sometimes in a client project. We don't have really time for that. But with open source software, you really must write documentation because if you don't have documentation, nobody will use your software. If you don't write tests, then it's a little bit scary to change the software. There's a commercial aspect of it as well. It shows the quality of our work. If you take a look at the source code, then I hope you'll conclude that we know our way around PHP and Laravel. And of course, we use those packages in our own project as well. So if we open up our compositor chasing, a lot of our own things are there. I'd like to take some time to very quickly introduce a few of those packages to you. Maybe they could be helpful to your next project. So the first one is Laravel specific one, Laravel media library. And in short, this one can associate files with eloquent models. So you have a news item model and you have a file, then you can put it in a collection of images. The second line of code, if you have an image in that collection, you can generate an URL to that so you can display it on your site. Now, the package can also convert images. It can generate thumbnails and such. And it can also give you URLs to those thumbnails. And if you're working with big files, then you'll be happy to know that the package has support for external file systems. So it's very easy to handle a big file and just copy it over to S3. So that's media library. We've also created a dashboard. So this is displayed on a TV hanging in our office. You can choose the latest versions of Laravel in view and it's easily extensible to add tiles to it. I'm not going to explain it in depth here. That's a talk on its own. Next, this is something we use ourselves as well. It's an uptime monitor. We use this to check if the sites of our clients are up. Now, there are a lot of other alternatives that you can use like Optime Robot and such. The benefit of our uptime monitor is that it's free and you can add as many hosts as you want. So it can notify you via Slack when something goes wrong. So the first screenshot there from our Slack channel, Laravel is down. It can notify you that a site comes back up. There's a second screenshot. And as a bonus, it can also warn you when an SSL certificate is going to expire. So a few days before it expires, you'll get a notification so your site won't go down. You still have time to refresh your certificate. So that's uptime monitor. Fractalistic. I love the Fractal package from the leaky. For those who don't know that, that's a package that can transform data to a format that can be easily used in your APIs. And I use the Fractal package from the leaky basically on every project, but it's a little bit unwieldy to work with. You have an array with data. Then you have to instantiate a manager. Then you have to create a collection, a resource, which is a collection of books and a transformer. Then you have to ask the manager to parse some includes to include the characters of the data. And then you have to ask the manager to create data for the resource and add it to the array. I can't remember this. So I made a wrapper around it that makes it very simple. So you create an instance of Fractal. You give it a collection. You transform it with a certain class. You include the characters and you are going to transform that to an array, which is in my mind much easier to use. The last one that I'm going to touch on very quickly is our response cache package. And this one can be used to speed up your Laravel application immensely. And it does that just by caching your response. So whenever a request comes into your application, Laravel will handle it and make a response. And we are going to save that response. So the next time the same request comes in, we're not going to start up Laravel entirely. We're just going to send the saved response. And this makes your application much faster. Now, if you ever heard of varnish, varnish does a little bit of the same. But varnish goes a step further. Varnish doesn't even start up PHP when it has cached something. So varnish is a lot faster than this package. But the advantage of using Laravel response cache is that unlike varnish, the whole package can be configured in your Laravel application. And with varnish, you have to install extra software on your server and you have to learn how to configure varnish. And with this package, you can basically use it with the knowledge you already have. So that's a few of our packages. You'll find a big list on our company website for the Laravel specific ones. We also have a page for the framework economic ones, so check that out. Now, I should maybe have told you this earlier. Those packages are not free. They have a special license called postcardware. So if any of our packages makes it into your production environment, you're required to send us a postcard. And we have a wall in our office with all the postcards from people. And we're going to put them on our website soon as well. So that's the postcardware bit. Okay, now that's out of the way. Let's talk about backups. Now, the first thing that I'm going to say about this is there is really no one-size-fits-all for backups. Everybody does it a little bit differently and it depends on a lot of factors. The most important one is how your team is organized, how big your company is. Probably Facebook is going to manage backups in an entirely other way than a small startup. And everything that I'm going to say during this talk is really targeted at a company like Spasi where there are only a few developers, a lot of projects and no dedicated dev ops team. So how did we do hosting in our company in the past? Up until a few years ago, we made relatively smallish sites and we used shared hosting. It worked perfectly for us. The backups were done by a hosting provider. We didn't really have to do anything regarding backups ourselves. When we accidentally deleted the file or dropped the database, we phoned up our hosting provider and we just hoped that they could restore the database or the file. So that was our backup process. Excuse me. But as a company, we grew and now we get larger projects and we make bigger applications and we learned quite a bit about server management. Our excellent resources nowadays to get your feedback with server management. One of my favorite sites for that is a service for hackers which provides very good tutorials targeted at people with no earlier dev ops experience. And instead of using shared hosting, all our sites are now hosted on digital ocean. So it's unmanaged. And we use Laravel Forge and Ansible to provision those. So that's nice. But it's not always that nice. Let's tell a few horror stories. So this happened to me. So last year, one fine morning, I got this mail from support from DigitalOcean. Earlier today, our cloud operations team was alerted to some performance issues affecting the physical server that hosts your droplet. The damage was serious enough that this droplet was lost and not able to be restored. So if I summarize this sentence and they're just saying, your server is gone, sorry. And five minutes later, I got a mail, I kid you not, with a subject, you have received $5 of credit. So that wasn't that good. Our server was just gone from the one minute to the next poof. So I don't want to really bash on DigitalOcean for this because this can basically happen at any provider. And if you Google around a bit, then you find similar stories for every major cloud hosting provider. This is a mail sent to the user of RackSpace that experienced the same thing as we did. So you might think, okay, those fancy cloud service providers, they provide backups as well. So I'm just going to mark that checkbox, let them backup my sites, and I'm good, right? Well, here's a third horror story. It's one by Taylor Otwell who made the Laravel framework. And in the beginning of last year, there was a big denial of service attack at Linode. So all the servers were down. But yeah, he ticked the backup box, so he thought, yeah, I have backups. I'm just going to restore it and then I'm good again. But what did he find out? The entire data center of Linode was down. And sure, he could see in the interface that he had made backups. But the backups can only be restored to the same data center. So literally, he couldn't restore his backups and all the sites were down for a few days. This is not good. So what can we learn on this? Relying on digital ocean backups or in general backups of your cloud service provider really isn't enough. And in the case of digital ocean, they only take weekly snapshots. So potentially a lot of data could get lost if your server crashes on Friday and the weekly snapshot was taken on a Monday, then congratulations, you've just lost data for an entire week. And also, as Taylor learned, if the data center is down, backups can't be accessed or restored. Relying on the backups of your cloud service provider certainly is not enough. How can we solve this? Well, it's very simple actually. Just don't put your eggs in one basket. Take care of your own backups as well. And there are a lot of options to do this. First one, a bash script. Second one, you can use a hosted service for that. Or you can use open source packages that can help you. Let's dive into each of these options. So the first one, a bash script. Now, I've mentioned service for hackers before. And Chris Fidao, who runs the site, made a script to backup a server to S3. And what it does, it dumps the database with MySQL dump and it uploads the dump plus the files that you want to backup to S3. And he explains how to schedule the script with Cron so it runs frequently. Now, the downside of the script is it works really well. But the only thing that bothers me a bit is that there are no notifications when something goes wrong. If your Cron has problems with the script for any reason as problems, you won't get notified of that. And you think you have backups, but you don't have backups, which is not that good. Next option you have, a hosted service. And I picked automatic.io here because it's a little bit known in the Laravel community. It's basically backups as a service. It's not free. It'll cost you. But they bring a lot of nice things to the table. They have an easy-to-use interface where you can figure what should be backuped and what not. And you can also use their own storage, which will cost you a bit. Or you can attach S3 to their service so everything that you want to backup is in S3. But this will cost you a little bit. There's a great backup solution called BackupPC that I use myself, too. It's open-source software and it's free. It can be installed onto a control server. And then it'll just SSH into every server that needs to be backuped. And it will use rsync to copy the files over to its local hard drive of the control server. Now, a very cool feature of this package is that it can use hard links to save disk space. It works a little bit like Apple's time machine. So if the software detects that two files are the same, it only will store it once on the disk. But if you list the files, you will actually see separate entries. They basically point to the same physical size on disk. So this can result into an impressive gain in disk space. Certainly for PHP projects, if you have a lot of the same size, mostly you have the same files in your vendor folder, and it will only store a unique file once there. The only downside of backup PC is that some system administration knowledge is required to set it up. It really isn't easy software. You need to know your way around the server and how you configure mail servers a little bit. And if I'm honest, the interface is really but ugly. But it does its job. So I'll give it that. So last year, I thought wouldn't it be very nice if we had a very easy to use backup solution? Because how hard can the problem be really? Because Laravel provides already a lot of the things to make a great backup solution. So I put in some work and created my own package called Laravel Backup, which I'm going to introduce to you now. So in short, what can this package do for you? It can backup files and databases to one or more file systems. It can clean up older backups. It can send notifications when something goes wrong. And it can easily be installed into any Laravel application. So the package is gaining some popularity. It has now been downloaded for over 250,000 times. And it is being downloaded 1,000 times a day now. So in that little drop at the end of the graph, that's Christmas and New Year. You see that on every package on packages, PHP developers take massively time-free at that time, which is good. So let's dive a little more in detail into what the package can do. So the package is called Laravel Backup. So it's good that it can backup files and databases. How does it do that? Well, you can, in a configuration file, you can select files that need to be backed up. And the package can also dump your database to an SQL file. And all those files will get zipped. And that zip file will be copied over to an external file system or to the local file system. It's what you want. You can even copy it to multiple file systems at once. It can also clean up old backups. Because backups can certainly take a lot of storage. And if you're using something like S3, you are going to pay for every byte you use. And it isn't really necessary to keep larger backups from every day from three years ago. That's just insane to keep those. So the package has an artisan task to clean up the older backups in the same way. And this task is fully configurable. And what is also built in in that artisan task is no matter how you configure it or misconfigure it, it will never delete the youngest backup. So whatever you do with that clean command, you'll always keep one backup. That's the fail-safe. And to delete backups, it uses the grandfather-father-son rotation scheme. I programmed this cleanup class. And later on, I found out, hey, there's a name for this. It's called grandfather-father-son. Let's explain that scheme. So imagine that you have every backup that you made for like three years on a disk somewhere. If you run the cleanup, what does it do? It will, for a specified amount of time, it will just keep every backup for like the first seven days or something. It will just keep every backup. Then after the first seven days, it will delete every backup except one per day. So you have daily backups. After that period, it will only keep weekly backups, then monthly backups, and so on. So you have a lot of recent backups, but you have less of backups that are further down in the past. That's how that rotation scheme works. What can a package also do? It can monitor your backups. So it can detect when no backups were made in a certain amount of days. And it can also detect if there is too much storage being used. Like I've said on this tree, it will cost you money. If for your little site, you're going to keep backups for like 15 terabytes, then there's probably something wrong. And the monitoring part of the package is also fully configurable. When certain things happen, when certain events take place, like a backup failed or the youngest backup is too old or the backup uses too much storage, the backers can notify you of this fact. And the notification part is also fully configurable. Out of the box, it has support for mail, slack, telegram and push over. But we leverage Larval's native notification capability. So if you want to have your notification delivered via another way, it's very easy to add your own driver to it. There's even a community effort where people are creating drivers for all notification channels. And I think there are already 50 different packages there. If you want to take a look, it's on GitHub. You'll find all those notification channel packages. So basically the package can send you a notification via every platform that you want. So I can talk a lot about this, but I think it's better if I demonstrate this so you get a little bit the feel of it. Let's do it. I'm just going to let this guy stop bouncing. Let's go into a Larval project that I prepared. So this is basically a Larval application. Those who use Larval recognize the structure and I've just installed the backup package into it. Now the package can be configured via a configure file called LarvalBackup. Let's review what's in that file. Is it big enough? Can everybody read this? So the configuration is split up into four different sections. First you have backup. Then you have notification where you can configure everything regarding notifications. There we have the monitoring part that you can configure and then we have the cleanup part that can be configured. Let's take a look at the backup part first. So that one is split up in the source. What are we going to backup and the destination? To where are we going to send those backups? If I look at the source, there we have files that we can backup and databases we can backup. If I take a look at files, we are going to include our entire application here and we can exclude some directories like the vendor and the node modules folder because you can build that up that isn't necessary to backup. Now in a real-life case, you're not going to backup the entire application. You're going to backup a directory that has your user generated content like the uploads or something like that. Here you can see that we are going to backup the MySQL database and that MySQL key that corresponds to the name of the database that is being configured in the database part of Laravel. Let me make it a little bit smaller. Here you can see that Laravel has some connections and we have the MySQL connection here and you can see that the credentials are being set here. They are not necessary to specify credentials in the backup file of Laravel backup itself. We're going to backup some files, the database, to where we are going to back it up. Well, that can be configured in the destination part. Laravel has built-in support for using multiple file systems. It calls those file systems disks and you can put names of disks here. I have a disk called backups here. Let me show you where those disks can be configured. Laravel comes out of the box with a file system configuration file. If I open this one up, you can see that disks are being configured here. We have a disk called backup with a local driver. It's a local directory. In the root we can see which directory that is. The backup disk is a local directory. You can see here that there are a few other disks configured, like S3, which we are going to use a little bit later on in the presentation. Let's open this again. Let's go over to the command line. I'm here in the directory of our demo application. For those who don't know Laravel, Laravel has a task runner called Artisan, which can be used to perform command line tasks. You can see that by installing our package, you gain a few commands. Clean command, list, monitor, and run. The run command takes care of backups. The clean command that will clean up your older backups and monitoring command that will send out notifications. Now those commands, you can execute them on the command line, which I'm going to do later. But Laravel also has a built-in scheduler, which can execute commands for you. The scheduler, you can add commands to the scheduler in the console kernel of Laravel. You can see here it's just a demo thing. You can put any value you want. I've scheduled the backup run command to run daily at two o'clock at night to clean up command at 10 minutes later, and the monitoring command five minutes later. But you can schedule it any way that you want. But for this demonstration, we're going to manually run those commands. So probably you should let you see this. I have an alias A, which is alias to PHP Artisan, so I just have to type A to execute Artisan. If I run backup list now, you can see there are no backups present, which is not good. So it's not healthy to have no backups at all. You can see here the name of my application and the disks to where the backups are going to be stored. So if I run backup run here, you can see that it was very fast that it has dumped the database. It has determined the files that need to be backed up. It has zipped the files. It has created a zip file for those files. It has copied them over to that disk name. And it has been copying that zip over, and that's successfully done. So the backup is completed. Okay, let's take a look at that storage directory where the backups start. And sure enough, we can see I'm going to open up. Ooh, that's big. In the finer here, reveal and finer, we have a zip file here. Let's open that up. You can see here that we have our dumped MySQL file and that we have our entire application inside this backup. So our backup is successful. Cool. Okay, so that's backing up. Let's go back to the configuration file and talk a little bit about what are we going to do. Maybe the monitoring part here. So the monitoring part, that's the part that will notify you when your backups are unhealthy. And here you can configure how the package will determine if your backups are healthy. So the newest backup should not be older than one day. So if your latest backup is seven days old, then you will get warned by this. Then there's something wrong. And the storage used may not be higher than megabytes. Now for this demo demonstration, I've set it to 30K in production. You are going to probably set this to a much higher value. Okay, so backups cannot be higher than 30K. Let's make it a little bit higher. So if you run backup list, you can see the backups that are being made. And you can see here that the newest backup was two minutes ago. And we're using now 11K of storage. Let's make another backup. Okay, then we use a little bit more of storage. Backup run. Okay, good. And if I run backup list now, you can see that the backups are not healthy. And before that they were healthy. And the reason for that is that we use too much space. So I'm running these commands here now locally. Wait, I'm going to stop. Bouncing guy. I really can't handle that. Oh, sorry. If I run the backup monitor now, let's dive into the configuration here first. Then it will notify me. And here you can set up the notifications. This is a bit, the formatting is slightly off because the resolution is a little bit low. But you can see here that if an unhealthy backup was found, I want the notification via Slack. And that's what we are going to do now. So if I run the backup monitor now, you can see here that the backups on backups are considered unhealthy, good, good. But if I go to my Slack channel, I can see here that we have notification now. The backups on my site are unhealthy. And you get some statistics of why they are considered unhealthy. So that's how that works. So now I can close Slack so it won't bounce again. Okay. How can we fix this situation? We have used too much storage. Well, for that we are going to run the clean command. But before I do that, let's just take a look at how that clean command can be configured. So in the configuration file, we have a key here for cleanup. And we have a cleanup strategy here, which contains that Groundfather, Father, Son rotation scheme I was talking about. If you don't like that scheme, you can just implement your own cleaning logic here. But I think for the most part this is a good strategy. And here you can see that the period of that Groundfather, Father, Son rotation scheme, those periods can be configured here. So keep all backups for seven days. After that, keep all the daily backups for 16 days. Then keep all the weekly backups for eight days and so on. And after it has done that, it will apply this rule, delete all these backups when using more megabytes than 30K. So it will delete backups starting from the oldest one until it reaches this number. So let's run that now. So again, if I run backup list here, you can see here we use too much storage. If I run backup clean now, it has deleted one backup. So it's under that 30K and the backups are considered healthy now. So that's how that works. Let's lastly demonstrate that it can also copy the backups to S3. So if I comment this out and comment it out in the monitor as well. And if I run backup list now, you can see here we have another disk here now S3 and those backups are considered unhealthy because there are no backups present. And if I open up my droplet, this is the view of my droplet. You can see that it's completely empty. If I run backup run now, then you can see here that it will not only have as copied to zip to a disk named backups, it has also copied to a disk named S3. Let's refresh our S3 droplet. And sure enough, we have a new directory here with a backup of the application of today. If I run backup list again, and yeah, this is a little bit slower because it uses an internet connection right now, you can see here that those backups of S3 are healthy, but the local backups aren't again. So let's fix that by running backup clean. And if I run backup list again, then all backups should be healthy and I've backuped my application to separate file systems. So that's how that works. So that concludes our little tour of the package and what it can do for you. Some best practices for monitoring your application. So I've demonstrated here that I run the monitor inside the application that needs to be backed up. But you should be aware that your Lyrofile application can break. If you have, like for instance, a syntax error inside your index.php file, your Lyrofile application won't run, your backups won't be made, but the monitoring part, which sends a notification, won't run as well, so you won't be notified. That's bad. Or maybe your server may be down, then you don't get notifications or backups. So what I encourage you to do is just install the package inside Lyrofile application on a separate server and monitor the backups of all your applications from there. So in a little scheme, it's this. You have your applications on the top. They all are using the backup package and they backup to a larger external disk. And then at the bottom we have our monitoring server, which will just take a look at the backups of that backup disk. Let me, to make it a little bit more clear, let's head over to the configuration file again. You have your monitor backups. You can see here that it's an array. So you can monitor multiple backups as well. So you can put the name of the second application here and you can monitor as many backups as you want. So that's how that works. You can also use the package to backup non-Lyrofile applications. How can we do this? Well, it's very easy. You have your server, which runs your symphony application or your WordPress application or whatever. You just need to install a Lyrofile application on that same server and just point all the configuration to the directories of your symphony application or your WordPress application and just use the backup package. It's very simple. So that's how that works. So in conclusion, what are the benefits of our backup package? It can backup things to multiple file systems. You will get notified when something goes wrong. It can clean up all the backups. It's very easy to install. You don't need to have any system administration knowledge for that and it will only cost you one postcard. Now, there are also a few drawbacks as well. I'd only use this application for small to medium sized apps if you have a larger app probably. You're going to have a dedicated DevOps team. Let them handle your backup. Another drawback is that this package can consume quite a bit of disk space while backing up. So on the server that needs to be backup because it has to create a zip on the same disk. So you should have enough free space on your disk so in order for that zip to be created. And your application also has credentials to access the backups which you should be aware of. If one of your servers gets hacked, then the attacker might gain the passwords to your backups as well. You should be aware of that as well. Currently, there are no restore options built in the package and I don't think there will ever be automated restore options because I think every incident needs to be solved a little different. Sometimes you restore just one file. Sometimes you restore an entire directory. Sometimes you just restore a table. Sometimes the whole database. And I think when something goes wrong, that's what I like. I like to investigate and fix it myself so I know what's going on. I think the automated restore options can be a little bit dangerous as well. So that's why I haven't included them in the package. So everything that I've said during this talk is also noted in the documentation of the package. We have documentation sites where you can find everything that you need to get started with this package. So what are the requirements to use this? We maintain two major versions. Version 4 is the most modern one. You're required to have PHP 7 running. It works in Laravel 5.3 and 5.4 and it uses Laravel's native notification capability so you can use any driver that you want to send those notifications. And the older version that works fine with PHP 5 and can be used into Laravel 5.1 and 5.2 and it has a custom notification system because Laravel didn't have any notification capabilities back then. And we have built-in support for mail, for log, for Slack, Push Over and Telegram. So this talk summarized in one slide. Do not rely solely on the backups of your service provider. Take care of your own backups as well. There are many options available. You have that DIY script, you have open source software maybe Laravel backup can help you out as well and it will only cost you one postcard. So that's everything that I wanted to say about this. Are there any questions? What if you don't have MySQL as database? Can you skip the database and just backup the files? Then you just leave the database part empty and it won't dump anything at all so then you are just backuping the files. Okay, cool. That was a good question. You said this is for a small and medium-sized application, right? Yeah. So I would say for medium-sized applications sometimes we pretty much use AWS. I think most of them started moving in AWS now and then AWS has its own backup system, right? So I feel like it's good for a small application for deep people who just manage one or two small websites. But I'm trying to understand how we can utilize this tool who already have hosted in AWS and they have their own backup system. So your question is if you're always... I feel like it's little... I wouldn't say it's not useful but I feel like, for example, I have a small website but I'm hosting on AWS and it has its own backup system, right? In that case, I possibly don't use this one because I have backup on AWS and whenever I'd like to restore, I can go ahead and restore. Yeah. I've tried to emphasize during the talk that everybody has its own strategy of doing things and you should do what feels best to you. I want to underline that don't really rely on one partner, rely on something else instead, and I really don't care which other thing that it doesn't have to be our package as well. You should do what feels good to you and what's appropriate for your application and for the people in your team. If you have somebody in your team that knows Amazon services and RDS really well, just use that. So it's really different for everybody. So in that case... To download the S3 backups again, well, it isn't automated but if it's on S3, you can do whatever it... Sorry? Yeah, that's a manual process. Let's discuss this after the presentation. If you have any questions more, I'll be happy to show you around a little bit. I was wondering if as you said, if someone gains access to this website, he also has access to your backup service because you have to store your login credentials. Are the backups themselves? Is there a way to encrypt them? Yeah, so the package itself doesn't have any options to encrypt them but you can configure an S3 bucket in such that all contents that is written on there will be encrypted by S3. So that's how you can do that. And one way to mitigate the problem of if one of your sites gets hacked and an attacker can gain every backup, you can add users to S3 that only have write capabilities but no read capabilities. So that's one way to protect your backups on S3 as well. So you can encrypt them there. You can use good credentials to only write them. So that's how you can solve that problem. I've got another question. It's a little bit like the one before, I think. You said it would be a good thing to set up a separate server as a monitoring server. But will there be a feature in the future that you can set up a separate backup server that not connects to another server to do an upload but do a download and save it locally? So the question is how the backups can also be copied over to another server, right? Yeah. So that you don't have to install your backup package on the server you're back up in, but on the backup server. Yeah. That isn't supported by this package. If you want that, then I suggest you use something like a backup PC which does allow that, which walks into a server-wide SSA connection and our thing is that. But that's an all different strategy and that makes it a little bit more difficult to configure. But it's a good idea to have it sometimes. Yeah. Because I like the package itself, so it would be nice. If something goes wrong, like your S3 credentials have changed or permissions on the file system, do you get notifications for that? Or will it just file silently? Sorry? Will it file silently or will we get slack notifications come through? You'll get slack notifications that the backup has failed. So whenever that copying process fails, an exception will get thrown. The package will catch that exception and send you a notification. Okay. Hi. Oh, sorry. When you backup files, every time you backup all files, there is no option. There is some basic backup and you backup on the new files. Well, I haven't touched upon it in the presentation because I consider it a little bit more advanced feature. But whenever a backup is being made right before the zip file is created, the application will send out an event containing all the files that will be backed up. And you can catch that event and there's a way of letting the package know not this file, not this file, not this file, or add this file. So you can add your logic yourself to it if you want to have specialized things. You could, for instance, say that when the event is fired and you're catching the event and it's, let's say, it's a weekday, I'm not going to backup these files. I'm only going to do that in the weekend. That's how you can do that. That's covered in the documentation. More questions? Yeah, one in the back. It's credentials when somebody hacks your site. What if you create, maybe you plan to create a manager like a wrapper, something for these backups? So the backup software itself is installed into each server. But a manager into the dedicated server which starts and downloads the results, the backup files from each server. And you need only to take care about the backups. That's exactly what the backup PC package does. It will just go into every server and it will back everything up. But for sake of simplicity, this package does it a little bit different. So yeah, if you only have one application, you just can use your own application for managing the backups. The emphasis here is on simplicity. But I do recognize that there is also value in doing it the other way around. And in my company, how we do things is we do them both, actually. We use our package to create backups, but we also use backup PC to copy over everything. So we do it, we have like three backups for every server and what the Cloud hosting provider does already. So yeah, it depends on how you... of your personal preference, I think. I think the main line is always do multiple backups. Okay? Just a really quick question. What does it... You had talked about being able to install this in its own in Laravel app that you could then back up non-Laravel projects. Yeah. So what does the package actually rely on Laravel-wise that you could install it itself without having to install a full Laravel application? Well, Laravel already gives you a lot of components to easily work with. Like for instance, the external file system part where disks can be easily configured to copy them over to external file systems. Laravel gives you that. Laravel also has the whole system for the notification drivers. So everybody can just add a little driver and the notification via that channel will work. And yeah, we also use the scheduler of Laravel with which you can easily schedule those commands. So there's a lot of infrastructure that Laravel has that we can simply use. Now I do recognize it can also be built with custom components. Like everything can be built with custom components. But I thought this is for me the easiest way to take care of this because I use Laravel quite a lot. And for me it was the easiest way to implement such a system in a fast way because I'm very familiar with those components. And I think for people inside the Laravel community it's really easy to work with. And I think for people outside of Laravel it really isn't that hard to get started with it because every single component is very good configured. So yeah, I think I've made it to Laravel because I just know it well, okay? Okay, any more questions? Okay, then I think we're done. I've copied over the slides of this presentation to Speaker Deck if you want to look at them again. I want to grow a little bit as a speaker so I welcome any feedback on joined in. Take a look on our company website on the open source page. Maybe we have made something that you can use and if you're interested in Laravel and modern PHP development take a look at my blog or subscribe to my newsletter. So that's it. Thank you.