 All right, I think I am live. This is Jim Groom and I'm here for an episode of Reclaim TV. This is just an impromptu episode where I wanted to talk a little bit about some of the work I've been doing and I'm trying to continue to do for essentially offloading large WordPress sites, so bigger WordPress sites with a ton of media. I'm talking like hundreds of gigs of media that are right now part of the server space and that when we're dealing with something like Reclaim Cloud or even a digital ocean or wherever it is, having a server with hundreds of gigabytes of storage space makes that server very hard to migrate often or at least a long time and it also makes it a little bit less agile, if you wanna move it, but also delivering that media if you wanna deliver it faster. A lot of times it might make more sense to offload that media to something like Amazon S3 or some S3 compatible service like Digital Ocean Spaces. You can even do an open source one like Minio. There's a lot of different S3 compatible, basically object storage where you can essentially take the media and the files and separate them from the server infrastructure and by doing that, you kinda decouple the server and the storage, which oftentimes makes moving that server at least when we're in the business we are, of hosting easier. So anyway, I've been doing this for a couple of sites. I did this for Baba Tuesdays which I'm not gonna pretend is a very big, like it's not like I have hundreds of gigs of servers. In fact, Baba Tuesdays was just a way for me to test it. And so I'll share a little bit about what I did and what that looks like and hopefully this makes sense. So let me get in here and share and here we go. Let me put this there. Okay, so this is obviously the blog I was just talking about. This is my blog. This is the great Baba Tuesdays for life. Anyway, I have about 20 or 30 gigs of storage in Baba Tuesdays, not a ton. But I use this as a way to test out a plugin that I'm using to do this. And this plugin has a free version and a premium version or a pro version. I'm using the pro version and that plugin is called, and let me kind of remove myself here. That plugin is called WordPress Offload Media or WP Offload Media. It's developed by the folks at Delicious Brains which essentially were bought by WordPress Engine or WP Engine. So this is the plugin I'm using. I have played with the Lite version. So that's the free version, which gives you a certain amount of options. But if we're gonna do this at Reclaim, we're gonna do this to get server or WordPress instances that have hundreds of gigabytes of storage off so that we can free that up for other instances on the server. So it's pretty kind of practical use case. So anyway, let's take a look at the settings of this. It's kind of a nice tool. I've kind of really enjoy it. You'll notice it has storage settings for Amazon S3, which is kind of where I am putting files for Bava Tuesdays. And you'll notice the URL I'm using for offloading files is files.BavaTuesdays.com. This is what might be considered a domain alias to get this stuff in there. And I kind of enjoy this playing around with domains. I've always enjoyed playing around with mapping domains. So rather than it being an S3 URL, it's actually looks like that media is coming directly from Bava Tuesday. So there's that sense of consistency, even though it's all being stored from S3. And we've been asked by people who will potentially be doing this. So one of our clients has a terabyte of data that they want to serve, but they understand the value of getting an outside of the server and onto something like S3 or some other object storage service. So that's why I'm effectively testing this. So anyway, this is working. This is offloading the media. If I was brave and I'm not that brave yet, I will pull the RIP court here. I can turn this on and this will essentially get rid of all of the media on my server because Bava Tuesdays isn't very big. It doesn't have hundreds of gigs. I haven't done this. But for those that I will, that will happen. And then the bucket is essentially a path. And I can kind of log into S3 and show you what a bucket looks like and how that works. But the bucket is essentially, it's basically files.BavaTuesdays.com. That's the name of my S3 bucket. And then this is the prefix that will go after that. So slash WordPress content slash uploads, right? So this is kind of pretty standard in WordPress instances. The uploads are stored in WordPress content in their own directory. And then there's date, year, et cetera. And that happens here, right? So this is basically all of this information is recreating the paths and the structure for the uploads but on S3. And they do a very good job of kind of taking you through that I don't use object versioning, but that might be a way to preserve it should you have multiple objects you're saving. I'm not entirely clear about the values of versioning but I'm sure they are there. Someone may know more, feel free to talk about that. I'm not that guy. The other thing is this gives you a preview of the URL. So here it is, HTTPS files.BavaTuesdays.com, WordPress content slash uploads slash year month and then the file name. And so this is telling you how that domain will be rewritten and still look as if it's happening this way. In order to do this, there's a couple of things I've had to do. Let's look at those. One of the things is I had to work with Cloudflare. So in Cloudflare, I have created a subdomain, files.BavaTuesdays.com and I took a C name for that subdomain and I pointed the C name to files.BavaTuesdays.com and I think it's .amazon.aws.s3, actually let's see. So rather than me guessing, there's a very specific structure for that. And I'm gonna bring up Cloudflare and I'm gonna show you what that looks like. I'm gonna kind of take you through the process of what I did, if that's at all useful. We will see, oh, there it is. Okay, so here we go. This is BavaTuesdays. I'm managing my domain for BavaTuesdays in Cloudflare. That has made things easy for me as I'm doing more in the cloud and I can redirect things to different IP addresses or C names as you'll see here. So let me find S1, yeah, here it is. So here it is, let me pull this over and show you what that looks like. Yeah, okay. Give me a second. I'm not, I'm sorry if I'm not looking at the stream right now in case anyone's saying anything intelligent. I apologize. I'm not the best at multitasking always, but I will take this because I just wanted to do a stream and share this. So anyway, here's a C name for files, which is the subdomain for BavaTuesdays and it's target or it's pointing to files.bavaTuesdays.com, .s3, .AmazonAWS.com, and that's the name of the bucket. I have named the bucket files.bavaTuesdays.com. The rest of that is essentially the Amazon URL that is gonna allow it to resolve. And I'm essentially hiding that and just making it look like it's a BavaTuesdays URL using this C name. So that's how I point to the bucket. That's how I kind of create the alias. Let's go back to BavaTuesdays now and how are we kind of delivering those assets, right? And in order for the domain alias to work, you have to name the bucket the same as your URL, like files.bavaTuesdays.com, that's an important point. And then I am using Cloudflare also to deliver the media in the CDN, which is quite nice. And this is allowing me to map that C name, files.bavaTuesdays.com, and it's also giving me HTTPS Cloudflare, which is nice. I really enjoy that. So I'm doing this particular instance with Cloudflare, but I'm trying another one right now to just use S3 and Amazon's basic CDN Cloudflare substitute called Cloudfront, and we'll talk about that in a second. So anyway, this is all set up. I have a custom domain, I'm delivering off-loaded media, so it's not coming from my blog, it's coming from S3, and I'm forcing HTTPS. All of that's fine. This will give me a sense, and this is kind of nice. You can go in there and you can kind of check for assets, right? There's a, hmm, assets cannot be delivered from the CDN, an error occurred while testing the domain, right? So I don't know what happened there, but I might have to look at that and see. I do think that it's working. We'll go and test, but I'm pretty sure they're working. That's weird. That's the first I'm seeing that. And then this is where you go. This is a tool which basically allows you to copy once you've set up your S3 and put those details here in WordPress off-loaded media plugin. It will copy those files in your media file over to the S3 bucket, which is quite nice and it will tell you how long. What I do, and I wanna see if it's made a difference, is I sync those files beforehand, and then I go through this, and I think it actually goes through the motions and knows they're there, and so it's much quicker, especially when you have hundreds of gigs of media, but that's something I have to confirm. And then license, this is not free. You pay for up to what you use. I'm paying for like, I don't know, 10,000 objects right now, and if I wanna get unlimited, it's not cheap. It's like a hundred bucks a month or something. But in our situation with Reclaim Hosting, it kind of makes sense because we save a ton of space on servers that we pay for to do that. So it really is a matter of what you need it for. They also have a free version should you go that route. But let me just see something. I'm gonna look at a file that I uploaded. A lot of these are from Flickr because my video game stuff is in Flickr, but I think I uploaded this one. Let me just see what the URL is on this one. So let's copy image address, right? That's a little Brian Mathers visual thinkery, and you'll notice, look at that. It's files.BavaTuesdays.com WordPress content, et cetera. And you'll notice it is delivering from that URL. It's not kind of broken. So I don't know why that plugin is suggesting there's a problem there. I'll have to go in there and confirm as much. But if I go to the WordPress offload media, even though it says it's not delivering the assets or maybe that they're not being delivered by a rewrite. So maybe because I defined that, assets cannot be delivered from the CDN. So I'm gonna have to look at this and see why that's happening from the endpoint because I'm still getting them. Maybe it's just directly doing them, but I have to believe it's going through S3 and CDN, but I don't know. I'm gonna have to check that. So anyway, that's the WordPress offload media. Now, one of the things plugin, one of the things that's interesting is if I go here to edit, I'm gonna be using Amazon S3 for this other site that I'm gonna be offloading to S3, but then using CloudFront. But you can use other services, right? So right now I'm using Cloudflare for their CDN. But I also could use something called Amazon CloudFront or just S3, which is slower. So that's why you wanna use a CDN because it's much faster. There's something called Stack Path, I don't know. So the two differences is, you're offloading and storing the media somewhere in S3, but then how are you delivering that media? And that's where the CDN comes in and is useful. And Cloudflare offers that, but so does Amazon CloudFront. So for this other instance, we're gonna be using Amazon CloudFront and we're gonna be playing with that. Now, I'm not using it for this so I don't wanna change anything. But let me give you a quick pick, a quick peek at what I am playing with. So let's see, boom, boom, boom. No, yeah. Let's see if I can get into this. Yeah, let's get in here. So I'm gonna get into Amazon right now. This is the Amazon AWS service, right? And you'll notice I'm doing this, the ePortfolio is from Aquali, which is a big WordPress multi-site. It's been around for many a year. They have about 600 gigs of storage. And that's not great when you have a four, five terabit storage server, big server that's running some of this reclaimed cloud or reclaimed press stuff. So what we're doing is we're gonna try and offload that 600 gigs of media to an S3 bucket, which is titled files.eportfolios.mcquali.cuny.edu. This files is kind of like the files.babatuesdays.com. It's basically a space where I am creating a CNAME and then pointing that to another service. In this case, I wouldn't be pointing it to Cloudflare. I'd be pointing it to Amazon Cloudfront. So I have to create a CNAME for this or CUNY would create a CNAME for this, point it to the Cloudfront URL, and then we'd be able to deliver all of the files that we have here in the academic commons blogs.dir directory. And so here we are and there were a ton of files and I copied them over again, 600 gigs. And so I am close at this point to enabling Cloudfront. Once the CNAME gets pointed to the Cloudfront URL to integrating Cloudfront and S3 for all of these files and then hopefully I can get an HTTPS cert from Cloudfront and essentially 600 gigs of files that would have lived on the server and taken up space will now be living on Amazon S3 saving space and being delivered very fast through Cloudfront. Now I'm gonna have to, I like to do this because I also want to get a sense of like what are costs on this? Like how much will it cost? What will the bottom line be? And what is the value? Like how much does it add? Does it need to be super fast? Can it be just S3 slow? Depending on if a lot of this stuff is not really accessed. So it will allow me to get some access stats and then say, you know, this is great to have as an archive and as storage but maybe it doesn't need to be lightning fast depending upon the service. And then this other client who might be wanna do a terabyte has some very specific kind of permissions needs based on files that they're serving through the WordPress multi-site which is why I'm leaning towards Amazon's S3 and Cloudfront because they have a lot more granular control. I love digital ocean spaces because it's so simple. It's like list the files or not. Like it's like dead simple, perfect. If you have a very simple like Macedon or setting up something, it's great. If you have stuff you have to do that's granular with permissions. S3 is complex and overly, you know, overly so, but at the same times you can do anything with it pretty much if you know how, which is a big if. So anyway, that's kind of where I am. I've been working with Chris Blankenship on this a bit and he's been great kind of helping me figure this out but this is something I've been playing with. I did it for Boba Tuesdays. I did it for DS106 which had about 100 gigs of files and now I'm trying to do it for some of these bigger schools. There's a couple of schools that have upwards of six seven 800 gigabytes that would save us a ton of server space and allow us to maximize the servers for just the kind of CPU and RAM and not just take up a lot of that space with kind of in some ways dead space which is the beauty of object storage. Anyway, that's a kind of overview of some of what I've been playing with with trying to kind of offload media from big WordPress multi sites into S3 and even if it's not a gigantic site like I liked the idea of having my Boba Tuesday stuff separate and cleanly over whether it's running through a CDN and Cloudflare. I don't know, it's definitely serving from S3 but I thought it was serving faster and now I have questions given that error. But anyway, I don't know. Let me go to, let me stop sharing my screen and then I'll go to the file reclaimed TV and I didn't even check if this stream was running or if I went to the wrong URL. I love reclaimed TV. Love the idea of it. Love the idea. I spelt reclaimed wrong twice. How often does that happen? But yeah, I think it's, yeah, it's going, right? And then Eric, of course, CDN, content, yet another computer storing? Yes, exactly. Eric, how are you? I saw that you got the reclaimed texture. That's great. But I do, part of what I still love about what I do is some of these setups. And I remember Tim, when we both first started exploring AWS and just cloud computing it was to get UMW blogs into the cloud and running on EC2 instances in AWS and then offloading the media on S3 and then understanding how you compartmentalized the server, the media and then the serving of the media versus the kind of power that gives that and then obviously load balancing so you can have many different servers and a lot of that stuff is kind of slowly working back into my workflow in terms of reclaimed cloud and now reclaimed press. Kind of further simplifies. I think the work since AWS has been things like digital ocean and there are many others simplifying. Cloudflare, like simplifying some of that. And the simpler we can make it and the more accessible and the more affordable I think the more it will change just how we think about computing and where we store our stuff. And I have found that Cloudflare has been a really great place to store media and to store not store but store DNS and serve media and has worked for us in particular it works for us in particular when we're doing multi-region setups meaning Cloudflare has low balancing built in. So if you have your DNS entries with two different servers Cloudflare will allow you kind of like AWS or some other space they're kind of doing a lot of the work just through DNS of directing your traffic and where traffic goes based on resources from your server. And that's wild because it's abstracted out a whole nother layer of server computing at the level of the DNS and it's the least issues like when your DNS goes something goes wrong there or whatever I mean there are different kind of issues but yeah it's interesting. So Eric I know Friday, TGIF I just wanted to get on cause I know we have the loyal band of one and I didn't want to let you down today I am talking just about kind of blah blah stuff of some of the work I'm playing with right now and that's kind of a recap of my week. I've been migrating Macaulay portfolios e-portfolios which is the honest college at CUNY and it really started to kind of strike me how important it is to get this stuff separate because this migration has become a beast because the database is several gigs the spiles are almost more than half a terabyte like it is just unwieldy. And so trying to get things in their place and then balance some of that it makes it a lot better experience I think for everybody. Yeah exactly I do have the back cog. So anyway I think that's it I just want to give a quick share on the work I'm doing and let people know I guess what I've been up to I know folks have been up to other things it's been super busy every claim as it always is and this month we set records in terms of tickets and support and we've hired someone new and we're still kind of mourning the leaving of Lauren and life goes on at Reclaim and I'm proud of the whole team of keeping it together and keeping things going, narrate to work. Yeah anyway thanks for watching Eric, big fan and thanks to anyone else out there if you're playing with this stuff and you I'll hopefully write this up I've moved the other thing I've been doing is playing with Reclaim Cloud I mean Reclaim Press and I've moved a lot of my own stuff into Reclaim Press and I frankly having all of the DS106 stuff and my Bava Tuesday stuff all the storage off has made my life and these transitions so seamless so it's just good almost protocol if you're in this business and you're managing stuff for a university to do that or for whatever service. Anyway I'll shut up now thanks again for watching big fan Jim Groom signing out.