 Today we're going to be looking at a program called Axel to speed up your downloads. Now when you're downloading from a server such as this one here to your computer down here, now even though you have a certain amount of bandwidth allowed by your internet service provider and the server has probably more bandwidth than you do, the server may say oh for each person downloading we're only going to give them so much bandwidth to download this file. So when you connect to it it's sending it at a certain speed but with Axel does you have two options you can connect multiple connections to the same server allowing you to split and get more bandwidth from the server. Now although you can do this it could be frowned upon because now you're eating up the bandwidth of that server by making multiple connections and there's still of course a limit of your bandwidth that you're allowed to your internet service provider that's limiting and you can't get past that without paying for more bandwidth but if your bandwidth is this big and the server's only giving you this much on downloading a file you could do four connections and speed up your download. Another option would be is sometimes servers have mirrored servers so even though you have that original server you may have another server here and here that you can download from and Axel allows you to do that too. You have the same file on multiple servers and you're able to split up and download portions of that file from each server using Axel. So let's go ahead and get working and see how this works. Okay so Axel should be in your repositories by default it's built A, X, E, L and so for example on a Debian based system you can use aptitude or aptget and just aptitude install Axel of course either as root or sudo. I already have it installed once it's installed the basic command is similar to that of wget or curl. In fact we're going to use wget as an example here first so I'm going to say wget I'm going to paste in a URL for an ISO of tiny core their larger size one their core plus which is I think 75 megabytes enter here and this is definitely a site they must really throttle back users because even though the distro is real small it tends to take a long time for it to download and as you can see right here we're getting four to five hundred kilobytes it's slowing down now so around four hundred kilobytes or kilobits a second and it's going to take approximately two minutes and 15 more seconds to download. I'm going to hit control c to cancel that I'm going to remove the file that we just started to download excuse me now I'm going to use Axel and I'm going to say dash and 10 this says take make 10 connections to the server that might be overkill um depends on the server but 10 is the number I tend to throw out there I'm going to paste in the same exact URL so Axel is going to download just like wget did only it's going to make 10 connections basically pretending to be 10 different computers downloading so I'm going to do that and as you can see already we went from 400 kilobits a second we're up at 1600 1700 kilobits a second definitely just right off the bat we're getting five six times the speed by doing that now that you do want to be careful though as I said I tend to use the number 10 I've tried 20 before uh with different servers and some servers will see one ip making that many connections and block you for a while um plus it's just kind of rude to do too many connections to a server at once especially something like tiny core I'm assuming since they throttle back so much it's probably uh budget restraints they probably can afford so much bandwidth it's an open source project not the biggest project probably doesn't have a lot of backing so you know you might want to think about who you're downloading from is it worth you know uh eating up their bandwidth you know if you're downloading something from google um that's large it just is an example they can afford the bandwidth they're just limiting everybody you know for efficiency use but making extra connections is an option um another option would be you could put multiple URLs to different mirrors so a lot of Linux distributions you go to their website and they have all these different mirrors well if it's the same file you just do you know I'll make something up here just as an example something a little shorter here I'll just call it linux.com forward slash core.iso and maybe they have another mirror that is like linux net.core.iso and another one that maybe it's it's like um .uk or something like that so it's the same file on different servers in different places you could enter this is I just made this up so it's going to give me an error not found but it will actually make three different connections one to each of those servers download so you're getting the benefit of multiple connections but you're not hogging down on each server so that is another option kind of like you know maybe yeah look at like torrenting but from actual servers rather than other people downloading the file to where you're getting bits of the file from each now also another thing axle automatically will start download uh uh continuing where you left off so if you were to control c like we did up here and you were to run the same command download the same file it would um continue right where it left off you can do the same thing with wget if you do the dash c option so it's just axle does that by default so that's a quick look at axle I don't use it all the time but when I'm downloading larger files like maybe like a 4 gig iso I might hit a few different mirrors using axle and might make more than one connection to each of those mirrors depending on again I'm limited on my internet speed um but definitely can increase also if you do make a lot of connections you might hog down your own network so if there's other people on a network you might bog down their connection so we can search other people on your network but if you're home alone downloading a large file there you go axle will help it's a great little program and I hope you enjoyed this tutorial and I hope you continue watching them consider going to my patreon page there should be a link in the description if you want to support and also be a little more interactive with me I thank you for watching and as always I hope that you have a great day