Loading...

How to Find Expired Domains with Free Tools -Houtsgraphics

1,467 views

Loading...

Loading...

Transcript

The interactive transcript could not be loaded.

Loading...

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Sep 24, 2015

This video shows you how I find expired domains by scraping using free tools and a free process to filter the domains to usable urls http://bit.ly/seohouts for more tips.

The first thing I do is find a niche I want to get domains from, I did a couple searches for wordpress, and seo but I was using a second term links which is commonly used in all articles about seo or wordpress. I should have used some of the other variations like forum or resources, instead for a better result. I ended up with some good old link pages to jump start from. I like to use these because its a technique that was widely used long ago and since I'm looking for domains that are old and powerful why not target an old technique/footprint.

I will go ahead and search the url itself in Xenu, and then also take the url into majestic. I use a paid version but you could probably find another source like opensitexplorer and just go by the top 5 links for your jumping strategy. Another source would be siteexplorer.info, they also do a good job with the links.

So once I see the links and I can see the power of each of the links coming in, I target the niche I want to continue in and load that url and other url's related into Xenu as well. I just go down the list and put the urls pointing to the old domains in many times. Once I have about 15 urls running i'll wait for some to finish and export them, and replace them with another url.

I like to end up with 15 or 20 good size lists to filter down, most lists are only a few domains of 'no such host' and those are your potential expired domains.

I use excel, (actually openoffice) to filter out by column b to just the domains that return a code of 12007, or a status of no such host. I can open all these sheets at once and copy the filtered results into notepad++ From there I will remove the http:// and www. by using find and replace commands, and I select all and copy the list.

The list still has inner pages so I paste into excel and where you choose additional separator I will tick the box and put / in there to remove the inner pages from that column. Then I copy this again and paste it into notepad, again. For some reason if you just copy and try to past it doesn't give you another choice for creating an additional separator but by pasting it into notepad I again have that choice.

This time I will paste the list into the 3rd column and choose a period . as the seperator. I can then go to the 6th column and put in =len(a5) and this will create a 0 for all of the domains that aren't subdomains or co.uk domains.

Ok we're almost ready to check for availability, keep steady the end is in sight. I like to take any duplicates out of the list with an online tool then I use ninjatools bulk domain checker to check the availablilty of 500 urls at once. It works real good with .com .net .info .biz, not so much with the country TLD's once you have that list its back to majestic to do a bulk check to see what we have. you could use any tool to find if the domains are worthwhile, Moz, ahrefs, but I prefer to start with majestic and then when I'm satisfied with those results I consult Moz as well.

Happy hunting

https://youtu.be/ZJTout9-VUA

See more of my domain scraping tutorials on the playlist
https://www.youtube.com/watch?v=qHhqR...

Loading...

Advertisement
When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...