 Hey everybody, Cyrus here, ready to introduce our next speaker, Rob Uzby. Now, I first got to know Rob in the Philippines of all places. We were at an SEO event there, got to hang out with him for a few days, and my biggest impression of him was, man, this guy is smart. It turns out he was educated at Cambridge, which I hear is a very good university. So today, Rob works at the cutting edge of making SEO tools for marketers around the world. I mean, next-generation SEO tools and insights that are actually actionable, not just data that a lot of SEO tools give you, but action that you can work to increase your traffic and rankings. I'm excited to hear what Rob has to say. His talk is titled, Beyond the Basics, five SEO tricks for uncovering advanced insights from your SEO data. Please welcome Mr. Rob Uzby. Howdy, Moz fans. I'm Rob Uzby, the VP strategy at Moz, and I'll tell you right now, I had more content that I wanted to share than we can fit in this presentation. So if you follow me on Twitter at Rob Uzby, I'll have bonus content and some special follow-ups for you over the next coming couple of weeks. So when I think about the activities that we do every day as SEOs, it's obvious that some are easy and some are very time consuming. Some are valuable and some are less so. But these things, time taken and value delivered, don't necessarily correlate. Because I was a consultant for over a decade, I like to plot things on two-by-two graphs and I'd say we have activities in every quadrant. These are clearly the place to start. Do the easy things that have value and I hope you're spending as much of your time as possible on these items and delivering value to your company or your client as quickly as possible. Then there's a few directions you can go. Time consuming things that are versatile or do more smaller things that are faster but not so valuable. You can pick from both these quadrants. And then there's an obvious corner to avoid unless you've really run out of other things to do. But what if you can take things that are difficult or time consuming and make them easier? These might be activities that hold a lot of value if you can get them done quickly enough. So let me give you a metaphor. What if you didn't know that chickpeas existed and I told you about them? I told you there was a food that was delicious. It was really cheap, high in fiber, protein vitamins. It sounds too good to be true. And then what if I let you taste hummus, falafel, chickpea curry? You would be sold on this incredible grain. And then I tell you how to prepare these things. First you have to sort through the beans you want to check them for stones or debris. Then you have to put them in a bowl of water and soak them for 12 hours. Then you drain them. You rinse them. You get them in a pan with fresh water, boil them for another two hours. Then you let them cool. What? You tell me, well, that hummus thing you maybe tried was great. But I'm not going to spend a day to make it. If that was what I had to go through just to throw half a cup of chickpeas into a salad, I would find something else. But of course you never have to do any of that because canned chickpeas exist. They cost 69 cents and you could ignore all that other work. We took something that was high value but so time consuming that it wasn't worth it. But as soon as you cut down on that time, if you can make it high value and quick and easy, then you change the dynamics of the activity. It doesn't become something that you do more often. It goes from something that you would never do to something that you would do every day as part of your regular week. And of course, we're not just talking about your meal planning here. We're talking about your day to day life as digital marketers. And there are activities. Activities I'm going to show you today that are time consuming enough that you would never consider them worth doing. But if we can change the dynamics, if we can make them easier to do, then they could become something that you include in your toolkit that you use week in week out to add high value to your business or to your clients. I want to help you take the data you get from your SEO tools, whether that's Mars or Google or any of the other tools out there and make these formerly time consuming tasks much faster. And we're going to touch on three major areas of SEO. Keyword research, technical SEO, and we're going to begin with link acquisition. If you think that link building is dead or it doesn't work anymore, then I'm going to point you to the data my friend Paddy Mugen published on the Moz blog last month. He found that 97% of in-house SEOs say the budget for link acquisition will stay the same or increase over the next year. So whatever you think about the effectiveness of link acquisition for SEO, people, smart people are still investing in it. They're still working on money there. That survey, importantly, also asked about the most challenging parts of the link building process. One of them was about the difficulty of finding domains to get links from. This work is not easy. It can be kind of soul crushing at times. What if we made it 10 times more efficient? What I'm going to do here is show you a relatively simple workflow that can be very powerful for finding new link opportunities. I'm going to let you in on a technique that I used to use and it's the first time I've ever shared it publicly. So, of course, we all know you can go to Moz's link explorer or any of your favorite link tools, put in a competitor site, find places they've earned links and see if you could go after any of those. A slightly more advanced technique is to pick a couple of competitor sites. Let's say there's four different domains with intersections where there are links that point to more than one of the domains. This green circle represents your site. The blue ones are all competitors. This overlapping section in the middle represents the linking domains that point to all four of you. They're the places you've all gotten links from, but that's not really very interesting. You don't need to get links from there. You've already got them. This is the section to focus on. Places that link to all three of those competitors, but not to you. Those are links you definitely want to look into getting. And then this wider region, it shows the places that link to two or more of your competitors, but not to you. And again, this might be places you want to look as well. And this is actually a really old technique. It's called link intersect analysis. And Moz has a tool for that. Other tools may have the same thing. But the link intersect tool will let you put in your domain, put in some competitors and get those results out really fast. I used the example of PC Mag here to see what link opportunities I could find if I compare CNET and Mashable and PC World. And actually the first thing that the link intersect tool found was the CDC. They link to all three of those competitors, but not to PC Mag. So if I worked for them, I would dive in to find more about that and see what the opportunity is. But this technique has a couple of drawbacks. You have to select which competitors you want to compare against, which might mean you overlook some interesting sites. It's also not the fastest way to find topically relevant links, which might be what you need to discover if you're trying to promote a particular piece of content or you're trying to rank for a particular keyword. So I found that you can improve this workflow by turning it on its head. Instead of starting with competitors, let's start with the page you want to build to and the keyword you're trying to rank for. Let's say I own a brewery and I've opened a location in Seattle, so I'm trying to compete for that keyword brewery in Seattle. Here's the SERP that I want to compete in. Now, maybe I've heard of some of these sites and I know they're my competitors. Maybe there are some sites here that I've not heard of before. And maybe there are some sites that I've never considered to be a competitor of mine, but since they have a page that's going after this term as well, then I should definitely consider at least this page to be competitive with me. So we ask, what links do these pages have and specifically which sites are linking to these pages? If there's a domain that links to five or six of these pages but not to me, then I'm very interested in hearing about it. Do I care about pages that link to a number of these results? No, I care about domains that link to them. Because for example, if there's an industry blog that has three different posts, each linking to one of these pages, then we might want to look into whether we could get them to write a similar post that links to us. You can do this yourself using link explorer, Excel and some patients. Here's the recipe. But it takes ages to analyze any given keyword. You spend a lot of time waiting for your link research tools to run and then copying and pasting and trying not to mess up the formulas that you're using in Excel. The first time I did this for a client, they were a small site in the holiday home rental space. I found dozens of amazing domains that they needed to get links from just from analyzing their top couple of keywords. But it firmly falls into the quadrant of high value extreme effort, which means that most of the time you just don't do it. It's too much effort to put in without a guaranteed return since not every keyword generates great results. Fortunately, I work for the greatest SEO software company in the world with the smartest team and the most generous culture. So can you guess what we did? We built a tool that does all of this work for you. Now, this isn't yet a fully fledged Moz product. It's just a prototype that sits in our lab, but it works. It works really well and it works really fast. The lab is where we usually put these little prototypes when we're testing things out and we want to give a few people access to them to get some feedback. But on this occasion, I'm going to make an exception. I want you all to be able to play with this tool. I bet that any keyword you're optimizing for will turn up some link opportunities through this process, so I encourage you to use this tool and find out. Here's how it works. You put in the keyword you're trying to rank for. I typed in Brewery in Seattle and if you already have a page ranking for it, you put in that URL as well. And then within a few seconds, Moz does all the calculations we talked about before and spits out some link suggestions for you. This runs so fast that I think it's easy to overlook all the work that's going on behind the scenes there. But if a computer can do this a hundred times faster than a human, that's a game changer. Now, there is a slight flaw in our current prototype and that is that it's very thorough. It will often find a bunch of these domains that are legacy directories or search engine scrapers or SEO tools and they link to just about every site on the web. So particularly if you're analyzing a keyword where half the results are homepages, then you're going to find some of these at the top of the list. We've got CMO link and Forbes and Yahoo and a bunch that I've never heard of. But if they link to the pages you're competing with but not to you, then use your judgment about whether you think they're worth going after. But I find the real value in this tool often comes from scrolling down the results just a little bit. There are domains here for running events like Eventbrite and Airbnb, for example. So maybe I should do a Brewery tour experience and then list it on Airbnb. I see travel guides, review sites, the local university listings pages, all linking to a bunch of people who rank for this keyword and I could totally get included in all of these sites in these lists. There are local news sites here that make me think I should get in touch with them and see what angle I can pitch to get them covering my ground open. And if I scroll down to find the sites that link to only one of those competing pages, it's still pretty interesting. Here are six bloggers who've written about Breweries in my city and it makes me start thinking of other angles to and links, not only from these six bloggers but others like them. Like I said, I want to give you the chance to play with this tool while it's still in testing in the lab. But here are the caveats. This is an alpha environment, so bugs or delays could still exist and while in the lab support will mostly be provided through documentation and in the app itself, but you can still always contact our help team through the platform. And these alphas are meant to help us along the road of our product development work, so they will be available for a limited duration and then you might see functionality change or it moves into MozPro or disappears altogether. And finally, your feedback and honesty is highly encouraged. You can send feedback by the platform and we might tap you for some more questions as well during your exploration. But with that said, I would love for you to accelerate your discovery of topical links. If you're interested, mz.cm-mozcon-lab, you will need to be logged into Moz to access it. You'll need to be a Moz pro user or have a community account. If you don't, then please use the second link on here. This will get you set up with a forever free Moz community account. That will grant you access to the lab and to this tool. So here, my trick to start with is find a keyword, do a massive link intersect across all the results to find opportunities. Again, there's the link to play with the alpha app that we built if you don't want to have to do yourself. Okay, we talked about link building and now let's turn our attention to keyword research. For this, I'm going to show you how you might have some fantastic content and keyword ideas looking on your site that you might be able to explode and get more traffic for if you use this technique to identify them. As SEOs, we know how to target a keyword on a page. We go through the usual motions of getting the keyword in the title, the URL and heading and so on, and then we make sure that the content reinforces the relevance of that page to the keyword. But of course, any page will rank from more than one keyword. In fact, usually dozens or even hundreds of longer tail terms. And in the best case, the page is well optimized for some of them, but the reality is it will be poorly optimized for others. And in the worst case, and I've seen this a few times, the page is simply poorly optimized for everything. It's tried to go after too many queries and does a poor job on all of them. So at some point, you should ask yourself the question, do I want to target all these keywords with one page, or should I split this into two pages to target them separately? Which raises a pretty fundamental question. When can a page hope to rank for two different keywords and when should we split it? Well, we know some things intuitively. Here's an example from the fashion space. I took a page from a site, and the page was obviously about men's jeans. And here are all the keywords that the page already ranked for somewhere in the top 50, some head terms and a lot of long tail terms. We'd probably feel pretty confident that if we optimize for men's jeans, then you're also automatically optimizing for men jeans as well, which has roughly the same amount of searches. And you'd be pretty sure that you're also going to optimize for jeans for men as well. We might feel comfortable saying skinny jeans for men doesn't deserve to be on that page. This feels unique enough that we should try to target it on its own page. Our first guess would be that there's too many people creating targeted content for that keyword, whether it's blog posts or category pages, that we ought to make sure we have a dedicated page about it as well. So the approach I'd recommend is to take all the terms that a page ranks for and say, this group of terms are all very close to each other if we optimize for one, we optimize for them all, and we can target them on this page. And then we also want to see if you rank for this group of terms, trying to do it on this page won't cut it. The terms are too different, and you need to target them on a separate page. So how do we figure out which keywords match and which don't? Well, the first thing you might try is some kind of textual similarity. Looking at the letters in a query, how similar are they? And this was the first thing I tried, and I immediately ran into some weird edge cases. Let's compare this term men's jeans to two other terms from that industry, men's denim pants and women's jeans. Just looking at the letters, the phrase men's jeans is pretty close to women's jeans. Two letters different, but no one would try to target both of these terms on the same page. Meanwhile, jeans and denim pants aren't all that similar in letters, but we know that they're basically exactly the same thing. The term that is 83% similar in terms of text is a completely different product. And the order of words matters to intent as well. Fence repair might be a query looking for a professional, but repair fence is more likely to be seeking a guide for how to do it yourself. Similar sounding products are totally different. So if textual similarity is a red herring, what can we do to figure out which terms are actually similar? Well, we turn to the people who are experts at this, who've spent decades working on AI and NLP to understand user intent. We go to take a peek at Google. So I know it's small, but these are the search results for two of the terms I showed before, men's jeans and men's denim pants. And seven of the top 10 pages ranking for these terms are the same. And at the top of the page, the top four are all the same. We can be pretty confident here that unless we're really not paying attention, then we can do the same as these seven sites and have the same page ranking for both terms. And just to check, here's the actual SERPs for men's jeans and women's jeans, and there's no overlap at all. It comes as no surprise that you shouldn't try to target these two terms for the same page, but this is how you confirm it. So this measure, SERP similarity, is the thing that's actually interesting for us. Let's be clear. If Google has basically the same list of pages for two queries, then it obviously considers them to be so similar that you can create a single page which will rank for both of them. If there's little to no overlap between results for two different terms, then you should target them on two different pages. It's actually really simple. You can just follow the hints that Google is leaving for us. Now, it turns out there's an established algorithm that can actually put a score to how similar these two SERPs are. There's an equation called ranked bias overlap that tells you how similar any two lists are. And for two sets of search results, we can give an actual score. We can say that Google's results are a 72% match between these two keywords or a 0% match between these others. And it looks at which pages are in both lists, but also how much they change position between the lists, and it waits the top of the rankings a bit more important than the bottom of the page. And this is great, but it only gives us a score for the similarity between any two SERPs. So how does that help us with all of the keywords that we know our page ranks for? Well, we have to run those RBO calculations a lot, like a lot. You basically have to go through and calculate the RBO score for every pair of SERPs in this list. It's a lot and it's not fast, but it is doable. And that lets you say, well, these keywords all have similar results, and these ones have a different set of results. And you end up with a bunch of groups like this. A group of keywords is there that are generically about men's genes. They're grouped together because they all have pretty similar rankings. There's a group of keywords about stretch genes, or high-waisted genes, which, again, are clustered together because there are similar pages ranking for every term in that group. And if you make a page about men's high-waisted genes, then you can probably get it to rank for high-rise men's genes or anything else from that group on the right. I thought one interesting group here was the genes fits terms. This is a bunch of queries all about different types and styles of genes. These seem more like keywords that are best serviced by content pages that explain the different cuts rather than the more commercial content that would target the other groups. But there's something else interesting we can deal with these groups once we have them. You can take your page that targets all these terms and find out what the average ranking is for your page across all terms in that group. So here we go. What I've just added underneath the head of there is calculated the average ranking for each group. And then you can say, well, this page is doing pretty well for terms in the men's genes and branded queries group. That's a relief. That's what this page was targeting. But it's typically ranking on the second or third page for these terms about specific styles. So that tells us that the page isn't well targeted for those terms. But it also tells us that somehow Google thinks the site is still relevant enough to warrant ranking somewhere for those keywords. And that's a vote of confidence. So I say, look, we found some great keyword opportunities. Things that Google wants to say we're relevant for. But we now need to break this page apart. We need to create some new content, new pages that can explicitly target each of those keyword groups. Google wants our site to rank for these terms, so we'll make the content that lets them. And if you want to discover these clusters, those groups, like I said, you have to do a lot of calculations. For six keywords, you've got to do 15 calculations for 100 keywords. We have to run the RBO algorithm almost 5,000 times. And you know who's good at that? Computers are, especially this fella. So again, we took the billions of data points that Moz has stored, and our incredible team found a way to run this algorithm over the top for you to create those groupings. And now we have another alpha tool almost available to test that does exactly this. It's called simply the on-page keyword group. You just put in the URL to a page you want to analyze and hit go. What you see are groups like these. What we're looking at is the name of the group and some stats like the number of keywords in each of those groups. And we can drill down into the group to see all the similar keywords that it found. For the individual keywords, we can see the search volume, we can see where the page currently ranks, and we can see how well optimized the page is. For that term, somewhere from 0%, it's not well optimized to 100%. And then if I switch back to the summary view, you can see that for each group, we have a weighted average of rankings for that group and a weighted average of the keyword optimization. So now at a glance you can say, hey, here's a group of keywords that we ranked low for, that we're under optimized for, and that we should break out and create a new page for. And again, you've found some hidden gems that your site probably could rank much better for if you had well optimized content to support it. So this tool will also be available as an alpha in the Mozilla pretty soon. I hope you can go and run to me or top pages through it to find your own hidden gems. The same display must apply and the same URLs will work. Use mz.cm-moscon-lab to get into the lab, and when the tool is ready, you'll be able to play with it in there. If you don't have a Moz account, then create one using this second link. Okay, so there's my second trick. Finding hidden gems for keywords to target the Google already thinks you might be relevant to. Link building, keyword research, and I want to close by talking about technical SEO. The first thing that people usually do when they have a new site to work on is run a Cyclo to find technical issues. And this is the first thing we get back. Loads and loads of warnings and errors. And too often I see people look at these as a list of issues that need to be fixed like a to-do list where each item has to be addressed. But that's like an electrician looking at a neighborhood and noticing that a hundred of the homes don't have any power and then saying, well, we'll go to the first home and fix their power, then we'll go to the second home and fix their power, and then we'll go to the third home. Instead they would say, something must have caused all these issues and we need to address the cause. They might even look and see what those houses had in common, like they're all on the west side of the street and so it gives us a hint to what the underlying issue is. And in terms of a website structure we'd look at it like that neighborhood. So often in SEO there's not an issue with a particular page, but there's a root cause affecting many, many pages. And if we saw that the same issue was affecting all the pages in this section of the site, that would give us a big hint. This part of the site is being created differently to others and we need to address that. If we saw that it was affecting all pages of a particular type, then this might be a different hint. Maybe in this case, our category page templates have an issue that doesn't affect product pages or homepage and we need to address this issue. So going back to this list of issues, we want to find out if there's an underlying cause and I'm going to walk you through an example of how we can do that root cause analysis. For example, on this site, it says there's 7,000 pages with metadata issues. That sounds bad, so we should dig in to see what the story is because if I have to rewrite 7,000 page titles for the site, that's a very different amount of work to if we just need to update a template somewhere. Moz makes it super easy to see the list of pages with these errors and then it's easy to export that list into a spreadsheet as well. So that's what we're going to do. We're going to hit this export button and the next thing we want to do is look at the list of URLs that have this error. Ideally, we'd like to try and group them into page types. What we want to do is go through this list and tag each page with a description of what kind of page it is, a category page, a product page, a blog post or whatever. Doing that manually will take forever. So this is where we say, let's let a computer do it for us and we're going to do that grouping using nothing more complicated than looking at the URLs of all of these pages. Actually, the only slightly complicated thing we'll do is use a bit of regex. If you've spent any time filtering pages in GA, you probably know how to do that already and if not, you can just use normal text filters and go from there. Now, you can absolutely do all of this in Excel. In fact, I've done it in Excel and I made a version of this workflow in Excel to show you, but it gets really cumbersome and gets bad really fast and I don't recommend it. Instead, I decided we can go one better. We're going to use my favorite method for tacking just about any problem like this JavaScript. I gave a whole session last year at Moscone about using JavaScript in the browser to do funky things that you wouldn't be able to do otherwise. And as far as tagging up all of those URLs, I realized I had a bit of old JavaScript that could do exactly what we needed here. So I prettied it up, I hosted it for you and I made it available to use. You can go to this link to access it, Hoosby.com slash tools, and I'll have this disclaimer as a reminder. This isn't a Moz product, it's just a tool I hacked together that's been very useful to me over the years. I can't offer support or whatever else for it, but it's here for you, it's useful and it's free. So let me show you what it does and how I use it. It starts with this input box where you can paste a list of URLs. You want to copy your list of URLs that you have in Excel, paste them in here, and when you import them, they'll all appear at the bottom of the page in a big, long table. And then you can start using the section at the top to tag these pages. In each row you just add the tag name on the left and then type some filter text in on the right. You can actually put any regex in there that you want. And then what the tool does is it takes every URL and it goes down the list of tags. If it matches the first tag, then it labels it with that. If it doesn't match the first tag, it keeps going down until it's found a matching tag. Very quickly, you can get all the pages in this list tagged up and up at the top in the count column on the right there, you'll see a running total of how many pages have received each tag. So this gives you some insights right to what you can suddenly see. There's a lot more of one page type than any others. Here's what the table at the bottom looks like as you start applying all those tags. And then it's as simple as taking that nice, neatly tagged list of URLs and pasting them back into Excel. The last two columns here now are a copy of the URL and the tag we applied to each one. So then we do everybody's favorite spreadsheet transformation. We turn that whole data set into a pivot table. So now I have that data in a pivot table. I put the URL issues along the top, missing meta description, title tag, so on. And then I've put the tags down the side, so author page, reviews page, and so on. And then I just get the pivot table to count the number of times each issue comes up for each type of page. And voila, this is what we get. And in this case, I can see a few numbers jump straight out at me. It's predominantly the news pages that have metadata issues and all of those issues, it's the titles and URLs being too long that we see most frequently. So now I've isolated the issue. We're starting to understand where to look because we shouldn't worry about the comparisons pages or the deals pages. It's those news pages on this site. This should be the first place we go look and laser in on that. Now, this was just for metadata issues. We can do the same for crawler issues, redirect issues, content issues, and find out if there's a portion of the site that is most affected. Moz now also has its new beta tool for measuring performance metrics from Google Lighthouse. And you could do this same analysis on your site that are much slower than others. So in summary, take any list of URLs, group them by applying tags, look at the aggregated metrics based on those numbers, and find out what the root cause of your problems might be. Now, it looks like we have a couple of minutes left, so I'm going to share two other use cases for the same technique, but that do very different things. First, we're going to use it to do some competitive content and link analysis. So tag up your competitors' pages to see if they have a particular type of content that performs well from a link acquisition perspective. So given that I work for Moz, I've taken a look at Ahrefs. They make content, some of it's good, some of it gets links, and I'm going to analyze it. I'll take a list of their most linked two pages from Moz's link explorer and export this to Excel. Here it is. This is Ahrefs' top 500 pages by links. Then I'm going to put that in this tool, and I'm going to look at the URLs and the blog posts to see what they're about. I've tagged up if the post is about links or if it's about keywords or content or Google or whatever else. We now have their top 500 pages all tagged up. We bring it back into Excel and put it in a pivot table. Now, this time I'm not doing a count of how many pages they have on each topic, but I'm using the pivot table features to look at the average page authority and the average number of linking domains for each of their page types. And what I see here is pretty interesting. Ahrefs does fine when it writes content about link or link building, but their blog posts about Google get twice as many linking root domains. That's pretty interesting from a content strategy perspective. And it's something I might consider if I was selecting other topics to write about at Moz. This only took a few minutes and we were able to learn this about our competitor. Another alternative use case is to tag up your Google analytics data. Here's the GA data from a retail store. This is actually the data for Google's own merchandise store. And I'm comparing the traffic over a four-week period in blue to the four weeks before that in orange. Straight away we can see that traffic has dropped. And I want to know is there a particular section of the site that is receiving less traffic now? Well, I can look at this report by landing page and export it. The Excel download that GA gives me has separate rows for each landing page in the two time periods and that's fine. We're just going to put it all in the tagger and create a regex for each section of the site. And here's what it looks like when those pages on the Google merch store are tagged up. We've got apparel, lifestyle, drinkware, office gear. Again, we copy and paste the tags back into Excel and again we create a pivot table. This time tags are down the side but along the top I've put the two time periods. We let the pivot table add up all the traffic that all pages for each section received in both time periods and then I've just added a column where we subtract one from the other. And actually there's an immediately a huge red flag here for me. Some sections gained a little traffic, some lost a little traffic but the apparel section lost half of their visitors. Now, was that because their rankings dropped or they turned off a PPC campaign or they lost some referral links? Who knows at this stage? But we know exactly where the root cause lies and rather than try to analyse the whole site we would laser in on that apparel section and find out why it lost so much traffic. Okay this was all a lot but to sum up I've given you a workflow to find link opportunities that you may not have come across before now and I've given you a Moz tool prototype to make it super super simple. I've shared the powerful RBO technique that lets you use Google's understanding of keywords and intent to better structure your site and find opportunities to split pages apart to be more effectively targeted and we've got another prototype there for you to play with as well. And I've shown you a technique to add an extra layer of insight on top of your technical site audits or frankly any situation where you have a list of URLs that can benefit from being grouped into sections. I'm looking forward to hearing any success stories you have using these techniques well at large or small and I'm looking forward to hearing your feedback about our new Alpha products so that we can continue making great SEO software for all of you.