 This next speaker needs no introduction. He's my co-MC who's been a mainstay at Muscon far longer than almost anyone else. When he's wearing his yellow glasses, it looks like an identical twin to Seth Golden. Even without the glasses, they share a level of marketing wisdom, intelligence, and quick-wits that still makes it hard to tell them apart. Cyrus is the founder of ZP and SEO software and consulting company. He also consults with Moz where he teaches about SEO and you would have seen him a bunch of our whiteboard Fridays as well. He's a face you'll see many times over the next few days just like mine as well. He's talking about mastering three clicks, engagement signals for higher rankings, and traffic. I'm excited for this one. Take this away, Seth. I mean, Cyrus, whatever, same thing. Welcome everybody to my Mozcon presentation. My name is Cyrus Shepherd. I'm an SEO consultant with Moz. If you can believe it or not, this is my 10th Mozcon that I've been involved with. Because of that, I want to do something extra special for this my own 10th anniversary. So my topic today that I've been working on all year long, mastering three-click engagement signals for higher rankings and traffic. I want to give practical advice about how to raise rankings and traffic using techniques that a lot of people don't talk about. So you're an SEO or you do SEO. Here's a common problem, a common situation that we often find ourselves in. You are an SEO rock star, not me, not me on stage, you. You are the SEO rock star. You're killing it with your content. It's highly relevant. You are mastering all the ranking signals. You got content. It's fresh. It's relevant. It's fast. It looks good on mobile. Your robot's text is in order. Your site maps are in order. You have nailed all the known Google ranking factors. But you can't get past page two. What is going on? We've all been in this situation where content is relevant. We know it's good. We know that it's fast. We've invested hours and money and people resources into the content. But we're just not ranking, following the advice we get on Twitter. So what if we're missing something? What if the Google ranking factors that Google talks about just aren't enough? What if the experts on Twitter, on Facebook, on LinkedIn aren't talking about the right things? What if we missed a piece of the puzzle? So this is probably very true. We know that Google does not like to talk about its important ranking factors. They made a mistake when they talked about page rank back in the day and SEOs like ourselves. We almost ruined SEO by gaming links and Google finally had to release their penguin algorithm to help solve it. But we really damaged web results because Google talked about their main ranking factors links too much. And so they write in their how search works documentation, we have to be careful not to reveal too much detail that would allow people to game our search. So Google talks about ranking factors all the time, but they don't talk about the big ranking factors. Core Web Vitals, have you heard of Core Web Vitals? Yeah, of course you have. Core Web Vitals has dominated the SEO conversation for nearly a year and a half now. Everybody talks about it. You can find thousands and thousands of articles because Google announced it as an official Google ranking factor. So everybody went crazy. But by their own admission, it is a tiny ranking factor. In fact, it is only a tiebreaker, a tiebreaker between two results. The important ranking factors, like links, they don't talk about that much because they have a vested interest to protect their algorithm. So instead, we need to look at other sources. Myself, I love reading Google patents. Google patents is not an indication of Google uses or something, but it tells us where they're putting their time and money. And there's one area of Google patent technology, patented technology, where they've invested 20 years of patents, dozens of patents, millions of dollars, thousands of people hours into trying to gauge how satisfied users are with certain content and how relevant that content is based on user behavior. Yes, I am talking about click signals. Click signals. Click signals are user interaction behavior, and they are a way that search engines can gauge how a user interacts with content and interacts with search results. For example, what are they clicking on? How long are they staying on a certain page? Are they going back to Google to click on another result? Google, with their search results and with Chrome, the largest browser in the world, has access to billions of petabytes worth of data, all measuring user behavior signals. Specifically, when we're talking about click signals and search engines and ranking results, we're talking about three different types of click signals. I'm briefly going to go over each one of these, and then we'll dive into them and see how we might be able to optimize these and use them for our own benefit and user satisfaction. So the first is being the first click. This is often known as having a high click-through rate. When Google presents a list of results or any search engine presents a list of results, it's basically a guess on Google's part is which is the best result. But what a user selects may be an indicator of quality. If users are consistently picking the number three result, that might be an indication that three is more relevant to their query than number one. Regardless of whether it's a ranking factor or not, being the first click is usually good for traffic because you're getting more clicks. The second click signal that we're going to talk about is long clicks. Long clicks is basically how engaged are people with your content. Once they click, once you get the first click and they come to your website, how long are they engaging with that? So the idea is that good content that's relevant will get slightly longer clicks than short clicks where people just click and say, ah, this isn't for me and go out. And this varies by query. Some types of queries like how tall is Mount Everest, they don't need long clicks. But other queries do, so it's very query dependent. But the short thing we need to know is longer clicks are often better than short clicks. And finally, you want to be the last click. If a user clicks on your search result, are they going back to Google to look for more answers? If so, it's probably an indication that you didn't do a very good job of answering the query. On the other hand, if they don't go back to Google to click on another result, you probably answered their query and you were probably a satisfying good result. And maybe you should rank higher because of that. The question is, though, when I talk about these things, does Google actually use these signals? We actually don't know. Google tends to not talk about it very much. When asked, they tend to say that these signals are noisy or it would be hard to use these signals. But they typically don't deny or confirm either way. We know that they use click signals in some capacity in their how search works. They say they use aggregated and anonymized interaction data, which is another word for click data, to assess whether search results are relevant to queries. We transform these signals that help our machine learn systems to better estimate relevance. So we know they use click signals in machine learning. We just don't know exactly how. But I would argue we don't necessarily need to know how. Nearly every search engine, major search engine in the world admits to using click signals. It's pretty obvious that almost everybody does it. Bing uses click signals, Apple, when talking about building their next search and uses click signals. When you have the data, it makes sense to use it. It would make sense for Google to use it, too. But still, it's kind of irrelevant if Google uses it or not. I wrote this big post that came out this year, April 7, 2021, three vital click-based signals for SEO, first, longest, and last, where I documented some of my findings. John Mueller of Google, who's a great friend of Ma's, he came on with a fireside chat a few months ago. We respect his word. I think Cyrus is using a long blog post to recommend you all make amazing websites to help solve users' needs. So that's where I want to stop and say this. It doesn't matter how Google uses click signals because John Mueller is exactly right. It's not about the signals. It's about making websites that satisfy the user query. Google talks about user satisfaction over and over and over again. And they say they reward sites that satisfy user intent. So click signals are just kind of a proxy to satisfying that user intent. If you can improve your click signals, you're probably improving your user satisfaction. So we want to keep our eye on the ball and that ball is user satisfaction, not the click signals itself. But we're going to talk about click signals because it's interesting and it's something we can measure and it's something we can experiment with. So let's experiment. So first, we're going to get into several experiments that we run here at Moz. Some large-scale, some small-scale. I think it's important to remember that small-scale SEO experiments and even large-scale SEO experiments on a single site are not proof of Google's larger algorithm. You can't change content because of Google uses so many signals. You can't change one thing and isolate it. You can only look at the overall result. All you can prove is if your traffic went up or down. So even though these experiments don't prove anything about Google's algorithm, they sure are, they sure can be a hint. And we can take them, we can take those learnings and apply them to our own sites to see if we can get more traffic in rankings. Because in the end, that's what we want in addition to satisfying users' query. So the first thing we want to experiment with is optimizing for first clicks. Get the first click. As I said, this is a way to, this is often known as getting a high click-through rate. And it's almost always good because if you can get a high click-through rate, you're getting more clicks no matter how you rank. And Google gives us a number of levers to pull where we can influence click-through rate. And these are what appears in SERPs. We have the favicon, very important on mobile, the brand name, URL and the keywords, the title tag, probably the most important thing to earn clicks. We have rich snippets if you can get them. And we have a meta description. These are all things we can do to influence earning the first click. So what about meta descriptions? Meta descriptions are kind of the ugly stepchild of click-through rate. Everybody focuses on title tags, but people ignore meta descriptions. Could this be because Google ignores 70% of meta descriptions and just rewrites them themselves? A lot of people just give up on meta descriptions because Google rewrites so many of them. But I thought, what if we don't ignore it? What if we try to optimize meta descriptions in a way to earn higher clicks? It sounds challenging, and I like it. So let's experiment. The first experiment we ran, and all of these experiments I'll just point out, we ran over several URLs. There was always a control group, an experiment group, analysis was done after the fact, and they were usually pretty simple experiments that anybody could run. So our first attempt at improving meta descriptions, we simply added keywords to the front of the meta descriptions. We got the keywords from the top queries of Google Search Console. We just inserted them, and we thought two things would happen here. One, Google would show our meta description more often because we're including relevant keywords, but two, because users could see those keywords, we'd get a higher click-through rate and eventually get more traffic. So what do you think happened when we did this? Gosh, we lost 7.5% traffic. Not only that, our engagement metrics were worse. 14% lower pages per session, 37% session duration, and a bounce rate that was virtually unchanged. By the way, when we talk about these metrics, these user engagement metrics, we know we're pretty sure Google does not use these exact engagement metrics, pages per session, session duration, bounce rate. Those are not click signals, but for our purposes, they are the best we have. So we use these as a proxy. So what we want to do in these experiments is look at our traffic and see if there's any correlation to engagement metrics. It's not a perfect comparison, but it's the best we have, and it's what we're going to go with. So this was a disappointing result. We actually tried this experiment again, not only inserting keywords, but rewriting the entire description for better conversion and got a very similar result. Turns out maybe we weren't that good at writing meta descriptions. So next, we tried removing the meta description. So we did this on a bunch of URLs. We thought, well, maybe if Google is so good at rewriting our meta description, maybe we'll just remove them and let Google write our meta descriptions for us. And maybe that will earn a higher click through rate. So we got rid of our fancy meta descriptions and let Google do their thing and let it run over thousands of visits. And what would you guess happened on this one? Not as bad. Google did a pretty good job of rewriting our meta descriptions, but we still experienced a 3% loss in traffic, not hugely significant. And the engagement metrics were mixed. It's kind of like when you write the meta description, you set an expectation of what the content is. And if it matches that expectation, people might be engaged. Or if it doesn't match the expectation, people aren't that engaged. So mixed results. So this is disappointing. I'm not having any luck with meta descriptions influence influence and click through rate. So I had this idea. I heard in kind of a spammy black hat forum, that people would write Google guaranteed and capital letters at the front of their meta descriptions. I thought this was kind of spammy. Google obviously isn't guaranteeing this result. Well, they kind of are. They're ranking it in search results. So it's a guaranteed third place result. But for giggles, we tried this experiment and just put Google guaranteed at the front of our meta descriptions and Google displayed it. So what do you think happened in this scenario? This was our only successful meta description experiment. It went up, traffic went up 7%. Not only that, engagement metrics really were significantly higher. 20% more time on site, pages per visit, 139% time on site and balance rate reduced by 10%. It's almost like we said, hey, Google guarantees this. This is great content. So visitors weren't more engaged. We got a higher click through rate. And we ended up getting more traffic. How did I feel about this result? Oh, geez. This was disappointing. I had to result to Google guaranteed to get more clicks. This is not what I was looking for. But we do have other ways, more legitimate ways of optimizing click through rate. Title tag experiments. If you've been following Moz for a while, you know that we run a lot of title tag experiments. Working with the folks at SearchPilot, we run these massive AB split tests over hundreds of URLs, thousands of visits to try. So this was one title tag experiment we ran last year on our Q&A pages. We simply put sold at the front of our question if it was, in fact, solved with the idea that more people would click on it if they knew that the question was solved. And how did this do? 7% uplift in traffic. Yes, and that was statistically significant. Simply by putting, we didn't have any indication that we ranked better, but we definitely got more clicks, those first clicks, by adding solve to the beginning of the title tag. And then we have an experiment where we removed title tag boilerplate. Now, if you're familiar with boilerplate, you know that boilerplate is the repeating part of the title tag that often appears on every URL. For instance, if your brand name appears on every URL, you could consider that boilerplate. We tried an experiment where we removed whiteboard Friday because we found that people weren't necessarily searching for whiteboard Friday. And if they were, they were looking for a very specific page. So we thought, what if we removed whiteboard Friday? Would we get more clicks and possibly more traffic? 21% uplift. And that just kept going and going and going. Oftentimes, it's very common to see boilerplate in your titles, but if it's irrelevant to what the user is searching for, you might consider removing it. And that's an easier way to get more clicks. And I've done this experiment many, many times, and it almost always succeeds when the boilerplate is irrelevant. Now, one thing you got to be careful about when you're doing click-through rate experiments is what I call the return to Earth optimizations. You can hack click-through rate. This is an experiment where we just capitalized all the words, capitalized every single letter, and we got a high click-through rate. But what happened after that? The content didn't support it. People were no more engaged with the content after clicking than they were before, and we returned quickly to Earth. So here's the point. Google says click-through rate is a messy signal. They're right. It's noisy. Your content has to back it up. And we're going to talk about content backing it up in the next two sections. So earning the high click-through rate. Also, if you're interested in learning more about title tags and how to master title tags, I have an advanced title tag optimization webinar. You can look for it on Moz. It's an SEO masterclass. You can Google it. It contains all my secrets over 10 years of optimizing. It's free, and it's a good educational resource. So how do you make your content back up the promise? After you get those first clicks, we want to optimize for long clicks. Long clicks simply mean your visitors are spending a little bit more time on the site compared to your competition for the same sort of queries. And Google has a complicated formula that we won't get into here, but it's all about your long click ratio. How many long clicks are you getting? So you want to improve. We can just think of it as improving the engagement on your site. So everybody says, if you want to improve engagement, add videos. Yeah, add videos. Adding videos increases time on the page. You hear that over and over and over again. So that's the first thing we tried. Moz has a huge video library spanning 10 years of content, hundreds of videos. So we have a great treasure trove of videos we can add to virtually any page. So that's what we did. We found some pages. We put in a very relevant video that the user may not have seen before. We tracked control. How do you think adding videos did to traffic and engagement? Basically, it didn't do anything. 1.39, this was not a statistically significant result. And the engagement didn't really change at all. Now, this is not an argument that you should add video or not add video. It just didn't work for us in this particular situation. So we want to try something else. Video was a bust. How about related articles? So this sounds cliche, but it's kind of interesting. So stay with me. You know what these are. Related articles, they're often found at the bottom of an article. You know, if you like this, you might also like this. What always struck me about related articles is there I always thought they're in the wrong position. They're usually at the end of a blog post where no one really reads them and they seem like more of an afterthought. When sometimes they're actually more relevant to the user query than what they what the article was about. So we thought, let's put the related articles to the top. So we picked a bunch of posts and we moved, you may also like, right at the beginning. If you're looking for this, you may also like this and they can read the article and go back to those links. We even linked to our competitors if we didn't have a good article. That last one, they're linked to A-Refs, which is a competitor boss because it's a great resource. Our goal here was to provide a great user experience so that if we don't have the answer, they can find the answer and they see these links and they can go. So how do you think it did when we linked out to relevant articles at the top of the page? Bueller, 18.5%. This was an amazingly successful result. And weirdly, weirdly though, engagement metrics didn't improve that much. Down 26, bounce rate was worse, time on site. So one of two things is happening here. We've repeated this experiment. It's always done well, but not necessarily with engagement metrics. So either this doesn't have to do with engagement, maybe it's just because we added links, more relevant content, or it's an indication that Google is not measuring engagement this way, which is probably a likely case. And this experiment has been repeated by many others. SearchPilot did something similar recently where they increased the number of related article links and they saw an 11% overall gain in traffic. They used to have two per page, for example, then they'd have four. What's interesting about this, you have two pages here, you have the control, you have the page, you have the page that's giving the extra links, and then you have the page that's receiving the extra links. You would think by linking to more pages, those pages would rank higher as well. But that's not the case. The SearchPilot study in their case showed that it was the donor pages that actually got the ranking benefit, the pages where you're putting the links on, the pages that you're linking to, they saw no increase in traffic. Possibly because they didn't change that much, and their metrics didn't change. So it's interesting to think about, but typically linking to highly relevant content that your user engages with is a smart strategy. So I want to present a case study we did here. We had a page on domain authority. This was not an experiment, but just something we wanted to improve this page. Domain Authority is a Moz metric. We rank very highly for it. But in our analysis, we realized what people were really looking for on this page was a way to check their domain authority. So we thought, how can we make this page more engaging? How can we increase the time on site, user engagement, reduce bounce rate, by allowing people to check their domain authority? So we added this to our page. How can you check your domain authority? We added a bunch of links so people could explore different ways to check their domain authority. We added a screenshot of a tool so you can easily analyze it. And how do you think this did with engagement and traffic? It did so well. We can't even measure it. Within three months after adding this, bounce rate went down almost immediately. All user engagement metrics improved. People were visiting multiple pages per site, and traffic has almost tripled maybe two and a half times since making that page more engaging. It was a huge win for us, and it's generated a lot of revenue because of that one symbol change on one simple page. We simply made it more engaging, and that worked. So finally, we want to talk about optimizing for last clicks. So we got the first click. We got the click-through rate. We got high clicks. Yay! We got engagement, keeping people on the site longer, getting them to interact with more pages, staying longer around. Now we want to be the last click. Our final goal in optimizing for user click behavior, we don't want users going back to Google to do more searches. We want to provide them with the answer. This is really hard to experiment with, but it's all about user satisfaction. So if we can improve user satisfaction, perhaps that's an indicator that we can get more traffic and higher ranking. So we wanted to try some simple things. We thought we'd start with frequently asked questions. And frequently asked questions, we would add these to the bottom of very big pages. What is on-page SEO? What is on-page and off-page SEO? You often see these on-site, but we wanted to do it in a slightly different way. We got these. We simply scraped these from Google search results. If you Google on-page SEO, and people also ask. So a couple of interesting things we did. We use no-fact frequently asked question markups. So we were not getting rich snippets for these results at all, because we didn't want that to influence anything. And we also stuck them way down at the bottom of the page. So people had to be really engaged. People who were interested in this content really had to work to get to it. And we did this for several pages, had a control group. Is this enough? What do you think happened when we did this in our experiment? 18% gain in traffic. This was very statistically significant. Also engagement metrics improved across the board. This is what we want to see. Pages per session increased, session duration increased, and bounce rate decreased 2%. This was an easy effort. It took 15-20 minutes per page and generated on whole thousands of more visits just by answering a few questions that people were interested in. Every indication is that we satisfied these users a little bit more, and they were less likely to return to Google to look up something else. So hopefully they stuck around Moz more. This was a great way to earn more traffic. So here's our final case study of the day. Moz Q&A. We had an old Q&A that we needed to migrate to a new platform. This presented several SEO problems for us. It was going to be improved user experience, but technically it was a tough lift. Specifically we're using a new platform and we couldn't use the same URLs. That was an issue because we had 100,000 URLs to migrate and they weren't necessarily better URLs. Now we did have some technical improvements in the new platform we improved the speed, we had slightly better site map coverage, but most importantly we had no new content. We just wanted to provide a better user experience improving the engagement while they were on the site with the exact same content. We migrated 60,000 questions. The questions and the answers all remained the same, but we made a couple improvements to improve user experience. One, we increased speed which is always a great way to reduce bounce rate and improve time on site and all that because pages are loading quicker. But the major thing we did was increase the number of links on the page. As I said the questions and answers all remained the same, but we included breadcrumbs to make the site more easy to navigate, but most importantly we added related questions. Remember the experiment earlier where traffic went up like 20% when we added related questions? Well we added very highly relevant questions to every page on the site and I was a little worried because we migrated 100,000 URLs. I told everybody to hope for the best, expect the worst, but what do you think happened here? Nearly a 17% increase in traffic. In fact we never saw a decrease in traffic. It was the most amazing migration that we've ever had. As soon as we launched it people were engaging faster. Pages were loading quicker, people were spending more time on site. This was an amazing success. I had to go back on my dire predictions and eat my words and say, hey this worked really well. It was an amazing SEO success for Moz and we were all very, very happy with this. But interestingly this is what happened to traffic. What happened to engagement metrics? Awesome. Pages per session increased 10%. Session duration was about the same, but bounce rate reduced 30%. So basically people were looking at more pages and they were staying a little bit longer. Mostly we think because of the faster page load and all those related questions that we provided with them. This is exactly what we're looking for. So keep in mind we've talked about all these ranking, possible ranking factors, first click, long clicks, last click. It's not about the clicks. It is about user satisfaction. Google quality radar guidelines talk about user satisfaction over and over again. You have to provide satisfying content. The question is how does Google measure satisfying content? We don't know, but these click signals could be a hint and they could be used as a proxy for us to gauge how satisfying our content is. If you provide the best and the last answer a user is going to need, you're going to win time and time again. If you're linking to related content that may also provide better answers, you're going to win time and time again. If you give people a reason to stick around your site, to share it, to not go back to Google, to search for your site in Google if they're searching for something, you're going to win again and again and again. Satisfaction is ranking factor number one. It's not about the signals. It's not about the clicks. It's not about the latest speed index ranking factor on Twitter. It's satisfaction and it goes back to that. Google is getting better at it. We have the knowledge and the metrics that get better at it ourselves. That's all I have today. If you have any questions about this presentation, you can flag me on Twitter. I'll be hanging around MozCon the next few days. I'll be seeing you as an MC as well and I hope you enjoyed this presentation. Thanks, everybody.