 Hi, Mods fans. My name is Will Critchlow. I'm the founder and CEO at SearchPilot. We run tons of SEO tests. And if you've ever seen me speak on one of these before, or on a bigger stage, you'll have probably heard me talk about a lot of winning tests, right? Those nice situations where you run an A-B test, and you get an uplift, and you get to celebrate. Today, we're going to be talking about losing tests. So these can be the negative ones or the ineffective changes, the ones where you just couldn't prove an impact in either direction. And so this is fundamentally that situation where you find an insight. It might be keyword research. It might be from some technical auditing of the site. Whatever it might be, you have a theory. You have a hypothesis of something that is going to benefit your website. You implement the change as a result, and you fall flat on your face. You fail spectacularly, and your test result data looks a little bit like this. Now, this is actually quite an exaggerated case. A lot of the failures that we see are minus 2%, minus 3%, or just flat line. And those minus 2% and 3% type ones can be really hard to pick up without scientifically controlled testing, which is what we focus a lot of our time on really big websites. And they can really add up. If you are continuously rolling out those little negative changes through the course of the year, it can really be a drag on your SEO program as a whole. But they can get lost. You roll out that change, and it can get lost in the noise. The seasonality, other site-wide changes, Google algorithm updates, things your competitors get up to. That's what we're trying to spot and avoid. And so what can we learn from losing tests? And when can they benefit us as a business? Well, one of the perhaps counter-intuitive benefits is the drop in effort that you might be asking of your engineering team. If you have all these ideas, and previously, you're asking your team to build all of them. But if you run tests and you find that some of your ideas were negative, some of them were ineffective and weren't going to benefit you, you're now only asking your product and engineering team to maintain the ones that turn out to have a positive SEO impact. And we've seen that be up to an 80% drop in SEO tickets for engineering. So that's one business case right there. But of course, sometimes your tests look like this. And so actually, the business case is about avoiding those negative impacts on your website. So I've got a couple of tactical examples that I thought would be good to run through that might be useful in your situations as well. The first one is a case of removing SEO text. So we've seen many cases where, I think, say, a category page on an e-commerce website, for example. You've got a bunch of product listings. And then somewhere down the bottom of the page, there's a bit of copy. Maybe it's in a div SEO underscore text. Maybe it's really small font gray, not exactly white on a white background, but clearly not designed for human eyes. We have run some experiments where we had situations like that with pretty poor quality text on category pages. We tested removing it and actually saw a statistically significant drop in organic visibility, which is a shame. Because we know that this isn't high quality text. We know it's not where Google wants us to be. And yet, removing it was a bad idea. And one of the things we can learn from that is say, firstly, don't throw the baby out with the bathwater. You can't just knee-jerk, react to Google's PR and all these kinds of things, and say, well, best practices say this. Let's just do it straight away. You can't do that without testing, because you might be hurting your website. But it does point to a direction of potential future improvement. Because if having terrible text is better than no text, having good text might be even better. And so one of the things that you benefit from with a losing test is you get to learn. And so you get to point the direction to some insights that might be positive for you in the future. The other example I've got for you here, you might be wondering what on Earth this is. And art is not my strong point. This is an Easter egg. Trust me, this is an Easter egg. And we saw an example across, this was a website that operated across the whole of Europe, multiple different country, territory websites, testing, adding seasonal offers. So in this case, it was about Easter travel, Easter breaks, Easter flights, those kinds of things. The keyword research had suggested that there was demand for this, that the audience is searching in this kind of way. And yet, adding those offers to the page was negative, and that was very surprising. And what it turned out was going on here was that it was diluting the quality of that page for the things that were the bread and butter of those landing pages. And so yes, it was ranking better for some Easter travel type related searches, but it was doing worse for the bulk of traffic of just trips to city name or whatever it might be. And the net impact was negative. And that's the kind of thing you can only pick up by searching. So I hope you've enjoyed this little journey into losing SEO tests and what we can learn from them. My name is Will Critchlow. I'm at SearchPilot. You can find me on Twitter at willcritchlow. Look forward to chatting to you soon. Take care.