 Happy Friday, Moz fans. Today I want to talk about this question, how volatile are organic rankings? Now, you might instantly think about things like core updates, other big, helpful content update, all these big algorithm updates we've seen in the last year or two when I mentioned something like that, but I'm actually talking about more day-to-day routine fluctuation in Google rankings, which I think is a lot more prevalent and a lot larger in magnitude basically than most SEOs tend to think. So this data is that I'm going to share with you today. This is actually taken from one of the quietest two-week periods that we've recently had. This is from memory, I think from August, but the full date will be linked below. And yeah, this is from a period where relatively speaking, not too much was happening in terms of major announced updates, although obviously that's not saying much. There's not a lot of clear periods in recent times, but that's kind of the point. And this is all taken from stat, our enterprise rank tracking platform, and it's the Mozcast corpus tracked in stat. So Mozcast you might have heard of, it's 10,000 head terms. And in this case, I'm tracking it in two suburban locations, one in the US, one in the UK, and on smartphone and desktop. So 40,000 total SERPs per day, and in this case, over a two-week period. So this chart, this shows the percentage of SERP positions that had one consistent URL for that entire time. So for example, in position one, so at the top of the SERP, about 70%, I think actually 69% of position one results had one URL every day for 14 days on the trot in this period. Going down to in position 10, only about 1% of SERPs had the same URL in position 10 for the entire two-week period. Now, that's already quite interesting to me, because I wouldn't have predicted such a steep drop-off or such a smooth, sort of neat, unchaotic curve for this. I've got two pet theories as to why this might be. One, I guess, the more obvious one is simply that if you're in position one, you can only move in one direction, and you can only move so much. Even if you're in position two, you can only move a little bit in one direction, right? So maybe that kind of limit is part of why the higher positions seem more stable. However, I think probably the bigger reason, and this is, like I say, just to pet theory, I'd be curious if you want to pop in on Twitter and share your own ideas. But my favorite sort of explanation is that often there is an obvious winner for a SERP, or maybe even an obvious top three, where you can think that for this keyword, there's kind of one site which is going to be the right answer. Sometimes you can imagine it be Wikipedia, for example, or in many cases, a query might not be branded, but the product is very closely associated with one brand, something like this. Whereas when you get further down, Google is guessing a bit more what you might be after, and you get a wider range of slightly more chaotic results. So that's a potential explanation. So this chart was looking at the percentage of positions which had one consistent URL. But I was also interested in if they don't have one URL, how many is it? Is it two, just swapping it out all the time, or is it a larger number? Now, it turns out that looking across all of these SERP positions, or 40,000 SERPs and 14 days and all these positions, the majority of positions actually housed four URLs or more, which is a staggeringly large number. So although there were some that were stable with just one URL, the ones that were fluctuating were fluctuating a lot over the course of this period. So why so much? Why so much change? Well, there's a few answers, some more obvious, some less obvious. I think most obvious is we know Google have said in the past, I think about five years ago now, Google said they were doing on average seven daily algorithm updates. So as well as the big ones that we hear about, there's all of these small ones all the time, we can only assume that that number has gone up since then. So all of these daily changes or Google rolling out all of these tests that affect just a small percentage of SERPs, this is obviously going to cause a lot of fluctuation. Site changes as well. If you think that SEO works, then you think that changing things on a website changes rankings and people are changing their websites all the time. In any given SERP, the chances are right that one of the URLs or one of the sites that that URL is on has been played with today. Maybe the internal links have changed, maybe it was linked to you from somewhere externally, maybe the anchor text was updated, maybe add some new content added, a new product. There'll be all of these small changes and if the results are very close in their ranking ability, then you could imagine that they're going to go up or down a little bit in this period. And lastly, I think maybe most controversially, we've been hearing a lot recently about the USV Google case and about some of the sort of exclusive data that maybe gives Google an unfair advantage. And a lot of what's been coming out is about how they might be using user data to inform search results. Now there have been experiments in the past by people like Rand Fishkin showing how you can affect real-time ranking changes using user data, as in if you ask everyone in a room to go in a big room to go and click on position two or three on a SERP, then that result will move its way upwards, this kind of thing. So maybe some of this reactive data is filtering through and affecting SERPs in real time. We don't know, but it's possible. What are the implications though? What does this actually mean for SEOs day to day? Well, I think there's two big things. One is this has to affect how you think about reporting. A lot of us, and I've done this in the past myself, a lot of us report weekly or monthly on SEO performance, but that can be potentially misleading. Like you could imagine if we've got a ranking chart and it's like this for your given keyword and you happen to report on this day, well, that's nowhere near the average for the period, right? Similarly, if you happen to report on this day, you're not getting a particularly representative picture and you can imagine one week you report here, the next week you report there, and you're reporting that there's been a big increase week on week, well, there hasn't been, right? It was just all over the place. So it's necessary in reporting to have this more holistic overview and not just look at rankings once a week or something like this. The other implication I think is about overreaction, and this is a difficult one for SEOs because we do want to be able to get on top of problems quickly, but I think you have to know when you're reporting, oh, a given ranking has gone down or gone up, is it only that keyword? Is it normal for this keyword to be going up or down this amount? Is this something that we should get excited about basically? Because a lot of people will want to say, oh, we made this change of page, it's gone from position forward to position two on this keyword we really care about. Great. Is it going to go back down to position four tomorrow? Is this just random? And I think that's the way you have to think about it. So that's all. I hope that was interesting. Like I said, there'll be a deeper version of this data linked to below. Thank you very much.