 All right, welcome, everyone, to the first Google Search Central Office Hours Hangout. My name is John Mueller. I'm a search advocate here at Google in Switzerland. And part of what we do are these Office Hour Hangouts where people can jump in and ask questions about their website and web search. This is the first Google Search Central Hangout. But essentially, it's the same as the previous Hangouts that we've been doing. The name has changed slightly because we kind of rebranded everything. So that's been pretty exciting this week. A bunch of stuff has been submitted already. But if any of you want to get started with the first question, you're welcome to jump in. Go, John. Go for it. Hi. John, I have one of our clients. They had to submit a website. So one of the websites, they actually designed for targeting Melbourne audience. And another one, they are targeting for Western Australia State. Now, they have planned to get rid of their second brand, which they use for the Western Australia. And they want to use a Melbourne brand for inter-Australia. And they want to consider it both brands. Now, the website they have for the Western Australia, they're going to remove the website. And we are thinking whether we should use TZ-001 redirection or counter worldwide so that we can still get the ranking of the Western Australia website to Melbourne websites. I think that's kind of up to you, depending on what you want to do there specifically. So with a 301 redirect, you're essentially saying we're combining these websites. And with the Royal Canonical, you're kind of able to leave the old website where it is and kind of focus your energy on the new one. And leaving the old website where it is might make sense if you want to keep it for other reasons, other than SEO, for example. So that's usually the two aspects that I would look at there. So if you're purely thinking about SEO and you just want to combine everything and clean everything up, I would redirect and do a proper merger between those sites. But if you want to keep the old website alive for whatever reasons, then Canonical is a good approach. Thank you. John, I'm curious about something regarding actually regarding Google News. Because there are so many surfaces out there. You have the top stories. You have the News tab in Google Search. And you have the Google News app, I guess, or different surface. So are they all operating under different kind of algorithms, decisions? Does each surface? Because we've noticed we've been working with a publisher. And they seem to be doing very well in top stories and the News tab in Google Search. But when you search for the exact same query in Google News, they're not being shown almost at all. You can see them and you can see their articles if you go to their News source. But they're not being shown for regular queries, whereas for the top stories or the News tab, they just show up fairly well. I don't know all of the details around Google News. So that's kind of the one aspect there. But for top stories and the News tab in Search, from our point of view, those are essentially just specific variations of Search. So that's something where any website could appear. It doesn't necessarily need to be a news website. It doesn't necessarily need to be listed in Google News. So those are essentially just kind of variations of Search. And for these different variations, because we think the intent is slightly different, there are definitely different kind of ranking algorithms that play there. And I assume in Google News, they have a whole set of their own ranking algorithms that they use for kind of that part. But now, I think it's getting more and more complicated because of all of these different surfaces. And there's also Discover that kind of plays in there as well, which makes it a little bit harder than before, where you can just say, oh, I'm in Google News and I focus on this. And I'm also in Web Search and I see myself here. But on the other hand, it also keeps it a little bit exciting because you have more opportunities, more different ways to appear, and more things that you could be focusing on slightly differently than your competitors could be focusing on. Right. I don't see this as a problem having multiple surfaces. And each one takes its own decisions in terms of what they should show or not. I'm just worrying about there's a big discrepancy when on a few surfaces, everything seems to be OK. And on another one, the site doesn't show up at all. So it's kind of hard to diagnose, OK, why? Yeah. I don't really have any great answers for that. And I think it's also tricky because we don't show all of those details in Search Console, where it's kind of harder to kind of see exactly what is happening where. OK. They do have a support team, but unfortunately, they are not very helpful. OK, now that's unfortunate. Do you know if there are any office hours? I'm not sure if. I don't think so. Oh, OK. So the support form is probably the only avenue. Yeah. That's usually the best approach. And every time I try to escalate news issues internally, they're always told them to use the form. So OK. Hi, John. Can I follow up on that really quickly? Sure. Speaking specifically to the News tab in Search, if I'm at a site and I search for my site specifically and I'm seeing the articles being listed not in chronological order, is there something I should be worried about with that? For other competitors, I see that there's are showing at least correctly in my view. But from what I'm seeing, I'll see something three weeks ago versus seeing something 20 minutes ago. That's something concerning I should kind of look into. I don't know. I actually don't know if they're supposed to be in chronological order or if that's just how they sometimes end up. One thing you could do is try to double check that we can recognize the date of your pages properly. One way you can kind of guess at that is if we show a date in a snippet, then that gives you a bit of an idea. Another thing you can do is use the date restrict in Google Search, where I think it's under Tools now or in advanced search settings where you can specify date range and just kind of roughly double check that we're picking up the right dates for your articles. One of the things I've noticed is that on some sites, the article date is kind of less prominent or less clear for us to understand. And we might pick up a date within the body of the article or from a sidebar somewhere instead. And we have a Help Center article on how to specify dates on your pages. And the important part there is really that all of these things align so that the visible date on the page matches the date that you have in the structured data. And if we can recognize the date properly, then usually we'll be able to pick that up accordingly. It's still sometimes the case that we have individual ones that are off, but essentially if you're doing things right, if you're doing things in a consistent way, then at least we'd be able to pick up the date. And if we need to use the date to kind of show these articles in chronological order in the News tab, then at least we'd be able to do that. But like I said, I don't know if there's a post to be in chronological order. I would assume that it's not necessarily always the case, similar to in web search, where if you search for a website, sometimes we show the home page. Sometimes we don't show the home page as a first result. OK, thank you. OK, let me run through some of the submitted questions. And we'll definitely have time for more questions from you all later on. And if you have any comments or questions in between to the questions or answers, feel free to jump on it. Do orphan subpages. So subpages without internal links with duplicated or low quality content have negative impact for a website overall. So I think I wasn't quite sure exactly which direction this question goes to. So it's hard to say exactly what would be helpful for you in this particular case. In particular, if you have pages that don't have internal links pointing at them, so pages that are hard to find from crawling, then usually Google Search will assume that these are not very critical for your website because you're essentially hiding the way from people who are clicking around within your website. And if we assume that you think they're not very critical for your website, then probably we won't give them as much weight in Search. And if we don't give them as much weight in Search, then it doesn't really matter that much what you actually have on those pages. So that's kind of the one aspect there. If you have pages within your site that you're not linking to at all, then we don't really know what you want to tell us with that. And if those pages are duplicate or low quality content, and we don't give them much weight, then it doesn't really matter that much. On the other hand, if you want those pages to be findable in Search, then I would definitely make sure that you do have internal links pointing at them and that you do try to avoid duplicate and low quality content on those pages. So those are kind of the two aspects that I would look at there. In Search Console, I'm seeing CLS issue more than 0.25 on mobile. How does this affect SEO, even if it's not affecting user experience? Good question. So I think CLS is a metric from the Core Web Vitals cumulative layout shift, I believe, which is basically a measurement for how much of a page shifts when a page is loaded. So if you open a page and then while it's loading, a new image pops up on top and everything shifts down, then it's hard to actually read the content on a page. So that, from our point of view, is a usability issue. We introduced the Core Web Vitals earlier this year, and our plan is to start using them as a part of the page experience factor for ranking in Search, I think, in May next year, I think we announced. So at the moment, that's not something that would be affecting your SEO in May that might be affecting your SEO. Because if you're not reaching kind of that good bar with regards to the Core Web Vitals, then that's something that could have a negative effect on your site's ranking. So yeah, so my recommendation there would be to try to figure out where that's coming from and to try to find ways to improve that so that you're a little bit ahead of the curve and don't have to rush anything when it comes to May. With regards to testing these things, one of the aspects to keep in mind in Search Console, the data that we show is based on data that users have seen. So we call it real user metrics. And this is aggregated over the course of a month. And it's based on what users see with their connections to your website. So in particular, if you're testing this then locally to try to reproduce this and you're saying, well, it works for me, it's important to keep in mind that your connection to your website might be very different from the average user's web connection to your website. So if you're seeing very different metrics when you test it internally, then that's something where you might want to kind of figure out, well, why am I seeing this difference? What do I need to do so that I can better reproduce what my users would be seeing? Which is best? REL Nofollow or REL Equal Sponsored for outgoing affiliate links? From our point of view, both of these are possible. So for affiliate links, we generally recommend specifying that these should not be passing page rank or any of the other search signals. And you can do that with REL Equals Nofollow or REL Equal Sponsored. You can also combine these. So in particular, some search engines don't use REL Sponsored yet. So you could easily use REL Nofollow Sponsored on these outbound links. And you're essentially covering. From our point of view, we prefer having a clear specification with REL Equal Sponsored. But that's really just so that we understand your site a little bit better. It's not that there is a big SEO difference with regards to handling them with Nofollow or with Sponsored. Also for the more common kinds of affiliate links, we've generally seen a lot of those already for a longer time. So we're able to handle those ourselves. It's not the case that you'll have any automatic penalty or any other search issues if you don't clearly specify your affiliate links. It just helps us because you're kind of being a little bit more deliberate on saying, well, these links, I don't need you to pass any signals to. Two questions. Why does the organic click-through rate drop with the exact same ranking for two different time periods? This is a pattern we've started seeing with some of our keywords. Yeah, I don't think there's any particular reason from our side why the click-through rate would be dropping if nothing else is changing in the search results, but rather that sounds more like things are changing for the users. And that could be because maybe your snippet or your title isn't as clear to users or doesn't match what they are looking for. It could be that users are just searching for different kinds of content. It might be that people are searching more or people are searching less for those particular queries. These are essentially changes more on the user side than on Google side. Because if we show your result in the same position for these searches, then essentially, potentially, you could be getting the same click-through rate. But what users actually do is not something we can control. It's more up to the user. And oftentimes, click-through rate is something that you can slightly affect by working with your titles or your descriptions to make them a little bit clear. The other thing to keep in mind is that click-through rate kind of as an aggregate metric for the whole website overall is probably not a great metric because many sites rank for things that are kind of irrelevant for the site. And it might be that you start ranking for something that is very visible because a lot of people search for it. But they're definitely not searching for your particular site to get there. That's something we sometimes see with our documentation. For example, we've been going through a lot of our documentation to figure out what we should be watching out for as we did the big migration or started the migration. And one of the things we notice is sometimes our documentation ranks for things like video. And people searching for videos are definitely not looking for structured data markup that they can put on videos. They're probably looking for videos directly. And if we look at the overall click-through rate of our site, then when people search for video and we show our structured data documentation there, then probably our click-through rate really drops because it's like a lot of people who are searching, we're kind of ranking in the same places, but we're showing in search results that are irrelevant for our sites that we don't really care about. And that would change the overall metric. So if you're purely looking at the overall click-through rate, then probably you're missing a lot of details that matter a little bit more. Then the second question is Google's domain or site diversity update launched during June 2019 and doesn't seem to be working, as we're continually seeing more than three to four results from the same site for certain non-branded queries. Please shed some light on this. Oh, no, I see people nodding. OK, so in general, we don't have a restriction on the number of times that a site can appear in search. And it can happen that a site appears once in a search result. It can happen that it appears five, six, seven more times even in the same search results page. And from our point of view, it's not a bug when a site appears more often. It's something that sometimes makes sense for users. The update that we did at that time is to try to reduce the number of times that we show sites multiple times in the same search results, but it's not to completely limit sites from appearing multiple times. So from that point of view, it's hard to say that this is not working or it is working, but rather what would be useful there would be to have specific examples. So if you're seeing things for generic queries where you're seeing the same site appearing very often in the search results and there are very good other results that we should be showing instead, please send us some examples. So that's the best way to handle this on our site, especially if you have something that's really obvious where if you're searching for a generic term and you find all of the whole page filled with the same site, and there is a lot of other good content for that available, then that's something we can pass on to the search quality team here, and usually they take these reports and they try to figure out why is this happening, what can we do to improve it, and then what algorithmic changes could we launch in the future to help make these search results a little bit better. Does Google have any? Is there an example provided in the question hall? No, it's like non-branded queries in the ed tech space. Because I only mentioned it because we've had, we see that I've seen that happening more now, similar to what happened four or five years ago. But I think we try and be honest with ourselves and separate what is a brand and what isn't because you know our sites are experiencedays.com and experiencegifts.com. Those are brands, but they're also very generic terms as well. So sometimes it's very difficult to determine whether or for Google to determine whether that's a brand search or whether that's just a generic search. And we have but we have seen an uptick in those first pages having four to six queries from the same site, sorry, results from the same site rather than what's been going on over the last three or four years. But it also goes in cycles where every three or four years this looks like it happens. And it looks like you guys, they're doing something and then go, oh, I know that didn't work. So it's unwinded. I mean, if you have any examples sitting on my way, I'm happy to pass them on to you. It's something where sometimes the examples that I see on Twitter are very kind of handpicked. And as soon as you kind of vary the query slightly, you realize, well, actually it's just one very specific case where that's happening. But I think especially if you have a site that is kind of with a generic name or it's not very generic, I guess people know you by now, right? But it's still those two words also sum up our industry. People can be searching for our competitors, but they're looking for a driving experience day. And it's very difficult for you to determine whether they're looking for us or just what someone else sells that's similar to us. It's one of the things we have to cope with, but we get huge benefits from having that domain as well, not that an exact match domain matters anyone, but we still do get some benefit. So we're kind of, you know, we're on the fence about it. There's no point in us complaining because we get the good end of act. OK, so we'll have to wait for your competitors to send me an email. OK, cool. I mean, if you find any bad examples in that case, send them my way. Does Google have any special criteria for search terms like best or top? For instance, might best surgeon Beverly Hills favor the site or page most optimized for the word or phrase best? Or are there other unique data sets used by Google for these particular queries, like number of Google reviews, stars, on-page appearance, et cetera? I don't know. This is an interesting question. My feeling is from what I've seen that we do try to recognize these particular kinds of queries and figure out what we should be showing there rather than just purely focusing on the actual words in the query. So that's something that you also see with a lot of the near me type queries, where for a long time our algorithms were, I guess, more basic, in a sense, in the way that people would optimize their site for near me queries, where it'd be like pizzeria near me and you would have a page on your site that is called pizzeria near me, which isn't particularly useful if you're just in one location. But I have seen some sites appear in the search results because of the things that they did there. And as far as I've seen, this is something that we've gotten better and better at understanding and figuring out, OK, this is actually the search query and this is more of a qualifier that doesn't necessarily need to be on the page itself. And I assume something similar like that would apply for queries like best or top, where we'd have to focus a little bit more on things that are outside of that page to figure out, is this really the best one to show or are people looking for a list of these particular kinds of businesses to try to figure out what we should be showing there? So that's something where I can imagine that sometimes it's useful to include these kind of words on your site, but probably in the meantime, our algorithms are a little bit more focused on things that are outside of just purely matching those individual words. If there's a huge website updates all its data in a single day, why isn't that data crawled? Is it difficult for the crawler to go through it all at once? Yes, it can be very difficult to crawl a huge website all at once. So that's definitely tricky. I think there are multiple things that come into play here. On the one hand, we'd have to recognize that a larger website is updating all at once. And based on that, then try to figure out, how do we schedule the recrawling of all of these URLs? And that's something you can sometimes help with with a sitemap file to let us know that all of these URLs have changed, and we should go up and try to refresh as much as possible of what we can do there. And in general, our crawling is split into two kind of general buckets, or maybe three, I guess. On the one hand, we want to discover new content on your website. We want to discover what is happening that is particularly new on your site. So we'll go off and crawl new things and things that we haven't seen before. And on the other hand, we'll also go off and try to update and refresh the crawls that we've done so far in the past and try to figure out which of these pages have changed. And balancing these two sides is sometimes a bit tricky. So that's something where we have to figure out, how much do we need to spend on refreshing? How much do we have to spend on discovering new content? How much do we need to spend on things like the home page of a website so that we make sure that we don't miss anything along the way? And all of that kind of comes together in a general topic of crawl budget, in the sense that we kind of can figure out how much we want to spend on each of these parts, but how much can we actually crawl from your website without causing any problems? And crawl budget is something we've written a blog post about that I think a few years ago that primarily focuses on two aspects. On the one hand, how much can your server kind of sustain? Like, we don't want to cause any problems on your server. We want to make sure that users, when they go to your site, that they're not confronted with a severely slowed down server because search engines are crawling all the time. So that's something we try to figure out automatically. And on the other hand, we need to balance the demand from our site. How much we think we need to crawl from this website? And some of that comes from understanding how much we need to refresh. So essentially what happens with a larger website is we'll try to refresh all of the data that we have with the website within, over the course of maybe a month to three months to maybe six months. So that's kind of the general crawling that we do across a website. So if we don't have any other signals at all from a website, we'll probably have somewhere between three months and six months time to refresh all of the content there. So that's kind of the general crawling. And you can imagine if you update the whole website on one day, then we'd have to do six months worth of crawling on one day to try to combine all of that and get that all kind of concentrated in one go. So usually what happens when we recognize that a site does larger scale updates is we'll try to refresh as much as possible, as quickly as possible. So we'll try to get that in within a couple of days. And essentially, we'll try to focus on the most important pages first, so the pages that users would end up seeing. Those are the ones that we try to refresh as quickly as possible. And then there's a whole bunch of other pages that also updated, but which we know are very rarely shown in search, so we don't really need to refresh them all at once. So because of that, what you'll see with a larger website when you make a bigger change, which could be something like a name change. It could be a domain change. It could be a change of titles or something like that. You'll see a bulk of the visible important pages get updated very quickly within a couple of days. And this longer tail of other pages take a couple of months to be updated. So that's usually the approach that happens there, balancing all of those different parts. So unfortunately, we can't crawl the whole web at once every day. May I raise a question here? Sure. There's an opportunity in the Google Search Console to let you know about the website has changed or updated or something like that. Forgot the name of that specific thingy. But it's been off for a couple of weeks now. You mentioned on Twitter that this is just for fixing purposes or something like that. So my question would be, do you have any updates on when that project is going to be relaunched? And the other question would be, would you guys at Google think about bug uploading? How do you mean bug uploading? Currently, it's only possible to give you one URL to crawl that URL and send that for indexing. And frankly, when I have like 10, 20 websites that I updated for some reason, and I have to type in every URL alone, and then there is that capture I have to go through and several clicks to be made. And I think it's only like, even if I work with several tabs, it's like 10 tabs is the max, which is allowed or something. So bug upload function would be really, really helpful sometimes. Oh, bulk upload. OK, I understood you. OK, yeah. I struggle with this because my general feeling is if we have a way of sites to tell us that pages have changed or that need to be updated, then we should try to find a way to automate that as much as possible. So instead of telling you that you need to go and fill out this form or upload a special file with all of the URLs or something like that, and then go through the captures and all of this, I think our system should be set up in a way that they can automatically handle this. And we essentially have a system to do that, which is all of the sitemap stuff, where you can say, well, this page has changed. And Google should refresh that as quickly as possible. And my general sense is that if you go to the effort and say, well, these 10, 20 pages have changed, and I will even fill out a capture to prove it to you, then it feels like maybe we should find a way to trust your sitemap files a little bit better so that you don't have to do all of this. And instead, your CMS updates the sitemap files and we say, oh, look, there are 20 new pages, and this is a great website. We will go and update that right away. So that's generally, from my point of view, the direction I prefer that we take. I think the tool for updating individual pages is something that definitely makes sense for other situations where maybe you can't do a sitemap file, or maybe you have something really urgent and you can't wait for someone to create a new sitemap file for you, or maybe there are other reasons as well. I think that definitely makes sense. But I really prefer that normal updates of a website, that that's something that can be processed as automatically as possible, because it feels inefficient to encourage you to fill out captures just because you're making normal changes on a website. No, true. OK, thank you. What's the relationship between article length and ranking? I don't think there is any relationship between article length and ranking. I think the aspect that most commonly comes into play with article length and ranking is that, for some topics, users expect longer articles. And when they reach your site, they're happier with finding some longer information. And for other topics, maybe they don't need a long article. And the one thing that I do kind of understand is if you're kind of working together with a team of writers and you're ordering content from them based on specific topics that you care about for your website, then sometimes you specify a length and say, well, I at least want an article this long. But essentially, from an SEO point of view, there is no requirement to have a certain length of an article on your site. But rather, it's more you need to be able to fulfill your user's needs. And sometimes you can do that quickly. Sometimes you can have a really long article with lots of details in it. There's a lot of room in between there. Is it necessary to use the keyword in exact match or not in the content? What's the best keyword strategy in the eyes of Google? It's not necessary to use exact match keywords within the content. And this is something that I think has been the case for a really long time now. In particular, we recognize things like synonyms. We recognize things like misspellings and kind of singular and plural forms of keywords. These are all things that our systems can recognize both in the query and within the content. And the newer machine learning-based systems that we have, they go even further and try to understand, well, what is it that you're actually writing about here and how does that match what the user is actually searching for? So with that in mind, you don't need to include all of the variations of all of the keywords that you care about within your content. And I think most sites have been doing fairly well with that. So it's no longer the case that you look at the top search results and it's just filled with kind of synonyms or misspellings, typos of the same keywords. So that's, I think, kind of a nice development there. What is the best keyword strategy in the eyes of Google? I think that's something where it's less a matter of SEO from my point of view, but more a matter of almost marketing and finding topics that your audience cares about and being able to be out there as early as possible so that you can get your content in front of those people when they search for that information. So that's something where it's not the case that you need to find the exact keywords or the most important keywords out there, but really you need to figure out what it is that you have on your website that's unique to your website that users will be searching for where you can provide some unique value. And those are kind of the keywords to focus on. The one thing with keywords that I do want to mention, though, is that while it's not necessary to focus on exact match keywords within your content, you should be a little bit specific about what you want to rank for. So if you have a news article about something big happening, then don't use descriptive phrases to describe that from happening, but rather mention the names, mention the places, be elaborate, and be specific on what is actually happening there. This is something that I see, especially with smaller business websites, where you go to the website. And as a user, even if you look at the website, you're like, well, what is it that they do? Do they sell a product? Do they sell a service? Are they just the consulting company? Is this just a general marketing page? What is it that I, as a user, when I go to this website, I'm supposed to be doing? And if users can't really tell what exactly it is that you're doing, then probably search engines will have trouble, too. And probably search engines will have trouble figuring out what they should be ranking this page for. It might look really nice, but if it doesn't mention what it is that you want to provide and what it is that you want to appear and search for, then it's going to be hard to show. Could you introduce a strategy on the improvement of the page indexing and the possibilities of time reduction between the bot-first wizard and the entrance of the material on the search results? I don't quite know which direction that goes. In general, when we crawl a website or when we crawl a page from a website, it can take a bit of time for those pages to appear in the search results. But it's not something that we kind of purposely slow down. It's more that sometimes things take a little bit longer to be processed than others. Sometimes we can show that within a minute or so of being crawled. So it's not that there is any kind of an explicit delay there. Who can we contact to receive the analysis of traffic and its fluctuations caused by the core update that occurred on September 24? I'm not aware of any core update that occurred on September 24. So that's kind of one thing with regards to receiving analysis data. The best way to get data for your website is, on the one hand, to collect it yourself, using any of the kind of normal analytics packages. And on the other hand, to use Google Search Console, where you can kind of track how your site is appearing in search and also get any information with regards to issues that might have popped up that our systems might be able to help flag for you. John, regarding these core updates, do you know if there's any plans to launch any new ones by the end of the year? Are there any? I don't know what the specific plans are there. So it's not the case that we stopped making core updates. I just don't know what the specific launch plans are. I think it's always tricky towards end of the year because everyone's a little bit jittery and probably this year more than ever. So there's a little bit of trickiness involved with, well, we'd like to make improvements in search, and we think our users deserve kind of to see those improvements in search. But at the same time, we don't want to be super disruptive. So I don't know what will happen there. It's possible at some point maybe we'll still have an update. I don't know what the timing would be there because essentially, all timing is bad when it comes to the last quarter of the year. Right. But just curious, these kind of updates are needed because there's no system to do this automatically ongoing. So is it like the old Panda and Penguin updates where you kind of needed to aggregate a lot of data, then kind of crunch it and then push the results into Google? And then you kind of transitioned into doing that kind of ongoing automatically. Is it kind of the same with the core updates? I don't think you can really compare those that well. So that's kind of one thing. The other thing is a lot of times within the core updates we'll have bigger algorithmic changes. So it's not so much that we're changing, I don't know, the page rank of all of these pages kind of thing. But rather, we're kind of making bigger algorithmic changes in the way that our systems show things in the search results. So it's something where if you change the algorithms with regards to how we rank things in search, then sometimes you just have kind of that jarring jump from one change to the other. But are these changes related to what happened between these months from the past core update to this one? I mean, does it take any data based on crawling data and indexing data and anything like that in order to kind of judge what the changes should be in the next core update? I don't know how to best answer that. Because we do look at what has been happening over time. And it's something where we obviously look at the data that we've collected until then. And we try to use some of that with the next algorithmic changes. And for bigger algorithmic changes, we do all of the usual tests, the AB tests, with the quality writers, all of that. And essentially, that takes into account the data that's been collected until then. So yeah, I mean, I generally prefer if we could do these algorithms a little bit smoother in that there isn't this jarring jump from one to the other. But sometimes that's just the way that it works out. This is why I was comparing to the Penguin algorithms, because there was also that kind of big change overnight. So this is why I was wondering if it's something like that in the sense that you are planning to smooth out these changes. So it's more an ongoing thing rather than all of a sudden. Now, I mean, if it were just a matter of changing like the page rank values or something like that, then that's something where you can smooth that out and easily kind of spread that out over a course of a couple of weeks or so, and so that it doesn't look that jarring. I think, I mean, sometimes also having a jarring change is useful for site owners, because then you can understand a little bit where things have changed. But I generally prefer when things are smoothed out. One last thing. So are these changes usually related to kind of you understanding the data better? Or are they more related to maybe we should weigh this factor a bit more or this factor a bit less? Or is it kind of the same? But oh, now we better understand what this page is about or something like that. I think all of that comes together. Yeah. Makes sense, yeah. Yeah, I think there are always so many factors that are involved with these algorithmic changes. So it's always tricky. Cool. We're kind of running towards the end of time. Well, the end of this session, at least. Not end of time, hopefully. If there are any questions from you all that you'd like to ask, feel free to jump on in. Hi, John. OK. You go ahead. You go ahead. Thank you. Hi, John. I'm Ben with my team. So now I'm a translator for him. We are Thai, so I could say I'm sorry for my Thai. I'm not that good, so. But I will try my best. For our problem is the Search Console. The Search Console on the indexing in the part of user decry. It's not our website. It's not our website. And we try to figure it out. We try to figure it out on this problem. But we couldn't fix it. We couldn't fix it. And we have a Google host in Thailand. And they couldn't help us as well. So do you have any suggestions for us to do or how we can do? And the second one. And the second one about the site, our website, that we can find our website on Google. It's another website that I mentioned you. So what can we do? Thank you. OK. That sounds a little bit more complicated. So my recommendation there would be, perhaps, to post the details as much as you can in the Webmaster Help Forum. We have a forum in English that I know is fairly, I don't know, fairly active. Mihai, for example, is active there. I don't know if we have one in Thai specifically. But if you can post in the English Forum, that would almost certainly, oh, yeah, Search Central Help Forum. We change the name. If you post there with the details of your website and maybe some screenshots of what you're seeing, that would really help. Because it feels like maybe there's something specific with your website or with the way that it's hosted or the way that it's shown in Search. And those are sometimes hard to kind of just give a general answer for. It would almost need to be that someone looks at your specific case. All right. Thank you. Sure. Hey, John. If I can just quickly jump in. This is all related to the question that someone else asked in the YouTube page, Solomia. It was around indexation and how you guys are serving pages after they run through your indexation process. Is there any, do you know of any delay that occurs between a website or a page, a web page being indexed and then served due to, and there's a delay because of bad usability or mobile usability? Is there anything that exists like that or is it? Does that just not happen? I don't think we have any delay based on quality issues or things like that. So I mean, one of the things that happens is we have to be able to index the content. And if we have the content in the HTML form, then we can pick that up right after the crawl. If we have to render that page first, then sometimes that causes a bit of a delay. But even that rendering delay for the most part is a matter of minutes. It's not a matter of hours or something that would be visible. And once we've indexed the content, then essentially it could appear in search. What might happen is that there's a bit of a delay until we rank it appropriately in search. So if you're searching for something and it's very competitive, then understanding, oh, this is a good page for that query, that might be something that has a delay. It might also be, depending on the query, an issue, well, not necessarily an issue, but a matter of understanding how timely this page is. So if you have a news article on your site and we don't recognize that it's a news article of hands and it's a query where people are expecting news about that topic, then it might be that we say, well, this is more of a general article on this topic. It's not a news article, so maybe we won't show it for this query. So the indexing side is one thing where I think the timing should be fairly tight. And the ranking side is kind of a completely different question. Right, OK. Do you know, I'll get you on the ranking side where it might not sharp because it's a really competitive term. But if you're searching for the exact headline or if you're doing a site search and you're not seeing it there, would you say that's problematic or is that normal behavior? I don't know. It seems weird, but it can happen sometimes. So for the most part, if you're explicitly looking for that page and you know that it's indexed, then it feels like we should be able to show that, unless there's something really kind of blocking with that page in particular. Yeah, that's kind of why I was rounding back to the mobile usability and that potentially being a hindrance there. But there's an issue with Google being able to process things. I don't think so. I mean, if it's a matter of usability of the page but we have the textual content, then that shouldn't be an issue that would prevent us from showing it at all. It might be kind of going into more of the ranking side of things where it's like, well, we could show it, but we're not showing it in the top position or something like that. But it shouldn't be something that would block us from showing that page. OK. OK, thanks, John. Sure. Hi, John. Can I follow up on that? Sorry. Sure. Yeah, so you just mentioned that if you recognize or if you don't think an article is newsworthy, you wouldn't serve it for specific terms. I'm curious kind of what you're looking at on that page that I identified that other than mentioning what people are kind of looking for at that time, are there any other things you're looking for on the site specifically? It's hard to say. I think so I don't think we have any explicit guidelines on what you need to do to make something look like a news article. Usually, I imagine that's something that we just pick up overall for a site. So it's less a matter of specific kind of meta tags or specific headings or specific site structure that you need to have there. It's more about understanding, well, there's an article associated with this page and we regularly see new articles on the site with new dates, kind of that general setup there. It's not, I don't think it's the case that we look for any one particular factor to say, well, this is a news article or this is more of a reference article or an evergreen article. Thank you. Hi, John. Can I pop in with just a quick question? Sure. OK, so just a brief context. I have a website that was stolen about one year ago and two weeks after that, the guy that stole my website redirected to another domain. And two weeks after that, he redirected that domain back to my original domain. So meanwhile, I was able to recover my website with a court action in US. So I entered a court action in Victoria's site. So I was able to recover my website and that other domain is no longer working. So the domain that the other guy created is no longer working. But Google is still indexing that other site with the contents of my site, with the fresh contents of my site. So if I go to Google, if I search for the other website, I will see the contents from my website from yesterday, from two days ago. So why is Google indexing a site set when you go to that site is returning a Cloudflare 522 error? But Google is showing my contents, my fresh contents of my real website. What kind of query are you? Redirection, historical redirection that that site and because it does not find the site real, it will go to that redirect? Well, what kind of query are you doing to see the other site? So if I do site semicolon and then the site, I see 13,000 pages with my contents. But for some specific queries, I see that other site also appearing before my site or at the same level of my website. OK. So usually, if you do a site query for a site that is redirecting, we may show the previous site for that. And that's something where our systems, when they look at the ranking, they try to understand what it is that you're searching for. And if it looks like you're searching for one specific website, then we'll show you that website in the search results, even if we know that it has been redirecting for a longer time. So you see this often with site moves. If you move from one domain to another, probably also in a case like yours. One way to confirm that this is the case is if you look at the cached page of one of those results. And usually, if you look at that page, then the URL we show on top is from the kind of the destination page. And that essentially means we're indexing your pages. But we know that these old URLs were also associated with it. So if you explicitly look for those old URLs, then we will show you those URLs. But if you look for the normal content, then we will show you a website. So that's kind of the one thing there. If you're still seeing it for generic queries where someone is not explicitly looking for that old website, that seems more like a bug on our side or an issue of us just not having processed it for those particular URLs. So if you're welcome to send me some examples there, I'm happy to take a look. But in general, also, that's something that might just depend on the timing, where if you've kind of updated your site and it's fairly fresh and we just haven't recrawled those old URLs and understood that actually these are separate now, then that's something that can still take place in terms of a couple of months even after these changes are made. And one way to kind of help with that is to make sure that on your website you're submitting a proper site map file so that we really understand this is your website and we can recrawl your website as quickly as possible with your fresh content and show that properly in search. Yeah, OK. So how can I send you some examples? Is it a Twitter message or? Let me just drop my email here. In the chat. And then, yeah, but like I said, the site query results, I think those will continue to be there for a longer period of time. I've seen cases where we've kept that connection for several years, even though the site has moved to a different domain. But if you explicitly look for the old domain, our systems will try to be helpful and say, oh, we know what you're looking for. And it's not really like if you use it for diagnostics purposes, that's not very useful. OK, OK, thank you, John. Cool. OK, we're essentially out of time for here. I have a little bit more time that I can stick around. So I will just pause the recording for the moment. If any of you all want to hang around a little bit longer and continue to talk about these topics, you're welcome to do that. Otherwise, I wish you all a great weekend and hopefully see you all again in one of your future hangouts. Thanks, everyone. Thank you, everyone. Bye. Bye. Bye.