 All right. Welcome, everyone, to today's Google SEO Office Hours. My name is John Mueller. I'm a search advocate at Google here in Switzerland. And part of what we do are these office hour hangouts where people can join in and ask their questions around their websites and SEO and Google search and all of that. Bunch of things were already submitted. But as always, if any of you want to get started with the question, you're welcome to jump on in. Hey, John. How's it going? Hi. So my question is pretty quick. There's this website that keeps showing up in search results in very prominent positions. And the problem is that when you actually click on the links from that website, it actually takes you to something else, completely irrelevant for those queries. And obviously, this is taking up valuable real estate from other websites that would be more relevant for those terms. So my question is, first of all, why doesn't Google pick up on these situations and remove those websites from the index? And because in this particular example, I've been seeing this publication for over a year. And if I may, send you in the chat box, perhaps, the URL so you can investigate. Sure. Yeah. Like, feel free to drop a link in the chat. In particular, also queries where you're seeing this kind of an issue. It kind of depends a bit on what is actually happening there. If it's more that they're returning content which isn't great, then that can happen. If it's a matter of being redirected to malware or to phishing content, then that's a little bit of a different situation. Usually, that's something where if you submit it to the appropriate forms, then someone from that team will take a look at that. But if it's just low quality content and you don't think it should be ranking like that, then sometimes that's a matter of discussion rather than something where there is like a clear yes or no thing that we can do or where the web spam team would jump in and make a manual action. But I'm happy to take a look at the examples and pass that on to whatever team needs to take a look there. Cool. Thank you. Hi, John. How are you? Hi. So my question is about Web Vitals report in Search Console. So we are working on Web Vitals. And Search Console is that there are a few pages on the website that has a poor CLS score. So when we check those URLs in PageSpeed Insights or other tools like webpagespeed.org, the score is pretty good, like 0.06. But Search Console is reporting it around 0.60 or more than that. So I'm not sure what we can do here because Search Console is saying something else and PSI is something else. So what do you suggest we can do here? Yeah, I think what you're seeing there is the difference between lab data and field data. So lab data is when you do a theoretical test and you're kind of theoretically looking at the website using your connection, using your browser, using a standardized browser, standardized connection, or what have you. And field data is the data that is actually collected from users when they're using Chrome and accessing the website. And in many cases, the lab data is kind of predictive of the field data that you would see there. But there definitely are situations where what you see in your kind of theoretical tests doesn't match exactly what your users are actually seeing. And that can happen, for example, if a lot of your users are in locations where they have a bad internet connectivity or where they use devices that tend to be different than what you're testing for. And that's kind of where that difference tends to come from. Well, in our case, actually, the Search Console is giving 0.60 for that URL. But paid speed inside in field data and lab data in both these sections, it is showing around 0.04 or 0.06. So the difference is pretty much like the only Search Console is giving some kind of error, but paid speed inside, or other tools are not giving. So that's the field data is pretty good 0.06. Lab data is 0.04. But Search Console is saying that it's 0.6. Yeah, I mean, the Search Console data is based on the Chrome User Experience Report data, which is the field data. So it's not that Search Console is doing separate tests there. And what might just be happening is that there's kind of a time delay that you're seeing there. But essentially, they're based on the same data. The other effect that you might be seeing is that we're grouping things slightly differently in Search Console than when you test them individually. But that's also something which tends to settle down over time. So should we wait? I would continue to keep an eye on it, but Search Console is not inventing these numbers. Like it's not making these measurements itself. It's basing them on something that it has seen from the field data. So I would definitely take the time to investigate a little bit further, try other parts of your website, and try the Chrome User Experience Report field data directly, all of those things. Sure. OK, sure. Thank you. Sure. Hi, John. How are you? Hi, John. I don't know who was first. He can go. It's OK, no problem. OK. It's OK. OK. Thanks. So John, I have a question about international SCU. So we have multiple websites for different countries. And actually, they have the similar content in English. We have used, of course, the hreflank for each website. And now, actually, we are facing a major decision, whether we should keep it like this with different CCTLDs or just switch to a generic domain with subfolders. So we're not sure yet which one to choose. Yeah. From our side, both of those approaches would be valid. So it's not something where we would say there's an inherent SEO advantage of CCTLDs or a generic domain with subdirectories. Both of those would be valid approaches. The only thing I will keep in mind is if you make a change, try to make a change that you can live with in the long run. Because every time you restructure your website, you will see some fluctuations. It will take a bit of time for everything to settle down again. So it's not so much that it'll be better automatically or worse automatically. It's just during that transition time, which might be a month, might be two months, you will see fluctuations. And Search will have trouble understanding the full picture of your website. So if you make a decision, make it based on what you want to have for the long run and kind of realize that you will see fluctuations. So maybe plan the timing in a way that it doesn't affect your business overall. That you can say, well, people are not searching so much during this time, so I'll do the migration there. So they understand, well, it's like having this kind of structure of multiple websites is like fine for ranking factors. So it doesn't matter whether we switch to subfolders or keep it like this. Yeah, exactly. For SEO reasons, both of these are valid. It's sometimes there are policy reasons to change. Sometimes there are practical reasons that the infrastructure is more expensive or takes more time to maintain. All of those, I think, are valid reasons. For SEO reasons, you can do either way. OK, and you've said something interesting about change. Actually, our URLs, like for the blog posts, let's say the current path is URL slash the post name. So is it bad, for example, to change the structure to site.com slash blog slash post name? So is this change could harm our ranking factors? So it's very important to know this. You will also see fluctuations when you make that change. But afterwards, it can be the same as before. So it's not that you automatically have an advantage or disadvantage from changing the URLs. But it will just take time to settle down and be understood. OK, thank you. You can go, Mike. Sorry. All right, thank you. Hey, John, I have a question regarding search intent. How does Google handle search intent? Does BERT have anything to do with it? And what can they do to optimize my website for better search intent? I mean, search intent is kind of based on what users are looking for, what they're trying to do. So it's not so much that you would optimize your website for search intent, but rather you would optimize your website for what it is that your users are looking for and how they're searching. So that's something where you could look at the queries that are going to your website. You could also do user studies for kind of a broader batch of users who maybe are not going to your website already. And based on that, try to figure out what aspects do I need to cover there. So for example, if a user is seeing a specific query, a long tail query, then I should have that exact sentence on my website. Is that what you're saying? It doesn't have to be the exact sentence. But if you're seeing that as a pattern of something significant with regards to traffic to your website, then you can kind of assume people have this kind of a question and they're trying to find information on that. So it's less a matter of you have to match that sentence one to one. But if you think this is an important query, then have some content that kind of addresses those questions. So it doesn't have to match exactly, but it would be good if it helps to kind of solve the problem that you think these users are having. And Burt has nothing to do with that or does have something to do with that? I mean, Burt is more about better understanding the queries and better understanding the content. So it's not that you would be optimizing for Burt or anything like that. It's more that we just try to match the content and the queries a little bit better. OK. Thank you. Sure. Can I also offer a small point of view here? We generally try to figure out kind of the same thing, searching behind certain queries in order to kind of understand what kind of pages should be created or optimized. And this is specifically for more general queries where not even knowing the audience, you're not really sure what people are looking for. So for example, somebody searching for something like progressive glasses. You're not sure if they're searching to buy something. You're not sure if they're searching to understand what they are or are searching for information. So I think a good way to go about it is basically kind of check the top 10 results, kind of figure out. Because kind of, so to speak, reverse engineer what Google kind of thinks that the most relevant results should be like maybe more informational results. Because if there are just informational results and you're trying to improve or optimize a commercial page, that might not have a lot of chance to rank because it's not really relevant based on the intent that the user is looking for more information about the subject, not necessarily to purchase anything. And that can change. I've noticed, for example, certain holidays like Black Friday and things like that, the search intent might actually change a bit and the search results might change to be a bit more commercial maybe, rather than informational or something that's newsworthy might change that a bit as well. But that's kind of a general idea of using what else is ranking in the top results to kind of understand what kind of results, what kind of page you should create in order to make sure you're relevant for whatever the users are searching for. I also noticed too, depending on the search query, that the tabs at the top of Google will change. So whether it's an image, a video, a map, whatever it is, that order will actually change based upon the query. So does that give me any insight to what information I should have and the hierarchy of the information I should have on my page? I think it's useful to take a look at these kind of things and kind of the existing top rankings. But you need to also kind of keep in mind that you're kind of optimizing for the current situation. Especially with a query like Back Friday, if in the summer you're saying, oh, I want to optimize for Black Friday and you search for it, then you'll find informational content on that. Whereas during the time that you care about that informational content is probably not the right thing to kind of focus on with your website. So it's something worth keeping in mind that the current results are kind of representative of the current state of our understanding of those queries, but not necessarily of the long term understanding. Gotcha. John, question about a safe search filter and potentially content that I'm concerned is getting filtered by the safe search. And I can see why potentially. But I want to try and understand, is there a way to definitively tell if content is being filtered by a safe search? And if so, and once we can determine that, how can I and I make changes to try and get around that, do I need to wait for just a Google to recrawl that content or those images or does it need to pass through another algorithm and filter or manual action or something like this? The easy way to test if safe search is kind of triggering for that content is to just turn the safe search filter on and off. So doing something like a site query and then just in the search URL on top, you can just add ampersand safe equals on, I think, or safe equals off. And based on that, you'll see if they're different results. And if there are no different results, then safe search is essentially not affecting that site. Okay, so just a site poll in with safe search on or off and analyzing those results. Okay, simple as that. Yeah, sure. Yeah, and with regards to kind of changes that you make on your website, that is something where we have to reprocess the content and we have to kind of re-understand it a little bit. So it's a little bit longer than just recrawling. But if it's individual pages, usually that's pretty fast. If it's a broader part of your website that we kind of misclassified in that regard, then that can take a bit longer for us to kind of aggregate those signals on a broader. So if it's an image, images we're specifically concerned about, that would affect the entire URL that those pages are on or those images are on or just, yeah. I mean, yeah, because we associate the image with the landing page. And if there's something problematic in the image, then it's very likely that we'll just say, well, kind of this pair of landing page and image is the thing that we have to filter for safe search. Got it. And then I think earlier at the end of last year, you had mentioned that if the site is being flagged in safe search or has been flagged as having adult content that it won't be eligible for rich results. Is that across the board or on those pages specifically? Or did I misinterpret that? I think that's specific to the pages that are affected. So with regards to safe search, it's something where we try to do it as granular as possible. But we can't always do that. So sometimes if we have individual pages across the site that we think need to fall into a safe search category, it's possible that our algorithms will just say, we'll take the whole domain and say, we don't know for sure. So that's something that kind of plays in there as well. It's like the individual pages are essentially what we try to focus on. But if we don't know for sure that these individual pages are OK or not, and it's not like a quality kind of assessment. It's just like safe search or not, then it's possible that we'll filter that out for those pages. John, I think we talked about this in like 2014 or so. When all these female celebrities were being hacked, I had an entertainment site. And we were doing legitimate reporting about so and so was hacked. So there was naked photo or naked video were the words. And just by reporting on it, the sheer number of those women who kept being hacked, we were getting bucketed, I think. You said that the systems were sort of accidentally bucketed. Not accidentally, but they were seeing, because it was over two or three days, a ton of that content that you're writing repeatedly these words and the subject matter that you could get bucketed. I think that's what happened about six years ago or so. Yeah. Hi, John. I might have a short and sweet question. Google earlier announced that the rollout date of the new page experience ranking signals, like LCP and FID and things like that, that the rollout date was May 2021. I was wondering if this is a global rollout on May 2021 or are there certain countries going to be before that date, after that date, masking specifically for the Belgian market? My understanding is that this is a global rollout. So it's something that we collect on a global scale. And there is no big reason to not do that on a global level if we can. So my understanding is it will be global. With regards to experiments before that, I don't know for sure. So in particular, we want to do that badging of the search results when we see that a site is within the good page experience bucket. And usually, badging is really hard to do, is really hard to get right. So we have to do a lot of tests before we figure out which way to badge these search results. And my assumption is that we will have some tests around the badging before that time. And some of those might get seen. Some of those might not get seen. But it's something where we, when we make visible changes in search like that, we really have to be sure that we do them in a way that the average user can understand them fairly quickly. OK, thanks. Hi there, John. Yeah, I work in a camper van rental company. And recently, we have been facing a big problem that I've been losing a lot of time to the last few months. Basically, in the specific regions of the UK, we see that for specific queries that have kind of the same terms, the search intent is exactly the same, I think. I'm going to give you an example camper van rental and camper space van rental. We see that our competition has consistent results. They basically rank from the first page. And for the two queries, they keep the same position. While our company for camper van rental, for example, goes like top five van results and camper space van rental, we go to 60 or 80th position. And I went deep into this problem, and I checked how many times I use this word. I hate to go by keyword by keyword, but I actually did it this time. And I see they don't use it. There's no discrepancy here. They use it as much as we do. And I'm trying to figure it out, and I'm losing my mind about this. OK, I suspect what is happening there is that we don't see these words as being 100% synonyms. And that's something that our systems try to pick up automatically. So it's not that we have, or at least as far as I know, I don't think we have linguistic models that are there to try to figure out what are the possible synonyms for these words, but rather we see these are unique queries. And we try to understand what the entities are behind that and to try to match that as much as possible. But probably we don't see them as 100% synonyms. Otherwise, the results would be exactly the same. And that's something that is some usually it just works itself out over time in that when we have enough data to understand that, OK, people searching for the word together mean exactly the same thing as with like separate words, then that's something that we'll take into account and say, OK, this is more of a synonym rather than just kind of very similar. And with regards to optimizing for something like this, improving your rankings, I don't really have any super, super fancy tips there. It's more a matter of if you're seeing that a lot of people are searching in one particular way than maybe use that way on your website as well so that we can find that appropriately across your site. And if you're seeing kind of a balance between those two ways and you're seeing that Google isn't kind of matching them as synonyms or as being kind of close together, then maybe it makes sense to mention both of those variations on some pages. It is kind of an awkward shift into the kind of the traditional SEO picture of like, oh, they're just optimizing for all synonyms. And usually, we figure that out fairly well. But sometimes, it's kind of more practical to say, well, Google will figure it out at some point, but I need to solve it for myself now. So I'll just kind of do the awkward thing. All right, thank you very much. And there was a video, I think, like what was it? Fall 2019 from Webmaster Conference in Mountain View from Paul Har, where Paul Har is one of the top quality engineers that works at Google on search quality. And he mentions quite a bit around synonyms and how we pick that up, how we understand which words belong together, which words don't. It's not going to solve your problem, but it'll give you a little bit of background information on what might be happening there. All right, thank you very much. Sure. Hi, John. Our website has about 60,000 products, and we have a rather robust layered navigation that uses a number of product attributes. We're preparing to go to a migration to a new system where our search parameters are going to be changing from the current system to the new one. And we were hoping to preserve some of our SEO value for many of those parameters since they do appear in the search results page. What is your suggestion on the best approach to take to preserving the SEO value for those legacy URLs once we're on a new system? Should we preserve those? Should we translate them? How do you see that playing out? No. I think any time you do a migration on a URL level, ideally, you would redirect the old URLs to the appropriate new ones. So if you're changing parameters or the setup on how you use the parameters within your website and you care about the old or the new URLs, then make sure that you have redirect setup. And with that, we can pretty much pass all of the value there. And if you have a very complex setup for your website where you can't just set up redirects for everything automatically, then I would at least take a look at the traffic that you're seeing from search for the URLs on your site and make sure that the most important URLs or the most important URL patterns are covered with redirects. So we shouldn't do anything like no index the legacy URLs or anything like that? No, I would not use. I mean, if it's something that you care about the traffic to the old URLs, kind of that gets forwarded to the new ones, then I would definitely make sure you have a redirect setup. Or at least if you can't do a redirect, then you have to set the rel canonical to the appropriate new URL so that we can forward any of the value collected with those old URLs. If you use a no index and just create the new ones, then essentially all of the value that was collected by the old URLs gets dropped. Oh, that's great to know. Thank you very much. Sure. OK, let me run through some of the submitted questions. And we'll have some more time for other questions from everyone along the way. I think some of them we might already have covered or gone through. So the first one I have here is our business is going through a rebrand and a system update or planning on executing quite a hefty migration. And then they list a ton of different things that they're doing, like a domain migration, CMS migration, consolidating pages, updating content, updating the site structure, template redesign, switching hosting providers. And kind of the question is, how should we do this? From my point of view, these are all pretty big changes. So it's something where you will definitely see fluctuations along the way. And that's something where if you plan to do so many different changes because you think the end state will be important, then you could theoretically spread that out and do it one after another. But practically speaking, that's probably not going to work. So practically, you're probably going to have to bite the bullet and say, well, I will switch from the old setup to the new setup and make sure that you have all of those details from the migration kind of nailed down. In particular, all of the redirects that you need from the individual URLs, any internal linking that you have, structured data that uses URLs, all of that, that it matches the new setup. And usually that means creating tons of spreadsheets and really big lists of URLs and matching things so that beforehand you have a really clear plan of what you need to do. And once you do the migration, you can do kind of a QA of the migration and really confirm that you have everything lined up. And kind of that confirmation that you have everything lined up is really critical because you will see fluctuations in at least kind of intermittently while things are being reprocessed. And you need to be sure that you can kind of trust that these fluctuations will settle down. And you can only kind of trust that if you're sure that you have really done all of the details right. And that's something where you don't want to wait for everything to settle down to figure out, oh, I forgot to redirect my blog and now everything is gone. And with regards to potentially the most dangerous thing with regards to the rankings there, it's important to keep in mind that you will see changes in ranking, at least during the migration. You will see different rankings afterwards as well. If you have a really clear plan and you have a really nice setup for the final configuration there, maybe you will see positive changes overall. If the final setup is a lot more confusing than the initial setup was, maybe you will see lower rankings. So it's not possible to guarantee that the final configuration will be equivalent in terms of SEO to the initial one. And sometimes it's not even reasonably possible to determine what order of magnitude the final setup will be. In particular, if you have multiple sites that you're combining into a single site, it's not really possible to just add up all of the traffic the individual sites got before and assume that your new site will get it. It's possible that your new site will be so much stronger that you'll get a lot more traffic. It's also possible that the combination that you're doing essentially results in one site rather than a couple of sites and that you might lose some visibility in search. So if you're doing this for the first time, I would definitely recommend getting help from someone who's done it before so that you can hear some, I don't know, war stories around migrations, what went wrong, and make sure that you don't do those things. Do you have any information regarding web stories when they might be released in Australia? We want to be early adopters, so if we can get any estimation. I don't have any information on when the new features will launch, particularly web stories in Australia. With web stories, it's important to keep in mind they are normal pages too, so you can already adopt and use these web stories on your website, and we will crawl and index them like normal pages as well. And when we start to show them more visibly, you'll see them a little bit differently in the search results. You might see them more prominently in Discover as well, but essentially, you can go ahead and implement these already. With regards to us kind of turning on individual features in individual countries, sometimes that's a matter of working out the policies and working out the technical details. Sometimes that's also a matter of us being able to look into the data from that country and say, oh, lots of people are already implementing this structured data. Maybe we should just turn it on for them. So in that regard, if you're tempted to try something like this out, I would say go for it, even if you're not seeing all of the advantages yet. Please detail the difference between a doorway page and a landing page and why doorway pages are frowned upon. So usually, I think we have a Help Center article about this, but usually doorway pages are individual pages where you're just funneling people to the actual part of your website that you care about. And my assumption is that for the most part, our systems are able to understand these doorway pages better nowadays anyway so that when we look at these doorway pages, we see, oh, this is a kind of a low quality page. It mentions these keywords, but it's kind of a low quality page, so we will probably not show it as visibly in search anyway. But essentially, the idea is that instead of setting up a number of different pages for just variations of different keywords, make one really strong page that really kind of covers that topic really well. So that our systems can recognize this is a fantastic page. And appropriately, we can understand, well, this is a part of this bigger website, which is also really fantastic. And that helps us to kind of better rank those pages overall. Will adding history API functionality into infinite scroll pages, is that enough? Or do I need to add physical pagination links on top of this? Yes, if you're using infinite scroll, then having paginated links is very important for us, because then we can crawl and index those paginated links separately. History API is something we can sometimes pick up on, but often it relies on the user doing something specific on a page, like scrolling to a specific location. And Googlebot, in general, when rendering a page, does not trigger any of these actions, but rather renders a page in kind of a really long viewport. And maybe that will trigger your history API or not. Whereas if you have links to the individual paginated versions of those pages, then for sure, we can figure out, like, these are the different pages, they belong to this one list, and we can index those individually. So history API alone, without using links on the page, is kind of hit and miss. And depending on how you implement it, it's likely that we won't be able to pick up on those paginated versions. The other thing to keep in mind is sometimes we don't need to trigger infinite scroll on a website. So in particular, if you have something like a blog and a category page and you use infinite scroll there, if we can already find all of those blog posts, then we don't really need to trigger infinite scroll. So it's not something where you'd have to say, well, if we use infinite scroll, we must make sure that it's crawlable, but rather it's a good practice. And if you care about those pages being indexed, then sure, you should watch out to make sure that they're crawlable. If you don't care, that's totally up to you. Please tell us about Google Sites Search. So I don't really have a lot of information about Google Sites. I think we have a help forum specifically for Google Sites, and I would recommend going there. So I don't really know what the kind of things to watch out for with regards to Google Sites. John, since you mentioned earlier regarding pagination, I know there is an old video from my Yohei regarding what kind of options you have in terms of pagination, basically the View All page or it was then run next for a prep. And she mentions in the video that if Google does find the View All page on the site, it will usually prefer to show that page in the search results. So obviously, if you don't want that, then you can use the normal pagination and Google will kind of figure it out on its own. What I noticed is that for a couple of our clients where we have a View All page, but it simply be there because the platform allows you to see kind of all of the results. And there's like a parameter like page equals all or something like that. Google will actually use that, index that, and rank that, despite us not wanting to. It has a canonical to the non-parameter version. But Google simply seems to be ignoring it. And I wonder whether it is because Google kind of figured out, oh, this is a View All page. This is a strong signal for us to kind of use it despite having a canonical to something else. And those kind of View All pages, in some cases, can create a bit of problems because it loads life. In some cases, 200, 300 products. So the page kind of starts to load fairly slowly. And other than a redirect, I don't see how else we could kind of push Google to stop using that page and just use the canonical version instead. Yeah, I don't know what we do with the View All pages at the moment. My guess is that you might be linking to that View All page from all of the pages within the paginated set. And then we'll say, oh, this must be a really important page. We should focus on it instead of the individual paginated versions. That might be where we're picking that up. But that then kind of depends on, well, do you want to keep a View All page or not? And if you don't want to keep a View All page, then obviously, don't link to it. And then we'll kind of work that out. If you do want to link to it, then maybe we will end up. Yeah, it looks like we're linking to it. I thought it was just a dropdown with no anchored actual links, like HTML tags. But it looks that they are HTML tags. So yeah, that seems to make sense. So removing it might lead Google to drop it and use the OK. No, no. Cool. Then a question about backlink badges. So I don't know. This seems like something that feels a little bit older. But the question is, a few days ago, I invited members of my website to use an HTML snippet that shows a badge on their websites, linking back to their profile page on my website. Users may edit the HTML snippet to their needs. The default alt tag is a picture of one of the site's main keywords. Using the badge is entirely voluntary. There are no incentives whatsoever. And using the picture itself without a backlink is also possible. In other words, users will use the badges solely to show their appreciation for the website in question and promote their profiles on that website. Is there anything wrong with that? That sounds essentially OK. I mean, that's something where if there is really no kind of value associated with that link in terms of like, you must link to my website with this badge in order to gain access to these features or whatever, then that's perfectly fine. Some sites do this in other ways. For example, that they'll have a link on a page that says link to this page. And you click on that, it opens a little text window where there's an HTML snippet that you can use and just copy that to your website. That's also perfectly fine. The important part is really that this is completely voluntary, that people are able to edit this kind of markup if they want to use it or not. That's essentially the important part. Then there is another question about CLS and kind of PageSpeed Insights and the field data. I think we talked about that a little bit beforehand as well. I think also goes into kind of a difference between the field data that you see in PageSpeed Insights and the data in Search Console. So I'll definitely pass this on to the Search Console team to take a look at. If you have any examples, feel free to drop them in the chat or send them my way, maybe on Twitter, for example. Having specific examples makes it a little bit easier. Does it negatively affect SEO for internal linking if an anchor tag only contains a button tag? Does Googlebot take the text inside the button tag into account as a signal for internal links? Or would it be better to use a plain text inside of an anchor tag? So at least as far as I understand it, by default, a button element on a page is essentially tied to a form element. And you could use JavaScript to trigger a kind of navigation to a specific URL, which makes it kind of like a link. But essentially, Googlebot won't click on these buttons to see what happens. So we would not see that there's a link associated to another URL within your website. So in that regard, if you want to use kind of something that looks like a button for internal navigation, then I would use normal HTML links and just style it with CSS to make it look like a button, rather than to use button elements in HTML and then add JavaScript that kind of makes them act like a link. Question about internationalization. We have the same version of the website in English for different countries. And we use hreflang. And I think this is similar to the question we had in the beginning. So currently, we're using different CCTLDs for the individual countries. And we'd like to move to a generic domain with subdirectories. And yeah, like I mentioned, we support both of those setups with regards to international websites, as long as the main domain is really a generic top-level domain. So you mentioned, or the question here mentions .io. I believe .io is theoretically a country code top-level domain. It might be that we treat IO as a generic top-level domain. You can look that up in our help documentation. But I believe by default, that's a country code top-level domain. And for country code top-level domains, you would not be able to set geo-targeting. So that's one thing to watch out for. Sometimes the cute top-level domains are actually country code top-level domains. And it's worthwhile to double-check that you can actually set the geo-targeting in Search Console before you kind of invest everything to move all of your international versions into that setup. John, I had a query about international websites. So I just wanted to understand if there is one page in top-level domain. And we have decided to go with the CCTLDs. Or we have decided to create separate country pages for the same page. On that case, obviously, new pages, older page was having a lot of backlinks. But now new pages won't be having those backlinks. On that case, how Google deals with backlinks to main page if we have expanded to different country-level pages. Now, so for example, I have right now one page in .com. And we have decided to go ahead with three separate countries because from them we are getting good number of leads. So we have planned to go ahead with separate country pages for those countries, like PK for Pakistan, IN for India. But the problem is that older page is ranking in India. And if I am introducing new IN page, that won't be having any backlinks. But HF-lang tag would be mapped properly. Yeah. So if you create new pages and they're not linked internally, then we would see them as separate pages. With regards to HF-lang, what would happen there is if we understand that these pages are part of a set and we see a user from a different country doing a search where we would show that page in the search results, then with HF-lang we would swap that out. But by default, if you expand your website to include more countries, then you're kind of, in a sense, diluting the value of your pages by expanding it across multiple country versions. So you really have to keep in mind that you should also focus on those country versions, that you shouldn't just say, oh, I can make like 50 different country versions of my website now. And you shouldn't assume that copying everything onto 50 different versions will make it more visible, rather than maybe even less visible because you're kind of like diluting the value across all of those. So if you're adding individual pages, then I would definitely make sure that you have normal HTML links between those existing pages and kind of individual country pages so that we can understand how those new pages kind of map within your website. And HF-lang helps us a little bit, but HF-lang is not the same as a normal link on a page. So on that case, internal linking is fine. We will be doing it. HF-lang tag also we will be doing. My only concern was that older page was getting the backlinks from India. Now there is new India page targeted for India market. On that case, will HF-lang tag swap all external backlinks to new page? It depends on what you do. Like if you take that existing page and you link to your new India page, then we can forward some of that value. It's not the same as kind of redirecting or posting everything on the existing page, but it does spread some of that value. So it's not the case that you have to kind of like work to get individual links to all variations of all of your pages, but rather that kind of spreads naturally with the internal linking of your site. Thanks. Sure. All right, let me see. One, I guess, quick question here. If I gain do follow-back backlinks due to paying bloggers to write highly relevant review articles or paying for high-quality PR news articles, are they paid links that go against Google's guidelines? So I feel like this question is asked a little bit pointed. And I guess the quick and easy answer is yes, if you're paying people to create content with links, then you're paying people for those links, right? And if you're paying for links, then that would be something that would be against our Webmaster guidelines. So that's kind of the easy answer there. Of course, if these links do not pass PageRank, if they have the no follow attached to it or real sponsored attached to it, then that can be fine. That's essentially a way of advertising your website. It doesn't pass any value to your website, but it still helps users to find your content and indirectly helps to promote your content and your website. So maybe that helps a little bit. Then a question. I think there are two similar ones like this. With regards to structured data types for tourism companies in the schema.org website, there are different suggestions for travel sector, such as tourist trip, travel agency, or trip. Are the schema types mentioned above supported or relevant for Google indexing and ranking? If not, what would be the most suitable structured data type for a trip landing page? So it's important to differentiate between kind of the broader schema.org ecosystem, where there are lots of different ways that you can add structured data to individual pages. And the Google side of that in terms of Google search. So in particular, we support a subset of the functionality or the different types of markup from schema.org. And those are the things that we would show in the search results. And everything else is essentially kind of, I don't know, more like a nice to have or more like something where you're adding a little bit of extra information to your pages, but I would not expect to see any change in search because of that. And I think that refers to all three of those types in that sense. So what I would recommend doing is going to our search developer documentation and looking at the rich results types that we have listed there and trying to work out which of these types map to the kind of content that you have and then kind of focusing on those types first so that you kind of have the visible effect kind of nailed down. And you kind of know that if you spend time on that specific kind of markup, you will have a visible effect in search. And then if you want to go further and provide additional information through otherschema.org types, you're welcome to do that. I would just assume by default that you would not see any effect in search at all. So there is a slight sense of, well, we understand the page is a little bit better with more types of structured data on our page, but I would not assume that that's something that would have a visible effect in terms of ranking or better visibility in search. So that's kind of the direction I would go there. OK, we're kind of running low on time, so I'll switch to more questions from you all. And I have a little bit more time afterwards as well if any of you want to hang around a little bit longer, ask more questions, or whatnot. John, I saw some tweets this week with you and Gary about an old subject, but it's still a favorite. Domain names, an old domain name, whether you're going to have problems with it down the road because you bought one that had bad content, bad links directed to it. I mean, the only way I know of is to look through Wayback Machine, but that will give you a sense, OK, maybe it was parked or maybe it had content, but could you ever really tell if it had links? Is there a way to do that? And then I guess the next step would be to disavow if one needed to. Now, I mean, to disavow, you'd have to have the site verified in Search Console. So if you have it verified, then you'll also see links in Search Console. So that kind of helps a little bit. If you're still at the stage where you're unsure if you should buy that domain name or not, then maybe it makes sense to look at some of the third-party link checking tools that are out there. I think there are a number of providers that essentially crawl the web similar to how we do it and provide some aggregate views of those domains. And usually, you'll see that it matches a little bit what you would see in the Wayback Machine, where you see, like, OK, this is, I don't know, a small business. And the links that you would find are probably the kind of links that a small business might have. Or maybe it was a casino or some crazy affiliate site. Then you can kind of assume that, well, probably the links will also be similar in that regard. Thank you. Cool. Hi, John. I'll just jump in. I do have a question about how Google's treating the date modified for certain articles. So we've tested live blog posting. And we've noticed that Google consistently either ignores or cannot correctly detect our last modified date. So what we tend to see is we've updated the post, let's say, five minutes ago. And Google's still showing in the Serbs that it was updated or published seven hours ago or four hours ago, something like that. So I think this puts us at a competitive disadvantage. We do have the proper markup in schema. Our front end does reflect the correct last updated date timestamp. And it's also in our site maps. I'm curious if you have any insights as to how we can kind of resolve that one. I'd almost need to see some examples where we get that wrong so that I could pass that on to the team. So one thing that's important is that we're able to recognize the same date on the page as well. So if you have a date set in the article structure data for the modification date, then that should be something that's also visible on the page as well. So that's one thing to keep in mind. The other thing is to watch out for time zones. That's something that I've sometimes seen where people will specify a date and a time on the site. And we're kind of not sure about which time zone that refers to. And then we can't match that correctly to the structure data on the page. So that's kind of another aspect that sometimes plays in there. And I guess the third thing that I've sometimes seen is sometimes we get it wrong in terms of maybe we'll show the proper date in the normal search results and then in the, what is it, I think the top stories or what are the different news blocks that we have. Sometimes we'll show a date that is several hours off. And I believe we fixed that. But if you're still seeing something like that, that's kind of like a different time stamp is shown in different places in search, then I'd love to have some screenshots of that happen. Yeah. So we are seeing that as well. So I'll definitely send those over. I'll send you an email that works. I do another question on the same topic. So our live version also has this AMP component, right? But for some reason, our AMP cache doesn't update with our current updates. And we've tried purging manually. We tried it over API. It should kind of work alongside with the updates on the non-Ampersion. But for some reason, this is not at least being accepted by Google. So I'm curious if you have any thoughts on that as well. I don't know how that is connected. So my general answer there would be to use the API to request those cache updates. But it sounds like you're already doing that. So I don't know what might need to be done there additionally. Is that specific to pages where you're updating things on the page, kind of like the live blog situation? Specific to the live blog. So for other pages, we can request a purge. And it will update. It may take some time, but it will update. But for this live blog, for some reason, this doesn't take. It does give us like a 200 response and everything. But we don't see any kind of changes on the app version. OK. Now, if you can send me some examples of that too, that would be useful. OK, thanks so much. Hi, John. If I may jump in. Sure. I have a problem with the site and submitting the sitemap in Google Search Console. When I try to submit the sitemap, the Search Console will give me a generic HTTP error. And when I have a look at the server log files, then I can see that Googlebot can fetch the sitemap with a 200 status code. And I can load the sitemap from a web browser using all the other tools. The sitemap is structurally correct. I've tested it using various tools. And it's within the limits of 50 megabytes and 50,000 URLs. So do you have any idea of what I could try to make a Search Console to accept and download the sitemap? It sounds like you're testing for most of the important things. One aspect that sometimes plays in there is kind of the topic around crawl budget. I don't know if that's something that might apply in your site in the sense that if we're having trouble crawling enough from your website, then it might be that we would deprioritize something like that sitemap file. But it's hard to say. So what I would do there is either send me the URL so that I can take a look or post maybe in the Search Help forum so that some of the other people can throw an eye on it and give you some initial feedback and escalate it if need be. OK, can I send you the URL in the private chat? Sure. Sure. OK, I will do. Thanks. Sure. OK, let me take a break here and just pause the recording. You're welcome to kind of hang around a little bit longer as well if you'd like to. But just so that we have kind of a reasonable length for the recording, thank you all for joining in. Thanks for all of the many questions that were submitted and that were asked here. If there's something on your mind that you still need to find an answer for, I'd recommend going to the Search Central Help forum and asking the folks there. The people there are pretty smart and have a lot of experience with websites, so they can generally help you along. And they can also escalate issues if something pops up that is completely weird or different. So maybe that will help. I'll set up more of these Office Hour Hangouts as well. So if you want to join in one of the future episodes, watch out for the I think the YouTube community page is where we list them all. And they're also on the event calendar on the Search Central site. All right, with that, I'll pause the recording and see you all next time.