 All right. Welcome, everyone, to today's Webmaster Central Office Hours Hangouts on Air. My name is John Mueller. I'm a Webmaster Chance analyst here at Google in Switzerland. And part of what we do are these Office Hours Hangouts. I thought I would try some new things out with announcing this through YouTube instead of using Google+, and submitting the questions through YouTube instead of Google+. So things are probably a bit confusing at the moment. But since Google+, is going away, we have to figure something out. So we'll see how it works here. And otherwise, we'll try some other things out. All right. A few of you found the link. So that's good. I dropped the link in the YouTube description and in the community page where the questions were submitted. So maybe you'll find those there if you're watching this live. Maybe it's already too late if you're watching the recording. But I guess we can get started. Do any of you want to get started with the first question? Yes, I would like to go first. If that's OK. Go for it. I did post some on the YouTube comments. But I'll say them out loud. So first one, would an ex-robots matter HTTP header stop Google for a location redirect in like a 301 or 302? No. No? No. OK. So the nofoil would only apply to links on that page. And since it's a server-side redirect, we wouldn't even look at the content of the page. So there would be no links to follow. So that wouldn't change anything. It wouldn't stop. OK. And the second quick question I had was about clients in the e-commerce store. They've been blocked by several ISPs, mobile ISPs. And we wondered how that would impact their organic rankings. So I guess the main thing from our point of view there would be whether or not Googlebot would also be blocked. And if Googlebot is not blocked, then we would be able to crawl and index the content normally. But of course, if Googlebot is also blocked because it's, I don't know, blocked in the US, for example, then that would, of course, result in those pages dropping out of search. But if it's just individual users that don't have access to that and indexing and crawling otherwise works normally, that's usually less of an issue. Of course, you have kind of the indirect effects there that these users wouldn't be able to recommend the site. And we wouldn't be able to pick up those recommendations because they're not there from those users. So obviously, if you're blocking most of the users on that would go to your website, that would probably have a long term effect on the site. But if this is just like individual groups of users. It's literally just mobile phone providers, say T-Mobile have blocked the site for their users. And this is something that some sites even do on purpose, not particularly the mobile users versus desktop. But some sites will say, I don't have anything that I can offer users in these individual countries or for whatever policy reason my content is illegal in Switzerland because it's not neutral enough or whatever. Then that site might choose to block users in those countries. And that's still something we have to deal with. So those users wouldn't be able to recommend it. There might be some long term effects there. But if we can still crawl and index the content, then that's generally fine. OK, cool. Thank you. Cool. Any other questions before we jump into the submitted ones? No? OK, hi. All right, the thing is, we have a website which has recently came out from the manual action because of the links. And it's been three, four months since it's been happened. So what we are observing is like we are used to have very pretty good rankings. And we mostly had this all-stop position. But now, suddenly what happens after this changes, that happened. And now we are starting moving in the first page at the very bottom, or maybe sometimes at the very second page, at times at the bottom, or the second page itself. So just wondering, is it because Google thinks we don't have enough links to our website? Or Google start thinking because of that they have certain bad links or unnatural links to the website. So I mean, just wondering how we can gain that particular trust so that we can, again, head back to that one on our original positions. No, it's hard to say sometimes. But in general, if a manual action is resolved, then that manual action isn't affecting the site anymore. So there's no kind of hold back effect that would keep a website held back and saying, well, it did say things wrong once. Therefore, we'll prevent them from showing up high in search. There's nothing like that. But in particular, when it comes to links, if you had a lot of unnatural links and perhaps your site was ranking because of those unnatural links, then by cleaning those unnatural links up, then, of course, there's a new situation that our algorithms have to kind of adjust to first. So that's something that could, to some extent, have an effect there. And the way I would approach this is just to continue working on your website, how you normally would, and make sure you don't run into that situation again that the WebFemps team runs across issues with regards to your links. So don't go off and say, oh, I removed those links. Therefore, I'll go buy some links from another site instead. Kind of make sure that what you're doing is really for the long-term use of your website. All right, thanks. Sure. OK, so let's jump on in through some of the questions that were submitted. Like I said, this is with the different setup from the YouTube community thing where you can also post comments. So I don't know if we'll keep this. It seems a bit weird at first, but maybe that's just because everything is slightly different. So let me just, I guess, run through these the way they're ranked on YouTube. It looks like someone got a bunch of people to thumbs up their question compared to everything else, but fine. So the question is generally about a website that moved to a different domain name, and they moved to a different framework, and also to a different platform, to an Angular site rather than the static HTML site. And since they did this move, they've been seeing significant drop in rankings, or invisibility in search in general. So I looked at a bunch of threads from the website and tried to follow up with some teams internally to see what exactly is happening here. And it's kind of a complicated situation where I don't really have a straightforward answer. In particular, there's the move from kind of a country code top-level domain to a generic top-level domain. So some kind of geo-targeting information is a little bit lost there. There were some issues in the beginning with being able to index the content at all, in particular on the new site. And there are some significant changes with the way that the content is provided on the new site compared to the old site. So overall, there is lots of pretty significant changes along the way. And some of them were problematic, like not being able to index the content. And some of them were just changes how they might normally happen on a website where you would change the content significantly, for example. And I think what is happening here is all of these changes have made it fairly hard for our algorithms to figure out what the new stable state should be for this website. And it's probably just something that's taking a lot longer than a normal site move would take. So this is, I think, one of those cautious situations where you want to make a bunch of changes and by making fairly significant changes all at once, you're causing a lot of confusion with regards to our algorithms, where it might make sense to take these changes step by step to test them individually to see that actually every step in between is working as it should be so that you're sure that the whole chain of changes that you make don't necessarily result in bigger negative effects with regards to search. So I guess my recommendation here would be to keep at it and keep looking at the site, working on the site to make it better. I think from an indexing point of view, you're in a pretty good state there. You're using some kind of server side pre-rendering, which makes it possible for us to pick up the content fairly quickly. It looks like you've made some significant changes on the speed of the website overall, so that's positive change as well. And I suspect that all of these things will add up over time to be reflected in search as well. I wish these kind of changes would be able to be processed a little bit faster. So I've been pinging some folks in the search engineering side as well to double check that everything is working as expected there. But in general, with a lot of kind of bumpy changes on a site and moving to a different domain, all of these things can make it a bit tricky to do a site. We have good first meaningful content two seconds, but our time to interactive is 12 to 15 seconds. It's affected by a large amount of scripts. According to Lighthouse, even though it loads very fast for users, we're about to launch a new website. And we wonder if it's crucial to fix this before launch or cannot wait a few months. How does Google view time to interactive? So a lot of these metrics that you're looking at from Lighthouse are primarily presented to you with regards to the user-facing side of things. So from our point of view, from a search point of view, we take a variety of these metrics together to figure out how we should be seeing the site with regards to speed, but the absolute measurements that you see there in Lighthouse are things that most likely have a stronger effect on how users see your site. So instead of asking Google with regards to SEO, like, is this site too slow, I would look at it with users and kind of get their feedback instead. And if you're saying that your site is actually pretty fast for users, it provides the content fairly quickly, then probably you're in a good state there. Google just explained in a white paper released a few days ago that it uses PageRank via links across the web to evaluate authority to mess and trustworthy mess algorithmically. Oh, god, these words. Can we assume that expertise is primarily evaluated via content quality algorithmically? Can you elaborate on this at all? I don't have any insight there. So that's something I don't really have anything specific to add there. I just saw that white paper, I don't know, yesterday or day before as well. So seems pretty interesting. But of course, it's a fairly long paper and there are lots of different topics in there. And PageRank is just more or less a side comment there. So I wouldn't say everything is just PageRank. A few months ago, the Google Chrome Labs team released Quicklink, which offers faster subsequent page loads by prefetching in viewport links during idle time. I know this is beneficial for users, but would it have any effect on Googlebot in ranking or is every page at Googlebot viewed as a first time visit a clean slate? You're correct. So every time Google loads a page, that's seen as kind of a first time fresh visit on that site or on that page. We generally don't keep any cookies. It's not that we keep session states where we would say, oh, we loaded this one page and we're following the links here, therefore we'll keep that state and click on those links to get to those new pages. But rather, we'll find those URLs and then fetch those URLs individually. And so in a case like this, I don't see any positive or negative effect on Googlebot here. It does sound like it's something that would significantly speed things up for users. So that's a good thing. And you probably see some indirect effects there in that when users are able to use your site very quickly, that often they spend more time on the site. They look around a little bit more. And in turn, it's probably more likely that they'll be able to recommend that site to others as well, which is something that we might be able to pick on. How much time can it take to restore rankings if I move the site from all domain to a completely new one and ensure that all 301 redirects are in place? So this is something that can happen very quickly. If you do a clean site move from one domain to another where everything is essentially the same where we can recognize this old URL is redirected to the same URL on the new domain, then we can generally pick that up fairly quickly. And usually you'll see in the indexing, what is it, the indexing overview in Search Console, you'll see that one goes down just about as the other one goes up and things shift over with maybe a day or so of a bump in between. On the other hand, if you make significant changes across your website, so different URLs, different frameworks, different kinds of content, then all of that can make it a lot harder for us in that we really have to re-evaluate the new site completely. And based on the new website, figure out how should we position the site. That's the differences there. But if it's really just a clean one URL moves to the same URL on another domain kind of change, that's really no problem. Will a new domain inherit SEO juice and also rank for new keywords? So essentially, we're trying to, if we can really transfer everything one to one to the new domain, then that new domain will be in place of the old domain. And it can rank for the old keywords. It can rank for the new keywords. And again, if you make significant changes during that move, then of course that new state has to be taken into account. And it's not just like we can forward everything to the new one because a new one isn't just a move version of the old one. A question regarding parameter URLs with UTM links. So UTM links are essentially tagged URLs that you might use for analytics tracking or just general tracking. And the question is, will these links dilute the link value if they're heavily linked internally? We're getting pages indexed with parameters where the canonical is pointing to the preferred version. How will it affect in the long run if we're linked within the website with 80% parameters and 20% clean URLs? I guess that's always a bit of a tricky situation because you're giving us mixed signals, essentially. On the one hand, you're saying, these are the links I want to have indexed because that's how you link internally within your website. On the other hand, those pages, when we open them, they have a rel canonical pointing to a different URL. So you're saying, index this one. And from that one, you're saying, well, actually index a different one. So what our systems end up doing there is they try to weigh the different types of URLs that we find for this content. We can probably recognize that these URLs lead to the same content. So we can put them in the same group. And then it's a matter of picking which one to actually use for indexing. And on the one hand, we have the internal links pointing to the UTM versions. On the other hand, we have the rel canonical pointing to the cleaner version. The cleaner version is probably also a shorter URL, a nicer looking URL that plays in line with us as well. But it's still not guaranteed from our point of view that we would always use the shorter URL. So rel canonical is obviously a strong sign. Internal linking is also kind of a stronger signal in that that's something that's under your control. So if you explicitly link to those URLs, then we think maybe you want them indexed like that. So in practice, what would probably happen here is we would index a mix of URLs. Some of them we would index the shorter version because maybe we find other signals pointing at the shorter version as well. Some of them we'd probably index with the UTM version. And we would try to rank them normally as the UTM version. In practice, in search, you wouldn't see any difference in ranking. You would just see that these URLs might be shown in the search results. So they would rank exactly the same with the UTM or without UTM. And they would just be listed individually in the search results. And from a practical point of view, that just means that in Search Console, you might see a mix of these URLs. In the performance report, you might see this mix. In the indexing report, you might see a mix in some of the other reports, maybe around the AMP or structured data. If you use anything like that, you might also see this mix. You might also see, in some cases, situation where it swaps between the URLs. So it might be that we index it with UTM parameters at one point. And then a couple of weeks later, if we switch to the cleaner version. And we say, well, probably this cleaner version is better. And then at some point later on, our algorithms look at it again and say, well, actually, more signals point to the UTM version. We'll switch back. That could theoretically happen as well. So what I would recommend doing there is if you have a preference with regards to your URLs, make sure that you're being as clear as possible within your website about what version you want to have indexed. With UTM parameters, you're also creating the situation that we'd have to crawl both of those versions. So it's a little bit more overhead. If it's just one extra version, that's probably not such a big deal. If you have multiple UTM parameters that you're using across the website, then we would try to crawl all of these different variations, which would, in turn, mean that maybe we crawl, I don't know, a couple times as many URLs as your website actually has, to be able to keep up with indexing. So that's probably something you'd want to avoid. So my recommendation would be really to try to clean that up as much as possible so that we can stick to the clean URLs, to the URLs that you want to have indexed, instead of ending up in a state where maybe we'll pick them up like this, maybe we'll pick them up like this, and then you're reporting, it could be like this, it could be like this. You have to watch out for that all the time. So keep it as simple as possible. Will there be any ability to test and view desktop renders with screenshots in the URL inspection tool in Search Console? I get these kind of questions all the time. It's like, will you do this in Search Console? Or will you do that? And in general, we try not to pre-announce things. So I don't really have any insight that I can give you there with regards to what will happen in the future. It does seem like something that is kind of a missing aspect of the tool, in that it's focusing on mobile at the moment, but it might be nice to actually test the desktop version as well. So I'll definitely pass that on to your team to make sure that that's on the radar somewhere. On one website, with a couple million pages, we have a sidebar which has internal links in it. And next to each, there's a number which represents how many subpages that link leads to, sort of as an information informative visual aid. Generating those numbers takes a lot of time. Is it OK to remove those numbers from the page when Googlebot is detected? So Googlebot should be seeing the equivalent version of the page as users would be seeing. So if this is something that's useful for users, I'd recommend showing it to Googlebot as well. It's something where if you're talking about the speed that it takes to render these pages, it's worth keeping in mind that we look at speed on kind of a holistic way. We don't just look at the speed of a page as Googlebot is fetching that page when it comes to speed with regards to using speed as a ranking factor. For crawling, of course, having pages that are available very quickly, that helps us quite a bit. So that makes my sense to kind of make things fast for Google for crawling, at least. One thing I'd recommend here, though, is try to avoid special case in Googlebot if at all possible, because it does mean that maintenance is a lot harder. If you're doing something special for Googlebot, then it's a lot harder to tell if Googlebot is seeing errors on the page and users are seeing normal content and Googlebot starts dropping those pages from search because of those errors. So ideally, treat Googlebot the same as users. One way you could do that here is to use some kind of JavaScript to just asynchronously load that extra information. So that's usually pretty easily doable, probably less effort than cloaking to Googlebot in a case like this. Does link equity pass through links on canonical pages, or is it just ignored, and everything flows to the canonical? So I think the question is more about, you have one page that has a real canonical pointing to a different page. And the question is, will those links on that original page kind of work past page rank, or will only the links on the specified canonical page pass page rank? So from our point of view, there are multiple things that come into play here. I think, first of all, those pages should be equivalent. If you're saying that there's a canonical for that page, it should be equivalent to the final page as well. So it shouldn't matter which of these pages is used for forwarding links. Because you're essentially telling us these pages are the same, we can treat them the same. So it shouldn't matter for algorithms like if we pick up those links on that page, or we pick up the links on the other page. So I would also assume there that not always kind of, like in the other question with UTM parameters, we would pick the URL that is specified as canonical as the one that we actually index. So if those links are different across versions of pages, then that's something where we might not pass page rank in the way that you're expecting them. So all of that kind of comes together. And from my point of view, what I'd recommend there is just really making sure that those pages are equivalent so that you don't have to worry about from where which links are passing page rank. But rather assume that it could be this page. It could be the other page. The rel canonical is a strong signal for us, but it's not a directive that prevents us from using that page completely. Is it a good idea to create a website for every country in the world? So the short answer there is no, that's a terrible idea. Especially if you don't really have content that's unique to every country in the world. In particular, if you're using hreflang, between the different country and language versions, you need to keep in mind that hreflang does not make those pages rank higher in those countries. It just swaps out those URLs. So you still have to be able to rank in those individual locations, which means if you have one piece of content and you split it up across every country in the world and multiply it by every language, then you'll have diluted your content significantly across a large number of URLs, which means that any piece of content there will have a lot more trouble being shown in the search results. Because instead of one strong page that we could show potentially worldwide, we have a ton of different pages that all show more or less the same content, and none of them are particularly strong. So that's something where when it comes to an internationalization strategy, I would strongly recommend getting help from people who have done this successfully for other websites so that you can really weigh the pros and the cons there. On the one hand, you have the advantage of being able to target individual countries and languages with content that's really uniquely specialized for them. And on the other hand, you'd have to weigh that if you're diluting the content too much, then all of those versions will have a lot harder time to be visible in search. So on the one hand, targeting well. On the other hand, making sure that the versions that you do have available are strong enough to actually be visible in search. And my general thought there is to err on the side of using fewer versions rather than using more versions. So unless you have a really strong use case for having content specifically targeted for individual countries, then I'd err on the side of, well, maybe one global website is better rather than splitting things up. Can we use both rating schema and question and answer schema for question answers? Sure. I don't see offhand why not. The main thing to keep in mind is that the structured data should be focused on the primary content of the page. So I don't know how exactly you would kind of combine the rating and the Q&A schema on a page, but maybe it's something like you have questions and answers to a product and you have ratings of that product as well. That might be an option there. But in general, it's not that these types of markups are exclusive and you can only use one type. You can combine multiple types of markup. Was there a question from the Hangout, anything? OK, just some random background noise. OK, cool. I've got a question. OK, go for it. Thanks. So our site had maybe like 5,000 Google visitors per day for several months. Now we're down to under 500. This was due to a sudden, just happened in one day. We think it's due to an automatic security penalty that we got removed within one business day and we just haven't seen any change within three weeks. So I'm wondering how long something like that might take to recover from and if there's anything else that would cause just a sudden one day drop like that. Thanks. Hard to say. So what do you mean with regards to security issue? We got a message in Search Console that social engineering content was detected. It turns out that was due to our email service provider. Their ClickTrack software was automatically flagged because I guess one of our clients was using it. So we got a manual review and within a business day and they said it was OK, but you know. OK, so if there was a manual review, then that would be resolved. So it's not that there's kind of a hold back after manual review that says, oh, we have to be careful with this website. But that should be resolved there. It's kind of hard to say just in a general case with regards to a site like that. There are sometimes algorithmic changes that happen that roll out where bigger change happens within a day or so. So that might also be playing a role here. It might also be that there's something else kind of playing in with the site there. But what I generally recommend doing is maybe starting a thread in the Webmaster Help Forum with the URL of your site and some of the queries where you're seeing changes. So not just like we had this many queries and now it's like down to this, but rather people are searching for our company name. They're still finding our company name. But for this particular kind of query, we're no longer visible anymore. So that kind of thing is really useful to post in the Webmaster Help Forum. And usually, also when people can't find an answer for that, the experts there are able to escalate that on to us so that we can take a look at that as well. Great, thanks. All right, let's see. One hreflang question. So they have multiple sites in German for different countries. And they have the hreflang set up. It sounds like it's set up properly. So this is just what the example URLs, it's hard to say. But they're saying that they're not seeing a lot of indexing of these URLs across the different contributions. So what could be happening here? So usually, when I see questions like this, it boils down to our algorithms looking at these pages and saying, well, these pages are essentially duplicates of each other. So we can index one of these versions. And we can show that one to users because it's all the same. So in particular, in German, this is something I see quite a bit. I imagine in some other languages it's similar, probably in Spanish, where there are multiple Spanish speaking countries, or English, of course, as well, where the content is essentially the same and is just targeted for different countries. So what happens here is, from an indexing point of view, we would probably, or at least potentially, fold these URLs together. So the multiple German versions, we pick one German version and say this is the canonical for the German version. With the hreflang markup, we'd still be able to show the other URLs. So we might index maybe the Swiss German version. And if a user in Germany were searching, we would be able to swap out that URL against the German version in the search results. But in the indexing report in Search Console, you would only see that URL being indexed for the Swiss version, which is the one that we chose as a canonical. So you wouldn't see that for the German version. In practice, that can be OK if these pages are really exactly the same, because we're swapping out the URL. Users are getting to the right version. That's pretty much OK. The difficulty can be that if you have something in the title or if you have prices or other structured data on those pages, that might be something that would confuse users in that the user in Germany might be searching. We show the German URL, but in the snippet, it has Swiss currency as a price somewhere. So that could be confusing to people. So there are usually two approaches to that. On the one hand, you could take the approach and say, well, folding these URLs together is fine. It kind of makes those URLs a bit stronger for me. That's OK. Then you might go in there and say, well, I'll make sure that these pages are generic across all of the German versions so that it doesn't matter for me which URL is actually indexed. The other approach is to make sure that these URLs are clearly seen as being separate URLs. So making sure that the content is really significantly different across the different language versions so that our systems, when they look at these pages, say, well, this is one German page, and this is a different German page. There's an hreflang link between those two so we can swap out those URLs. But they're unique pages. They need to be indexed individually on their own. So those are kind of the two approaches there. On the one hand, you could fold things together. On the other hand, you could say, well, I want these explicitly separate. What would you advise if a brand has an international presence and wants to shut down for certain regions? What do you do with the sites that are targeting to other countries? Should they redirect to the main site? Essentially, this is kind of a site migration situation, I guess, if you have one country site and you're saying, well, this site is no longer available, I'll redirect it to another version of my site. That's something you can do. What you can't really do is specify in search to tell us that we should not show this site in particular countries. So you can kind of fold your German versions together and say, like Germany, Austria, Switzerland, they should all just be the one German website. That's fine, but I don't want to show my site to users in Switzerland, perhaps, but you can't. There's no meta tag where you can tell Google, don't show my site in Switzerland. So at most, what you can do is block users in that country from accessing the site. But you can't tell Google not to show it in the search results for that country. Yeah. Yeah, so that was my question. So this is something like about this all website in English and something having unique content with different countries itself. So because of some business decisions, basically, and they're kind of shutting down certain regions. So for that, I'm just assuming I'm kind of wondering if we can some redirecting the site. So will it kind of impact my main website or like, I mean, will it be kind of negative or positive effect of that? So I mean, we currently have this etcher flag on across all this website, which have the common pages where we can see like, I mean, we are offering one product in one specific country and the similar product in other country with the different content and all the region, local content. So just funny, I mean, how will it happen to those pages? Like, I mean, eventually, what happened? Like, I mean, if we redirect those pages. So my main website pages start ranking for that specific region. Is that something going to be happened? I could potentially happen. But again, that's not something you can really control. So if you're redirecting those pages or those pages still exist, then they might rank in those locations as well. You can't say that there's no meta tag where you could say, well, my website should be indexed but not shown in this particular country. OK. Yeah. Thanks. Sure. Google has this UX playbook for best practices to delight users for different niches. Are these considered part of ranking? Or can you give some insight on that? So those UX playbooks, as far as I know, are put out by the ads team. And it's more a matter of, well, these are UX best practices and good things to do on a website and not necessarily something that we'd say, well, these play into ranking. But obviously, if you make a good website that works well for users, then indirectly you can certainly see an effect in ranking. But it's not that we would say, look at these UX playbooks and specifically use those as factors for ranking. What's the impact of affiliate links when mixed with content targeting sensitive topics? We know that aggressive monetization can be problematic, especially when affiliate links aren't disclosed to users. And when the focus on monetization is more important than providing great content to users. But I'm wondering how that would work for topics like health, medical, financial, et cetera. So in general, as far as I know, we don't explicitly go into the site and say, well, there are links that look like affiliate links. Therefore, we will treat this website as being lower quality. In general, the main issues that I see with regards to affiliate websites is that they tend to just be lower quality websites. So it's not so much a matter of where the checkout flow is, but rather that, overall, the content is often kind of low quality. And because of the low quality content, that's something that our algorithms might pick up on and say, well, this is probably not the most relevant result to show in the search results. And there can be really high quality affiliate websites as well, which are good to show in the search results. So it's not so much a matter of is there an affiliate link or not, but rather, what about the rest of the website? Is this something that would be relevant to show to users? Or is there something perhaps problematic there? And I think, at least as far as I know, that would apply across the board. So it wouldn't really matter what the specific topic area is of the website. But in general, there are some really good affiliate sites. And there are some really, really terrible affiliate sites. So it's more a matter of is the site good or is the site terrible? With Fetch and Render being deprecated from the old search console and being replaced with the URL inspection tool, will the URL inspection tool be updated to include side-by-side visual comparison of how Googlebot sees a page versus how a user would see a page? I don't know. As with the other question, we try not to pre-announce things. So that's not something where I'd be able to say anything specific with regards to what will happen in the future. I'm also happy to pass this on to the team to take a look at to see with regards to prioritization. From a practical point of view, I have always felt that this comparison is something that's less useful nowadays because most sites are pretty good at making sure that a page can be rendered for search engines. At least the sites that I've looked at tend to be kind of on the good side So in particular, in the beginning, when sites weren't so used to search engines actually trying to render a page, that was a big thing. But nowadays, I don't know if that's really such a big problem. Like if JavaScript is blocked by robots text, those kind of things, I feel that has changed significantly over the years. But I'm happy to pass this on to the team, and they'll probably double check to see if there really are issues that we need to show there. Can link disavow boost rankings? Oh my gosh, such a big question. So in theory, if your website is demoted with a manual action with regards to links that you've bought or that someone else has bought for your website over time, maybe a previous SEO, maybe something that was set up before your time with that company, then you can use the disavow tool to kind of remove those links from our systems. And afterwards, you can use the reconsideration request form to let us know about this change. And then the manual web spend team will take a look and double check to see that the current status is okay. And if the current status is clean, then they will result that manual action. And in general, your website will be able to rank naturally again. So that's something where in many cases, the website might go up a little bit in ranking. In some cases, it might also be that these unnatural links were unnaturally supporting the website in the beginning. So by removing those links, it might be that this support is gone, which is kind of the natural state. So I guess the shorter answer is, there is no yes or no here in that links disavow does change what we see as links to your website. And that can affect your website positively or negatively. So my recommendation generally with the disavow tool is to use this if you have a manual action with regards to links, and you can't get those links removed. Or when you look at your links, you realize, oh, there was this crazy thing that we did maybe two years ago. And the Web front team didn't notice, but maybe it's time to clean up our act and clean that up. Then that's something where it makes sense to use the disavow tool. I wouldn't just randomly use this and say, well, these are weird links, and I don't want to be associated with them. I hope Google might rank my website better when I remove those links. So that's unlikely to happen. John, can I ask a question based on disavows here? Sure. I've been listening along. So we see sometimes I'm guessing maybe competitors will never, we'll just see literally one place sent 1.1 or 1.2 links to our homepage. Completely off topic, an instrument repair site out of Ireland. Nothing to do with what we do. Obviously, I don't have the tools or ability to disavow all the links. Is it best to then disavow the site? Sure, you can do that. That's kind of what the domain entry in the disavow file is for. That way, all of these millions or thousands of links, you can just say everything from the site. I just don't want to have anything to do with that. Usually what happens in cases like that, where you see this completely random site linking, is that site probably got hacked. And for whatever reason, someone decided to drop all those links there when hacking the website. And usually we're pretty good at picking that up and trying to keep the state from before they hack in our systems. So probably we're already ignoring those links. It's possible that maybe we're even indexing the website from before rather than with the hacked content. So that's something where usually I wouldn't worry about that too much, because these kind of hacks are really common. And we have a bit of practice trying to deal with that. But if you're worried about this, you're like, oh, this is really crazy. I think instrument repair links probably aren't something where you're like saying, oh, this will kill my website if I'm associated with violins. Maybe adult content is something where you'd be more cautious about. Then I would just use a disavow file, submit that, and then you're like, sure that these are not taken into account. Thank you. John, can I ask something? Sure. Oh, OK, finally. I've been trying to speak for the past 55 minutes. OK, your microphone works now. Yeah. Two quick questions. So regarding this URL, it's a category URL for an e-commerce website that has a kind of a view all parameter. Basically, our issue is that URL has a canonical to the version without the parameter. It's linked with the version without the parameter throughout the website. It's in the sitemap without the parameter. But somehow Google decides, no, this is the canonical version. And all of our signals are pointing to the other version. And I don't think it affects us much, but I don't understand why doesn't Google pick our version when all of our signals are pointing into the same direction. I don't know. Hard to say, offhand. So I think one thing to kind of keep in mind is that it's being mobile-first. It's under mobile-first indexing. So perhaps the mobile version is slightly different. Maybe there's a rel canonical there. Maybe there's something with JavaScript that's changing the rel canonical. I don't know. But that's always something to kind of keep in mind. So when you're testing it in a browser, make sure you double check the mobile view. But kind of as with the other questions earlier, it's also just possible that for whatever other reasons we're deciding, well, actually, this is the cleaner version. So I can. If it's not linked from anywhere else, and the internal links are not pointing to it, it's not in the sitemap. And with a URL inspection tool, Google does see the right, the correct canonical that we set. It just, the Google selected canonical is this one instead, rather than what we tagged it. So I don't know if it would have any effect in terms of, I'm just worried that there might be something that we're missing that might affect other pages as well. And I just can't seem to find that. I don't know. I'd have to take a look. In general, if it's the same content, then the ranking would be the same. So it's not that anything would change there. I don't know if driving me with hair care products would be particularly useful. But I don't know. So the ranking. In the forums, and that might be. Yeah. So I mean, offhand, it's a situation where you have kind of a view all page and the other version of the page, where it might be that we're picking up some signal saying, well, this is the view all version. We should keep that one instead of the other one, because it might be paginated or something like that. But yeah, I don't know offhand. I'd have to dig into that. The other thing maybe worth mentioning is we'd like to put together some document with regards to best practices around e-commerce sites in general. So if you run across weird things like this, where you think, well, with e-commerce sites, you often have these category pages that have tons of content, and the pagination, and the filtering, and the view all things are kind of tricky, then that's something where I'd love to get feedback on that. So maybe examples, maybe just general questions with regards to e-commerce sites, all of that would be really useful to have. Ideally, maybe send those via Twitter so that I can collect those there. I started a tweet on that. And hopefully we can put something together with some, I guess, best practices with regards to e-commerce sites on how you can make sure that indexing and crawling works ideally. Awesome. One less other thing. I've noticed a lot of our search analytics for sheet users reporting timeouts with the API. Is there something that has changed lately, or is just more and more users seem to be experiencing timeouts when requesting data through the API? That shouldn't be the case. I'll check with the team on that. That's using the search analytics report, the API for that. Yeah. Yeah. I don't know. That sounds like something we should be able to deal with better. Yeah, I mean, it wasn't the case before. I just, in the past six months or so, people have been complaining that they keep receiving timeout messages. I'm guessing it's pretty big queries, paginated queries. So I'm guessing I should send those to you directly, or is there some more on the API team? You can probably just send those to me. And it might also be something where you can include some kind of fallback within your code so that it retries a few times before timing out completely. But the API should be able to deal with some amount of that. Yeah. The Google Sheets add on a restriction is maximum five minutes of execution. So if I fall back to half an hour. OK. OK. That seems pretty long. Yeah. Yeah, it should get the data in that. Yeah. OK. Cool. Looks like we're a bit out of time. And there's still some questions left. So wow. OK. So submitting questions works. We'll just have to get faster at answering questions for the next time. Thank you all for joining in. I hope this was useful. I hope it wasn't too confusing with dropping a link to the live Hangout on YouTube. And hopefully, we'll see you all again one of the next times. Bye, everyone. Bye. Thank you. Bye.