 All right. Welcome, everyone, to today's Webmaster Central Office Hour Hangouts. I guess not Office Hours anymore, but kind of like, I don't know, People Hours. My name is John Mueller. I'm a Webmaster Trends Analyst at Google in Switzerland. And part of what we do are these Hangouts where people can join in and ask any question related to their website and web search. And we'll try to find some answers along the way. It looks like people are busy with other things, which is totally understandable. Not a ton of questions were submitted, but we have a bunch of people here. So if any of you want to get started with a first question, feel free to jump on in. Hey, John. Hi. Pretty good. So I have a question today about, I have a single source AMP carousel on mobile. So it's my page in the results. It has a carousel of AMP articles. So we've had it for a while now for, I think, almost two years or something. But since around Christmas, the article stopped updating less frequently. So we stopped getting fresh articles in the carousel. It used to be like maybe articles that were published after like two or three hours. They would appear in the carousel. Now we are lucky to get articles from within the last 24 hours. And there's a lot less content in there now. So we used to have more items, 10, maybe a little more around there. And now we have around three, I would say. So since this started happening around Christmas, we've made a few changes to our schema. We've updated, I noticed there were some inconsistencies in our timestamps. So on the article versus on our sitemaps and our feeds and stuff like that. So we've updated all of that. And I'm not really sure what else it could be. But the carousel itself still hasn't updated. And I don't find too much information online about single source carousels in the first place. So I'm not really sure how to diagnose this issue. Yeah, is this a, do you have any advice for how we might be able to fix it? I'm not quite sure what kind of carousel you mean. Is it the top stories thing on top of the results? Actually, if you want to pull it up, if you type movie news, you'll see it's the results. It's like a result for a single page. And it has just a carousel underneath the site link for AMP articles. OK, I probably need to check that afterwards. It's kind of tricky to check that live. And on the article, of course. Yeah, if you could maybe post some links or a link to a screenshot in the chat here, I can pick that up afterwards and double check. Usually, I'd say the date is definitely something to watch out for. That's something I know a lot of sites get wrong in that they have inconsistent time zones, for example, on the page and in the structured data. But otherwise, my hunch is that for a lot of these cases where there's just less content showing up in the search results, it's more a matter of quality. And better algorithms are not quite sure about the quality of this content, and therefore, they're kind of reducing the visibility of that content appropriately. But I'm sorry, I tried to send a screenshot, and it actually, it accidentally changed the page. So I just missed what you were saying. I'm sorry to tell your people. Sure, sure. So I think dates is probably a good point to look at. That's something that a lot of sites get wrong, especially if there is an inconsistency with time zone for example. So that's definitely a good thing. But otherwise, for a lot of these cases where the site or its content is just not as visible in the search results, it's really more a matter of the quality of the content. Where our algorithms are a bit unsure maybe how much of this content we should be showing. So that's something where, on the one hand, double checking all of the technical issues is definitely a good idea. But I wouldn't only limit your kind of review of the site to technical issues, but really kind of be critical about the quality of the content as well. And I haven't looked at your site. I don't know exactly what kind of content you have there. So I'm just kind of throwing this out there. But if you can post some links in the chat, then I can take a look at that afterwards. OK. The other thing that might thicken the mystery a little bit is we do perform very well in top stories carousels in general. And we have other carousels for some other pages in the search results. So some other queries that we rank for, those carousels seem to update more frequently. And if you even Google just the brand name of our site, and there's a carousel for latest articles, those always are very recent and plentiful, I guess. So it just seems to be for that query in particular. And it was a bit of a recent change only a few months ago. So does that shine some more light to the situation? It's hard to say, yeah. I mean, to me, that would sound like from a technical point of view, you're probably doing it right. And it's really a matter of us trying to understand which pieces of content from your site we should be showing. So especially if you're seeing it in the top stories, if you're seeing it for other kinds of carousels, then that means we're technically able to crawl and index the content fast enough. So it's probably from a technical point of view and not an issue and really more a matter of we don't know how much of your content to show for those particular queries, especially if you're saying for some queries it works well and for others it doesn't. So maybe more of a relevancy here thing here? Could be, yeah. Because we did change our content operation slightly around that time to expand it to doing newer types of newer verticals, I would say newer genres of content that we didn't do before. Whereas the movie news content, that was our primary content before. So that potentially affected it. I think it's tricky when it comes to things like that because our algorithms, when things change, it's often not based on the things that you change on your website just the moments before. So from a technical point of view, you do have this relationship very quickly where if you make a big technical change, that'll take a couple of days and so be visible in search. But from a quality point of view, usually the quality changes take a lot longer to be updated in search. So if you're seeing significant changes in search and you just made some changes a couple of days ago, then those changes, unless there was like a technical issue with those changes, then that probably wouldn't be related. So if you're saying you changed the way that you created the content maybe a week before this change in search, then my feeling is that wouldn't be related to this drop in visibility that you're seeing. Okay. Okay. I'll send over a screenshot in the chat so you can take a look. Thank you. Sure. And then if you have time, I had a second quick question. Okay. Go for it. So we get a lot of discovered, I work on publishing websites that get a lot of Google Discover traffic. And I know you said in the past that you guys currently don't have plans to add a referral source for Google Discover in particular. So recently we had a situation where an ad partner that we were trying to work with audited one of our websites that gets a particularly high amount of Discover traffic and they assumed it was a suspicious activity on the website because there was no keywords. Like the keyword data didn't match up with the amount of search traffic or Google traffic, I should say that we were getting. So in this case, we have a pretty good relationship with that advertiser so they contacted us and we were able to explain the situation and stuff. But I don't know if this is something that you've heard affects other publishers. Given that that's the case that these situations arise from there not being a proper referral for Google Discover. Is it something you guys would consider adding in the future? I don't know. I'd like to have a clear refer for Discover. But at the same time, from kind of a bigger picture point of view, we see Discover as a part of search, as a part of this big Google family of ways to get traffic for a website. So other people are saying, well, we should just fold it all together into one big refer and keep it like that. So that's, I totally get the advertising angle. That's probably tricky. The one place where we do have Discover data is in Search Console, where you see kind of the total amount of clicks and impressions that you got, the click-through rate there. So that's something where you could point at that or you could export that as kind of a CSV file and show that to them too, to show a little bit how the split is between normal search traffic and Discover traffic. For sure. And on that note, we do have the Search Console data, but in GA, it can be hard to map, I would say the two numbers, to the two sources to each other. For us, our Google Discover traffic ends up kind of the dark direct traffic. So it gets lumped in there and then it becomes difficult to sort of untangle it. And the numbers for sessions and clicks never really quite match up. So it just becomes a little muddled. So even internally, I find that there are issues with sort of disambiguating that Google Discover data. Because so given that, I think Chrome article suggestions are properly distinguished as Google APIs as a source. So it is a bit of a confusing situation from sort of all aspects. And I mean, if there's any way to maybe pass this feedback on to Dev Team or something, I would really appreciate that. Yeah, I'll push for that again. I think lots of people want more data with regards to Discover, especially when they see a lot of traffic going through there. So I totally understand your point. Okay, thank you very much. Sure. All right, any other questions before we get started with the submitted ones? Yeah, sure, John. I got a question about FAQ schema or QA page schema. I have, we write a lot of content, a lot of articles where the main topic is a single question, but there are often multiple answers. So I'm just wondering about how to mark that up because the way that I've marked it up, it doesn't really look good when I get the schema generated. It's question, the question just shows up multiple times. I have the title of the question multiple times and each with a different answer. I'm assuming that that's probably not super good to do. What would you suggest I do instead of listing the question five times with the five different answers? What might I do differently? I don't know, hard to say. I think QA page is also more meant for user-generated content, right? Yeah, so then it's FAQ, the other one. Yeah, I'm always confused with those two, yeah. Me too. Yeah, I don't know. I would test things out. I think the nice part about this is that you can see fairly quickly if it gets picked up and shown. And that makes it a little bit easier for you to double check things like the clicks and impressions, click-through rate. If you're getting what you think you'd like to achieve with those pages, then I think that's a good idea. I don't know with regards to the policies of something like this is okay to kind of say, well, it's the same question, but there are multiple answers. So therefore I have multiple entries in the FAQ. That's what I'm mostly worried about. Is it fine if I'm against some sort of a policy? Yeah, I'm not aware of anything particular with regards to the policy in that regard. And usually what would happen if there were something where this would go against the policy, what would happen is somewhere from the WebSum team would take a look at that and say, oh, this is not compliant. They apply manual action for that, which would essentially mean that the structured data, the rich results would not be shown that way in search until the reconsideration process has been kind of completed. So it's not that your website would drop in ranking, it wouldn't disappear from the index or anything. It's really just, well, those elements where the team says it's not compliant usage, those would not be shown. So it's something where, especially if the guidelines aren't completely clear, then that feels like something where you can try things out a little bit, see what works well for you. And kind of if someone from the WebSum team were to look at that and say, well, it's like this really doesn't match our guidelines or we've changed our guidelines because we see people do things in weird ways and that suddenly includes things that you're doing, then in the worst case, you'd be going through that reconsideration process, which is it does take a bit of time, but it's not the end of the world. And might that only affect that page or am I putting my site at risk? That would generally affect the site and that type of rich result. Got it. Okay, I'm gonna try doing question and I'm gonna append maybe part of the first answers that it doesn't look because it just looks really spammy. It's question, question, question showing up in the FAQ schema. I'm gonna try something a little different, but thanks for your help. Sure. John, on the FAQ schema, they used to show emojis. Do you know if Google recently stopped showing emojis in FAQ schema or are you not aware of anything with that? I don't know. So we're not showing them anymore? Some ways that they're not, I tried to search for them for a long, long time and then I found one French site that happened to show it, but I don't have every query. I have old screenshots and those don't even show FAQs at all, schema at all. So you're not aware of any change there. I'm not aware of anything particular with the FAQ and emojis. I know with emojis, one of the things that they always struggle with is kind of misleading emojis. So if people have stars, for example, and they make it look like review stars and that's something where the snippets team might look at that and say, well, we should filter out these particular characters because they're misleading to people. But that's less a matter of the FAQ markup or not, but more is like, well, we've noticed that this is really being spammed and people are using them for misleading ways. Then that might be something where we take action. But otherwise, we have lots of weird characters in the search results and try to allow people to be a little bit creative. Barry, I'll throw a question mark emoji in one of my FAQ schemas and see if that shows up for you. Cool, okay. Let me get through some of the submitted questions and then we can get back to more questions from you all. Hang tight. If Google has moved my site to Mobile First Indexing, do I have to optimize or at least put more effort into my AMP pages? So from my point of view, this doesn't change anything with regards to how much effort you should put into the AMP pages. The AMP pages should be consistently matching what users would see anyway. So that's something where the Mobile First Indexing side is kind of independent of how much work you should be putting into your AMP pages. In general, the AMP team, for example, is not fond of pages that are very simplified AMP pages where there's a lot of content missing. Obviously, there are some elements that you can't easily do on AMP pages and that's fine. But in general, the AMP page should match as much as possible what you have on your normal pages. My site is no longer appearing in search results. There are no violations on my sites and it's stopped appearing completely. From March 12th, it's a Google News-approved site. I received over 3,000 bad backlinks from my competitors and overfake DMCA complaints from my competitors, which I have countered them and awaiting Google's response. Why is my site the index they're motored in Google when there's no violation on my site? It's really hard to say without a site. So that's something where what I would usually recommend in a case like this is to post this in the Webmaster Help Forum, where there are a bunch of people that are able to take a look at your site and give you some guidance on what you could be looking at there. And they're also able to escalate issues that they're not able to resolve in the Webmaster Forum. So that's something where it's kind of like a normal escalation path there. They can also help you to figure out what kind of issues you're really looking at there. Because on the one hand, you're saying your site is no longer appearing. And then the last question is, why is my site de-indexed or demoted in search? And from our point of view, there is a really big difference between a site that's de-indexed and a site that is demoted. So de-index would be not appearing at all. It's not indexed at all, which oftentimes is either due to a very significant web spam issue or a technical issue. And so if it's really de-indexed, that's something where oftentimes there are really clear paths to resolving that. Whereas if it's demoted, where it's just not ranking as well as before, then that's a lot trickier because that can be due to a lot of different reasons. With regards to the quality of the content, kind of the general setup of the site, all of that can kind of play a role there. In general, I wouldn't worry too much about these 3,000 bad backlinks from competitors. Our systems are pretty good at recognizing those kind of things and ignoring them. So that's something I wouldn't focus on too much with regards to DMCA complaints. Obviously, responding to them if they're invalid is the right approach there to do. There seems to have been a very significant change that took place on January 14th. My traffic and revenue have been negatively affected, despite no penalty or notification of any problems. While I'm used to an ebb and flow of traffic and revenue, I guess it goes on like, are you able to give any advice on specific things to focus on to get back on track? So first of all, I'm not aware of anything specific that happened on January 14. So I don't really know exactly what you're pointing out there. I also don't know your site, so it's really hard to say what might be involved there. In general, I think there are a few things to keep in mind. On the one hand, there are lots of sites in the search results, so it's never the case that everyone is demoted and no longer visible in search, because there are always sites that come up as well. It's not that we don't show anything in search. So these kind of changes are generally pretty normal, and they can come and go. And that can be due to changes in our algorithms. They can be due to changes in the way that people search. So if a topic suddenly is no longer as interesting, then maybe you will have less traffic for those kind of queries, just because it's something that people don't search for anymore. It can also be due to changes in quality on the website or the perceived quality of a website based on our algorithms. But all of these things are really kind of hard to guess at without knowing your website. So that's something where I'd also start a thread in the Webmaster Help Forum to kind of get other people's input on this. And when you start a thread there, try to be as transparent as possible about your site, the kinds of queries that you're seeing changes in, maybe significant changes that you've made on your website in the past so that people are really able to look at this problem and give you honest and direct advice. There is no need to kind of hide things that you're aware of, but you don't really want people to notice offhand. It's a lot easier for people to help if they have kind of the clear picture of what's involved with your site. What are the major differences between any two posts that rank at number N and at number N plus 1 in the search results? That's really hard to say. It's not that we have this one factor that determines the ranking of pages in the search results. It's really more that we have lots and lots of different factors that are involved. And sometimes there are subtle differences between pages and the way our algorithms review them. And they can result in one page being ranked higher at one moment. And then maybe when we review them again, our algorithms, we maybe want to rank them in the other order. And it's not the case that there's this one magic thing where everything is sorted by this number. I don't think that would really make sense. So there's no kind of one search ranking factor that trumps everything else. How to hide text in an SEO-friendly way? For example, we have a big table of content in an article, and we want to hide part of it for mobile users. So I think in general, this kind of question is tricky, because on the one hand, you're saying this content is important, otherwise you wouldn't have it on your pages. On the other hand, you're saying, well, you want to hide it from your users. So you're saying it's not that important. My general recommendation here would be to make up your mind and to pick either leaving the content on there and making it available to everyone, including users and search engines, or removing that content from your pages so that it's not on that page anymore. In particular, when it comes to mobile users, at least from the numbers that we see, most users are using mobile devices nowadays. So if you're trying to hide content from most users, then that seems to me a pretty strong signal that you probably don't need this content at all. If you do want to keep that content and you just really think mobile users don't need this, then I would just put it on a separate URL so that those who do want to see the content, they can navigate to that URL. They can click a link to go there. And those that don't want to see that content have the page without that content. I'm working on an international site revamp in migration project, which takes the phrase approach for launching different market sites. There will be a transitional period with different new and legacy market sites concurrently running with different URL structure on the same domain. The issue is this situation will make implementation of hreflang very difficult as each market site has around 1,000 pages. Every time a new site is launched, update of hreflang tags has to go through thousands of pages. For this transitional period, would it be OK to just have GeoSignals with location targeting setting in Search Console and subfolder structured differentiating countries without implementing hreflang tags until all new sites are launched? From my point of view, that sounds like a reasonable strategy. In general, when it comes to hreflang, especially with large international sites, the one thing I would watch out for is that every time you add a new connection within your matrix of hreflang pages, that expands and adds a lot more complexity all the time. So if there's a way, for example, for you to go from having all of these different versions and go to having fewer versions, maybe just country-based versions, then I'd try to go in that direction. Usually, fewer pages makes everything a lot easier. Especially if you have pages that can be either just purely geo-targeted where you don't need hreflang, then that makes it a lot easier as well. One way you can think about this is if you're already in this transitional period where you remove the hreflang data, then you can track the usage of your site in Search Console or in Analytics, for example, and see if people are actually hitting the right version of your site. Because it might be, for example, that you only need to use hreflang on a limited number of pages rather than all pages. So what I often see is that, especially when you're using hreflang just to differentiate between different language versions, then that's something where, maybe for the brand name, it will be hard for us to recognize which language version to show for queries. But especially when people are searching for kind of long-tail content, specific pages within your website, they'll be searching in ways that is easily recognizable to be in one particular language. And if we understand the user's language and we understand you have pages in this language, then we can just map them. We don't need hreflang to match that better. But for example, for home pages or maybe for category pages, that's something where sometimes hreflang helps us to figure out a little bit better what people mean when they're just searching for a brand name. I was working with a website person who made himself an admin in Search Console and we're no longer working together or speaking. But now, I can't delete him from the console. It's him and I that are both admins. So I guess that's always an awkward situation, especially if you're no longer speaking. But in general, in Search Console, there are two main ways that you can add people as owners in Search Console. On the one hand, you can do that by adding a token for verification and that makes them the owner essentially of that site as well. Or you can also use delegation features in Search Console, which means essentially you have the site verified and you're just telling Search Console, by the way, this other person also has access. Depending on how you set that up, that determines how you can revoke that access. If it's done through delegation, you can just remove that in Search Console and that'll essentially be picked up and then that user will no longer have access to your site in Search Console. On the other hand, if that access was created through a token on your site, so maybe a meta tag or verification file or DNS entry or whatever, then you need to remove that token from your website first because otherwise, Search Console will just be helpful and re-verify that owner if you were to remove them from Search Console. One way to figure out how that user is verified is to go through Search Console into the Settings section. There is a section, I think, for Users and Permissions and there you should see a list of the users that you have there. And next to the username, there's, I think, dot, dot, dot menu that you can click on to manage the verified users. And that takes you to yet another page where you see the different verification methods that were used for that site. So that's where you can see which user has which type of verification access. And if there's a file involved, I think it shows you the file name, if there's a meta tag, it gives you the ID that's shown in the meta tag. And with that, you can go and remove that information from your server, that token, and then you can unverify that user in Search Console in that place. So that's a little bit more involved. That's something where you kind of have to dig through a little bit to find that. But essentially, those are two main ways that you can remove someone from access in Search Console. Usually, my recommendation is to use delegation as much as possible because that makes it a lot easier for you to add people. You don't need to change anything out of the server. And also makes it easier for you to remove access once people move on. We have many products that don't vary much in their details, but these are genuinely different products and therefore have their own URLs. One of my considerations is that Google may perceive this as a lot of thin content. While this was still the case before our domain move and our rankings we previously had were good, my thoughts are that the loss of authority that occurred during the domain move and still seems to exist meant that Google no longer overlooks these issues and treats us in a way similar to that of a new domain. In effect, it's possibly seeing this as spammy, therefore negatively impacting the whole domain's listings. Is this likely to be a problem? If so, how can I solve it? Should I, for example, know index of product pages, use rel canonical on product pages to their respective categories or something else? So I think I've seen your site a few times. I took a look at your site with some folks a couple of weeks back when I think you were in one of the Hangouts. And from our point of view, we don't really see anything holding the site back. So it's not the case that there's anything problematic with the site per se in the sense that you'd have to do something different than any other website. From our point of view, we're trying to rank this content the way that we would try to rank it on any other website. So it's not something specifically tied to the domain move or the changes that you had there. I think a lot of those fluctuations that you had there with that domain move, which, from what I recall, weren't that great in the sense that the redirects were kind of weird and I believe there were some JavaScript issues or something initially. I think from our side, all of those details are essentially resolved. And it's really just a matter of us trying to figure out how to rank your content appropriately. And there's nothing different that you would have to do than any other website. So from that point of view, I would treat this as any kind of situation where you're creating a website and you have different products. And if you think these are unique products that people would be searching for separately, then keep them separately. That's something where if someone is looking for something that really matches what you're providing, then that feels like a good idea to have a landing page for that and to have that product be visible in search. On the other hand, if there are ways for you to reduce the number of pages so that you can focus on fewer pages that are stronger, then I think that's also a good idea. So that's something where you kind of have to balance those two sides. On the one hand, having pages that are really well-targeted for individual queries, on the other hand, having pages that are really strong. And sometimes with a website that can evolve over time, where you say, well, initially, maybe you have fewer pages that are indexed and you try to make them as strong as possible and you try to build out your offerings that search engines and users understand which parts of your site are really important. And it helps you to focus on those pieces of content as well so that you can kind of create more pieces of content in that style, all of that. But that's not something where there's kind of like a yes or no, like you always have to do it like this or always have to do it like that. You kind of have to find that balance yourself. One thing that's kind of useful about this situation is you can also test this where if you take a part of your website and say, well, I'm not really sure if we should reduce the number of pages that we have indexed or expand them, you can take a part of your website and have a more limited set there. And you can have the other part of your website with a more expanded set there. And if the overall numbers are fairly comparable otherwise, then you can pretty much see if by making one of those changes, you're increasing the visibility of your content overall, getting better targeted users, or if you're going in the opposite direction. And with that, you can kind of try to refine things a little bit. Is it OK to expand from narrow niche to broad niche? To me, it's normal, but in different examples, I find bad results by Google when doing it. Is it because Google doesn't understand the expansion or different approach by the webmaster or an EAT issue? So you can definitely go from a narrow niche to a broad niche. You can go from a broad niche to a narrow niche. You can expand your site or reduce it or focus on more specific areas. All of these are completely natural and normal changes within a website. And they happen with any business where you kind of evolve and change the products and the services that you provide to match what users are currently looking for. And it's never the case that you figure out what users are looking for. And you can keep doing that forever. You kind of have to continue to evolve this over time. And usually what people do is kind of A-B testing to figure out which of these variations works best, sometimes expanding things, sometimes folding things together. So that's something where there's really nothing special from Google's side that says you can't create more pages of this type or you can't fold multiple pages together into one page. That's totally up to you, essentially. My question isn't related to PageRank. I'm asking about URL discovery, site architecture. We always keep our category pages as no index follow. Is Google seeing our product URLs in those categories? Or are they completely 404 in Google's eyes? So that's kind of hard to see, say, without knowing your site. So if the product pages do exist, then we definitely don't see them as 404s. If these are category pages and you're blocking them from indexing, then we wouldn't be able to index those. And over time, we would have trouble finding the links that you have on those pages. So that's something where if those products are only linked from pages that are within this set of pages that are no index, then that's probably going to be hard for us to understand that those products are actually useful and important for your site. Your products can be linked in a variety of different ways, though. So they could be in multiple categories. So maybe in one category page, they're linked. In another one, they're not linked. That's also fine. It could be that your products are cross-linked between each other so that you have something like related products that you're linking together. That's also generally fine. They could be linked from your home page as well. If you have your best sellers, then that's something. Like, we'd be able to find those products fairly well, too. All of these are different ways to discover these URLs on your website. From our point of view, we currently don't have any exact guidelines on how to deal with category pages. I think what we're probably going to evolve towards is to say that for category pages in particular, you should allow one set to the index. So you pick one sorting order, for example, and let that order, that set of category pages, be completely indexed. And all of the alternate variations of that with different filters or different sorting orders, you'd probably want to set to no index. So that would allow us to go through one set of category pages and really find all products that you have linked there and essentially go off and index your product pages directly. And if there are things in specific versions of category pages that you find critical where maybe a specific filter set or a specific sorting order is very critical, then maybe let that first page of that variation be indexed as well. But that's kind of the direction where we'd be headed. I think in general, just always setting all category pages to no index and follow, that's probably not a great strategy. And that's one where I would recommend at least using some kind of a local crawler tool to double check that we can actually find all of your category pages. Because if we can't reach them with normal crawling, then it's kind of hit and miss if we'd be able to show them well in search anyway. The Google News algorithm is now working as the old style because my news trends on Google don't really understand that question. I think in general with Google News, we've made a number of changes. And that's something where things will continue to evolve. So I don't think there are any plans to say, oh, all of these changes that we've worked on for such a long time, we need to drop them all. That's something where if you have specific feedback on specific kinds of queries or specific results in Google News, that would be really useful to have. And I think in Google News, you can also submit feedback directly. So that's kind of the direction I would head there. If this is specific to your site as a news publisher, I would go to the Google News Publisher forum and kind of chat with the other publishers there. OK, I think we kind of made it through the submitter question. So I'll just open it up to more questions from you all if there's anything on your side. OK, John, I have one quick question. I have noticed on the Google Webmaster forums and we have experienced it with some of our projects. Sometimes we see that, for example, for a very quick time, the home page of a specific website just go away from the index and we don't see it on the results with the site comment. It's twice the same for the Google Search Console, but once we reindex it with the indexation request, it appears back again. Could you explain please why this happens? And I know that the reason could be very different, but the most common reasons you have observed. I don't know. I'd love to see some examples like that if you see that, because then I can double check to see what is actually happening internally. Usually this kind of fluctuation, especially with things like home pages, shouldn't be happening. What might happen is if you do a site query that the home page is just not the first result, that's completely normal. From our point of view, a site query is an artificial query, so there is no predetermined order where we'd say this is the order we should always give the results in. So just because the home page isn't the first one in the site query results, from my point of view, wouldn't be a sign of any problem. If the home page were not indexed at all, that seems a little bit different. That seems like something where it would be useful to look at examples. If you don't care, maybe I can send you something. That's cool. OK, thank you. Thank you, sir. I have another question, John, if nobody else has one. Sure, go for it. I think confirm something first, hopefully. I think you guys have always said, when it comes to the speed of a site, you're really only directly from a ranking standpoint, directly impacting the slowest of slow sites. Is that just an SEO thing that we say, or is that actually true? And then I have a follow up to that. I think, for the most part, that's still the case. So that was essentially the, I don't know, when we initially launched all of the speed as a ranking factor things a number of years ago. That was definitely the case where we tried to differentiate between normal speed sites and really slow sites. I could imagine that over time, especially with mobile sites, we might find a more granular approach. But it's really still the case that if you're like tweaking milliseconds, then that's probably not the best use of your time if you're only worried about SEO. Yeah, just from an SEO perspective. OK, because you and Martin did a video, and the very first question, I think, was about speed. And then Martin said something like, I don't remember his exact words, but something like, we only look at the really slow sites and the really fast sites. So he did include that other end of the spectrum. So I don't know if that was just in saying that you look at those things or if you actually boost sites that are on the 95th percentile or whatever it may be. I think it probably all comes to the same thing. So if we reduce slow sites, then essentially we're boosting faster sites. It's kind of like there's no real difference between saying one set of sites is ranking higher and another set of sites is ranking lower. It's like you can't do one without the other. So that's always kind of that two-sided thing. OK, so maybe all things, I know you can never say all things being equal, but a site that was middle of the pack and then a site that was 95th percentile fast, the one that was faster might rank above for that reason? It is possible. I think if you're talking about sites that are reasonably fast, then probably you wouldn't see a big difference there. But if it's something where there's really a noticeable difference between those two, then I would expect there to be some differences. The tricky part, I think, with speed is that there are so many different ways of measuring speed. And there's kind of the theoretical approach where you just run a series of tests and then there's the field data where we look at what people are actually seeing. And all of these things are kind of mixed together. And they always bring a kind of a blurry number. So it's usually, if you're just tweaking milliseconds, then you're kind of still within this range of blurriness where you're not really seeing any big changes. Gotcha. Thank you very much. Sure. Let's see. I think there is some stuff in the chat. Is there an alternative to crawl this URL and its direct links in the new Search Console? No, not that I'm aware of. It's really essentially inspect URL and then submit that URL for indexing. There's no mechanism to, say, kind of submit all URLs that are linked from here to indexing. What usually I'd recommend doing here is just using sitemap files to kind of do all of this in a more programmatic way rather than to manually go through and submit things. Let's see. Google is planning to eliminate third-party cookies entirely in Chrome browser by 2022 for the sake of more private web. What actions should be taken about it in general? Well, I think if they're removed in Chrome, then probably you want to make sure that your sites don't break because of that. So that's, I'd say, the primary effect thing that I would watch out for is that your site still works well for users. When it comes to SEO, this is usually less of an issue. However, we do use Chrome to render pages for indexing. So if your site is significantly different with regards to cookies, then that's something where from an indexing point of view, if after rendering we can't get all of your content, then we'll index the content that we do have. So that's something where if you're somehow reliant on these third-party cookies to create a page that contains all of the content for your site, then that's something where you need to watch out for. The other thing, I guess, with regards to cookies is Googlebot doesn't keep cookies. So if you give cookies to Googlebot, then those cookies might be used for that one-time rendering. But they won't be used for the next crawl or the next run when Google reprocesses your pages again. So that's something. Sometimes people trip over that and that they expect that Googlebot keeps reusing the cookies that it has. And we essentially try to crawl and render pages with a completely fresh slate. Let's see, any German e-commerce? My domains are declining and coronavirus season. OK, I don't know what I can say there. I imagine that depends a lot on the type of your site. I don't know how it is in Germany at the moment here in Switzerland. We essentially close all stores yesterday, except for grocery stores and maybe pharmacies, I think. So if people want products, they end up going online. So I would imagine that a lot of these things will be picking up and more people going online to buy stuff because they can't go and buy things in person. That said, this is kind of a unique situation and who knows how it will develop. So I'm not going to make any kind of prognosis on how that happens. What are all the places Discover content can appear? We're aware of Chrome article suggestions and Discover feed in the Google app. Are there other sources Discover content can be seen by users? I don't know of all of these sources. I think there are three main sources, but apart from the two that you mentioned, I can't think of the third one. I think it's the Chrome new tab, the Discover feed, and then something else. But I can't think of it at the moment. I believe we have a Help Center article that has this a little bit more in detail. We're a new site and we lost all of our traffic over years because we appear from top stories in Discover. What can we do before we close? I don't know. That's, I think, a tough situation to be in all the time. With regards to both top stories and Discover, these are algorithmic features. So we use a number of factors to determine when we would show sites or when we wouldn't. And these kind of algorithmic changes can change over time. So it's really hard to say what exactly you should be doing there. It's definitely not the case that there's one meta tag that will automatically make your site appear or disappear in these sections. The thing I would generally try to focus on is quality. I know I think I've seen your site before, and it seems like a pretty reasonable site. But it's still something where our algorithms focus a lot on quality, and especially for elements like the top stories where we try to show sites that have recent content for specific topics or Discover where we're trying to highlight content that matches what we think users might be interested in at the moment. That's something where quality is very important for us. I think Discover is a bit unique because it's something where there's no query associated with it. So it's really kind of tricky for us to find the right results that we can show there. All right. Any more questions for many of you? Nothing at the moment. OK. That's good, too. John? Hi. Hi. Yeah, I have this very strange situation with a client of mine. They've had a website for, I believe, more or less five years. And then the last, I would say, within the last 12 months we've lost approximately 80% to 85% of the organic traffic, which is very strange. We repealed the site a year ago. And then the first three months everything was going great, getting actually even more traffic. And the rankings were going up. And then at some point everything started to decline to the point, like I said, we've lost approximately 80% of the traffic. The technical side of the site, everything seems to be working fine. We don't find any errors. We've maintained the same backlinks. The speed of the website seems to be OK. We don't see any slow pages. It's mobile friendly. But there's nothing that we can see that's not working, so we don't really know. There's only one thing. There was another website, I believe, another version of the website who someone accidentally uploaded and it got indexed on a different domain with the exact same content. And that seems to be at the same time when we saw the ranking drops. And so what they did was they removed everything. They took everything down from that domain. But we never recovered the ranking, so I'm not really sure what to do. OK, so usually what would happen if we see the same content being hosted again on a website is we would try to canonicalize that content, which essentially means our systems would say, well, we have multiple URLs for this piece of content. Which one is the right one to show? And usually we'd be able to keep the existing URLs and say, well, this is where it's always been and this is what we're going to show in search. It can happen that we switch to another domain. That can, for example, happen if you do a domain move and for whatever reason you can't set up redirects and we'll try to figure that out and switch those URLs over. However, the switching of the URLs for canonicalization is purely a technical matter on our side. It's something where we're essentially just changing the URL that we would show in the search results. So if it's the same content, we switch the URLs. It would rank exactly the same way as before, just with the different URLs. If you switch the URLs back, if we see that thing should go back to the previous URL, then that's just like a technical change of those URLs. We'll switch back over to the other version. And that's not something that would have lingering effects. It's really just, well, this URL is the right one. Oh, wait. No, the other one is the right one. And from that point of view, that kind of copy of the site that you accidentally had live, I don't see that as having any lingering effects there. If you're worried about that, if that's something that just recently happened, the one thing I would try to do is make it so that the canonicalization is as obvious as possible. So set up redirects from the old domain to the new one so that we can really tell if we were to crawl those old URLs where you had a copy, then you clearly tell us, actually, this is the one that should be used by Google. If you just delete the old version, then that's a bit trickier. But I. That's the thing. The thing is, sorry to interrupt. The thing is, the other website is live, but it's a completely different domain. It's actually a subsidiary company. And they by mistake, they just copied the exact same content. It went live. It got indexed. And then when they noticed that it was problematic, then they took it down. And so I'm not sure if that might be the reason. And so that's still being canonicalized to something that doesn't exist. I don't know if that's the root cause of the issue. Yeah, I think one way you can double check that is to take your existing website kind of the original version and using Tech URL tool in Search Compo to see what we pick as canonical for some of those URLs. And you should see there either we don't have this page indexed or we've picked your version as a canonical or we picked the other version as a canonical. And if you're seeing that we picked your version as a canonical, then essentially, that's resolved. Kind of that technical back and forth thing is resolved. And my hunch is that this is less of a technical issue and really more of a quality issue overall, where maybe our algorithm determined that for whatever reason, this website is not as relevant for some of those queries as it was before. And that's something that you probably need to take a look at from a bigger picture point of view. The one thing I always do when I see sites like that is to try to understand the situation before and after a little bit better. So to see which pages received impressions and clicks from Search before and after and which types of queries were there before and after. And sometimes what I'll see is that there's either a change across the board, which to me is more a sign of overall, where we're not really that happy with the quality of your website or we don't think it's as relevant as it was before. Or I see that just some very high traffic queries have changed significantly. And sometimes those high traffic queries are things that are not really that relevant for your website. So one thing I've seen, for example, with some new sites is suddenly you had a page on how to log in to WhatsApp and that was really popular because it happened to rank really well for that, but your news website. So it's not that you're the authoritative source to logging in to WhatsApp. But because that page received a lot of traffic, a lot of impressions, if you just look at the overall traffic in total, then it'll look like a significant drop. But it's actually you're getting more targeted traffic. And it's traffic that matches better what your website is actually about. So that's something where looking into those details, trying to understand the picture before and after, that can help you to better figure out what kinds of things you should be focusing on. OK, thank you, Jeanette. It actually seems to be, I mean, some of that situation seems to be happening because those high volume queries are the ones that dropped the most. Those are the ones that are not ranking at all. And I think that might be the case. OK. Sounds sound? Thank you, Jeanette. Cool. OK, wow. I just noticed we're a bit over time. So let me pause here. I have the next Hangout session set up for Friday in the morning, so maybe less useful for those of you in the US, but more useful for those of you maybe in Europe or in India or Asia. I hope you all stay safe until then. Thank you all for joining in and look forward to seeing you again in one of the future Hangouts. Thank you, Jeanette. Bye, everyone. Thank you. Bye. Thank you, Jeanette.