 All right. Welcome, everyone, to today's Webmaster Central Office Hours Hangout. My name is John Mueller. I'm a Webmaster Trends Analyst here at Google in Switzerland. And part of what we do are these Office Hour Hangouts. I hope if you're celebrating Christmas that you had a good break in between, hope you have a good start of the new year. And I thought it would be, I don't know, a good idea to put one of these Office Hour Hangouts kind of in the middle when we have a bit of time. Hope everything is going well for all of you. There are a few questions that were submitted. But if any of you want to get started, you're welcome to jump in and ask a first question. Thank you, John. I think my name is Armin Harian. I'm from the United States. One of the first questions that was submitted there are mine. So maybe we can take a look at that. I'm doing SEO for a large e-commerce organization in the United States. And I'm trying to help them with content, speed, and everything. John, my main question, we can take a look at the questions, is how is AMP for e-commerce websites? We know AMP because I have a news organization myself. And we have done AMP. And we have seen our traffic boosted, especially from the mobile. But I would like to know, how is AMP for e-commerce? Is it recommended? And there are a couple of other questions that maybe we can start from the AMP. I don't know specifically about e-commerce. So with regards to AMP, it's something where sometimes there are limitations on the website that make it a little bit tricky to implement AMP in a way that works well. So that might be something that you see with e-commerce, things like adding to cart and personalization and all of those things are probably a bit trickier with AMP. So that's kind of the main thing I would watch out for there. In general, making your pages as fast as possible, I think is always a good idea. And AMP is a great way to do that. One of the things that also kind of falls into this, I think, is that some kinds of pages are just easier to make with AMP than others. And on a larger e-commerce site, you have all different kinds of pages. So you have product pages, category pages, search pages. You have a lot of informational content as well. And it might be that if you're using something like WordPress as a backend, then a lot of the informational content is just a post or a page, which you can easily convert to an AMP with an AMP plugin. So if there's a simple plugin that you can install on your backend that makes it possible to move some of these things to AMP, then that's what I would do. For all of the other content, it's something where you kind of have to weigh the pros and cons yourself. I think on the pro side, it's definitely an opportunity to make really fast pages. You can make really fast pages without AMP as well. So depending on how much work it takes to implement AMP versus how much time it takes to make the pages faster in general, that's kind of a call that you have to make. There's definitely no kind of bonus from having AMP when it comes to search. So it's not that you'll rank higher, that you'll be more visible for all of these e-commerce pages. It's just one way to make your content available really quickly. So I'd see it more as a mechanism to help you speed things up rather than a magic bullet that you have to implement. I see. Thank you very much. Let me ask you a question about this, please. So if a website has already AMP implemented, let's say a news website, when I go to Google PageSpeed Insights and I put a website, under mobile, how come Google, that tool is not picking the AMP version but picks the regular version and says it's slow while the page has AMP already? Isn't it that wouldn't it make sense if the Google Speed tool would take the AMP and show the desktop version as regular? I don't know how it's implemented in the testing tool. But if you have separate URLs for AMP, then you can always test the separate URLs directly. It's also something where I believe most websites don't redirect to the AMP version. So if you use a mobile device to access the regular URLs directly, then you'll get that regular content. Whereas if you access the AMP pages directly, then you'll get the AMP version. And in search, we try to show the AMP version because we know that connection between the two. But it's not that there's a kind of a redirect that goes to the AMP version automatically. So I imagine the testing tool is just let you test the individual URLs. And since there's no redirect, they just see that. And my last question, please, with that, which I asked there. Let's say a website has lots of errors reported in Google Webmaster Search Console. Will they negatively affect the rankings? Maybe. I think it really depends on the errors that are reported there. Let's say the price is missing. Price is missing. But because some products don't have price, or yes? So what happens with things like structured data, especially product markup, if there's a price missing, then what would happen is no changes in ranking. But we would not be able to show the product's rich result. So sometimes we have the rich result with the review stars, and the price, and availability, and things. And if we can't process the structured data properly, then we just don't show that. But it would rank exactly the same as before. All of the other pages that do have the proper markup, they would be used normally. For that specific page, it looks like you want to use this markup, but it's incorrect. So we tell you that if you want it to have a tone, then you need to change that. What about breadcrumb errors? That's the same thing. So with all of these structured data types, if the markup that you have for structured data is not valid, then we would just not use that markup for rich results. Page would rank normally. It would show normally in Search. So that's something where we want to flag it as an error that you know about it. But if we say, well, I don't care. I don't need this rich result on that page, then that's perfectly fine. Thank you very much. But in case of this, links will be followed. Only structured data will be ignored. Sure, sure. We will process the rest of the page. Everything will work normally. It's really just that specific structured data type that you have that's not valid. We would ignore that. And, John, I just tried to run the MP page, and it is showing me correctly. Page speed inside tool. I just typed MP page. It is showing me the performance of MP page. By the way, I don't know how Armin checked that one. I think if you enter the URL directly, then probably it would just test that. Yes, you're correct. When I entered the MP URL, it shows correctly and shows very fast. But when I post the regular URL, let's say, it's a news article, the tool shows mobile and desktop, right? It shows mobile very slow because it's regular. But I was thinking that Google should take them M version already because M version already exists. That's what I was surprised. There's also, I think, a speed effect that these testing tools don't show, which is that with AMP pages, we can pre-render them and pre-cache them directly from the search results. So you save that round trip to your server to get the initial HTML, to get the rendered version. And that's something that we do with the AMP cache, which is not something that would be tested in any of the speed testing tools. So that's why sometimes when you look at an AMP page in a testing tool, it looks like, well, it's just as fast as other mobile pages. But actually, there is still that kind of pre-caching effect that also takes place. Thank you very much. Do you think Google will provide support for AMP for e-commerce, like more tags or more tools in the future? It would be very helpful. I don't know. I think it's hard with e-commerce. We've tried a bit with AMP for e-commerce, but I don't know what the team is planning to do there. Yes, thank you very much. All right. Any other questions before we jump into what was submitted? Short question. But speed definitely is a ranking factor, correct? Yes. Yes, thank you. John, can I ask one problem which I am facing? Sure. OK. So actually, for internal linking, I tried to see many hangouts, but I could never find if there is any page rank gets passed or not. Because you always say that Google tries to understand the context of the page from links in internal linking. But first thing, I still don't know. Probably Google does not pass page rank, what I assumed it is. The second thing I just tried to understand, I just launched one blog page for the same entity, and I have one product page for the same entity. The topics are quite different, like top 10 things before buying one product. This is blog content, and by the product is my main content or main page content. The main problem started from last three months when Google started ranking a blog page for that product name only. And my main page just disappeared. So first thing I assumed that external links don't matter in this case. Why? Because my main product page was getting links from a long time. Even I don't know from when the page was live. And hardly six months back, we launched one blog page, and it replaced the ranking. So in this case, what should we do? Because content-wise, I cannot change the content. That is very valid content. This is also very valid content. Banner, I just added that in blog page that if you want to buy this product, click here. But I believe still there is a lot of conversion rate difference. Yeah. So let me try to answer the first question. Let's see. So internally within a website, we do pass page rank. So that's something which I think is really important in that when we see a normally structured website with internal links going to different parts of the website, then that's something where we would pass page rank. And we try to understand which of these pages are more important. So internal links definitely play a big role in discovering content and also in recognizing which pieces of content are more important. So that's maybe one thing. With regards to different types of pages ranking, that's sometimes a bit trickier because it really depends on how we understand the query. And in some cases, we can understand the query fairly well and guide users to the right page. And in other cases, it's sometimes a bit tricky. So for example, if we think that a search looks more informational, that someone is looking more for information, then maybe we would point them more towards a blog post rather than a product landing page. On the other hand, if we think the page is more someone searching for buying a product, so something more transactional, I want to buy this thing, then we would send them more to a product page. And oftentimes, I think we get that fairly well. Sometimes we don't understand the query that well. And over time, we try to figure out what we should be doing there. And for cases like that, I think making it easy for users to find the product landing page is really important. But also, one thing I would consider is whether or not you're actually looking at queries that users are using and where you're sure that the intent is to actually go to a product page. Because it's very easy to get kind of hung up on specific queries where you think, for this query, I want that page to appear. And if you look at the search analytics for your website, you might notice that actually nobody really searches for this query. So it's a good match for that page, but it's maybe not something that people actually use. Or you see that when users come to your website that they tend to stay maybe on the blog post with more information on that product rather than actually going to the product page, which might also be a sign that actually they're getting the information that they need from this product page. So that's kind of taking a step back and looking at the bigger picture first and then kind of trying to go from there. I think if you're really sure that lots of people are going to your website and landing on the wrong page, then having a banner or some kind of easy way for people to find the place where they probably can get more information, that's really recommended. Yeah, and in this case, external links don't matter. If this is one domain has two pages, one page is getting links, but one page is new when new replaces the main page. So external links value is positive to that one page. They do matter, but it's not that they override everything else. We use a lot of factors when it comes to ranking. So just because one page has more links doesn't mean that it will automatically rank more. OK, thank you. All right, let's take a look at some of the submitted questions. As always, you're welcome to jump in in between and probably will have time towards the end to go through some other questions as well. For the Cloud Natural Language API, will there be more support for different languages in the future, for example Scandinavian languages? I have no idea. I don't know how the Cloud Natural Language API suddenly got so popular, but I think it's really neat to try things out with these kind of APIs to see what comes out of them. But one thing I would kind of keep in mind with all of these things is that there are lots of ways to understand pages and to understand text on a page. And it's not necessarily the case that just because we have an API for something like this that we would use this in search. We have APIs for lots and lots of different things. So it's really cool ways to build products on top of Google technology, but it doesn't mean that Google Search does things exactly like that. How can I make sure my users are having a good user experience on my site? So it kind of goes into I'm seeing lots of ranking fluctuations and not really sure. So I think that's something where there is no absolute answer that works well for every website. That's really something where you kind of have to dig in on your website yourself to figure out what you can do to improve things significantly. I think one of the tricky parts here, with at least with the sample URL that you link to, is it seems like something that doesn't really have a ton of content. And it's very easy to accidentally go forth and create millions of pages of content just with variations of keywords in the URL and by creating millions of pages of content that isn't really valuable on its own, it's something that makes it really hard for us to understand what the value of the website is overall. So one thing I would tend to recommend there is that sometimes less is more. And sometimes it makes sense to really narrow down the target of your website so that you have something really fantastic to offer that is really clearly visible to anyone who comes to your web page. And if from there they can dig in and get more information, but that more kind of detailed things aren't necessarily indexed, then you still have kind of a really strong basis and a strong foundation for people to find you. So that's kind of what I would focus on there is maybe not create so many thin pages or not have your website automatically create them that much and instead try to create something that's more structured that has a little bit more information for people. Are non-Latin characters a good choice for URL path with multilingual websites? Yes, sure. You can definitely do that. Nowadays, any kind of Unicode URL tends to work in most browsers. So I would go ahead and do that. I think that generally makes it a little bit easier for users who don't understand Latin characters. Like if you have users in countries like China or India, then maybe that's something where using local characters makes it easier for them to understand your pages. So that's definitely an option. It's not that you have a kind of a ranking advantage for doing that, but it makes URLs a little bit more understandable for users, so sometimes that's OK. We're planning a massive catalog update this month. So whether I need to redirect 301 to the new associated pages or 200 pages or create a new catalog by keeping the old catalog and keeping the canonical tag to the new catalog URL. So it sounds like a pretty big change. When you say this month, that sounds like you don't have a lot of time left, so not really sure how useful it is to change things just at last minute. But in general, anytime you're changing URLs on a website, I would strongly recommend using 301 redirect as much as possible from the old URLs to the new URLs. And if you can manage to do it somehow, then I would even try to look into ways that you can keep the old URLs. Because anytime you restructure a website like this by changing the URLs on the site, it means we essentially have to reprocess the whole website to understand it again, to understand how these pages connect to each other. So with 301 redirect, we could kind of follow that. We can re-crawl the whole website. We can see how things are reconnected. But if you don't change the URLs at all, if there's a way to do URL rewriting on the server side to reuse the old structure of the URLs, then that's something where, from our point of view, almost nothing changes. And it's a lot easier for us to just say, well, the content has updated here. And we can kind of re-crawl the individual pages and re-understand that content. But we don't have to reprocess all of the URLs and understand how these URLs are connected and how they belong together. So that's kind of my order of recommendation there. Under what first? If you can find a way to do it with 200 pages, maybe that's doable, keep the old URLs. If you can't keep the old URLs, then definitely 301 redirect from the old ones to the new ones. If you can't redirect at all because of technical limitations, then a canonical is a good backup. But a redirect is really much cleaner and much stronger sign that you're moving things to your new preferred URLs. How to write a perfect meta title and description? Oh my gosh, I don't have kind of the secret recipe for meta titles and descriptions. The one thing I would just say is there is no absolute length that you need to focus on, so no maximum or minimum length. But rather, it needs to match the content that you have. And past that, we have a great Help Center article on titles and descriptions and how they work that I'd recommend taking a look at. So that's kind of the directions I would head there. Kind of avoid artificially focusing on just pure keywords and pure character length and things like that. And instead, try to understand what it is that you're actually trying to provide on your pages and how you want that presented in the search results. How breadcrumbs errors cause my blogger? I don't quite understand the question. So what I'd recommend doing here is maybe going to the Webmaster Help Forum and trying to elaborate a little bit on what you're exactly looking for. In general, like I mentioned in the beginning, if a specific type of markup isn't valid on your pages, then we would just ignore that type of markup and not use that to show the rich results type. So if, for example, the breadcrumb markup isn't valid on your pages, then we wouldn't be able to use those breadcrumbs or that breadcrumb markup to show breadcrumbs. We would still rank your pages exactly the same as before. We just might not show it in the same way in the search results. When it comes to breadcrumbs, we're doing more and more automatically trying to figure out what the breadcrumb should be anyway. So maybe that's not that critical for you in a case like that. On the other hand, if there's a specific type of structured data that you really want to have shown like that in the search results, so for example, if you have recipes on your pages and you really want that picture and kind of the nutritional or cooking information to be shown in the search results right away, then you definitely want to make sure that your markup is correct so that we can show it like that. We would still rank it exactly the same as before. But if everyone else has this really fancy recipe rich result type and you have kind of the plain text version, then maybe that doesn't make your pages that interesting for people to click on. So it would be a good idea to focus on that. With breadcrumbs alone, I don't think that would be that big of an issue. We're migrating seven store sites to the same platform using the same content, same title tags, meta descriptions, H1s, et cetera, despite my efforts to convince the developer to create different instances of the database and allow unique content for each store. They're moving forward, essentially creating seven duplicate sites for 500 brick and mortars on the East Coast. None of the stores are in the same exact geolocation, but all of them are on the East Coast. What can be expected? What would your general recommendations be to convince the developer to allow unique titles, descriptions, meta tags, et cetera? Yeah, I think this is probably the, I mean, maybe not the specific problem, but the general type of issue is something that a lot of SEOs struggle with. Namely, you have a bunch of really good recommendations that maybe you can even back up with comments from us somehow. And developers have a very limited time, and someone has to make a call and say, well, developers should focus on this, or they should focus on implementing these SEO recommendations. And sometimes you win, sometimes you lose. And so that's something where I think having clear data in your favor makes it a little bit easier to make that decision, where if you can really say, well, if we do it like this, then this is something that we would expect, then that's something that makes it a little bit easier to make a call there. But sometimes that's not so trivial. With regards to this specific situation, what I expect would happen here, I guess, first of all, I'm not quite clear how these seven sites for 500 businesses would work out if that's like 500 sites, or if that's just seven websites, and just local addresses. I think seven sites versus 500 sites is a significant difference. And so that's something where, depending on how that is set up, that might play a role there. With regards to multiple websites that have exactly the same content in general, there are two things, or I guess a few things that could happen here, I'm going to assume that these seven websites look slightly different. They have different titles on them. It's like, I don't know, website A, website B, and the products themselves are exactly the same. And maybe the categories, the search pages, all of those things are exactly the same. So at first, what would happen here is we would index all of these sites individually. We would recognize that the page as a total with the title, with the branding, with the logo, things like that is different across these seven sites. So we would be able to index them individually. However, we would also see that the primary content or large chunk of the content for individual pages is the same. And then in the search results, when someone searches for something that's within just the primary content of the page, like if you have, I don't know, blue shoes in your shop, and the blue shoe pages are exactly the same, and if someone searches for blue shoes, we would see that the snippet that we would show to users would be exactly the same. Therefore, it doesn't make sense to show multiple copies of this site. So in practice, what would happen is we would pick one of these sites and show that to the user. Looking at this from a business point of view, that's not necessarily bad. If someone is searching for blue shoes and one of your pages shows up there in that query for blue shoes, then you kind of have that opportunity to get that click, to get that user to come to your site, for that user to do a conversion on your site. And if these sites are all the same, then everything after kind of going to your website is essentially the same. So that's probably not that problematic in the sense that one of your pages is shown in search, and we can kind of show that properly. The alternative, if all of these pages were unique, if there were kind of different product descriptions across these different versions, it could potentially happen that we would show two, three, four, maybe up to seven of those sites in the search results and say, well, these are other sites on the same topic. From a purely ranking point of view, one of the things that can help here is if you pick a canonical version yourself. So maybe that's an option from the developer side in that if you pick a canonical version on your own, on your website, then what will happen is that canonical version will be much stronger than the individual version. So kind of if someone searches for blue shoes, then if there is a bit of competition out there, then we would know this is a really strong page on blue shoes, and we'd be able to rank it a lot better. So that's kind of one way to kind of mitigate this problem with regards to multiple sites showing the same content, to pick a canonical and kind of help us to rank that one a little bit better. You can do this canonical thing on multiple levels. You can do it per product if you want. For example, if you have one site that is focused more on garden furniture and another site that is focused more on maybe indoor furniture, then maybe you make the canonical for garden furniture products be your garden furniture site, and you do the other for the indoor content. That might be one thing to kind of do there. So that's kind of, I think, a smaller approach that I would take there is to do the real canonical. The other problem that can happen with a setup like this with seven sites, I'm not sure it would happen if you have significantly more than seven sites that are set up exactly the same way, then it most likely would happen is that we would recognize at some point from crawling these sites that actually these sites are the same, and we would completely skip over all of these sites except for one. So we would try to recognize, well, here's kind of the content and all of the other websites that the full content is exactly the same as what we already have. Therefore, we don't even need to crawl or index all of those other versions because they're really exactly the same. And in a case like that, we would automatically pick one of those sites kind of as a canonical and focus our crawling and indexing on that one completely. That can sometimes be a bit problematic in the sense that if you have different variations across these different sites, then suddenly they would not be indexed at all. So for example, if these seven sites are kind of like how I mentioned like garden furniture and indoor furniture and actually they show exactly the same content, if someone is searching for the garden furniture variation of that site and we think it's actually exactly the same as the indoor furniture version of that site, then we would show the indoor version instead because that's the one that we kind of picked as a canonical for that setup. So that's kind of the risk that you go there in that you create these seven sites and then in search, we just pick one of them and we focus on that one. So that's, I don't know, with seven sites, I suspect it might happen, might not happen. If you have significantly more than seven sites and they're exactly the same, then that's pretty certain that that would happen. If these sites are all equal from your point of view, if you don't care which one of them is shown in search, then maybe that's fine. On the other hand, if they're slightly unique and you really want to make sure that kind of things are uniquely available across these sites, then that's something I would try to avoid. And the best way to avoid that is really to make sure that they don't show exactly the same content. So that's kind of the worst case scenario. I'd say if you have seven sites that show exactly the same content, we would just pick one of those sites and we would not even bother crawling or indexing the other versions of those sites. So many things to keep in mind. Seems like it can get pretty complicated. But I guess to sum up, the one thing I would try to figure out is if you can at least get a real canonical for these individual pages. And if you plan on going to significantly more than seven sites, then you really, really need to make sure that you have unique content across those seven sites. One of my pages has been ranking number one for a number of times by Google. And now when I've fetched a page, I get an error. The URL is not on Google for indexing errors. So I didn't see the URL itself in the questions. So it's really hard for me to say what exactly you'd need to do there. My recommendation would be to take this and go to the Webmaster Help Forum and post the URL where you're seeing this problem and get some input from the community on that. Sometimes these are simple technical things that are going wrong on your website that you can easily fix. Sometimes these are just fluctuations that happen that can go away over time as well. A question, a long question on headers and H1, H2, H3, like what is the best setup? I think in general, headings are a bit overrated in the sense that it's very easy to kind of get pulled in to lots of theoretical discussions on what the optimal heading should be. We do use headings when it comes to search, but we use them to try to better understand the content on the pages. So this question of how should I order my H1, H2, H3 headings, and what should the content be? That's something that, from my point of view, isn't really that relevant. But rather, what we use these headings for is, well, we have this big chunk of text or we have this big image. And there's a heading above that. Therefore, maybe this heading applies to this chunk of text or to this image. So it's not so much like there are five keywords in these headings. Therefore, this page will rank for these keywords, but more here's some more information about that piece of text or about that image on that page. And that helps us to better understand how to kind of frame that piece of text, how to frame the images that you have within those blocks. And with that, it's a lot easier to find kind of the right queries that lead us to these pages. So it's not so much that suddenly your page ranks higher because you have those keywords there, but suddenly it's more, well, Google understands my content a little bit better, and therefore it can send users who are explicitly looking for my content a little bit more towards my pages. So obviously, there's a little bit of overlap there with regards to kind of Google understanding my content better and me ranking better for the queries that I care about. Because if you write about content that you want to rank for, which probably you're doing, then being able to understand that content better does help us a little bit. But it's not that suddenly your page will rank, number one, for competitive queries just because you're making it very easy for Google to understand your content. So with that said, I think it's useful to kind of look at the individual headings on a page, but it don't kind of get too dug down into all of these details and variations. And instead, try to find a way to make it easy for people and for scripts to understand the content and kind of the context of things on your pages. How can I get in touch with Search Console? I'm interested in website optimization, which increases the results in the search results. So what I would recommend doing here is maybe going to the Webmaster Help forums and chatting about specific issues that you're worried about or that you have questions about. The folks there are generally pretty helpful and can help you to kind of figure out which directions you should be looking there. How to copy content without copyright and rank at the top? I don't know. This sounds kind of sneaky, kind of like something that doesn't make that much sense in the sense that if you're copying content from other websites, then why would Google want to kind of highlight your website on top when the other websites already have that content? So my general recommendation there would be to kind of create unique and compelling content of your own rather than just to copy content from other websites. And it doesn't matter so much if you're allowed to copy it or not. But if Google looks at your content and says, well, I've already seen this, and I already know that this is kind of the best place to send people for this content, then why should we focus on showing your version of that content as well? And the same thing applies to when you take and rewrite content, when you rewrite it yourself manually, when you rewrite it with a script, in that we would look at that and say, well, we already know this is kind of the content that is out there that people trust, that people go to, why should we show your version of this content? And kind of going from there, obviously, if this is a topic that you really care about, then I would try to make sure that you can create kind of unique and compelling content of your own on this topic rather than copying things that other people are doing. And if you want to write about what other people are doing and you want to include a quote on your website, that's perfectly fine. But really make sure that the rest of your website has something that is significantly different, where when people search for information that we would be able to say, well, this is, by far, the best source of information on this topic, it's not just the same as all the other ones, but rather it's really significantly better. And that's kind of the direction that you should be having. What's the best way to distinguish the name of the business for Google My Business? I have no idea how Google My Business works, so I can't really help with that. Last month, we migrated our server and the number of indexed AMP pages increased from 1,700 to 7,000. And then from 4th of December, the number started to drop without any issue. And now we just have 3,700 AMP pages left. The mobile usability graph is going down the same pattern. We have more than 20,000 pages indexed, and all of them are mobile friendly and have an AMP version. Why is there a drop? So sometimes this can be a little bit tricky in that for all of these aggregate reports in Search Console, we focus on a significant sample of the pages from your website, and we report on those. So just because you have 20,000 pages indexed in total doesn't mean that you would see a total of 20,000 in the AMP report or in the structured data report or in the mobile usability reports. But rather, we would try to take a significant sample of your website and we would use those kind of as the baseline number for these other reports. And the idea here is not so much that these numbers should not match up and it should be as confusing as possible, but rather that we try to highlight issues that you should really care about. And if from all of the pages that we have sampled in your report, there are no issues at all than your all set. Probably none of the pages have significant issues that you need to worry about. On the other hand, if a large part of the sample of URLs that we test for your website do have issues, then that's something that you should focus on overall. And usually, those kind of issues are not things unique to individual pages, but rather unique maybe to a template or to a significant part of your website. So fixing the baseline problem with your website that's causing this issue would fix it across all of the pages of your website. So with that said, if the total number for these aggregate reports goes up and down a little bit, I would not worry about that at all. Instead, what I would look at is kind of the ratio of the number of errors that we flag for your website. So kind of not expect the total number to match the index count, but rather this is a sample that we picked to test for these issues. And of that sample, we found so many issues. And kind of that ratio of the number of issues that we found there versus the number of pages that we have as a total for that report is what I would focus on. URL removal tool, removing a folder that has more than 10,000 pages is considered a single removal or not. That is a single removal from our point of view. If you remove a whole folder or a subdomain, that's one removal request from our side. It can cover a lot of individual URLs. So there are limits to the number of requests that you can make with the URL removal tool. So if you have thousands of URLs that you need to have removed, then if you can narrow that down to a specific folder structure or kind of a URL prefix on your website where most of these are included, then submitting that as a removal request will save you from having to do all of the individual ones and the individual ones you can still use to clean up the details. Let's see. Then there's one about images with product images where you see the big version on click. I didn't check out the sample URL that you have there, so it's really hard for me to say. But in general, when it comes to images, we want to be able to recognize the large and high quality images on your website so that we can use those in image search to send people to your site. And if you have multiple resolutions, multiple variations of images on your website, then it's fine for us to recognize all of them. We'll try to treat them appropriately. There are ways that you can use responsive images to let us know about that. So that's kind of the direction that we usually recommend. Using responsive images means that you send an image that matches what the user might need to see at the moment and still have the ability to point to the higher resolution or lower resolution images as needed otherwise. So usually, that's a pretty good approach to take there. Oh my gosh, we still have a bunch of questions left and we're running low on time. So maybe I'll just open it up for questions from you all first to see if there's anything urgent on your mind that I need to get through. Yes, John. I'm Sanjit. Hi. Yes, so actually, John, what I did actually, in the crawl budget section in our master tool, I limit my crawl rate to, obviously it was maximum. And I changed it to many minutes medium. And suddenly, our number of crawl pages dropped from six million to part A, six million to 600K per day. So after seeing that, again, I make it high. But our number of pages crawled per day not increased. So could you please help me on that? OK. So the crawl rate setting is really for when you need to limit the amount of crawling. It doesn't increase the amount of crawling. And there is, I think, a maximum that you can set there so that even if you set it to the maximum, it might be that that maximum in the setting is lower than what we would naturally crawl. So in a case like yours, where you're saying, usually I'd have a million pages crawled per day, then I would recommend taking the crawl rate setting out completely and not using that and just letting Google figure that out directly. Because even if you applied the maximum setting there, then what would probably happen is that it's less than we would crawl naturally from your website. OK. But I have another question, actually. But the DA actually limited it to medium. From a couple of days, it's completely lost means number of pages crawled per day. But it never gets back if I set up the setting to the pH for your issues. So usually what happens with the crawl rate in general is that we try to adjust it automatically over time. So we'll go up to the maximum of the setting that you have set there. And if you don't have a setting set, then we will try to go as high as we can. And this is something that happens over the course of several days or maybe even weeks or longer in that over time, we will test the website and see that actually it returns results very quickly. There's a lot of content that we want to have crawled. And over time, we will increase the crawl rate again. So I could imagine if you set it to something much lower for a short time that it just takes a couple of days, maybe a couple of weeks to get back to the previous maximum that used to be. Thank you. John, I have a question about Google Search Console. I see a discrepancy between in the last six days. In Google Search Console, it said that I have 64 clicks and I want to see the queries and I can't see them. And in Google Analytics, it said it only see 10 clicks. So I don't understand why there is discrepancy and how I can solve it. So maybe you can give me some advice about it. OK. In general, Search Console tracks things differently than Analytics would. So I would expect to see some amount of difference between those two versions. And especially when you're looking at fairly low numbers like 60 visits a day, that's something where I could imagine that the differences are pretty significant across the tools that you use there. So in general, I would expect the trend to be fairly similar, but the absolute counts would definitely be different. Because in Search Console, we track what people use to go to your website, what they click on. And in Analytics, it tracks things a little bit differently. And depending on how your website is set up with redirects and things like that, it might be that it loses the refer. And then it doesn't know that actually this came from Search or this came from organic traffic. So that's something where I would always expect some differences. Yes, but I want to ask. I see 64 clicks. I can't see the queries. You see me like I can only see five or three queries like what people clicked on. And then a whole bunch of rows that say me zero. But if I said 64, so I don't understand why I can't see exactly where the queries that people click or in order to get into my website, why this kind of thing happened. OK. What we do is we filter some queries out for privacy reasons, especially things that are not used so often to access your website for, which with 60 clicks probably covers a lot of queries, we would not show those. So that's something where over time, as more and more people search for your content, you would start to see more and more of the queries as well. OK, thank you. John, can I ask two quick questions, please, before we end? OK. In the HTML, is it important that title tag is above the meta description tag? Because I've seen some sites doing that, but news websites always have the title tag on the top. So it doesn't make a difference? No, it doesn't. And second question, maybe you have an answer. I know this is a difficult question. In e-commerce, let's say a vendor which sends a product to a retailer's website, they come up with a description of this product and they're not always great, but let's say it's OK. But they send, because it's a large website, large vendor, they send the same product description to 200, 300 websites. And Amazon and others and everybody may have the same description, selling the same product. Now, when a retailer's website is dealing with 1 million products or 2 million products on their website, what is the best thing they can do? Because it's impossible to sit down and write each because they add like 5,000 products every day. What they can do to improve the content as much as possible? I know the reviews are part of the main contact also, to improve the EAT factor. What can they do? What are the some best practices? I don't know specifically about EAT for e-commerce. EAT is something that we have in our Quality Raider guidelines and more focused on websites where the type of information is critical for the user, where they really need to know that they get the right information there, so probably less the case for most e-commerce websites. In general, what I would recommend doing is trying to find ways to improve or expand the content that you have otherwise on these pages. So you will have this one block of text, which is the same across lots of different websites from the product description. But kind of like you mentioned, you have reviews where people write about this product in their own words, where they write about how it worked, how it didn't work, where they give some additional context for the product. You might have people within your organization that can give more information about a general class of product, for example. Where maybe you have comparisons between different products or you have more information on what to watch out for when you're buying running shoes, for example. Kind of this informational content that you can reuse across your website, which is unique to your website. That's another approach that you can take there. Sometimes local information helps as well, where if you can tell that someone is looking for some local supplier of this content, then making sure to highlight that on your pages makes sense as well. I think this is something that particularly plays a role when it's multi-country targeting that you're doing there, where maybe, I guess, users in Switzerland, if they want to buy something, they try to find a local supplier of that content, and they specifically try to find a local website. So if you can highlight that on your website, if that's a critical factor, then that's another thing that you can do. But all of these things where you essentially provide more context is something that would add value there. In that we understand this piece of text is the same, but actually there are lots of other elements that are unique to your website. And if someone is searching for some combination of something in that shared piece of text, plus maybe implying that they're looking for something local, and maybe asking for more information, maybe asking for reviews, then suddenly your page is not the same as all of the other ones, but rather it's the unique version that has that chunk of text, but also has a lot of other things. Thank you. John, a quick question. Sure. The crawl setting page says the settings would be valid for, it will start getting validated from two days after now. And it will be valid for 90 days. Can you explain to me what does that mean? So it takes a while for the crawl rate setting to be applied. We usually take the settings, I think, at midnight in Mountain View, or one of those time zones, and we reprocess that. And so those two days are kind of the maximum time that it takes for us to pick up those new settings. And the 90 days limit is there so that if you accidentally set a wrong crawl rate setting, then after 90 days, our systems will go back to automatic, and we'll try to figure out how much we can crawl from your website. And that could be that maybe it goes down because we see that your website is fairly loaded and we can't crawl as much as we want. It could also be that it goes up again, where we see, well, actually, your website is very fast. And you artificially limited the amount of crawling that we can do to the detriment of your website in general, where you could be much more visible with your newer content if we can pick it up faster. OK, OK. So in that 90 days, I can still change the crawl settings, right? And it will start from there. OK, that works. Thank you. All right, let me take a break here and pause the recording. You're welcome to stay on a little bit longer if you want. We can chat kind of off the record, I guess. Thank you all for joining in. Thank you for all of the people who joined in over the course of the year. It's been fantastic again and been really helpful for me as well to kind of understand where the problems are, what we should be focusing on more. So I'm looking forward to more of these hangouts next year. I'm sure we'll find different variations over the course of the year, but looking forward to doing more. All right, and with that, let me pause here.