 All right. Welcome, everyone, to today's Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a Webmaster Trends Analyst here at Google in Switzerland. And part of what we do is these Office Hours Hangouts with webmasters and publishers like you all here. As always, I'd like to give those of you who are kind of new to these Hangouts a chance to ask the first questions. It looks like we have a bunch of new faces. So feel free to jump on in and ask the first couple of questions. Hello, John. Can you hear me? Yes. My name is Tom Goreng, and Baby Cyberspace is my site. And I started it back in February of 2004. And it was a prototype that we worked with Navy Recruiting Command to find out how to recruit within internet gaming. And during that time, we put some parts of the site that had to do with Counter-Strike and some of the games that were out there at the time that were pretty prevalent. And I still have on the site custom maps and those type of things that are way outdated. And I have a number of probably 100 or so pages that I just want to delete because they're no longer germane to what my site does. Should I delete those piece mail or do them all at one time? Whatever you want. From our side, it doesn't really matter which way you do that. One thing that might be worthwhile is to think about if there are any pages on your site that replace those pages. So if you could set up a redirect. Otherwise, just removing those pages, having a 404 page is perfectly fine. Well, thank you. I might have to jump in. Can you hear me OK? Go for it. My name is John. We have an inbound marketing agency and we have a client that we're working for. We did not develop their site. We came in much afterwards and creating content and all of that. And about three or four weeks ago, they received a manual penalty. And it gives you the responses. It looks like it's automated. It gives you four options. And we had no idea what it was, what the problem was of those four. We thought perhaps it was duplicate content, but because their site was meshed, there was data feeding to a company in Canada that they also owned. So they were sharing contents between the two. So we said, OK, that's probably it. So we pried those two apart. We separated them. And then it came back and said that our appeal was rejected and that it was for doorway pages. And so we went back to the site and looked at it and they had a whole bunch of city pages that they had set up. And we weren't even aware of it actually, but we were like, holy cow, OK, we got to get rid of those. That's probably it. And we were like, why did you do that? And they said, well, those are our best pages for getting people to the site because people want to know if we service their area. It's exactly what all our competition is doing, blah, blah, blah. So they did this without malicious intent, but we think that's what it was. So we're now in review again, waiting however long it takes. And it's kind of nail-biting. This is this company that depends heavily on the internet to get business. So just curious what your thoughts are. Is there anything else we could possibly be missing? That sounds about like doorway pages. Especially if you have kind of the thinner city pages where basically you're taking boilerplate text and just swapping out the city name, and it's essentially the same page over and over again, then that would probably be seen as a doorway page by the web spam team. And that might be something that they would take action on. So if you've cleaned that up, then that sounds like the right step there. OK, do they ever provide any sort of insights when they get back to you? Did they ever come back with something and say, no, it wasn't that, it was this? Sometimes, yeah. So there are two types of ways where they sometimes give a bit more information depending on the situation. On the one hand, sometimes they'll give sample URLs. So that helps narrow things down a little bit. And sometimes they can give a bit more kind of free text information as well regarding an issue to kind of explain what is happening here. And usually that's something that they turn to when it ends up being this kind of back and forth where they can tell that you're not a malicious spammer and you're not trying to manipulate things in a crazy way, but you've kind of ran into this trap. Maybe someone else set it up a couple of years ago. Maybe listen to that advice, all of that. So those are kind of the situations where they will try to get some more information back to you. OK, great. Thank you. So I need to wait, basically. Yeah, I mean, it sounds like you already submitted it, so it's probably not too far off. OK, thank you. Can I jump in here? Sure. No, hear me? OK, so I'm Allie. I work for caseinate.com. And we just did a site replatforming, and it did not go so smoothly. We had problems getting indexed for a little while. Now we're back being indexed, but our organic rankings are just plummeted. So another problem that we're having is when we replatformed, we went from Shopify to Magento. And Magento automatically creates these segmented URLs for based on price and color and things like that that we didn't set up. And so we changed in our robots to block all those extra pages because they were coming in as duplicate content because they're just subpages of one page. But that being said, now Webmaster Tools is saying that our robots is blocking an important page, but it's not an important page. It's just one of those subpages. So it is being blocked, but on purpose. Is that hurting? Part of the reason why our organic is taking so long to come back? Or is that part of our problem? Or is that a totally separate thing? Yeah, so I guess there are two aspects there. On the one hand, it sounds like you changed the URL structure completely of the website? Yes. Did you set up redirects? We did. Problem is we set them up. We tried to set them up for all of them. Our site's been around for a very long time. And so a lot of ones are iPhone 4 and Samsung Galaxy. They're very, very old phones. So they got all the newer stuff. I'm working on still trying to get the old stuff. But for example, before this, we were ranking number four for iPhone 7 Plus cases. And now we're at 70 something. And it's been like three weeks. Yeah, I mean, you'll always see some fluctuations when you do a redesign of a website. And especially when you move from one CMS to another, because essentially the whole layout of the pages changes, the URL structure changes of the website, kind of the internal linking of the website changes, all of that takes time to be reprocessed. So that's definitely one thing where you should see some fluctuations. It sounds a bit more extreme in your case, especially sounds like it's been that way for a while. So that's something where I'd probably dig in a little bit more to figure out exactly what's happening there. With regards to the kind of the faceted pages with the individual, kind of like the price in the URL or other attributes in the URL, those are the kind of pages where we'd recommend allowing crawling and using your rel canonical to kind of combine them on your preferred page. So that's kind of what I would aim for there, to kind of guide Googlebot into the pages that you do actually want to have crawled and indexed. Because by blocking them, you're preventing us from crawling them, but we'll still find links to them. And we'll essentially try to index a robot page where we don't know what the content is, and we can't really rank that that well without the content. So if we could crawl them, see the rel canonical to your preferred page, I imagine you would see a change there. It looks like Mihai also recommended another tool to kind of look at for Magento. I don't know anything specific about Magentos. I can't really say how that will work. Do you want to say anything about that, Mihai? Well, it's basically an extension we've worked with multiple Magento stores, and it's an extension we found that kind of achieves the goal you just explained to use a canonical tag for the filtered categories, because it's usually a categories that are the problems whenever you apply price filters or other facets. And the canonical tag would point to the non-filtered version of that category, which would kind of achieve both the goals of redirecting any ranking signals to the non-filtered category and avoid indexing and reduce the crawl frequency of those pages. And what was that tool? What was that extension? I've mentioned it in the chat. It's MageWorks SEO Suite. OK. Sorry, it's my first time doing this. I don't know how to find that. OK. Go on, can I just ask a question on that? Sure. You know, when you talked about different CMSs and it having to be re-evaluated, are different crawling rules built according to, if you like, popular CMS platforms? So for instance, if Magento has certain site structures that are, if you like, common as such, with parameters and filters and so forth, say you then move to a new platform, say Shopify or vice versa, or PrestoShop or whichever. Does that have to kind of be reprocessed, if you like, so that the patterns are built and they understand that this is a different platform? It's almost like rules to understand which is the important types of URLs there. Yeah, that's something we try to do on a per site basis. So as far as I know, we wouldn't try to figure that out based on a CMS. So we wouldn't try to say, oh, this is Magento. Therefore, this will be the common pattern because you can set up all kinds of crazy server site rewrites and then the URLs look differently. So it's something where we try to look at the site and see what do we find with normal crawling, which patterns make sense for us to kind of focus on, which ones can we kind of ignore, those kind of things. And especially for smaller sites, we'll probably not even think about kind of patterns at all. We'll just say, oh, we'll just try to crawl everything. So URL parameters are probably one of the best places to sort of try and handle that a little bit, maybe? Yeah, I mean, depending on the type of site, the way you have it set up using URL parameters is a great way to kind of make it easier for search engines to crawl your site. If it's a complicated site structure and all of these parameters are in the path of the URL somewhere, then that's a lot harder for us to figure out than if they're individual URL parameters where we can say, oh, we can ignore this parameter here because it's just changing the order of the items on the page. It doesn't change the content of the page. OK, thanks. Just a little addition regarding Magento. Magento actually has out of the box a canonical feature. It's just that they applied for pagination URLs by default, which is not ideal because Google will kind of stop crawling those pagination URLs and won't really find new products anymore. OK, awesome. So do you think that that might have to do with the rankings then as well, like why it's taking so long to come back because of all this mess? Or is that a whole separate feature? I imagine that probably plays into that. So especially if we have trouble crawling the site, then it's really hard for us to kind of reprocess everything and understand how it should all fit together again. But what I would do is maybe take a look at those plugins and those settings to see if there are easy things that you can do to change and otherwise maybe go to one of the Webmaster forums and kind of explain what you're seeing with specific examples as well so that other people can take a quick look there too. OK, great. Thank you. Sure. All right, let me run through some of the questions that were submitted. And as always, feel free to add comments or more questions along the way. We work on a huge website, justanswer.com. And since late April, early May, we've experienced a big drop in Google Index. This happened across all of our international sites. We weren't changing anything significant in the last couple of months. What could that be? Or is this just a regular index cleanup that Google does every spring? So from our point of view, it's not that we have spring cleaning in our index. We try to re-crawl and re-index essentially URLs kind of like normal. I took a quick look at the website to kind of see what type of site it is. And it looks like it's heavily based on user-generated content. So this seems like one of those sites where, from our point of view, our algorithms might go and say, well, there's lots of low-quality user-generated content here. We have to be a bit more careful. Or it might be that you're working really hard on cleaning things up and have really high-quality user-generated content. And depending on how things evolve over time, it's also conceivable that our algorithms sometimes switch from thinking it's this higher quality to thinking this is lower quality. So my recommendation there would be not to take this as something that just randomly happened. But rather to kind of think about what you could be doing on your side to significantly make sure that the quality of the content that you're providing for indexing is as high as possible. We haven't heard about Panda in a while. Is it still being handled in the same way? Is there any direct connection between Panda and other quality algorithms? So essentially, Panda is still rolling the way that it's normally rolling. It's something that's kind of automated in that we don't really have to worry about individual refreshes anymore, which makes things a little bit easier. There's nothing kind of like the jumps that would happen with regards to Panda in the past. So that's something where we don't have any big news to share on that because it's essentially working the way that we think it should be working. With regards to connections to other quality algorithms, in general, we try to keep these algorithms separate as possible so that each of these kind of quality algorithms has its own data set to work on. One thing that does happen, though, is that sometimes quality information from one algorithm is used in other algorithms where, when it comes to understanding how we should be treating this website. So for example, if we think this website is generally lower quality, then that could flow into other things, like maybe crawling and indexing speed, something like that. So I have a question. Hello. Sure. Yeah. How could I target my website for multiple geolocation? I mean, I tried using hreflang, but it's not working. Hreflang should be working. That's essentially the right tool there, especially if you have different language content. If you're just targeting individual countries, then I would use a normal geo-targeting tool, which is built into Search Console. So you can say, this part of my website or this domain should be targeted to this specific country. hreflang is a little bit different in that it'll swap out URLs. So geo-targeting will change the ranking locally for people who are searching locally for local content. And hreflang will just swap out the URLs. It doesn't change the ranking itself. And if you see that the wrong URLs are being shown in the search results for your contents or the wrong language version, then I suspect maybe the hreflang markup might not be set up properly in that you might not be linking directly between the canonical URLs. So you really need to link between the URLs that we actually indexed like that. And do I need to make the changes in XML sitemap also after I include the hreflang? No, not necessarily. So you can put the hreflang in the sitemap, or you can put it on the pages themselves. So depending on where you have the hreflang markup, that's where you should be making those changes. Thanks. Sure. All right. Can external links from my page to another domain in any way harm rankings or quality of my pages besides bleeding page rank? All of these links will be relevant and useful for the user that might land on this page. So since it's asking in any way, that's something where I would watch out for things like unnatural links. So if you have, on the one hand, maybe paid links on your site, if you have advertising on your site, then those are the type of links that should have a nofollow attached. Otherwise, we might lose trust in the links on that site. If you have user-generated content on that site with links from random people on the internet or leaving them there, I'd also make sure to have a nofollow attached to those links so that you're sure that the normal links on your site are really relevant. They're natural. They're there because you placed them there as a webmaster. So that's kind of what I would aim for there. What specific advice do you have for webmasters that had a website lose rankings at the end of last week? So this is kind of a tricky question in the sense that, what specifically could you offer webmasters who have one of the millions of websites that changed rankings in the last couple of weeks? So obviously, there are lots of different things that affect different websites. There's nothing, not one specific thing where I can say all websites that saw a change in ranking last week or last couple of weeks had the specific problem. In general, rankings do fluctuate. That's something where I wouldn't say that you could kind of assume that rankings will always stay the same if you don't change anything on your website. On the one hand, our algorithms change all the time. On the other hand, the rest of the web evolves as well. So these are things that do kind of change over time. There's nothing really specific I'd be able to point out there. Some websites linking to me have a lot of backlinks, but don't rank for any keywords. Totally junk sites from a backlink profile standpoint. From what I can tell, they have been penalized or suppressed to not show up in search results. Do these sites still pass a vote to me? I want to disavow them, but I'm not sure if they're helping me rank. So I guess, first of all, if these are totally useless sites, then they're probably not helping your website rank. So from that point of view, that's something you can probably kind of ignore. So if you're worried about these sites linking to your website, then I would just go ahead and disavow them. If these are just normal spam sites, like they're out there in millions anyway, then I wouldn't worry so much about it. But as soon as you're kind of worried about those links and you just want to make sure that they're not causing any problems, I would go ahead and just disavow them. A negative SEO attack, I have a client who is a nonprofit charitable organization and they help to get people out of debt. They're the largest organization of their kind and have a high success rate. I believe they're under a negative SEO attack and we need someone at Google to annually take a look. So what I would recommend doing there is maybe posting in the Webmaster Help Forum with the details so that other people can take a quick look and pass it on to us if that's needed. In general, just because it's a good organization or a large organization and they're successful doesn't necessarily mean that their website will automatically be ranking well. So that's something to kind of keep in mind. I think it's really important for a website in general on the web to offer something that's useful, that's compelling, and useful for users. But just because they're large and they've been successful doesn't necessarily mean they'll rank well, too. So it's not the case that I would say that they would automatically be ranking high. So from that kind of point of view, I'd be skeptical about saying that it's automatically a negative SEO attack just because they've changed in rankings, even if you do find a bunch of crazy links pointing through that website. But it's always useful to have examples. So I'd really recommend kind of posting in the Help Forum or sending that to me directly so I can take a quick look. On our website with steady daily organic traffic during six months or so, let's say about 40% of the pages are displaying a rich results. Contemporary peak of organic traffic have an impact on the way the pages are displayed in the search results. I'm trying to figure out if there's a correlation between the organic traffic to my website on one specific day and the change in rich results on the same day. In general, that wouldn't be related. So at least I can't think of any situation where that would be related. And usually any kind of correlation between different factors takes a bit of time to be propagated. So even if there were something tied in there, then that wouldn't be something where you would see any change on the same day. In general, the reduction in the number of rich snippets, rich results that we show for a website, if everything on the website stays the same, if they're using valid markup, if they're complying with our policies, then that would kind of point at our algorithms looking at that website and thinking maybe this isn't as high quality of a website as we thought. And that's the kind of thing that's totally unrelated to kind of a spike in organic traffic. So what I'd recommend doing there is kind of taking a step back and thinking about what you can do overall to increase the quality of the website overall. The issue omitted search index has been bothering us for over a year. OK. I think I looked at this in the past. So this is essentially search for a specific address in the country and some sites we don't show there. We have a property profile page for every address in Australia and every time the property is listed, we create another listing similar to what our competitor does. So in general, what happens with the kind of filtered results in the search results is we filter them out when we think that this is essentially a duplicate of something we already have shown in the search results. So if this is something where you have pages that are essentially very similar to other pages or that don't have significant unique content or don't have anything that we would be able to highlight as a difference in the search results, then that would be normal for us to actually filter that out. And that's not a matter of a site being penalized or otherwise kind of being treated badly. From our point of view, it makes sense to show users unique content in the search results. And if we see things that are essentially just a duplication of what we've already shown the user in the search results, then that's something that we would filter out. So my recommendation there would be to kind of rethink the strategy of creating pages for every single address in the whole country and thinking about what you really have content for, like where is the value of your content and what you want to rank for there. So that's kind of my recommendation there, taking a step back and not just automatically generating one page or two pages for every single address in the country, but rather trying to find ways to actually create something of value that you can maintain on your website that lasts over a longer period of time. John, you talked earlier about this and about doorway pages and how different cities, pages that target cities can be seen as doorway pages. I was thinking about, for example, Rob's site with the experience gifts that he has to offer in various states and cities. If he would have a filter or something that would create a separate URL that lists the services in that city or state, could that be seen as a doorway page? Or that's not exactly what Google understands this. Doorway pages would be more if you have one general service. Oh, speak of the devil. You must have heard me. Excuse me. Yeah, so doorway pages would be more if you have one service that you're providing across the whole country and you're just creating pages for each individual city listing the same content, essentially. So if you have one online store that you offer or one kind of service that you provide across the whole country and you just create pages for every single location, even though you don't have anything unique to offer there, then that would be a doorway page. But if you had services that were specific to individual locations where you can really say, in this area, I have these services. I have this address. I have this specific thing to offer. Then that would be kind of a normal page. Right, so what I just kind of watch out for is to avoid splitting that up too much because you're kind of diluting the value of your pages there. So if you have it for each individual part of a city and you create separate pages and say, oh, well, in this part of London you can also drive 20 miles to see this, then that's probably not that useful. And that will probably result in each of those pages having so small value overall that they wouldn't rank on their own. So it's basically like, in cases like this, it's just like a store that sells t-shirts would have pages for each of the sizes, something like that. I mean, something that is useful for users searching for that specific size kind of t-shirt, just as it would be for like a sub-selection of your entire services, particular to that filter or whatever that is. OK. Can I ask an unrelated question, more questions? Sure. So back in January, February, you guys rolled out the thing about mobile pop-ups, or just pop-ups in general. Have you guys seen sites been penalized from that? And what's another option instead of a pop-up or subscriptions or coupons or whatever have you? Yeah, we did roll it out. It's specific to mobile, so it's something you wouldn't see on desktop search. But we did roll it out. For some sites, it does have a fairly strong effect for other sites, especially if you're primarily searching for that site's name, then you wouldn't really see a strong effect there. So if, I don't know, if, like, let's say Google News had this issue, and if people were searching for Google News, then obviously we'd still be showing Google News. But if they were searching for some article that's somewhere on Google News, and Google News had this issue with the pop-ups, then we'd probably try to show some other variation because we think we'd have something better fitting for the user for that case. So that has rolled out. What I would recommend doing there is kind of thinking about what you can do with regards to banners instead of pop-ups. So instead of blocking the whole page, maybe just like a part of the page and say, hey, we have this special offer, or hey, we also have an app, or whatever you want to do there, to kind of give users the option of going there without blocking the actual access to the normal content. Watson, thank you. Sure. All right. In order to have SiteLinks search box displayed in Google search results, do we need to allow Google to crawl and or index the website's internal search results pages? And no, you don't need to do that. The tricky part with the SiteLinks search box markup specifically is that we would use this markup when we would show a SiteLinks search box. So if you are noticing that we're showing this kind of search within the snippet part when people are searching for your site's name, for example, then using that markup lets you swap that out against whatever search results page that you want to provide. So that's something that, from our point of view, doesn't need to be crawled and indexed, but it does need to be a valid URL. And in order for us to show that, we do first need to come to the conclusion that showing this kind of search box in the snippet for your site would make sense. One other kind of tricky aspect with the SiteLinks search box is that, at the moment, we don't differentiate between mobile and desktop URLs there. So if you have separate URLs for your mobile and your desktop site, then I would kind of thinking about setting up an automatic redirect on that search results page so that people get to the right version that matches the device that they're using. Or maybe even switch to responsive or dynamic serving if you can do that. Can I follow up on that one and just ask if that page needs to have a canonical tag that is self-canonicalizing? Or if canonicals affect that in any way? That doesn't matter. That's totally up to you. So if these are pages that you mean to get indexed separately, then using canonical tag is a good way to do that. I'd kind of be careful with getting search results pages indexed because that quickly kind of balloons, and suddenly you have millions of URLs that are indexed, and they don't provide that much value. So I think about maybe getting the main search results page index, like this is where you can start searching within the site, but not the individual searches themselves. Perfect, thanks. How many external links can we use on a single page with the nofollow tag? As many as you want. So if these are nofollowed links on a page, put as many as you want on there. Obviously, for usability reasons, there's probably some at some point you want to kind of make a cut and say this. It doesn't make sense to add even more links to this page because nobody's going to be able to find anything specific on there. But there is no limit from our point of view. This is something that commonly comes up. For example, if you have a really popular blog and lots of people leave comments and those comments have links to their websites and you have nofollowed all of those links, then that's something where sometimes a page has a lot of links on it, but these aren't necessarily links that you need to care about. Many blogs say that redirects chains are bad for ranking. Authority is deprecated by 10% for every redirect, but I've not been able to find any official word from Google. For example, if we redirect from example.com to dub dub dub, example.com, and then to HTTPS, example.com, would that impact the rankings? No, that's not a problem to have these kind of redirect chains there. In general, we do recommend reducing the number of redirects that you chain just for usability reasons. Every redirect that you do, especially if it goes to a different host name, so from example.com to dub dub dub, example.com has a kind of a cost associated with it on the client side. So as short as you can keep this chain is kind of what we'd recommend. If you can set it up, then I would just redirect to the final URL as much as possible. But there's no issue with regards to page rank being passed or signals being passed when it comes to redirect chains provided we can kind of crawl through them. One thing that we have mentioned in the past is that up to five hops will follow automatically, like right away. And if we need more than five hops, then we kind of have to pick that up in the next crawl cycle, which might be like a day later. But in general, if you're seeing more than five hops in a redirect chain for URLs that you think are important on your site, then that would be something I'd fix anyway. That's just way too slow, especially if people are on mobile devices, then all of these redirects, they just take an enormous amount of time. And really, if you can avoid that, then I try to avoid that as much as possible. Let's see. Can you help me understand Search Console? There's kind of a question about why does this specific URL show so many impressions and which rankings for these specific queries. I think there was some help already given in the comments there. And if you're unsure about the details there, this is really the kind of thing I'd recommend posting in the Webmaster Help Forum about, because people there can really kind of guide you to the details there. Is it possible to use hreview microformat for customer satisfaction reviews acquired by Google Surveys to pull through star ratings on my website? So I'm not sure how Google Surveys pulls in customer satisfaction reviews. So that's kind of one thing to watch out for. In general, if you use review markup on your pages, it should be specific to the item that's being discussed on that specific page. So it shouldn't be broadly for your business or for your website. It should be specific to one specific item. So if you're selling, I don't know, blue shoes, then the reviews that you include on your site should be specific to that kind of blue shoe that you have on that specific page, not to your website in general. So that's one thing kind of to watch out for. The other thing is that it should be obvious to any user coming to your website how to leave a review. So that's another thing where if you have a too complicated setup where you're pulling reviews from an aggregator that's pulling reviews from somewhere else, then that's really hard for users to kind of follow up and see, oh, where is this review actually coming from? Can I look at it? Can I leave a review myself? And that would be something that, from a policy point of view, the web spam team might say, this is not set up properly. I tried Data Highlighter for this page, and I see a completely different page. And the fetch page and the live page by the tool, I am unable to figure out why. So I don't know about that specific URL, but in general, the Data Highlighter is built based on the indexed content. So if you have pages that change fairly regularly, then that's something where if you try to mark up one of these pages with the Data Highlighter, it'll use the index version of that page. So in particular, if this is kind of a lower level page on your website that we don't crawl and index that often, it might be that this page is a couple of weeks or even a couple of months old when you mark it up. From our point of view, that's less of a problem, because what we're trying to look at with the Data Highlighter is what kind of items you're marking up, not the specific values that you're currently marking up for that individual page. So that's something where we're using machine learning to understand that for this type of page that has this type of content on it, you're saying, this is the date, and this is the location, and this is the description, this is the name, this is the price, those kind of things. And that's something that we can learn even from older pages. So from our point of view, just because the content that's displayed there is older isn't necessarily a bad sign. I would watch out for situations where you've done a complete redesign of your website and used the Data Highlighter, then obviously you want to make sure that you're marking up pages that reflect the current design of those pages. That said, if you're doing a complete redesign of your website, then maybe it makes sense to put the mark up directly on the page anyway so that you don't have to kind of worry about Data Highlighter picking up the right things on your pages there. Let me see. We observe the Fetch tool is not working. Is there any update in the Search Console tool? As far as I know, there is nothing specific, kind of broken, or stuck, or changed with the Fetch as Google tool. There are some limitations with regards to how often you can use that tool for your website. So that might be something that you're looking at. It might also be that we're just not able to crawl much more from your website because we think your website is kind of limited from a kind of capacity point of view. But in general, as far as I know, there is nothing stuck with the Fetch as Google tool. How come a homepage index in Google and ranking for numerous keywords is not ranking anymore on the first keyword it was ranking for on page one in the search results. Another deeper page is ranking instead, but far from the home page is first ranking. These kind of things can change over time. That's not something where I'd say this page is ranking number one now. It'll always be ranking number one now. In the future, that can change. Fluctuations are kind of normal in Search. What I would recommend doing there is thinking about finding ways to make sure that people come back to your site on their own as well. So instead of just relying on traffic from search, think about what you can do to take that traffic from search and turn them into recurring visitors so that you don't have to always rely on your page's ranking in a specific location in the search results. Obviously, that's not always that easy. That's kind of what everyone wants to do, right? We're an international multi-language e-commerce startup. For some of our products, thumbnails show up in mobile search, while others don't. How could we optimize our page so that more products will displayed with this thumbnail? I don't know how you're seeing these URLs being shown with a thumbnail, so that's something where I'd probably need a bit more details, maybe screenshots, maybe example URLs. But this is probably also something where other people can help out as well. So I take this kind of a question and go to the Webmaster Help Forum and see what you can do there with feedback from other people. And if all else fails, feel free to send me the link to your thread, and I can double-check as well. With mobile first, should we start including mobile pages and sitemaps? No, you don't really need to do that. When it comes to mobile first, our primary goal is to try to keep everything as same as possible, with the exception that, of course, we're indexing the content on the mobile pages. So the sitemap continues to list your normal canonical pages, your desktop pages, the rel canonical stays the same, all of that kind of stays the same. Obviously, if you have a responsive setup where you keep the same URLs for different kinds of content or use dynamic serving for mobile content, then that makes all of this pretty much a non-issue. You don't really have to worry about it. So if you're kind of on the fence with regards to moving to a responsive layout, maybe that's something worth doing. We have a very large site with a number of products going offline every month with a 404. If there were an increase in this amount, say, 1 million 404s in a month, how would that affect us? And how would your best advice be in handling this? From my point of view, that's fine. Some sites do fluctuate quite a bit. If you have a lot of 404s, that wouldn't be a sign. From our point of view, that the site is lower quality. Obviously, it makes crawling a little bit trickier because we can't kind of recrawl everything that quickly. One thing you can do is use the maybe unavailable after meta tag to let us know if you know ahead of time which URLs will go 404 so that we can drop them from the index without having to actually recrawl them that often. But in general, yeah. I asked a follow up question to the 404 issue. There's a remove URL link or option in the search console. If I remove pages, should I follow that up with a remove URL request? And the reason I ask is each one of the pages that I'm going to be deleting link to each other within the same directory or folder. I'll be getting, I mean, if the pages get crawled at a slow rate, which they probably do because they're not that anyway. Should I be deleting them all at once and then removing URLs at the same time? And is there a bulk way of doing that through search console or do I have to individually request URL removal? Yeah, so the URL removal tool there is specific to the search results. So it doesn't change indexing of those URLs. It basically just hides them in the search results. So that's something where you can use it for situations like that where you have kind of a handful of URLs that you want to have removed. If you need to remove them quickly, that's an option there. In general, however, you can also just let us recrawl those. And we will drop them when we see a 404. And that works naturally as well. There's no bulk option in Search Console for your removals unless these are all in a specific folder where you can say everything in this folder should be removed from the search results as quickly as possible, then we can do that. But if it's individual URLs here, there, and there, you'd have to submit those separately. In general, we recommend not using that tool for normal maintenance. So if you're just removing individual pages and you do this every now and then, then I wouldn't use that tool. I primarily use that if you have something really critical on your site that you want to have hidden from search immediately. So maybe someone's full address or name or something that you accidentally put on your site and you want to have that removed as quickly as possible, that's a good use for this tool. Well, I guess my only question was if they're all interlinking to each other and Google de-indexes or gets rid of or calls 404, half the pages, the other half of the pages are still linking to those pages that no longer exist. Would those broken links display any kind of a quality problem with my site? No, that would be perfectly fine. I mean, from a usability point of view, it's not great to link to a 404 page, but it's kind of natural. It just happens. The web isn't perfect. Most websites aren't perfect. We have to deal with that. Maybe it's the OCD and may not want it to be perfect. Thank you. Yeah. I mean, it's always good to have high goals, but it's not the case that if we run across a bunch of links on your site that go to 404s, and we say, oh, this is a bad site, we have tons of 404s on our pages as well. I recently went through some of our Help Center, and we linked through a bunch of really old pages that haven't existed for a while, and nobody has noticed it. So it happens to everyone. Matter of fact, I think the page that links to the webmaster, things that you're doing now, actually has a broken link on that page. Hey. Yeah. OK. Thank you. I hope it's not. These kind of things happen. It's not that we would take any kind of a technical issue like this and say, oh, it's a sign that this website is lower quality. All right, we have a couple minutes left. Do you all have any questions left over? What can I help with? Can I? Sure. So my question is, one second. When do you need to declare the language of a site in Google Webmaster 2? It makes sense to declare it when you have a website with online content, or when you have a physical business in that specific country. So you probably mean the location, not the language? Location, yeah. Because language is something you wouldn't need to declare in Search Console at all. Location is something where you can set the geo-targeting. If you want to do that, what happens is when we recognize that users in that country are searching for something that we think they're looking for something local, then we'll show your site a little bit higher in those search results. So that can make a lot of sense. If you do have local content for users like that, if your site is primarily this generic reference material type content where people aren't searching for something local specifically, then probably that geo-targeting setting doesn't have a big effect. But if they're looking for something local and you have something local, then I would definitely use that. OK, even if you have video content with description in a language like Romania or Czech Republic or whatever, you think that is better to use it. I mean, you don't have any disadvantage to not use it. So that's something where I'd at least give that extra information if you have it. When it comes to different language content, usually we can map that anyway. So if someone is searching in French, and you have a website in French, then we will recognize. They probably want the French version, not the Spanish version, for example. So the language itself also helps us to filter automatically. But if you have country-specific content, then that definitely helps. I think with a pure video site, it's kind of, I don't know, hit and miss. So if you really have country-specific content, maybe you do have something specific to individual countries. And in that case, I would definitely just go ahead and do that. OK, thank you very much. John, if you have something that's country-specific, let's say like rent a car service. And how does that play out for users who are outside the country but are searching for something specific, like rent a car Romania, and you offer rent a car services Romania, how does your targeting work in that situation? We use it for the user's location. So that's something where if you're targeting users in the UK who are searching for a rent a car in Romania, you would kind of have to geotarget the UK, which probably doesn't make a lot of sense here. So this is something where if we can recognize that the user is searching for content in a different country, then I think we would pretty much ignore the geotargeting card there. Because it's like you're searching for rent a car Romania. It doesn't matter where you're located. You want a site that's rent a car Romania. You don't want a random rent a car site that happens to be geotargeted for the UK. OK, so content plays more of a role than actual relevant content plays a role than geotargeting. Because I know geotargeting plays more role if you're looking for something local to your specific area. So if you're searching for pizza and you want local pizza places, I think pizza is maybe a bad example unless you're in a really small country. But that kind of thing where you're searching for something that's where we can kind of guess you prefer something that's local to your country or your area. OK, but if I have a .com site, which is a generic TLD, and I offer rent a car services in Romania, but I also want users from the UK, the US, to find me when I'm searching for that. Should I geotarget? It's also for users in Romania, of course. But should I geotarget for Romania, or should I just leave it open? I think you probably wouldn't see any big changes there, either way. OK, OK, got it. All right, so with that, we're kind of at the end of our time. Thank you all for joining. I have another Hangout set up on Friday in the morning, European time, if you're all around. Otherwise, I'll set up the next batch of Hangouts probably early next week again. Thank you all for joining. Thanks for all of your questions and comments. It's been a fun Hangout again. All right, see you all. Bye, John. Bye.