 OK, welcome, everyone, to today's Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a Webmaster Trends Analyst here at Google in Switzerland. And part of what we do are these Office Hour Hangouts with webmasters and publishers, SEOs, who are here to ask questions and confirm things with Google directly. All right, as always, if you're relatively new to these Hangouts and want to get started with a question, feel free to jump on in now. All right, I can go first. So John, my question specifically was around top stories. So I just put my website here in the chat window here. So just a quick introduction. We cover Indian Bollywood news. And we've been in the industry for around 10 years now. We are pretty much number one in the industry. We are doing really well on search for a while now, for a few years now. And roughly around 140,000 daily visitors from organic search from Google. And most of it, a lot of it coming from top stories. Around September 16th or 17th, we saw a big plummeting drop. 140,000 users went to, like, 35,000, 40,000 users. Occasionally, we see some drops like this. But the next day, we pick back up. And now it's been two weeks, but there's no change at all. And our first inclination was, something's technically wrong with the site. So we went and looked up Google Webmaster tools, console errors, or server log errors. We went and looked at AMP validation tools, structured data testing tools. And we just couldn't find anything wrong anywhere. We consulted with a lot of SEO guys. And we talked to Google, ads and support teams. And finally, someone suggested to join this. And maybe we can get some insights into this. So my question was, hey, how do we figure out what's going on? And what can we do to fix and get back there in the top stories? So I think, first of all, one important aspect is that the top story section is algorithmic. It's not something that is tied to any manual setting or meta tag or anything like that. On a site, that's something that the search algorithms determine algorithmically whether or not to include individual pages from a site in those results there. So that's sometimes kind of tricky to kind of figure out. And if you're sure that there are no technical issues, if you're seeing that the new pages on your site, especially if it's a news type site, then the new pages are kind of important. If the new pages are being indexed as they were before, then I'd assume that this is more a matter of our quality algorithms kind of rethinking the way they were showing your site in the search results. So if that's the case, then that's always kind of tricky, because there is no simple meta tag that you can just tweak and just say, oh, this is what needs to be different. So what I'd recommend doing there is, first of all, double checking that, really, the new pages are being indexed normally. They are. Actually, we actually show up both in the searches and the Google Search, the Google News Search, if you go and see in, I think, every 15 minutes our site is getting indexed. So articles are coming up right away. So that's something that we already verified. OK. So if that's the case, then that's really just a matter of our quality algorithms trying to figure out how we should show your site. And sometimes that's a combination of different things that kind of comes together there with regards to things like ads on top and all of that, which might be relevant to kind of look at. So I don't know your site at all. I'm not sure what it looks like, what kind of content you have, what the general quality is there. But that's something that I sometimes see site struggling with. OK. Given the fact that we've been around for a while and we've been on top of all Google News searches for many years now, and then a sudden drop like this, and the news that we cover, we see that a lot of other competitors cover the same news after we've covered it, and they show up there. So I think the quality of using stuff like that, we feel it's right up there. Nothing's really chained on that front. That's something that we've verified with a lot of other competitors as well. And things like Google News, all the guidelines that Google News has put up, the number of articles, it shouldn't be too long, not too fragmented. We've been following every single guideline that Google has put up there. So it's just mystifying what's going on. Yeah. In general, just because a site has been around for a long time and has been doing things consistently for a long time doesn't mean it'll continue to be ranking the same way. So that's kind of one thing also to keep in mind there. It's not the case that if you've always been doing it like this, then you will always appear in the search results like this, the web changes, our algorithms changes. All of that kind of evolves over time. So that's something where sometimes it's worth thinking what you're providing and how you're providing it so that you make sure you're always kind of on top of things rather than chasing them. Right. I suspect a lot of this are things that you've been doing already. But it's still perhaps taking something like this and really trying to get objective input from users directly to see what you could be doing differently. That's kind of what I'd recommend doing there. Yeah. I think on the same lines, we've been measuring our traffic against the social and all the other referral traffic. And nothing has dropped anywhere. And even in Google, what we've noticed is desktop traffic has been consistently good. Nothing has changed on that side. It's only the mobile traffic from top stories that have dropped and everything else looks solid. Anything else I can suggest what we can do here? We just totally lost here two weeks and it's just flat out there. So what I'd recommend doing there as a first step is maybe going to the Webmaster Help Forum and getting some objective feedback from peers. Just kind of as a rough check to see if you're looking at the right things, if they also kind of poke around in the dark and can't figure out what it might be. And that's also a place where the top contributors in the Help forums, they can escalate issues to us directly if they see that something is really totally weird and not doing it the way that it should be. But usually kind of getting some general help there and getting some ways of kind of really pinpointing where it is that you're seeing these changes, that's really useful. Yeah, I think we did post a couple of messages in the Webmaster forums. I think everyone was like, same answers that you've been finding really good insights from there as well. Do you think like any, the number of ads that we have on the pages might have any effect on this? That's something we do look at. So that's, I think, one of the algorithmic factors that we look at where we want to see to make sure that there's actual content above the fold when we open a page. So that's perhaps one thing to look at. Also, when it comes to mobile, things like interstitials, if you have kind of these notification interstitials or sign up for your newsletter, interstitials, those are things we also look at, all of that. But if you have a link to the forum thread and if you could post that in the comments of either the chat or on the Google Plus thread, we can take a look and see what was going on in the forum thread. OK, I can just follow the couple of things that we already have. OK, great. Thank you. All right, who else is new and has a question that they have on their mind? John. Yes, go ahead. Pretty good. OK, actually, I have four questions for you. The first question is, one of our clients has recently released a website. He has two websites. One is for the New Zealand, another one is for Australia. And both have the same content because the products are the same. But the domain extension is different. For Australia, we have .com.au. And for New Zealand, we have .co.enged. So to avoid the duplicate content issue, we use hreflang tag. Is that enough to avoid duplicate content issue or do we need to do anything else? That's pretty good. I think that's a great start. What I would still recommend doing is making sure that there's some unique content on these pages. So some information that tells us these are pages that we shouldn't be folding together. If they're for different countries, then things like local addresses make a lot of sense. Local currencies, all of that kind of helps us to make sure that we understand some content on these pages is duplicate and that's fine. But we should keep them separately indexed. Okay. And then our next question is, some of our clients, they have e-commerce website and they usually remove products from the website. They add products to the website. This is a common matter for e-commerce website. But when they remove products from the website, it creates four or four errors. Sometimes they remove 10 or 12 products at the same time. So what is the best way to deal with this issue? Usually what I do, I create, I direct all the pages to the four or four page, custom four or four page, which contains other products link, other page links. Is that okay? Or should we redirect that page to the product category page? What is the best option? I think having a clean four or four page is perfect. So if you have more information there, that's great. What's important is that the four or four page returns a four or four result code though. Okay. And another question is, sometimes some product has a small description. So when we use some SEO tools to check the website health, they identify those product page as a duplicate content issue. Because the supplementary contents are same. Only the main content are very small and it is different. But if they check the ratio, it seems the contents are almost same. So will that be a problem from Google's point of view? That should be fine. Okay. And one of our client is planning to add a review section in the product page. So any client or any user can post review on a specific product page. Now the question is the review we get on those page will be considered as a main content or supplementary content? I think it could be either. So that's kind of up to you depending on how you want to structure these pages. I don't see a disadvantage in either way. It's a different way of putting content together. If you have it as a main content or not, that's kind of a decision you can make. Okay. Thank you, John. That is the question I have. All right. Great. So let me jump in to some of the questions that were submitted and we'll have more time for other questions from you guys here as we go along. As always, if there's anything in between that you want to add or that's unclear with regards to question or an answer, feel free to jump on in as well. Hey, John. Hello? All right. One more before we get started with the list. Okay. Go for it or maybe not. Okay. Hi, John. Could I jump in? Okay. Go for it. Thanks, John. I have a client that has a website that has a million 500 sessions per month averaging. We currently have a prox about 600,000, 44 errors. So to give you a bit of a background, this client has been using prerender.io. So what prerender does, if you're probably very familiar with it, but it caches and renders the website. And the problem is is that the website uses quite a lot of JavaScript. So they've used this service to make crawling more efficient. So we've kind of cleaned up the cache version and we have all the old URLs out and new URLs are all in the cache version. But we still, when we go in Google search console, we can't see the 44 errors going down. It's still increasing. So we're not really sure what's going on. Would it, like, is there a time delay? Is it a couple of days data that could be the delay or what are your things going on here? It's hard to tell. So what I would do there is look at some of the sample URLs that the search console finds. You can sort them by date, the URLs on the bottom and look at some of those to see, are these kind of the old pages that are being found again or is this a new issue or are these pages that should have been indexable in the first place? So kind of just double checking to make sure that these 404 errors that are coming are really errors that you don't want to care about. And if they're errors that you don't care about because these pages shouldn't exist, then that's perfectly fine. Having a large number of 404 errors is essentially no problem. The one thing I'd kind of be a little bit worried about there is where they actually came from. So in particular, it sounds like you have so many 404 errors that almost kind of match how many pages of content you have, which to me kind of suggests that maybe you restructured your website yet that you changed some of the URLs on the website and didn't redirect from the old URLs to the new URLs. And that's something that, from my point of view, is definitely worth doing. Without those redirects, we essentially just see errors for the old URLs and we discovered random new pages on your website and we treat them as new pages. So we don't know what the value of these pages whereas with redirects, we'd be able to pass all of the value and all of the information we have from the old pages to the new pages. And we wouldn't see those as 404s. We just kind of pass everything on. We'd be able to focus on the new URLs fairly quickly. So that's kind of, if you have done a restructuring of your website, even if that's maybe a couple months back, I would still go back and try to set up redirects from the old URLs to the new ones. Thanks for that, John. What we found out was they had sitemaps for all the different locations that they were targeting. So they had multiple sitemaps and all these sitemaps had old URLs. So they went through and fixed that up. But yeah, that's where all those 404 errors came from. OK, so these are sitemap URLs, essentially that shouldn't have been submitted like that. I guess in a case like that, I wouldn't really worry about it too much. If you've cleaned up the sitemap files in the meantime, then these errors will disappear over time. But it's likely to take a couple of months, maybe even up to a year for this number to really significantly go down, just because of the way that we recrawl a website, which means that some pages we recrawl quite frequently, maybe every day, maybe every couple of weeks. And other pages just take a couple of months, maybe up to a half a year or even longer to be recrawled. So those 404 errors will probably remain fairly high for quite a bit of time until we've really been able to kind of drop those out of our system. Right, right. That's interesting, because when I look in Google Search Console, I can see the crawl rate. But what you're telling me is that there's a certain priority of how Google crawls it. So it wouldn't just completely recraw every single page from scratch, or start with the most popular pages and then stay on that for a while. And then maybe somewhere down the line a couple of months later, we'll go and crawl some of the less popular pages on the site. Exactly, yeah. And especially if these are URLs that never existed, that were submitted in the sitemap file, we're probably not crawling them much at all. And that means they'll stick around as errors for a fairly long time. And that's not something you really need to worry about. From our point of view, we give you this information in case there's something there that you thought should have been indexed and suddenly is a 404. Then that's something you should definitely take action on. But if these are URLs that shouldn't exist at all, that's perfectly fine. You can ignore that. OK, and even if we mark it as fixed, would that contribute to the crawling at all? Or? Marking as fixed is only in the UI, which means it just hides it for you. So you don't see it anymore there. But it's still on our side. We still see it as a 404. It doesn't change anything. Right, gotcha. OK, cool. OK, thanks for that. Johnny, you've cleared a lot of up. Sure. Awesome. Hey, John, I have two questions. All right, you're very quiet. One is about duplicate content. I just want to know how Google treats duplicate content. How Google treats duplicate content. That's a giant topic. So it's hard to start there. Go ahead. I just want to know that this duplicate content is always a penalty? No, duplicate content is not always a penalty. For the most part, we recognize that pages are duplicate, and we just see that as a part of the web. And we have to deal with it. It doesn't, it's perfectly fine. The main issue with duplicate content is if a website is only duplicate content. If the whole website is just copying things from other sites or rewriting things from other sites, then that's spammy. That's a problem for us. So what about the content's teaching? Content stitching. So you mean you take different pieces of content from other websites and put them on your website? Yeah. I think we would probably see that as something similar. So depending on how you're doing that, if it's not adding significant value to the content, then the web spam team might look at that and say, there is no need to index these pages. We have this content already. So for example, one of my content, I have created one content, and I posted it in a few websites. So it cannot hurt my website. If you take your content and post it on multiple websites, it generally wouldn't hurt your website. But what might happen is that these other websites rank instead of your website. So I don't know if you're OK with that. That's kind of up to you. But that's something you have to take into account. The second question is, in GSE, I just can see the data of 1st October. So I haven't seen the past data, near like 2 or 3 October. In Search Console, you're missing some of the newer data. There's always a bit of a delay in Search Console. So that can be perfectly normal. Sometimes there is a bit more delay. Sometimes there is a bit less delay. It depends a bit on the technical setup and how things are processed on our side. OK, is there any tangible data? When can we see the recent data? When can we see the new data? I don't have any dates. Sorry. Thanks, John. Sure. OK, let me run through some of the submitted questions. And then we'll have more time for questions from you all as well. I show a delivery image on my branch page. Do you think it's spammy if I was to mark up the alt image tag saying something like we offer delivering of whatever in this location? I think you can do that. From my point of view, that's less of a problem. For us, in general, the alt attribute should be descriptive for the image. And if the image is an image of you delivering something in a specific location, that's kind of the description. So that's something that I would say you can go ahead and do that. In general, I wouldn't recommend stuffing the alt attribute just with keywords and filling things out there, because that does look a bit spammy. But if you're describing the image in a short way, that's perfectly fine. Does the Google Algorithm treat a migration to HTTPS any different to doing, say, normal page updates lightly? So the thing with a normal migration to HTTPS, where you just switch the protocol, essentially, is that all of the rest of the URL and all of the information that we have there kind of remains the same. So if the rest of the URL path is the same, if the query parameters are all the same, if the host name is all the same, then that's a fairly easy change for us compared to a site that changes its URL structure or moves to a different domain name. So with that in mind, then HTTPS migration is usually a lot easier and a lot faster than any other change or bigger migration on a website. Let's see here. I need to remove websites with incorrect information from my search results. How do I do that? So in general, if these are not your websites, you need to contact the webmaster of those websites and ask them to remove that content. So that's, depending on the type of issue that you're looking at here, that's something that's sometimes easier or trickier to do. If it is your website, you can just remove that content from your website. And once we recrawl that content, those pages, we can show that appropriately in search. You can tell us that you've removed this content through the URL removal tools as well. From there, if the page is removed completely, we can drop it fairly quickly. You can also do that for pages that were removed on other websites that you don't directly control. So if this page was removed completely, you can use the URL removal tool for that. I have RelNext and RelPrevious and also use a no-index tag on paginated pages. Does using the no-index tag have negative SEO impact with regard to the crawl rate? As in, if a page is set to no-index, does it decrease the likelihood of Googlebot crawling the page the next time? To some extent, yes. If it has a no-index tag, then it's essentially kind of telling us that this page doesn't need to be indexed. And if it doesn't need to be indexed, we don't really need to crawl it that much. So that's something where if you want this page to be crawled, and if you want this page to be indexed, then you wouldn't have the no-index tag on there. And if you do have the no-index tag on there, then that kind of helps us to learn that this isn't really such an important page from your website. John, sorry to interrupt. For the pagination, if we use a technical tag instead of relation next or previous tag, will it OK? Or do we need to use that next or previous tag? With the rel canonical, you can do that. But what generally happens there, so I assume you mean a rel canonical to the first page of the whole set. But what generally happens there is that we will focus on that first page. And if there are any links or any information on the second, third, fourth page, then we would lose that. So if, for example, you have a list of products and some of those products are only listed on the second or third page, then we might not find a link to that product on your website if we kind of directly follow the canonical and only focus on the first page. So that's something where you can kind of weigh the pros and cons. For some websites, that's perfectly fine. If you have a good structure and these web pages are well linked with each other, that might be perfectly fine. With other websites, maybe you need to let us crawl a certain number of pages within a set of a series of pages that you have there. OK, and another question is, sometimes when we install a certificate on the website, we use a history-tps version for the website. But there are some links which are history-tp. For example, the interlinking we have done before that those links are in history-tp. Will that be a problem if we have any history-tp link on a history-tps page? That's usually no problem. What you would be doing with an history-tps migration anyway is to redirect from history-tp to history-tps. So you would have that internal link to the history-tp page and that redirects to history-tps. And that's perfectly fine. So on an history-tps page, you can definitely link to history-tp content within your website or outside of your website. OK, thank you. Sure. All right, question from François. Last month, my host updated Apache and that introduced errors in my HT access. And the whole site went 500. OK, I wasn't warned by Search Console. I had to discover it by myself. Is that normal? I'd like to receive a warning mail. I don't know if we send notifications for this type of error, but that's definitely good feedback to take back to the team to see what we can do there to make that a little bit better. But this type of situation is always pretty tricky, especially if it's returning 500 errors in a case like that. So if it were returning 404 errors, then I assume we would send a notification saying there's a rise in 404 errors with 500 errors. It's possible that we'd also send a notification. I'm not completely sure, but it wouldn't be sending a notification exactly when it starts. It would be sending a notification a little bit later saying, hey, a lot of the URLs that we crawled recently have been returning 500. That's probably something you want to take a look at. Whereas you probably want something that monitors your website a little bit more frequently than just that. So if you're worried about this kind of issue in the future, I would set something else up to just double check your website at a higher rate rather than waiting a couple of days for Search Console to kind of let you know about this broader issue. John, I have a webmaster warnings question or notepad question. We got an email this morning to say, we wanted to let you know you've lost your website URL claim. Have you heard of those emails? And now suddenly, the domain isn't verified anymore in webmaster tools. This is for our UK site. I don't know. So that's from your website URL claim in your Google Merchant Center account. It seems to be product related. Oh, that sounds like something related to Merchant Center, though. But it also dropped out in Search Console? Yeah, it's now no longer verified in there. OK. It was yesterday because I was working on 404s. OK, I double checked to see what the verification setup you're using there. And I kind of make sure that those could be via tag manager. You do that with the tag manager? I don't know. That's what I'm asking. Could it be, and we did everything at one time and something's changed? Maybe. So what I usually recommend doing for important websites is make sure you have two verification setups used there. Something kind of independent. So that could be, on the one hand, the file that you place on the server. On the other hand, maybe a DNS verification. And that way, if any of these two falls out, then you still have that backup, and it remains verified. But at least for Search Console, once you verify it again, you'll have all of the data back again. So it's not that you'll lose anything. I don't know about these kind of connected systems like Merchant Center, how they deal with a situation like that. I assume websites drop out of verification every now and then that's kind of, things go wrong everywhere. It's not specific to any particular site. Well, the weird thing is, it says you've lost your URL claim because the Google account associated with this Merchant Center is no longer verified in Google Search Console. So the client is saying it's the other way around. Yeah, that sounds like it just dropped out in Search Console. I don't think we send notifications for that in Search Console. Why is he picking on me? Man. It's not me personally. It's computers. Computers say no. All right. OK. I'll check it out to see obviously. Yeah, usually verification is quick to set up. If you know what to do, you're just dropping that file on the server and it's done. Sorry. I'll try another couple of methods as well just to be sure. OK. Sure. Thanks. All right. A website took an image from my site and now that image shows up as a featured snippet. Their site is newer than mine and it shows up in the first page while my site seems to be stuck on the second page. How come my site doesn't get to reap the benefits of having original content? That's always kind of tricky. So I think first off, the featured snippet that we show there is essentially a snippet from one page and then maybe an image in general. And that's something that's generated algorithmically. That's not specifically tied to your pages or tied to any meta tags on your pages. That's something where as you improve your website over time, that will improve as well. So that's something where I'd say there is no simple magic trick to making that work, but rather you just kind of need to keep working on your website and keep focusing on that. If they're using your image on their website and it's a copyrighted image that you own, you might also want to look into the DMCA process to see if that's relevant in a case like that. That's something where you might need legal advice and I can't help you with legal issues, but it's a process that is available for websites that have content online. I have a question regarding links and canonicals. Imagine three pages on the same website. A links to B and B has a canonical set to C. Is there any value left for page C or is all of the value lost due to canonicalization? So I think first of all, in a case like this, you'd want to clean that up as much as possible. So if you recognize this type of situation with links or redirects to specific URLs and canonicals set to something else, then I would clean that up so that you have a clean selection of canonicals so that you're giving us really clear signals and saying, this is the page that I want to have indexed. And not kind of leaving it open to us. It's like everything links to this page, but it has a canonical to this one. So which of these is actually the one that we should be showing in search? That's always kind of a frustrating set up for algorithms because they don't know what to do. And then we do something that you don't expect, and then you're frustrated. So the clearer you can give us this information, the better. With regards to links, in general, we see them as being between two canonical URLs. So if A is a canonical URL and C is the canonical for B, then we would see that link essentially the same as a link from A to C directly. But again, it also kind of depends on how we choose that canonical. And maybe we'll choose B as the canonical instead of C. So that's something I definitely clean up. No site search box is appearing for a client, and schema markup is valid. There are e-commerce and brand searches are important, so it would be useful to have that. How do we get that? So the site link search box is something that's generated algorithmically and shown algorithmically. It's something where when we recognize that it makes sense to show this, we'll show it. And when we do show it, we'll take into account any schema markup that you might have to tell us how you want to have it shown. So the markup itself doesn't force it to be shown, but rather when we do show it, we'll take your markup into account and try to use the setup that you provide there, assuming that it's valid markup. So that's something which is kind of tricky in the sense that when we show it, we'll use a markup, but we won't show it just because you have the markup. Is it possible to include in your posting a link so that we can download it directly? I think we had this. So let's see. What's the difference between soft 404 and normal 404s? How can soft 404 errors impact the website's crawling and indexing? So normal 404 errors are when you tell us explicitly that a page doesn't exist. And soft 404 is when a page returns 200 and essentially, from a technical point of view, tells us this page exists. But when we look at the content, we realize, oh, it's an error page, or it's empty, or it has a no index, or something like that. So essentially, we look at the page, we get normal content back, and then we have to make this judgment call and say, oh, this is probably the same as a 404 page. So essentially, 404 errors directly is the more correct way to handle pages that were removed. And soft 404 means that our algorithms thought that there is actually nothing here to keep an index. Soft 404 errors also mean that sometimes we think there's content, and sometimes we don't think there is content. Other search engines might treat it slightly differently. So if you're removing things, I'd really recommend making sure that they actually return 404 and that we don't have to crawl the page and recognize that it's actually missing or not. Is it possible to see reports about mobile click to call traffic from organic search results? I don't think we have that information in Search Console. I believe some of this is shown in the reports that Google Buy Business sends out. So I believe I've seen that in a report that was forwarded to me at some point. I don't think that's actually a part of the Search Console information that we provide there. So I don't have any kind of tips on how to handle that better. What I would recommend doing there, if you're really curious about this and wondering what is happening there, is maybe posting in the Google My Business Help Forum and seeing how other small businesses or other businesses then have these type of entries, deal with that. Maybe there are some things that you can do to track the usage of these click to call buddies. The whole SEO community here in Romania thinks that by writing without the acrylics and special characters, you can rank better. Mihai says no. So some of the SEO community in Romania thinks this, apparently. The top argument that people use is that people search without the acrylics. So we should write without the acrylics and we will rank better. Is this true? How does Google handle this in German and French? What's your take on this? So I don't really know what's unique with the Romanian kind of search results, if that's something that where you do see this kind of difference. In general, in German and in French, where I looked into this a little bit, this is not the case. So some people do search without kind of the special characters. And a lot of people search with them. And it's something where if you create your content to actually be readable, it'll essentially rank for both of these variations. It's not the case that the search results are exactly the same, because sometimes people mean subtly different things, especially if you're looking at French, for example, then sometimes having an accent or not makes a difference in the word that's actually used. So that's something where it would be normal to see subtly different search results. But I wouldn't see that as something saying that you should be writing in a way that is essentially kind of not correct for your local language. And especially when users come to your pages and they read your content and it looks like you aren't really writing in their language, then that's something they're not really going to appreciate, I assume, at least for most cases. So with that in mind, my recommendation would be just to write normally, write your content in your language properly. And that's something that we should be able to pick up accordingly in the search results. If you see situations where we're really ranking things wrong, when people commonly search in a variation of a word that should be showing some pages but isn't showing them at all, then feel free to let me know. Send me some example queries, some example pages, where you're seeing this happening. And I can pass those on to the search quality teams here who can take a look at that specifically. I think the more generic the queries, the better, of course, because that way we can really kind of be sure that this is something that people actually do. And it's not like this one long sentence in quotes that nobody actually searches for in real life, but that has a weird match in the search results. So the more generic the queries, the better. And the clear kind of you can show that the results that are given are really bad, the better as well. So again, feel free to send me some examples if you're seeing that happening. I definitely wouldn't recommend writing your content in a way that doesn't actually match your language. OK, then I have a long sitemap question where I think it kind of goes into we have a lot of URLs on our website. And some sections of the website tend not to get indexed as well as the rest of the website with regards to sitemap files. It's really hard to say what that could be. In general, this could be perfectly normal. So we do try to understand larger websites a little bit better to understand which parts of a website it makes sense to actually index more and which parts of a website it doesn't make that much sense to index more. And that just might be that we've kind of discovered over time that this part of your website is really useful, and we'd love to index it and show it to people. And another part of your website has a lot of URLs, but we haven't really recognized that this is actually worth the time to index in depth. So just because a URL is in the sitemap file doesn't guarantee that it's actually indexed. And we might kind of split things up subtly differently. If you don't agree with that assessment of your website, then one thing I'd recommend doing there is really making sure that the part of your website that you do care about is really consistently up front and center within your website and that you don't kind of dilute things with hundreds of thousands of other sitemaps or URLs that aren't really your primary focus. So that's something where maybe it makes sense to restructure things subtly so that the part of your website that you do care about is a little bit more in the center than the part of the website that you don't care about that much. I'm curious if poor AdWords performance will impact organic search results ranking. That answer is easy. That's no. We don't take into account AdWords performance at all when it comes to organic search results. That's essentially completely separate on Google side. It neither affects it in a positive way nor in a negative way. So that's totally up to you. One thing we do sometimes see is that people use AdWords or other types of advertising to test things out. So that could be to test landing pages to see how they perform. It could be to test titles and snippets to see how they perform to find the best variation. I think that's a great way to try things out. And when you test things like that, some of these tests will work, and a lot of these tests won't work. And that's perfectly normal. That's kind of how things should be. And that is more, from my point of view, kind of a help to let you figure out how you could be structuring things for your organic search results. But doesn't mean that organic search results will mirror that completely. So again, how your site does in AdWords is totally unrelated to how it does in normal web search. Let's see. I think we had the 404 list here. We'll apply in schema local business type in a local business with a global range, reduce its potential to rank in other countries. For example, local real estate developer with projects in many countries or local clothes shop with clients all over Europe, is it better to use organization or local business? I think that's totally up to you. So some global businesses do have a local office that you can visit. And that's kind of the way you want to be visible, in that you let people come and visit you in person, and you're still globally active. So that's something where I don't see much of a problem there. With regards to Googlebot crawl rate, we have 48 global subdomains, of which a large percentage receive only a tiny crawl rate relative to the top five big countries. If we were to move to CCTLDs, is it likely we would see some total higher crawl rate, or would it make no difference from our current setup? It's hard to say. I think if you're already using subdomains, you already have unique host names, and that probably means that we're kind of figuring out what the optimal crawl rate is per subdomain anyway. So probably you would see no big change in the overall crawl rate. And especially when you're looking at 48 different sites for individual countries, I think it's completely normal that some of these will be crawled more often than others. That's not necessarily a sign that anything is going wrong. The one thing I might watch out for there is if you're creating something unique and new for one of these countries, and you're seeing it not being picked up at all from crawling and indexing, then that's something where you'd kind of have to step in and figure out, what can I do to encourage Googlebot to crawl this content a little bit more? But on the other hand, if this content is essentially static to a large part and we're just not crawling it that often, that's perfectly fine. Crawling isn't necessarily related to ranking. Once we've kind of indexed that content, we don't need to re-crawl it all the time in order to show you in the search results. We recently switched to SSL and created a new property in Search Console, but missed to update the disavow file. And now we saw a big drop in rankings. Could this be the case? If yes, will it come back after we update the disavow file? I don't think that that would be a big problem there. So we'd probably pick that up anyway from the other version and use that disavow normally. So my guess is that drop in rankings is unrelated to the move to HTTPS and unrelated to the disavow file. However, it's still something you should clean up. And if you missed the disavow file, then I would go back and double check the help center that lists all of the other things that you need to watch out for and make sure that you really have all of those checked off as well so that you're really sure that that move to HTTPS was done cleanly and you didn't miss out on anything that might otherwise be important. What's confused with regards to site map indexing, we've been trying to figure out why all of our pages are not being indexed by Google to see if there is any kind of issue with the site map file. I don't see the link to the photo there. But in general, if there's an issue with the site map file, we'll show that in Search Console. So you'll see an error when you submit the page, the site map file. Like I mentioned before, in the other case, we don't guarantee indexing. So just because a URL is listed in the site map file doesn't mean we'll index it. And especially, this is really common in situations where you have a lot of URLs on a website that just come up. And the site map file lists them all. And when we look at the website, we're like, I don't know how good this website actually is. Then we probably won't go off and spend too much time crawling all of these hundreds of thousands of URLs on a website, regardless of if they're in a site map or not. The other thing to keep in mind is that the index count for the site map file is based on the exact URL. That means if you're linking within your website to a subtly different URL, or if you have canonical setup on your website to subtly different URL, that won't count in the site map's index count. So that's one thing to also kind of watch out for. And kind of double check some of these site map files and see which of these URLs are indexed and how they're actually indexed. If their index was a subtly different URL, then that can mean that we chose a different URL for the same content. And we wouldn't count that as index for the site map file. And usually, that's a sign that you can go ahead and figure out where you're giving us conflicting signals. Are you linking to this URL within your website and using a different one in the site map file? Do you have the rel canonical setup properly? Are you using hreflang? Are you using redirects? Are all of these signals kind of consistent and pointing at the same URL? And that's essentially what you should be aiming for. Oh, wow. More and more questions. We're kind of running low on time. So maybe I'll just take some questions from you while you're in the audience. If there's anything else on your mind. Yes, John, I have a question. All right. Yeah, I'm just wondering if you can give me a lowdown on shared IP addresses and CDNs. I've seen a lot of reports on websites that have been completely dropped from Google by using CDNs with shared IPs. Then when CDNs have been disabled, the websites have appeared back in search results within a few hours. I'm just wondering how Google deal with shared IP addresses. Shared IP addresses are generally no problem in the sense that that's something that has been around for a really long time. So lots of sites are on shared hosting. Not everyone has their own IP address. And especially when it comes to CDNs, you have maybe potentially different IP addresses all the time in that depending on the user, they might see this IP address or might see a different one. And that's essentially completely normal. That's not something where I'd say we look out for that and say, oh, this is on the same IP address as this shady site. Therefore, this site must also be shady. So I wouldn't see any issue with regards to shared IP addresses there. The one thing with moving to a CDN that sometimes plays a role is that you're essentially moving to a different type of hosting setup. And when Googlebot sees that, it gets a bit cautious with crawling. So it might slow down crawling a little bit until it recognizes that actually it can crawl a lot faster than before. So that's one aspect that sometimes sites see. But that wouldn't be affecting the ranking of the site. So like before, the crawling is something where once we have the content indexed, it's not necessary to recrawl it all the time for this page to rank really well. So my guess is the shared IP address CDN says kind of change there is unrelated to any ranking change that you might have seen there. OK, thank you. OK, wow, a really long question in the chat. I don't know if you're here in the Hangout directly. Maybe you can just sum it up very briefly. Otherwise, let me take a quick stab at it. We're an Indian video hosting website planning to go international. Decided to use subdomains. The only difference is the amount of video content hosted on international subdomains, as well as the paywall that will be introduced for certain content and languages. What optimal setup would there be with regards to like hreflang and search console, excitement files, IP redirects, et cetera? So I think the tricky part here is probably that Googlebot generally crawls from the US. And that means we index the content like a user in the US would. So if you're doing anything special for US-based users, that's what we would end up indexing and showing in the search results. So in particular, if there's content not available in the US and you don't show it to users in the US, then we wouldn't be able to index that content. Or if you have kind of a paywall setup on your website for users in the US before they can ever see any content, then that's also kind of a way that blocks us from actually crawling and indexing this content, and we wouldn't be able to use it. So the ideal situation, if that's possible, is to have some amount of content available in the US. If that's a possibility, it can be something like a teaser or whatever you think is relevant for these individual pieces of content. And that would be something that we'd be able to pick up and use for crawling and indexing as well. And users in the US would be able to see that as well. And that way, we'd be able to show at least some things in the search results globally. So again, it's really kind of important to keep in mind Googlebot crawls and indexes from the US and serves the results that we've indexed like that to users worldwide. So if you have a lot more content in India and you don't provide it to users in the US, we wouldn't be able to index all of that additional content and show it in the search results even to users in India because we never see that when we crawl and index those pages. Once we do have those pages indexed, assuming you have an English version or like a US version and an Indian version, using hreflang is perfect setup there. You can tell us which of these versions is for which languages in which countries. And we can swap those out in the search results appropriately. But for that, it's really we need to be able to index that content first. The last thing I think you also mentioned is IP redirect that again kind of falls into the same bucket as before. If you're redirecting users in the US to the US version from your maybe a Hindi version or general Indian version, then we'd never be able to see that Hindi version or that Indian version. So that's something where instead of using IP redirects, maybe it makes more sense to use a banner on top so that we can actually index both of these versions and show the Hindi version to users that are searching in Hindi, for example, or that are searching in India, for example. So that's kind of one thing to watch out for. Sometimes this gets really tricky, especially when there are legal issues involved, when there are policy issues involved with regards to like, I'm not allowed to show this content to users in the US. And sometimes there are creative solutions that you can do, maybe like show a thumbnail instead of the video preview. Those kind of things. But sometimes you just kind of have to bite the bullet and say, I'm not allowed to show this content at all in the US. So I kind of have to accept that this content won't be visible in search at all. All right. What else is on your mind? I have a question. All right. Go for it. We moved all of our blog content from a subdomain, blog experience days, to now it's on a subfolder. Is it worth, because I can't redirect to the site move in Webmaster Tools to that subfolder, because it's not set up as a separate property. Is it worth setting up as a property, then moving it, or are 301s just fine? You wouldn't be able to set up a site move anyway in Search Console for like a subdomain to a subdirect remove. But sometimes it's useful to have it listed separately in Search Console. So for things like search analytics, sometimes it's useful to have just that section of the site kind of up to you how you want to deal with that or how you want to manage that. But there's no not doing an actual site move has no differential benefit than just basic 301s. Exactly, yeah. OK. All right. What else is on your mind? Everything answered? I can try a question. All right, go for it. So I don't know if you've seen there was a Moz post regarding JavaScript indexing and rendering and things and how the value of the HTML links that are rendered by JavaScript might be somewhat less, especially maybe Google both cannot render it in time or anything like that. So I guess my question would be how much do you recommend things like pre-rendering services that were discussed earlier? Do you think those can bring significant benefits to sites who rely a lot on JavaScript? I think at the moment that's probably a useful thing to do. It's something where I see our systems over time moving more and more in the direction of us being able to render these pages just as well. I do sometimes see these kind of edge cases where if you pre-render it yourself and you make sure those pre-rendered versions are really comprehensive and complete and in an exact match of the actual JavaScript content, then sometimes that does make sense. So I don't know what exactly this Moz post has, but that's something where we do sometimes see a subtle difference. But I think more and more you'll see that if you follow the best practices for a JavaScript-based website, then it'll be fairly competitive with a normal website. There might still be some subtle differences there, but you're kind of doing a trade-off with regards to complexity in that if you pre-render things on your side, you have all of the complexity of pre-rendering and you can do more things wrong when it comes to pre-rendering. And on the other hand, if you let Googlebot handle that, then you don't really have to worry about that too much. But it's definitely, I'd say, a situation where we're pretty close in the way that we handle it, but it's not exactly one-to-one the same yet. OK. So if it's not too big of a headache, one could use something like a pre-rendering service to if it's not too hard to set up. Otherwise, going into the future, Google will probably get better and better at this kind of stuff. Yeah, I know when talking with the rendering team here at Google, they don't like it when I say we're kind of OK with pre-rendered content because they prefer to just get the content directly. I think, in general, the absolute best approach is probably to pre-render things for all clients, especially on the first visit. So essentially, when someone loads a page for the first time, you serve the pre-rendered version. That loads up a lot faster than anything else anyway. And when they start interacting with the page, then kind of swap in the JavaScript version. So that's kind of the ideal situation. At the moment, that's not always easy to implement. So that's something where you have a lot of complexity there. But from my point of view, that's probably the best approach there because that way, everyone has the advantage of the pre-rendered version. And it loads really quickly on all devices. There is no JavaScript that needs to run back and forth and do all of the fancy stuff. And you also don't need to worry about things around search. So for some platforms, there are tools out there that do this. I believe for Angular, there is a setup called Angular Universal. For React, there is something else that I forgot the name. But some of these tools, some of these platforms already have tools available that make this possible. So I kind of check that out as well. All right. So I still have a little bit more time. I just want to drop two notes or three notes maybe. One is if you've been following along this hangout for so long and made it to the end or watched the video, there's also Deep Crawl that does fantastic summaries of these videos. So if you just want a five-minute summary of some of the issues that we talk about, I would definitely check out their blog. They do great summary there. Another thing is we're going to do another round of question and answer videos where essentially, you focus on one question and one answer rather than a million questions, like in these Hangouts. And for that, I tweeted a link to a forum where you can submit your short questions, and we'll try to get you some answers for that. Short questions are better than long questions. They make it a little bit easier to answer. Site-specific questions, I'd also kind of go to the forums instead. And finally, we're still looking for more webmaster trends analysts like me who are interested in joining Google. So if you're a developer, if you've developed some websites, if you're into SEO, if you're interested in talking about the web and kind of moving things forward, feel free to drop me a note. And I can send you the links for that. All right, I think there are a few questions left. I think there's one in the chat, but I don't quite have the full context there. So if you can write that up briefly, then I can go through that. So we're concentrating on quality of content. We never try to build links. Our keywords ranking is falling continuously. Visitors are reduced. How to rectify what's wrong? That's really hard to say. So what I would recommend doing there in a case like that is going to a webmaster help forum, maybe our forum, and getting objective advice from peers with regards to your website in general, with regards to the quality of your website overall, and trying to really kind of listen to that hard feedback and see if there's anything that you can do to improve things overall there. And sometimes these are things where you've looked at your website for so many years and things like, this is my baby, and this is completely perfect, and anyone who complains about it is wrong. But sometimes you need to take that input from other people and say, and kind of wonder, if these people think my website is ugly or is structured badly or the content is not properly written or is just too many ads on these pages, maybe there's something I can do to make them happy and still not give away my baby. So that's kind of what I would recommend doing there. So it sounds like the rankings are just dropping. And usually from my point of view, that's a sign that our algorithms are looking at the site. I don't think from a quality point of view we should have ranked it like this in the past. So let me just double check to see what else was submitted and maybe I think the video site. Our website has always ranked well for a majority of keywords on Google. I've attached a photo of a huge drop of ranking in Google for Auckland strippers. It's the location name, the service we provide. We rank really well for other cities in Google, but our home page, as well as that subpage, no longer ranks in Google at all. Another Google page is ranking better in Google from a post we put out last year. Any advice what to look for? Search console says we're not penalized, but we've gone kind of down in rankings. My guess is this is probably something that would need to be looked at in a little bit more in detail, so I don't know if I have something kind of offhand to provide there. One thing kind of to watch out for there is I suspect we will be seeing this website as having adult content. So the safe search filter might be something that's playing a role there as well. So depending on how your pages look, that might be something to kind of look into to say, am I going too far? Might it be that Google is assuming this is really crazy adult content site when it's actually just kind of adultish business location? That might be something to look into there. That's something where if you maybe post in the Webmaster Help Forum, you can probably get some tips on that part as well to kind of see, is there something I need to tweak with regards to the text and the video content or image content that I have on these pages to make sure that it's kind of still reasonable and doesn't look too crazy in the eyes of Google's algorithms? The other thing that we do sometimes see is that sometimes there are technical issues on pages that cause us to drop these pages from the search results completely. So especially when you're saying that the home page is not ranking, sometimes that's a matter of the home page occasionally dropping out of Google's index completely, which might be because we're folding it together with some other page on your website, which might also be that we're seeing it as a soft 404 error. So I double check Search Console for those kinds of issues. And if we see it as a soft 404, then perhaps there's something like an error message or some error text that you sometimes have on your home page that we're picking up and saying, oh, this says couldn't build database connection or something like that. And in a case like that, making sure that these errors, when they occur, result in maybe a 503 result code on the server or are not displayed in the public version of the page, that would be a good idea. All right, so I think, whoa, they keep coming more. I think that's pretty much all of the ones I saw here. Any last questions from you all before we sign off for the weekend? Oh, what's the best way to contact you? Probably easiest by Twitter or Google+, and Google+, you can send me a note privately, and we can move on to email from there. All right, great. So with that, let's take a break here. Thank you all for joining. Thanks for all of the many questions and comments and things that you brought in. And I hope this was useful, and maybe we'll see each other again in one of the future Hangouts. Have a great weekend, everyone. Thank you, have a good weekend. Bye.