 All right, welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I'm a webmaster trends analyst here at Google in Switzerland. And part of what we do are these Office Hours Hangouts, together with webmasters, publishers, SEOs, trends analysts who want to join in. As always, if any of you want to get started with the question, feel free to jump on in now. I'll start if that's OK. Go ahead. Hi there. My name's Darren. I'm from the UK. I run a couple of travel businesses. We've just acquired a fairly large website called travelgay.com. It was presently previously in two different websites, travelgayeurop.com and travelgaysia.com. We decided to merge them together, starting by travelgayeurop.com, becoming travelgay.com. And since we did that and put in all of the video one redirects, we've seen the traffic drop by about 80%. We've done all the correct redirects. We've used the site move tool. We think that the problem might be around this domain previously having been used about 10 years ago as an adult domain. And potentially, when we've looked at safe search, it's being blocked on quite a lot of safe search. And now Bing have now lifted that safe search and the traffic's recovered. But just trying to find out how we deal with Google, we have submitted a safe search request about 10 days ago. Yeah, those requests sometimes take a bit of time to give process. But that's definitely the right place. I pinged the safe search team about that just before they hang out as well when I saw your question. So I hope that'll get processed there. Usually, it can take a couple of weeks for us to process these kind of requests. So if you just submitted it maybe 10 days ago, then a couple of the weeks is kind of reasonable. And it takes a bit of time for all of these changes to bubble up into the search results as well afterwards. So maybe it was just a matter of time or maybe this additional nudge from our side helped a little bit. But do you think that 80% drop is probably correlated to that? It's hard to say. I looked at the Europe site that you linked there as well. And that one didn't seem to be affected by safe search. So I don't know if it's not affected anymore because there's no content there, obviously, now. Or if it was never affected there, if it was never under a safe search, then that could definitely have an effect in the sense that when we can tell that people aren't searching with that setting or explicitly looking for this kind of adult content, then that could have an effect. I don't know what percentage would be reasonable there. It probably depends a lot on the website. OK, yeah, that's great. Thank you very much. I think most of it is organic traffic, which is why we're just keen to see that 80% recover, obviously. Yeah, I don't know if all of that would be from this. Especially since you now mentioned that you actually merged two sites there, then that's also something that takes a bit longer for us to actually process. If you're just, I think that's the second slide, untouched at the moment. So travel gay Asia is still untouched, and that hasn't seen any effect in that sort of traffic. So the next few months time is to merge it once we've got the first one out the way, basically. Yeah, I'd wait for it to kind of settle down first. But that's always tricky. I think, in general, when you're merging or splitting sites, that's something that takes a little bit longer. If you can do it step by step, that makes it a little bit easier and a little bit easier to figure out where things went wrong, something does go wrong. Great, thank you. Sure. All right, Arman, I think you had. Yes, thank you very much, John. I actually have three questions. One is, I just posted a short message in the group, maybe to Darren can answer it or somebody else, or do John. I didn't know that there is a way to submit a safe search request to Google. If there is, would you please pass the link in the chat, and maybe I can explore that. My second question is, and I think you may have in the past addressed it. I just wanted to ask if there isn't any update on this. Several months ago, actually more than six months ago, we did MPA for all our articles. And I was wondering, do we need, but we see that our MPA pages are not actively crawling. I don't know why. So do we need a special site map for AMP pages, accelerated mobile pages, to rank better? That's my second question. And the third question, what is the best, should I address the third question after this, not to forget? Yeah, let's, before I further. OK, yes, because it is most important. OK. Yeah, so I guess the link I have to look up, it's in the Webmaster Help Center. I think sometimes it's harder to find. So usually what I do is I search in the Webmaster Help Forum for one of these threads that occasionally come up. And then someone will post the link there. And then I follow that link. For crawling AMP pages, usually you don't need to do anything. So what happens is we re-crawl the AMP page when we re-crawl the normal web page. When we see the link rel AMP HTML on those pages, then we'll re-crawl the AMP page as well. And we'll also re-crawl the AMP page when a user views the page in the cache. So if someone goes to the AMP cache, then we'll re-crawl it after the user views the AMP, the cache page, so that we have a chance to update the cache that we have. So it's not that they need to be indexed on their own. Usually they're not indexed. Usually the normal web page is the one that's indexed. And the AMP page is just kind of swapped out for that. And we crawl the AMP version for two reasons. One is to make sure it's really a valid AMP page and that it has the canonical back to the web page. And the other is for the AMP cache so that we can show it in the AMP cache. OK. Thank you so much. And my third question. Does an average news website, which is publishing original news story, have a fair chance to appear in Google Top Stories, or AMP News Top Stories? Until about six months ago, my website is torknews.com, torknews, which is we cover car news. I made a comment under your you probably have seen. John, about December of last year, we stopped appearing in top stories. But for example, I mean, you're talking about branding. You have to be people need to talk about your website. And so we are the only website on the net, car from car websites that, for example, covers Subaru News, Subaru Car Stories, every day. No other website does that. But we never appear in top stories. Even if you search for Subaru News, we are on the top of search changes. What can we do to have a fair chance to appear in Google Top Stories that's killing our traffic, not appearing? Please, can you take a look at our website? OK. I think that's always a bit tricky because the top stories is, from our point of view, an organic search element, which we treat as something like any other part of the search results that we don't have an explicit turn on, turn off button that we can manually adjust there. It's really based on what we've crawled an index from your website. But I can definitely take a look at your site and pass that on to the top stories team to look at. I know the people working on the top stories feature, they've been quite active and are working on various iterations to make it, I don't know, possible to show different kinds of content. It's already something that's not tied to Google News, for example. So this is something where we're trying to figure out what the right experience there should be and what kind of content we should show for which kind of queries. Thank you so much. That's something they're still working on. It's a fairly new feature. So I would certainly expect to see some changes there. I don't know if for your specific case, if that would be enough to say, OK, it's right enough to kind of go through the top stories and we start showing a lot more, or for different kinds of queries, we start showing your results. I don't know if that would be the case, but I do know that they're working on this. And they do take kind of escalations that we send their way about this seriously. Is it OK if in the chat room of this message I pass my website so you can see it? I think you posted it in one of the questions, right? Yes, because I also have a second website, which is a health website. Same situation. At least you can send it to them too. OK, sure. Thank you so much, John. I greatly appreciate this. I don't know if they'll be able to change anything, but at least they can take a look at it. That they can take a look at it. Because it's originally unique content, it works with the team, and I have no idea why we're not appearing there. OK. Thank you so much. All right, let's take a look at what all was submitted. And if any questions from your side come up in between, feel free to jump on in. And probably we'll also have some time for more questions on your side. Don't mind it. I'm missing a graphic. An issue of factors like your ranking and backings. Sorry, I didn't quite get that. How to change one domain to another domain without losing traffic? How to move from one domain to another? We have a Help Center article on that. I will double check on that. I think it's called changing your domain name. Let me just double check. Is Indeed2Page 3.0 is required or change of factors is enough? You have to set up the redirects and all of that. So not just the change of address request. I can pull that out and post it in the chat. I have another question. If I have an AMP page, that's an exact duplicate of any other page. Will it hurt, SEO, or? That's kind of the normal situation that you have one version as the normal web page and the same content as an AMP page. So that's ideally the way that you would have it set up. Yeah. Thanks. Sure. All right. So let's run through some of these questions. I have a question regarding crawl anomaly report in the new Search Console. I've got a lot of very old, like 2015, 2016, 404 arrow pages reported as crawl anomalies. Why are they still reported as crawl anomalies and not 404s? The documentation is kind of minimal, so it's kind of confusing, I guess. So first thing I would do there is submit feedback in the new Search Console. I know the Search Console team is actively watching out for all of the feedback in the new part of Search Console. So if anything is confusing in there, always make sure to submit feedback to them so that they're aware that this is confusing and you don't know what this means or why is Google showing you this and not something different. So that's kind of the first thing I would do there. The other thing that I'd keep in mind, especially with crawl errors, is that we crawl or we remember URLs for a very long time. And even if they return 404 for a long time, then we might still remember them and retry them from time to time. So it can certainly happen that a URL that used to exist on your website a long time ago is still occasionally re-crawled. And that's not a bad thing. So from our point of view, we show that in Search Console specifically as a crawl error or as a crawl anomaly or whatever it happens to be. And just so that you're aware of this, that we tried this URL, it didn't work. And if that's OK with you, that's fine. If you wanted to actually have content there, then you might need to double check your server configuration and make sure that it actually has content and it's returning a 200 code instead. So that's kind of what's tied in with there. So in general, I wouldn't worry too much if you see old crawl errors in Search Console. All right. Does Google unofficially support hreflang attributes in anchor tags? So for example, an A tag with an hreflang link attached to it. No. Well, as far as I know, we don't officially or unofficially support hreflang links in anchor tags. If you want to use hreflang links, then I'd really recommend just using them the way that we have them documented in the head of a page, in the HTTP header, or in a sitemap file. Usually, setting up hreflang is complicated enough with that basic setup, especially if you have a lot of different language pages or a lot of pages that you use with hreflang. So I'd be really worried if the team came to us and said, how about we also support hreflang in this completely different place on a web page? Because that just adds so much more complexity and makes it so much harder for most webmasters to actually get it right. So I'd really recommend using it in the traditional and the normal place. We recently moved our website. Oh, this is the one we talked about before. We're facing a problem with declining click-through rate caused by the video snippet in the search result. We consulted with you in previous Hangout session, and you recommended to block Googlebot from crawling the video file, which seemed to work at first because we lost the video snippet gaining the click-through rate back. But then it reappeared a few days later. What could be happening here? Is this like a Google test or what? So my guess, I don't know which site this is at the moment, but my guess is either we saw the video file from another location, or it's embedded in a slightly different way, or it's a matter of things kind of just settling down in the final state in the sense that maybe some said data centers have the updated content with the non-video snippet, and others still have the older content with the video snippet. They were showing that. So that might be happening there, too, in that sometimes it just takes a little bit longer to settle down. The other thing is you disallowed the video file with a dollar sign at the end of the URL, according to the comment that you have there. That means if the video is embedded in a way that has a URL parameter or anything attached to it, it might be that we still try to crawl that video and that we'd be able to crawl it. Because the disallow and the robots text file like that is very specific in that anything that ends with lowercase mp4 would be blocked from crawling. So any other variation might still be crawlable. And depending on how that's embedded on the page, that might still be something that's being picked up on. I don't know if you'd be able to see that directly in the video snippet, but you should be able to double check that in your server logs if you have access to that to really confirm, like, are we crawling any other video files and showing that instead? Then here's it. Here we go. I'm creating an Ajax-based data visualization tool. Names are loaded with infinite scrolling, SVG and text elements change in response to clicking on names. In addition, the URL is sometimes modified with a hash fragment to enable back button functionality and to assist crawlers. What's the best way to assist Googlebot's crawler to traverse these hash fragment pages? Should I include them in the sitemap? Will it be published, punished for Googlebot content? So I think, first of all, we generally don't crawl URLs with just the hash fragment. So if you just have the hash sign and then some text afterwards, usually we drop that and only crawl the version without the fragment. There are extremely few cases where we do recognize that this is something that's critical to a page and then we try to pick that up and actually follow that. But for the largest part, you can assume that we are not going to crawl or index any hash fragment URLs. So what I'd recommend doing there is using the HTML5 History API instead. With that API, you can change the normal part of the URL and use that for navigation within a JavaScript app, for example. So that might be an option there to make it a little bit easier to crawl through a website like that. The other thing where I'm not really sure how you're going to be handling this is with regards to the links on the page, the infinite scrolling part, the text elements that you have on a page, SVGs that you might use with text in them as well or with links in them as well, all of these things make it a lot harder for us to actually crawl and index a website. So in particular, if we have to use rendering to view all of this content, that means it takes a little bit longer for us to kind of pull in all of this content and be able to show it in the search results. And additionally, if you require rendering so that we can crawl through the rest of your website, all of those crawling steps each individually take a little bit longer as well. And finally, depending on the way that you're setting up infinite scrolling and SVGs and text elements, it might be that we don't even recognize that there's content here that we need to index properly. So my recommendation here would be to kind of be cautious with regards to this setup and to kind of get advice from other people with regards to how you can set it up in a way that works well for users and that works well for search. That could be to have something like pagination on the bottom rather than just infinite scrolling. You can also combine the two, of course. It could be to make sure that you have some elements that are within Static HTML on these pages as well so that when you open a page for the first time, you see kind of a pre-rendered version right away rather than that the client has to process JavaScript and do all of this fancy stuff to just show the primary content. So I wouldn't say it's impossible to set up a web app to be crawled and indexed like this, but it's definitely the harder level compared to something that's more static HTML-oriented. And there's nothing wrong with taking the hard approach to a problem like this. I think it's something you would learn a lot from, and it's something that's extremely valuable because lots and lots of sites are using these JavaScript frameworks nowadays, and they also need help with regards to search. So having experience on a JavaScript-based website with regards to search, I think it's something that's really useful to have. Quick question regarding titles and HTML improvements. I asked a couple of time ago in the JavaScript sites Google groups, so I didn't get any help. Let's see, problem how Google identifies titles for JavaScript sites, or if it's Angular implementation. Google warns us about pages with duplicate title tags, but the pages have different titles in our back end and front end. And the problem is that Google seems to get the correct title, but in a different language. So for page A in English, it takes a title for page A in German, for example. And I think Google is penalizing us for duplicate content. So I think that last part is definitely not happening. Google is definitely not indexing a site for duplicate content with regards to things like titles. I think the main issue I would watch out for here is if we're switching titles between individual URLs, that sounds like something on your back end is not working the way that it should be working, in the sense that we should always be able to fetch a page and render a page and get exactly the same title every time. It should not be the case that, depending on how pages are crawled and rendered, that certainly a page's title would be in a different language. So in particular, we don't use cookies for different languages. We don't use cookies when crawling and rendering. So that's something where you need to have it set up in a way that we can crawl that specific URL. When we look at that page, we get a specific title in that language, and not that there's any kind of crosstalk happening there. Usually, what can happen when the content itself is very similar, we might fold them into one, where we say, this one, this set of URLs has exactly the same content as others, and then we have one version. Usually, that doesn't happen for individual languages, but more within the same language. So if you have a German page for Austria and a German page for Germany, and the content is exactly the same, then we might say, well, this is a German page. So in a case like that, you would have kind of the same content on two pages. It wouldn't be the case that we would swap the content. So if you're seeing that we're actually swapping content, especially across different languages, from my point of view, that's a really strong sign that this is an issue on your side, on your back end, so on. I kind of try to dig into that first. We've seen a lot of emphasis being put on reviews and star ratings, including Google Surveys. How much are search results affected by these on-site surveys, and in turn, the star ratings? For example, if I run a survey on my site and we see mostly lower ratings, will this hurt my search rankings or prevent me from climbing higher in the search results? Or does this only matter for local searches or should brands focus on surveys and reviews? So I think there are a few aspects there that we can take a quick look at. On the one hand, we don't use the content that we see in surveys as a ranking factor. So if you have a survey on your site and currently the status is like two stars, we're not going to rank your site lower because you have something that says two stars on there because we also don't know what these two stars mean. It might be that one star is like the best and fewer stars, more stars is something different. It's really hard to tell. We don't know offhand there. But in general, we wouldn't rank them differently in the organic search results based on the star rankings. It might be different with local search, where you might have your Google My Business listing and you have reviews attached to that Google My Business listing. But that's completely separate from the normal kind of organic search results that we have there. So those are kind of the first things there. If you do want to have reviews or kind of star ratings on your pages, that's something you can add with rich snippets or with the schema.org markup that you can put on your pages. We have a lot of information in our developer site on how to set up these reviews on your pages with what the requirements are there for those reviews. There's also a code lab for kind of setting this up on a set of HTML pages by yourself. And these are things that we would just show in the search results. So if we decide to show your review rankings in the search results, we will just kind of show that as a part of the snippet. It wouldn't be the case that we would rank those pages higher or lower because of the content there. So those are kind of the options that you have available there and they're different effects. And it's not something that every site has to do, obviously. For some types of content, it makes sense to have some reviews on there that users place. It really kind of depends on the type of content you have. And that also drives whether or not it's even possible for you to use review snippets. So in general, we expect that the structured data reflects the primary object that you're talking about on that page and not that you just put structured data on every page of your website and kind of have it set up like that. Hi, John. May I ask a question? Sure. So we are a big Q&A website with verified expert answering questions. And for the last two years, we are implementing EAT strategy to drive authority, express authority, and so on. And by some reason, since September 2017, every time when Google is doing core update, we are seeing heat in traffic as significant. And we're not doing anything wrong, we think. So I was wondering if I may ask you to maybe take a look and maybe tell us what is the main thing we have to fix, maybe, if anything. Sure, you can post your URL in the chat maybe, but it's really hard to look at these sites in a live situation because you really need to dig into the details to see if there's anything that we could even say that is happening there. But you're welcome to post it in the chat there. In general, a lot of the more regular updates that we do, they're not specific to any particular factor on your website. It's more a matter of where we think your website would normally fit in with regards to search results. And sometimes that goes up for a website when we think, well, we probably didn't treat it as well as we could. And other times it goes down a little bit where we say, well, maybe other results are actually more relevant for these kinds of queries at the moment, according to our system. So these kind of changes are not tied so much to a specific page on your website or a specific technical setup that you have on your website. It's more that we're looking at the bigger picture and trying to figure out what is actually relevant or more relevant for users. And these things can change a bit over time. That makes sense. We were just wondering if there is a really issue and we are just blind and don't see it. Yeah, I'm happy to take a look. So feel free to post the URL in the chat here. I don't know if I'd have something that I can pass on back to you and say, you need to fix this or you need to change this. But happy to take a look. Maybe there is. Hard to say. Thank you. All right. We have competitors with deep pockets who've done everything in their power to hurt or discredit us. One has even linked us hundreds of thousands of times to Russian porn sites, generally directed at our home page and certain tag pages. I don't understand disavows. And there are even warnings beforehand that concern me have had others put them in for me in the past, how important are disavows and how long does it take to see positive effects after putting them in? And should we keep doing this over and over? Um, so in general, these are the type of things that we recognize as well. We've seen tons of these kind of spammy links kind of come and go. And it's something that our algorithms try to just ignore. So instead of saying, well, this is a problem and this website did this on purpose and we will kind of need to demote them to take care of this, it's something that we find works better when our algorithms can just recognize these issues and ignore them completely on our side. What I would just do, however, is if you notice this type of thing is happening and it worries you and you're not sure, like is Google really recognizing this or not, then that's the situation where I might go and just put them all into a disavow file, ideally by domain. That way you don't have to do that much work and just submit that. And that way you're sure that Google will not take those into account. In general, because we already ignore most of these, probably you won't see a big change up or down with a disavow file like this. But at least you're sure that Google really doesn't take these into account at all. So that's kind of where I would focus there, where if you're really worried about this problem, you're like, I don't know if Google is treating my site properly or not, then go ahead and put them in disavow. If this is something that you've just seen and you think, oh, this is kind of weird, then I would just skip over that and just say, wow. It's no big deal. I mean, one of them linked close to 84,000 times. I mean, there are like thousands and thousands of these. I mean, they're paying people to do this all day long. Linked 84,000 times to our home page, which is more than we've been linked by MSN. And we get linked by MSN a lot. So it's one of those things when you look at links to your site, it's legitimate places and then it's something with a rather graphic name dot r-u and you go, wow, what's going on here? Yeah, I mean, the easy part there is that you can just put that one domain in the disavow file. And then it's like all of these 80,000 links are just gone. So that makes it a little bit easier. You don't have to kind of keep following up with that. But in general, we do take care of these. So these are things that we've seen day in and day out for many, many years. And we basically just ignore them. But again, if you're ever worried about this, if you're ever in a situation like I'm losing sleep and I don't know how Google is treating my site, then I would just put them in the disavow file. You don't have any downside to putting those kind of links in a disavow file. And some of the sites, as I go through link by link by link, it's not that they're offensive content, but it's completely unrelated, like horse farming. It's like, and one of them really in the tens of thousands is horse farming. And I don't know whether at some point that site had a connection to Entertainment. I doubt it. But is that something you would disavow also? Because it has nothing to do with what we do. No, I wouldn't worry about that. I mean, what would happen in the worst case is that suddenly for horse farming terms, your site would also show up. I'm pretty sure who cares. Yeah, I'm pretty sure we might be interesting. No, that's good to know. And also, I asked, so we do, we have lots of old content that we did redirects. And some of it, we just actually decided let's remove its thin content that's nothing. Usually, when I do the removal of URLs within 24 hours, it says removed. And then maybe 48 hours it says expired. It's been stuck at pending now for four days, which is odd. Yeah, so I don't know if this is in your situation, if this happened there. But in general, there are two ways to submit URL removals. One is within your Search Console account, where it's clearly tied to your account. And we have understanding that you're the owner. You get to make these decisions. And those are the ones that we generally process within less than a day. And you can also submit them with your normal Google account in general or with anyone's Google account. And those are ones that generally take a little bit longer because we want to make sure that this URL is actually really gone. We do it through Webmaster Tools through the console. I mean, it's correct. Yeah, just for some reason, this past week, it's been. Is it an individual one? Or is it a whole bunch? No, it's about 125 that have been just sitting there. OK, that sounds kind of weird. I'll take a look at that. Yeah. Thank you. Oh, and one last thing if I may ask. Sure. It's sort of related to the disavowal. We clearly have people who were an independent company. We go against, honestly, conglomerates that are in the billions. Whereas in the billions, they have a lot at stake. They use the feedback. We know this. We have moles everywhere. We know they use the feedback section. We know they use their interns to denigrate us. Does that have any effect on ranking and how one is trusted or rated in any way? No, no. That's something that goes to the search team, to the search quality team, to the user experience people. And they take this feedback in a way to kind of better understand where people's thoughts are and what kind of content they would like to see in individual situations. So that's something that usually is more of a longer process and more of something like, oh, we've seen people complain about AMP pages because they don't like them, or they don't like the logo or something. And this is like, we've seen lots of people complain about this over time. So maybe we should change something slightly there. But on the basis of individual sites, that's not something that we would use that for. So it's really a matter of, this is a general trend. And they're not focusing on one specific site, but rather this whole different area that we're kind of getting wrong in search. And we get feedback from all kinds of people on that general area. And because of that, we should figure out, I don't know, some approach that we can do to make it easier to show more relevant results to the bigger group of people. Thanks. All right, so more questions that were submitted, two questions about impressions in Search Console. If a website is on page two and the person searching for a particular keyword phrase is not visiting page two, would it still be countered as an impression? No, it's really only countered as an impression when the user would actually see that in the current search results page. And I think it goes on, says there's some weird impressions that probably look a bit weird, I guess, in Search Console. I think in general, depending on the type of website that you have, you will sometimes see these weird kind of edge cases where someone is searching on page 10 of the search results for this particular query. And that just happens sometimes. So that's not something I would say is a problem with the data. We really only track these when we actually see them in the search results happening. Maybe people just search in unique ways. Sometimes you're really desperate to get that information on a specific topic, and you can't find the right term to get exactly what you're looking for in the search results. So you just click through a couple of pages. It happens. And the second one is, when I check the AdWords keyword planner, I can see that a certain keyword is searched for 2,000 times a month. But my impressions only show 300 per month and on page one. Why is there a big difference? So first off, I don't know how the AdWords keyword planner calculates these numbers. So that's really hard to say. It might be that it's bucketing individual numbers there. It might be that it's looking at a global count and your site is not ranking number one like everywhere for those terms. All of these things can kind of come together. And in general, depending on the way that you look at impressions, the way that you track them, for example, if you also look at Google Analytics or other analytics tools, you'll always see some general difference across these different ways of tracking information. And especially if you're looking at something that's relatively low numbers, where you're talking about hundreds or low thousands per month, even small changes on a day-to-day basis, they can skew this number quite significantly. So that's also something kind of worth keeping in mind in that some amount of fluctuation is completely normal there. And even when you're looking at trends, it's kind of tricky to look at something which has a couple hundred per month when it goes up 10% or down 10%. Those are changes that are in absolute counts, like very few. So those are kind of things I would watch out for with regards to kind of checking across different tools and different ways of measuring things. And that doesn't mean that any of these tools are less useful than others. It's just you have to understand what they're kind of measuring and maybe look at the bigger picture and look at relative differences as well, where you see this tool says 2,000 for this term and 200,000 for another term. That's obviously a pretty big difference. So that's something that might be more useful to focus on instead. Continuing the impressions question, if a search results is in the 10th position and the user had not scrolled down to see that result, would that still be counted? Yes, that would still be counted. So if it's on the current search results page, then that's counted. We have a help center page on clicks, impressions, and position for search console specifically that goes through a lot of these edge cases, also kind of the situations with regards to what if there are images on top or what if there is a carousel on top of the search results page, how does that actually get counted? And OK, a question on multilingual SEO without duplicate content issues. In general, if you're doing content in different languages, it's not a matter of duplicate content. So if you translate your content from one language to another and you publish that under a different URL on your website, we don't see that as duplicate content. It's essentially the same information that you're providing there, but it's completely different content. It's a different language. It's a different target audience, essentially. So we would treat that as unique content. So in that regard, if you feel that you have users or you might have users that would be interested in seeing your content in one particular language, then I would definitely go ahead and try to get that set up with a localized version of your content. What's the difference in ranking for an old website, a new page, and a new page on a new domain? Let's see. With the suggestions, it means that we shouldn't go with a CTOD, but rather create separate folder and target of individual countries or also m.org or not. In general, these are, I'd say, kind of different situations. If you have an existing website and you add a new page to an existing website, then for a large part, we understand that this website already has a lot of context on the internet. And it makes it a lot easier for us to understand those new pages that you're placing on an existing website. Whereas if you start over with a completely new website, then obviously we need to understand that website first. So that's something that always kind of plays in there. That doesn't mean that the end effect is better or worse. It just means that in the beginning, we have a lot more context. So we can kind of use that context to understand those new pages a little bit better. With regards to CCTLDs or subdirectories or subdomains when you're going international, from my point of view, this is something that you can take into account if you want. But it's probably a really tiny effect in the long run. So in general, when it comes to things like CCTLDs or using a subdomain or using subdirectories, that's something where I would focus more on what your technical capabilities are, what you can do on your side. Can you put completely localized content within a sub directory on your website or not? In many cases, that's something that's really obviously yes or no. So you can do it or not. If you can't do it, then obviously you go with an approach that you can do a little bit easier. Sometimes there are also situations where you need to use a CCTLD for legal reasons. And obviously in cases like that, if you need to use it, then you need to use it. And from our point of view, we think that in the long run, any kind of internationalization setup that you have, we should be able to pick that up and just go with that naturally. With regards to mobile-first indexing and m.subdomains, that's something that, from our point of view, is completely separate in the sense that, at the moment, an m.content is something that's attached to the desktop page. We understand that this is the desktop URL and this is the mobile URL. It can be on the same domain. It can be on a different subdomain. It can be on a different domain, completely, even. And that's something that we just use for recognizing where the mobile content is and swapping out the URLs. So that doesn't have any effect at all. With regards to mobile-first indexing, later on when we shift to the m.dev versions, essentially what happens there is that we just shift the indexing to that other version. It's not the case that you would see any kind of ranking change that would be similar to moving to a different website there. So with that said, I would primarily focus on what you need to do on your website. For mobile indexing, we recommend using responsive web design, if that's something that you can do. So I try to avoid setting up new m.sites. It just makes everything so much more complicated. But if you currently have an m.site, it is the way it is. And we should be able to just deal with that. Seems like the new Search Console is incorrectly reporting flagging pages and posts marked as no index. I don't quite understand what you mean in that regard. Maybe you have some more information somewhere. Or what might make sense is maybe link to a help forum thread that you have with that more information. I can double-check there after the Hangout. What's the best practice to handle localization of domain suffixes? My question has to do with internationalization. We have a .com domain. Our goal is to break into the EMEA markets. And we'd like to obtain domain suffixes relevant to the countries we will be targeting. However, we don't want to manage multiple websites. So we would like to have the same or very similar content on all of these domains. How will this affect our web rankings? I think it's kind of tricky if you have exactly the same content and you're just saying this is localized for different markets because it's not really localized. So what would happen in general here if you have the same content on your .com as you have on your .de or .fr? Website is we would look at that and recognize that it's the same content. And we would pick one of these URLs and treat it as canonical. And that's the one we will focus on for Search. So probably that would be the one that you currently have if you're just copying the content from there. So that means you would have these different country code top level domains. But in Search, we will just show the .com version because we're trying to make you do your favor and say, well, probably you mean just index your main version of your website. So we'll try to do that. What you could do to change that a little bit, if you wanted to, is to make sure that you actually have localized content for those individual country markets so that we can clearly recognize that actually there is something unique here that we should index separately. From a practical point of view with the first setup, if you just copy the content, then users can, of course, go to those URLs directly. You can use those URLs directly for things like ads or links or anything else that you have kind of happening with your website. And that might be perfectly fine. That might be good enough for you. We still index the .com. But you have maybe an advertisement running for .fr because you're doing some local promotion or whatever you're doing there. If you do want to use the country code top level domains, again, you'd need to have the unique content there so that we actually recognize that. And what also helps us a little bit is to have the hreflang links between the different language versions or the different country versions so that we understand which version we should show in which individual country. For a large part, we already understand that based on the domain ending, of course, by the top level domain. So if you have .de, that kind of tells us pretty clearly that this is the one that you want to have shown in Germany. In general, though, what I'd recommend doing there is if you're really just starting into this for the first time, I would recommend getting help from some expert who's done this a few times because there's some things that you can run into which make things a little bit more complicated than they need to be, especially if you can do that kind of ahead of time rather than responding to something actually breaking. And some of these consultants and experts can also give you a bit of advice about whether or not it actually makes sense to do that in your situation. Because a lot of times, it's perfectly fine to have one large global website and to just use that internationally. It doesn't prevent your website from showing up if you have a .com for users in France or in Germany. They can still find your website just fine. So all of these things are topics that are kind of complicated. They depend on a lot of individual situations and specifically around your website, the capabilities that you have with regards to localizing content or not. So I try to get some help there before you just run off and kind of buy all these different top-level domains and set them up. We have a portal which has English and French versions of the same content on the same URL separated by query parameter called set language equals FR. Can we assume that all our English and French versions of your URLs are naturally indexed? Probably. Probably. So if we see links to these individual language URLs, so in this case, they wouldn't be the same URL, but they would have the query parameter and the end with language equal French or in English. If we see links to those pages, we'll probably crawl and try to index those pages separately, and that will be perfectly fine. So what you can do is double check and search. Do a site query for your website. Maybe use in URL and include set language or set language equals FR depending on how you have that set up and see what you actually have indexed from your website already. And my guess is if you've set this up in a way that you have links to those language versions, that we'll probably pick that up properly, and you don't have to do anything special there. All right, let's see. Any more questions from your side? Hi, John, I've got a lot of questions. That's OK. All right. Let me chat, but it's about the 301 redirects we've put in place on the original domain. Obviously, that domain, the travel guy Europe domain, has now dropped off Google in the index. As and when we start re-ranking, hopefully, when Safe Search is listed, will that, the waiting that was behind that domain pass through to the new domain still? Yeah, yeah. Is that period last? Is there a time where that drops off? So usually, I'd recommend just keeping those redirects in place as long as you can. Definitely at least a year so that we can actually pick that up. Yeah, that's fine. So I mean, if these are your domains, I would just continue to keep that and keep that redirect set up so that it keeps passing everything on. Yeah. But in general, once we've been able to kind of understand the content of your new domain and understand that it's the same as the old one, then that should pretty much be similar or the same with regards to ranking. It's always, there are always some possible fluctuations that happen when you do a site move. But in general, it should kind of settle down about it the same. Yeah, I mean, we've noticed that most some key words are still in the same place if you've got Safe Search off, others which were some of our most popular pages have disappeared to page 11 or 12. But I'm sure that'll come back up. I mean, we moved out a month ago, six weeks ago, probably, but obviously the Safe Search thing has really Yeah, I take a look when things settle down there. It's always tricky to check the rankings when you have such a change with regards to Safe Search on and off. OK, thanks. Sure. All right, so it looks like the URLs are in the path here. So I can copy that out and keep a copy of that so that I can double check those URLs. All right. Can I ask a quick question, John? Quick question, please? All right. Go for it. First of all, I really thank you very much for introducing our URLs to the Top Stories team. And that time frame, let's say they say, oh, these are some great websites, good stories. We're going to include. Realistically, when can we expect to change? Because this is so important for us. Our entire business is at stake because of this. We're in the news business, but we never appear in the Top Stories. Despite publishing 7, 10 original stories every day by professionals. So when can we expect any change, if at all? I can't make any promises there. So that's something that really depends on where they see the problems. For the most part, these kind of issues are more algorithmic changes that we have to make. And that's something that depends a lot on when they're able to update these algorithms. And one last question. Yes. Usually that includes a lot of testing time as well. So it's really, really hard to say. OK. And one last question, please. Let's say there is a problem on our end. This is one thing that I would like to give a feedback for Google Webmaster Tools. I know you have changed it a lot. It gives us a lot of feedback. But let's say the problem is on our end and we're not able to see it. Is there a way you can at least message me something to say, Armin, this is the problem. The caching issue here or something. So we can fix it, please. Yeah. Thank you so much, John. I appreciate it. That's sometimes we sometimes do as well. Thank you so much. I really appreciate it. From the indexing or from the ranking teams, like this guy is making some simple mistake and they're probably not recognizing it, then we will send them a note and search console. Thank you so much. Sure. All right. So thank you all for joining. It's been great having you here. Lots of good feedback, lots of good questions. And I hope to see you all again in one of the future Hangouts. Thank you very much. You have a great day. Bye, everyone. Bye, bye, everybody.