 All right, welcome, everyone, to today's Webmaster Essential Office Hours Hangouts. My name is John Wheeler. I am a webmaster trans analyst at Google in Zurich. And part of what we do are these Office Hours Hangouts, where webmasters and publishers can join and ask any questions they might have around Search or Search Console, their website, all of that. As always, a bunch of questions were already submitted. But maybe some of you want to get started with the first question directly. Oh, buddy. OK. We should have time for more questions in between, as well. So if something's on your mind, feel free to jump on in. I hear a little bit of noise in the background, so I am going to mute some of you. But feel free to unmute if you have something that you want to add. All right, let's look at what came in this week. So a bunch of questions are here, not like a whole ton of them. But that's OK, too. The first one is around reviews and branch pages. We have different branch pages for our depots. And we have different GMB profiles for each branch. And they have reviews on them. Can we put the text from some of these reviews onto the relevant branch pages and mark it up? We already have a link from the branch page going to the Google Plus page. But could we also do a link going to the review section, if need be? What do you think? So I find the use of reviews on website is really a good way to kind of highlight what you're doing uniquely that people really like. But the way you've kind of framed this is you're going to take the reviews that you like from your Google Plus page and copy those onto your normal website and mark those up kind of as an aggregate review. And from our point of view from our guidelines, that wouldn't be OK. So it's important for us that if you do use aggregate review, mark up on your pages that this reflects the whole aggregate that you have the full set of the reviews there, not just individual ones that you're kind of picking and choosing. It's not meant to mark up things like testimonials, for example. So that's one thing to kind of watch out for there. You can still put reviews on your pages. If you don't mark them up, that's perfectly fine. We might still show those in the snippet. Probably not with the rich snippets and not with the star ratings. But we could still essentially show that in search of people explicitly search for that. But in a case like this, I would recommend just linking to your reviews and not marking them up on your pages directly. John? Sure. Now Google, my business here, for one of new options about we can embed directly our reviews in our website. What do you think about this? I haven't seen that to be honest. So I don't know. Before 20 days, not more. OK. They're dynamic. They're not stay with some history. They're dynamic. If somebody make review today, this will be showing life on our website. That sounds pretty cool. I don't know if that would be marked up with the review markup. But if you have some examples of that, I'd love to take a look. And if you can send me a link where this is done, then I can double check with our search systems to see is that OK with our guidelines? Is that something that we could potentially pick up? Do we need to add markup to embeds? All of that. Because maybe that would be a nice way to make it easy. OK, I will send you things. Cool. Hi, John. Hi. I have a question. In the new Google Search Console, I saw a lot of world-deleted site maps. This is a part of Google history or somewhat. I don't know how influence is over the website. OK. So that shouldn't cause any problems in the sense that if the site map file returns a 404, we will just view it as a 404. It's no problem. That happens. You can remove those in the old Search Console in the site map section there. It's removed. This is the problem. So it's removed in old Search Console. And the new Search Console is still there. OK. That sounds like a bug. That sounds like something that is our problem, not your problem. So if you can send me a screenshot maybe on Google+, then I can take a look at that and send that to the team to make sure that we can improve that. OK. From an SEO point of view, that wouldn't cause any problems. But it's confusing. It's a little bit difficult because in the old site maps removed, there are some tens of millions of problems. And I hope we repair the website by removing the site maps and leave only the indexed pages, natural indexed pages online. So this is why I'm worried about this. Yeah, I think either way it shouldn't cause a problem. So if these are 404 pages, the site map files, or the old pages that were included in the site map file, that's perfectly fine. 404s are not a problem for us in web search. We have to crawl them to see that they're 404, and we will tell you that it's a 404, but it won't affect the rest of your website's ranking. It's not the case that we would say there are a lot of 404 errors. Therefore, this website is bad. We see this as a technical thing, essentially. So if these are 404, they're 404, that's fine. And just another thing. How long will Google Boot crawl an old site map? So if it's a 10-year site map there, and I see it still crawled one day ago, I don't know why. We have a big memory. We try those all the time. I think that it depends on the way that the site map file was submitted. So if it was submitted with the robots.txt file or with an anonymous ping. So if you ping the site map to us, after a while, if that link is no longer there, so from the robots.txt file, or we haven't seen it pinged for a while, and we see that it returns 404, then we will drop that site map file automatically. But if you submitted it in Search Console, then we will assume that you meant to submit it, even if it returns 404 for a long time. Thank you. All right. Let's see. Is there any benefit or harm in putting some of our USP in the title tag if we already have the keyword phrase there first? So USP, I believe, is the unique selling proposition. So what your page is unique about what it's trying to offer to users. And from our point of view, you can put that in your title if you want. So if you feel that this makes it clear to users what your pages are about, then that's something you can put in the title. I wouldn't focus so much on the ranking aspect there, but think about what encourages users to actually click on your pages when they see your page in the search result and think about how that might make sense. So if people are searching for a general term, the term you might find in your Search Console account, and they were to find that particular page with that title that you provide, does that give them enough information that they know whether or not this matches what they're looking for? And if so, that's probably a good title. If they see a title that sounds like a marketing phrase where they have no idea what this page is actually about, then that's probably a suboptimal title. Because if users don't know what the page is about, then probably they'll click on something that's a little bit easier for them to actually understand. Yeah, just with context to the same question, just wanted to ask, user might be wanting to looking for information, maybe from generic to maybe to specific kind of questions. So when optimizing a specific page, what do you recommend by either crafting these titles or maybe the description, or even going with the content part, like how we should go about. We have to target generic as well as the, you can say, the specific questions, specific queries. So I mean, either we get, it can be, goes very long form of content. Maybe it's sometimes not very kind of required to play so much of content, which is not really required. So it's kind of a tricky situation for us basically, to how we should go about for certain topic, which is a request, generic, as well as the very specific questions. Yeah, I think that's one of the areas where your experience comes in. And helps your clients to figure out what the best approach is there. So from our point of view, you can do either of these approaches. And your kind of experience will tell you which of these approaches is probably the best one to do. And it's something you can also test. So it's not that we would say you need to do it like this or like that, or you need to put both of these variations in like a long title and it'll be cut into pieces in the right place. But rather, you can test this. You can try things out. After a while, you probably understand your client's audience a little bit better. You understand their websites a little bit better, what they're looking for, how they're looking within your website. And with kind of that experience that you gain over time, you will be able to make these decisions a little bit faster. But until then, I would continue to test this. And sometimes you can AB test with the same page. You can give it a couple of months time to settle down. I've seen people also use ads to AB test in a much quicker way. So that might also be an approach. Cool. Thank you. Sure. OK, there's a question in the chat. Does Google have any issues or preference for indexing and ranking URLs that have special characters or for indexing ranking special characters used in French or Spanish or German? So for normal characters and all of their variants, that's absolutely no problem. We can deal with pretty much all of that with regards to things like commas, ampersands, parentheses. That can sometimes get tricky in the sense that sometimes these URLs are hard to pick up properly. So in particular, for Google, it's less of a problem if you link it in a way that is valid HTML than that will continue to work. But if someone copies and pastes URL into a forum, some forum softwares aren't able to actually recognize which part is a URL and which part is like a start of a sentence or kind of something in parentheses, anything like that. So my recommendation there would be to stick to the normal characters that you would use within kind of normal text. And that might be special characters as well. So if you have a website in Japanese, then obviously use the appropriate Japanese symbols instead of Latin letters. And to kind of avoid the trickier symbols like commas, colons, semicolons, ampersands, parentheses, spaces, those kind of things. How can I improve the indexing of sites developed on a single CMS having the same catalog of products with the same description on the product cards? Google indexes the site and excludes pages from the index in about a month. Sites belong to different organizations, but they were ordered from one developer. So I don't completely understand the background to this question, which means maybe this answer is not as useful for your specific case. But in general, if these sites contain the same content, if they're based on the same database of products, for example, and there's nothing really unique about any of these different sites that were generated, then what we'll probably end up doing is index these different variations, and then at some point we'll recognize, oh, this is all the same content. It's not worthwhile for us to keep all of these duplicates in our index, so we'll just pick one of these and index that. So if all of these different organizations are using the same site generated by the same database of products, for example, and they contain the same content, then usually it doesn't make sense to the index all of these separately. There's no additional value of having all of these, so we will just pick one and focus on that. What I would recommend doing in a case like this, for example, if you have some products that are the same, is to make sure that you're providing something really unique and compelling on your site as well, rather than just the same set of things that everyone else's has, which could be something like you have a specific set of products, which are actually the same as other people, but you have a bunch of things that are completely different, that are really unique and valuable, or you have a unique spin to all of these products. For example, you deliver all of these things in person rather than have them shipped from Taiwan or some far away place. So anything that you can do to really make your site stand out as a unique version of all of this content rather than just another copy that we already have, all of that can help us to understand that actually it does make sense to index this site separately. And we should make sure that it's shown in the search results separately because there is something, some unique value that's being provided here. These are a way to auto-disavow some TLDs. It would help with the .ru domains that are linking to me, which are all spam. New ones show up every couple of days. So as far as I know, there is no way to disavow a whole TLD with a disavow file. In general, I think that's a bad idea anyway because there is a whole bunch of content out there on some of these TLDs. And just because you see some of these that are spammy that are linking to your site doesn't mean all of them are spammy. So that's one thing to keep in mind here. The other thing to keep in mind is that our systems are generally pretty good at dealing with links that are completely irrelevant to a website. So just because random new sites are showing up for some TLD that appear to be spammy doesn't mean that these are causing any issues on your website. So instead of checking your links every day to see are there any weird links that I don't like showing up for my website and disavowing all of those right away, I would take a calmer approach to this and kind of look at this when you realize there's actually a problem and then think about where are the bigger sources of this problem coming from or when you know that there is some background information that you need to clean up. So maybe a previous SEO did a big paid link campaign at some point. And you're tasked with cleaning that up, then obviously that's something that you can clean up. But usually it doesn't make sense to proactively disavow all links that are coming to your site from sites that you don't really know about that well. Because probably if they are spammy and they just randomly show up, then we've seen them a million times already and we're already ignoring. Is it OK to have a .cf domain extension? And what does Google say about .cf or .tk or similar domain types? Yes, of course, if this is a valid top level domain, I don't know for which one .cf would be. But if it's a valid top level domain where you can host your site, of course that works. One thing to keep in mind is that some of these domains are country code top level domains, which means that we automatically associate a country with that, so we automatically apply geo-targeting there, which means we will give you a slight boost to users in that country when they're searching. And you won't be able to set a different country for geo-targeting. So if it's a CCTLD, then you'll probably get a small boost in that particular country when people are searching for local content that would match your website. In general, if you think nobody really cares about your website in particular in within that country, then probably that's a non-issue. In particular, if you want to target a global audience, then that's also a non-issue because you want to target everyone the same. So it's not something that you really need to worry about geo-targeting. But if you want to target a different country with geo-targeting, then that would be tricky. So for example, if you pick a .es domain and you think, oh, this matches my brand. It's such a cool domain ending. And of course, .es is for Spain, which is an awesome country too. But your website might be targeting, say, France. Then that would be trickier because you wouldn't be able to set geo-targeting for France with a .es, though you'd be automatically kind of geo-targeting Spain. If people are in Spain or searching, you'll get a slight boost there. So that would be something to watch out for. John? Yes? Our official Bulgarian provider now will start to sell two-letters domain names. I know this is not, we will not say, yeah, this is better from three-letters domain names. But what do you think? Because very much type of brands use these domain names for some shortware and some like this strategy. What do you think about this? What kind of domain name are these? Two-letter domain names. Two-letter domain names. That's, I don't know, that's pretty cool. Some people, I think, do that. Again, I would watch out for the global top-level domain versus country-code top-level domain part. But if that's something that matches the brand that you think is cool, then I don't know. Why not? Go for that. I think the tricky part is always you want to make sure that if you do a domain move, that you find something that you can stick to for the long run. So that it's not something where you're like, oh, I will move to this domain today. And then a couple months later, I move to a different domain. Because every time you do a domain move, you have some fluctuations. You have some issues that are coming and going. So you don't want to kind of keep things on a rough path all the time. You want things to settle down. Jump people on Google Webmaster forums. They really want to do this. Sorry? Ready for the next show. Yeah. I never know if that's on my side or on your side. It's hard to tell. All right. Let's see. There's another comment in the chat. At the end of my every post, I put a comment. Four to five lines. It's clamor, so that it comes to my visitor's notice. Rather than making a special page, which many people might miss, is that a wrong sense of plagiarism? So if you're copying your own content, I don't think technically that's plagiarism. Because if you wrote it, it's like, who's going to get upset yourself? Probably not. So what happens on our side when we see this, when we see a block of text that's repeated across the website, is we tend to treat that more like boilerplate text, which means we don't give it that much weight. So when we look at an article and we see this block of text that's repeated across your website, then we know that this is something that's not so critical for this particular page. It's probably relevant for the website, but it's not that important for this particular page. So it doesn't mean that we would demote your page if you have that content on there. It's just that we say, well, actually this kind of middle piece of text, there is what is most important on this page. So we will focus on that for indexing and ranking. All right. I have about 60 pages with duplicate titles and descriptions because they're part of WordPress categories. They're all rel next and rel previous, and they have canonicals. But they still show up in Search Console as duplicates. Is that a problem? No, that's not a problem. So we realize that this can happen, and from a search point of view, that's perfectly fine. We highlight this in Search Console primarily so that if you aren't aware of this, then you can take a look and see, oh, is this something I need to clean up or not? If you are aware of it, and particularly if you know that it comes from these paginated pages or individual categories that are all more or less the same, then I would just ignore that. That's perfectly fine to have. That's not a sign of any problem. I made more backlinks before one month, and Webmaster Tools is not showing them. What should I do? So in general, Webmaster Tools shows you a sample of the links. We try to be a representative sample of the links to the website. Sometimes it takes a bit of time for all of that to catch up and show. But generally speaking, that should show up there. The kind of more, I guess, almost important thing to keep in mind here is that if you're making these backlinks yourself, then from our point of view, that's not really a natural link. That's not the kind of link that we're looking for, from our website, because it's not really a recommendation for the website if you say that you're placing it yourself. So that's one thing where I would recommend taking a look at our Webmaster guidelines and double checking that you're not accidentally running into any issues there, and making sure that what you're doing to promote your website is actually in line with what we think makes sense. If you're unsure about some of this and you have specific cases where it's like, I don't know, this link that I got from this directory site, for example, is that something that is a problem that I do the wrong thing? Am I doing it right? Then that's the type of thing I would go to the Webmaster forums and get advice from the other experts that are there at the forum. Usually, they can tell you fairly quickly, like, that's OK. That's not something to worry about, or you're doing way too much. You need to kind of back off. It looks like you're spamming. So those are kind of the things that I would watch out for. Then there is a post, I guess, a question around the mobile-friendly test and the label we show in the search results, where basically in the search results we're saying your site is not mobile-friendly, but if you test the page, it looks mobile-friendly. So what might be a problem there? It's hard to say without looking at the specific case there. There are two general types of issues that I've run across where this can happen. So kind of a mismatch between the mobile-friendly test and what we show in the search results for you. The first one is something that can happen in general. It's like you make a redesign. You change something on your web pages. Somehow that's enough to make the page mobile-friendly now. But because we have a green index and we process that page, we still kind of have that old association that maybe it's not mobile-friendly. So that's something where sometimes it's just a matter of time. The other trickier one is sometimes we've had problems rendering the page, which can happen if there's embedded content that's blocked by the text, for example, or that's hard for us to fetch. So those are kind of the trickier cases. Sorry. Usually this also catches up over time, but sometimes it takes a bit longer to kind of settle down. You can see this usually in Search Console in the Blocked Resources report where you'll see which resources are blocked on your website and which pages they were found on. Sometimes you'll see kind of insights there. There's also a mobile-friendliness report in Search Console that gives you more advice on general issues that we found. So I'd also take a look there. Maybe there are other pages as well that you could look at. In which cases does Google recognize a French translated page like fr.example.com as a duplicate of an English one? So example.com.page. Hreflang tags are correct, but the query info colon for the French URL shows the English page and not the French one. So it would be useful to have the example URLs here, so the actual URLs that you're looking at. Sometimes that makes it a little bit easier to understand. But what is probably happening is for some reason we're recognizing these pages as being duplicates and we're treating them as actually one page. So that's something that can happen in some cases. With the Hreflang markup, we can still pick up that these are different pages to show at different times to users, which makes sense if you have the same language content for different countries. So for example, English UK and English Australia, you might have the same language content, but you want the right one to be shown. In that case, it's OK for us to recognize that these are duplicates as long as we can still show the right URL at the right time. So I wouldn't blindly focus on the info query. With regards to pages that are actually in a visibly different content or different language, it's a bit trickier. So what sometimes happens is we kind of proactively recognize that something is probably a duplicate, even before crawling it. So this happens when we see that the difference, for example, is within the URL somewhere in a place where we've generally noticed that actually the content that's shown in this part of the URL is not so relevant to the content that's shown on the page. So that could be something like you have a language parameter that you can set to any kind of term. And we might have gone through and tried like language equals English, language equal French, language equal German, language equals, I don't know, BMW or some random other word that we found. And if we find that all these pages show the English content, except for maybe language equals Spanish, that shows the Spanish version, then we might assume that this language parameter is actually irrelevant to this page. And then we might miss that one page that actually has unique content. So that's something that's a bit trickier to kind of watch out for. And that's something that sometimes you see when you look at the URL itself. So if you look at the URL and you see language equals one, language equals two, language equals 700, shows the same content, then maybe language equals 701 showing Spanish would be something that we would miss. So what I'd recommend doing there is perhaps posting in the webmaster help form and getting some tips from some of the people there. And they would also generally be able to escalate this to us if they see that there's actually something that's really weird here and probably is Google Google is picking something up wrong here. Sometimes that does happen. So it's not the case that only webmasters make problems. Sometimes we have our own bugs as well. Any change? Yes. Go for it. Yeah. Yeah. So we have a website like basically. It's available in different international markets, basically one in India, one in Singapore, one in Malaysia. So I mean, we are doing this at your flying tag with our connection between our India, Singapore, and Malaysia website. And but the thing is like we also have content in local regional language. So I mean, how we should go about with linking to this my local language content with my this English version. So can I link my English plus my local language from the same page? Sure. Sure. Yeah, that's something you can do. So with hreflang, you can specify the language in the country. So you could say this is English for India, English for Singapore, English for Malaysia, and you also have maybe a Hindi language page or a Malaysian language page that you have separately. And all of those can be part of this hreflang set that you have. If you have a lot of different hreflang values, I'd recommend putting that in the sitemap file because it makes it a little bit easier to maintain. But otherwise, that's certainly possible, that you can mix these and say, this is English for different countries and this is just the language and that language for all countries. You can also specify in X default or you say anything that doesn't match should use this one. And you can use the same page for multiple targets as well. So you could say this is English for India and also English for, I don't know, what's there about Pakistan. Maybe you have a, I don't know, same page for India and Pakistan, probably not. But theoretically, you could have that and you could say this is like the same, like different hreflang values going to the same URL. So that's something that I think for a bigger website, it's worth taking time to try to find a strategy around this. And if you haven't worked on this kind of a site before, I would definitely advise getting help from some consultant who's done bigger hreflang implementations. There are a few people out there that I think really know what they're doing and they can give you advice on whether or not it makes sense to split things up or to combine things more or to separate things out completely, for example. These are all things that you kind of pick up in time with experience that are maybe not so obvious in the beginning where you might say, oh, I have 50 countries and so many languages, I will just multiply it and make everything for everyone. And suddenly you have a gigantic mess that Google doesn't know how to crawl and index. So finding the right balance there is sometimes tricky. Hey, John, could I ask a question? Sure. I actually posted in the questions but we only have about 20 minutes left, so let's make sure we don't run out of time to get to it because they're building up. And there, though, I'm the one who posted, I'm hoping you had time to look at the forum that I've already started about a site that is non-adult that is showing heavy adult ads from AdWords. And these ads are really strong. I actually spoke to a representative in the AdWords department yesterday and said that these ads should actually not even be approved. They're gonna look into that side of things. But more so than that from the natural search algorithmic side, all the domains on this same host or IP address are showing, it's a VPS are showing adult ads, no matter what ad on domain I go to that's on the server. And so it's starting to make me think was the IP flagged as adult at one point and I just got it like a week or two or so ago. Is it one domain that's causing everything on the server to be flagged as adult to show this type of stuff? And the other thing is, is for the longest time, the homepage hasn't been displaying for any type of key words, even the exact match domain with dot com on it. It won't come up. And that's when the adult ads come up too. If you search for it separated as a phrase, they don't come up. It only comes up when you use the dot com on it. I don't know if you had a chance to read the forum but there's a lot more detail there with all the URLs and screenshots. But any help would be appreciated because some of these domains are my kid's names on this server. Their friends are searching for this stuff and seeing these ads looking for them, kid miners. So it's pretty bad. I'm trying to figure out how to fix it. Yeah, I saw the threat but I need to double check with some things on that. So in particular with regards to ads, that's something where I don't really have much insight into. But if there's something from the search side that maybe we're picking up, that might be something that we could look at there. What I've sometimes seen in cases like these is that maybe the domain name was used before or maybe the domain name was hacked or the site was hacked in the past and we still kind of have that lingering connection somewhere. But if you're talking about various sites on the same server, that seems kind of weird. That shouldn't really be happening like that, as far as I know. Yeah, because if you look at the screenshots that I put in there, I put seven in there of all kinds of different domains that are on the same server and they're all showing very similar ads. I've cleared my cache. You know, thinking, well, is this a personalized thing, potentially or what I absolutely know, I've tried it on multiple devices, multiple machines at libraries where all this stuff is blocked and these ads are still coming up. Like, it's really bad. Yeah, I'll take a look at that. The Safe Search team is based in Zurich as well, so it's easy to chat about these issues there. That'd be great. And if you, yeah, any feedback to help me know if there's something I need to do, that would be fantastic. If you can, I'll just watch the forum. Hopefully you can put something in there if you find anything. That sounds great. Cool. Thank you. Thanks for putting all the details there. That's really useful. Absolutely. All right. There's a question in the chat. Would Google consider the following duplicate content, the same PDF uploaded to the same domain under multiple different file path? They're all the same PDF. And they link to, like, the blog and the bell page. So what would happen? I mean, technically, it is duplicate content. So it's the same PDF if it's in multiple URLs. It's the same as the same web page with multiple URLs. So what would happen there is we would pick that up and see it as duplicate content. We'd take one of these to index and try to show that version in the search results. It's not the case that we would penalize your website for something like that. So it's not that your site would be demoted or be seen worse in search because of that. It's just that we would pick one of these and show those in the search results. Nothing really crazy, essentially, there. It's fine to have the same content in a multiple URLs. It's not something that you need to avoid or that will cause any problems. It might cause problems if we have a ton of different URLs that all lead to the same content and we download a lot of data from your site and it's all, like, the same thing. Then that would make it harder for us to crawl your website. But if you're talking about a handful of PDFs that are essentially just duplicates, then that's a non-machine. That's not something that I would worry about there. Any change in how Google views subdomain URLs, for example, example.uk.com, AdWords struggles to understand that, what might be the problem there? So I don't know how AdWords handles URLs. So I have no insights from the ad site there. In general, we view these URLs as URLs, and we try to pick them up and deal with them appropriately. What would not happen here is that we would do any kind of country targeting here because it's a .com domain and .com is a GTLD by default. So even though it says .uk.com, it would still be a generic top-level domain or generic domain for us. So the advantage there is you can set geo-targeting in Search Console. Obviously, if you want to target a specific country, then you'd probably want to remember to do that rather than to just kind of hope that Google will figure it out. But otherwise, that's something that's fine. The one thing I would kind of watch out for here from a personal point of view is if you're buying a subdomain, essentially, from a website, then you're dependent on the provider of that subdomain to continue to exist and to continue to let you host your content there. So it's technically not a domain name. It's not something that you can just take and go to any other provider and continue using your website there. You're kind of tied to that domain provider. So that's something, generally speaking, I try to avoid, because if you want to host your website on your own domain, then you might as well make sure that it's really your own domain and not something that's tied to someone else's domain and you're just a subdomain. So that's kind of my personal take there from an SEO point of view. Obviously, you can do this however you want. Some people just take an existing domain and host on a subdomain there for practical reasons, because they don't want to buy a separate domain. Totally up to you. My website got a penalty, Penalty on Natural Links. I also submit a review request, but since 20 days, I haven't got a reply. What can I do? Should I resubmit or should I wait? So if you get a manual action for unnatural links, then I would recommend cleaning that up before you submit a review request. If you have cleaned that up, then you need to wait until that review is processed. Usually, this happens within a couple of weeks at most. So I think with 20 days, you're probably on the long scale of things, and things will probably pick up quickly. But in general, it can take a bit of time for these to be reviewed by the team. If you've been going back and forth with a review request, then that might be something where the team says, OK, you need to settle down first and figure out what the actual problem is, and make sure you clean it up for good before we spend too much time trying to double check this individual case. So that's particularly the case if we see that a site cleans up a manual action, they go through the review process, and the reviewer says, OK, it's cleaned up, and then the site does the same thing again. And then review again, and then it's like, does it again again? And then at some point, we're kind of wasting our time. Like going in circles here, you're just playing a game with us. This is not a serious engagement on trying to fix a website and making it come clean. That might be something where we say, OK, we're just going to back off a little bit and settle down first. And then if you're sure that you want to kind of clean things up for good, then obviously, we will double check that. But we're not going to play this kind of jumping back and forth game. Let's see. How can I submit my URL to Google for automatic indexing through the Search Console tool? You can submit it there. It's something where we do take that into account by our systems, and we try to process those index requests as quickly as we can. We don't guarantee indexing of any URL that we get there. So it's not the case that you can just take any random spammy URL and shove it in there, and we will index it right away. But rather, our systems are sometimes a bit picky and want to make sure that actually the content that we indexed is really awesome, is really something that we would want to show to users in the long run. And sometimes, that's not so clear. So that's something where maybe even if you use this tool, it might be that we would say, well, this is good feedback to have, and we'll take that into account. But it's not that we're going to jump and crawl and index it right away. So that, I think, is kind of normal. In general, for most websites, you shouldn't need to use this tool at all. So most websites should be just crawled and indexed normally in that we come and visit your web pages, we find links to other content on your website, and we pick that up, and we just crawl it right away. There should be no need for you as a normal webmaster to submit any individual URLs for kind of the normal day-to-day site life through search consoles through the submit URL tool. If you're seeing that we're not picking up new content as quickly as we could, then I would make sure that you're sticking to it from a visible place on your website and also try to put it in maybe a sitemap file or an RSS feed or anything like that, because these are more kind of automated, scalable ways of us picking up content on your website and knowing that this is actually useful to crawl and index as quickly as possible. The one time I would recommend using the submit URL tool is when you have real issues on your website that you urgently need to fix and you urgently need to have Google reflect that in the search results. So for example, maybe you remove something private that you accidentally published. That would be a good candidate for this tool. Maybe you had this wrong phone number on your title in your search results, and people were calling up someone else accidentally. That would be something that you'd want to kind of get re-indexed, reprocess as quickly as possible. But for normal day-to-day life or normal site maintenance, you shouldn't need to use this tool. How will GDPR affect the use of Google Analytics? I have no idea about Google Analytics, so I can't really tell you how that works. They do have a product help forum, I guess, where you can go and ask these kinds of questions. I would definitely check with them, because that's obviously an important topic to think about, especially if you're hosted in Europe somewhere. I'll look for it. Yes. May I ask you something? Sure. Go for it. OK. So how important is the overall layout of a page for Google? Is it more important from user experience perspective for does it play any specific role for SEO? It's mostly for user experience point of view. We do try to take it into account for things like the mobile-friendly test. So if we can't see it on a mobile device, then that makes it a bit trickier to say this is mobile-friendly page. That's something that we take into account. Also, I believe the mobile interstitial, and maybe the ads-heavy changes that we made a couple of years back where we, when we look at the page and we see just ads on our page, those are things that fall into the layout category. But whether or not you have the sidebar here or on the other side is totally up to you. Which colors you use is totally up to you. Your font choice, where you put text on a page, totally up to you. Maybe from conversion rate optimization perspective. It's more about user-centric tactics. Exactly. So that's, you probably care about conversions. You're writing a website for commercial reasons. So it's not that I would say you can do whatever you want. It will just work. You should kind of just focus on what makes sense for your users. And in general, that will just work well for search as well. OK, thank you. Let's see. There's some clarification for the PDF question. I don't believe I have any other option to host the PDF on multiple URLs. John says that the motion is unlikely, but more than a handful of links may be an issue. I believe there are over 20 different URLs for the same PDF. I don't think that would be a problem at all. So if you're talking about 20 links to a normal PDF file, we can crawl thousands of links on a site every day, so that shouldn't be a problem. That definitely wouldn't cause a demotion, because essentially, pretty much all web servers let you access any static page with almost an infinite number of URLs with the question mark and then various URL parameters. So theoretically, any website that hosts a PDF, if it runs on a server like this, they could have a ton of URLs generated for the same PDF file. And that shouldn't be a problem. We should be able to just deal with that automatically. That should just continue to work. OK, this is to settle an argument with a developer. OK, let's see. Serious question. Is it considered cloaking that would incur a penalty if a link href values are blocked by disallowing access to a JavaScript file? For example, blocking href link, href value to a link in a calendar that would create an infinite crawl path over the years, or blocking the sort by links in a table. Prior to the advent of JavaScript rendering, this was a way to stop bad URLs from being crawled. So technically, that wouldn't be considered cloaking. We would still see the same content. I think you're getting into an area where it gets really tricky, though. If your JavaScript explicitly changes your pages in significant ways, and we can't view that JavaScript because you're not letting us view that JavaScript, then it's arguable whether or not the version Googlebot sees is actually representative of the version that the user sees because we might miss out on a significant amount of content there. So it is something where one could argue either way and say that this is a problem or this is not a problem. In general, when you're looking at it for individual links like this, when you're trying to essentially do page rank, sculpting type things, I suspect the work that you put into this is time better spent on other things with the exception of maybe really gigantic websites where actually crawling some of these areas accidentally causes a significant problem with regards to the overall crawling of the website. So for example, if we start crawling into different faceted navigation aspects and you provide us no information at all about which of these facets is important, then it might be that we easily crawl another 20 million URLs before we figure out that, oh, actually, this parameter is irrelevant. So that's something where, for the most part, it's irrelevant. It's not something that you win any value out of blocking us through this convoluted way of blocking JavaScript files that place links on a page. Instead of doing that, what I would recommend doing is trying to find ways that are more legitimate that you can block us or discourage us from crawling these pages, for example, by using the URL parameter handling tool and telling us about things there. For the most part, I don't think this is something that the web spam team won't say this is a problem because you're kind of trying to deal with a technical issue in what I would consider a nonoptimal way, but you're not cloaking to us and providing completely different content. So it would be different, I would argue, if your JavaScript file turned your page on comics into, I don't know, a page about pharmaceuticals, then you could say, well, this is a significantly different page. And what the user sees is not at all representative of what a search agency is. And that might be more of a problem where the web spam team would take a look at that and say, actually, we need to take manual action here. This is not something that we would want our users to run into. We wouldn't want to promote a page in the search result saying this is a nice page about comics. And when people click on it, they get a bunch of pharmaceuticals. But that would not be OK. So I guess that means I don't have a clear yes or no answer to settle your arguments with your developer. But I would say that the solution you've provided there is kind of suboptimal in the sense that I would try to avoid it. And I would try to use just normal mechanisms to kind of guide Googlebot into the right sections of your site rather than to use these fancy techniques to hide things from Googlebot. The other thing here is that anytime you add additional complexity like this, you are always going to suffer an extra maintenance overhead in the sense that at some point, another developer is going to get involved and they're going to have to fix this website. And if you have this crazy JavaScript setup that tries to figure out which links on a page need to be placed after the page is actually loaded in a browser and it's blocked by robots.txt, then you can imagine that other developer is going to scratch your head and is like, I have no idea what I'm doing here. And chances are they're going to break something that makes it so that it doesn't work for users properly. So keeping it simple, focusing on the traditional ways of dealing with this stuff is what I would recommend. All right. Wow. We kind of made it to the end. And actually, there are a handful of questions still left. Maybe I can try to let me see if I can refresh here and see if I can just pick up some more here. There's one about the adult ads. I have to take a look at that separately. I thread on Twitter about translating content. If I translate pages from one website to another language, do I need to use hreflang? No, you don't need to use hreflang. If it's just a complete language change, if it's the same language for different countries, then hreflang makes it a little bit easier. How does Google Index text generated by computers based on unique proprietary metadata, such as hotel descriptions on booking.com? I don't know. Like, what is unique about the text here? So in general, we do have a webmaster guideline that says, other generated content would not be OK. If this is a small part of a web page, then usually that's less of a problem, because we can't pick up enough other content on a page. If the whole page is other generated content, then that's something where we might consider taking manual ads. We might consider taking manual action and saying, well, actually there is no real value added here. And in particular, if the other generated content is such that it's more like gibberish and just a bunch of words that match some algorithm that were put together, then that would be something that the WebSend team might take action on. All right. Maybe any last questions from you all before we hand things over to the evening? John, my last question is about the Google Search Console. You know, in Google Analytics, you have one great option about notes. You cannot note for some dates. And you know, this day I add some tracking code or some like this. If we have this on Google Search Console, it will be amazing, because we will have history on the project. I would submit that as feedback into the console. Please submit it, John. No, you need to submit it. I don't know, find your friends and tell them to submit that. If you think that's important, then let the team know there. It's always a lot more valuable for the team if they realize that actually their actual users, I care about this, that feel strongly about this specific feature. That makes a big difference for the team in prioritization. Compared to me going to the Search Console team saying, hey, you know, be really nice to have this feature. I have a friend who would really like this as well. And then they're like, oh, come on, John. You have crazy ideas. Nobody really cares about this, right? But if they look at the feedback list, and they're like, oh, everybody wants a different color in this graph, or everybody wants this button here, everybody is confused by this thing, then that makes it a lot easier for them to say, OK, maybe that idea from John is actually not that crazy. Maybe we should actually spend some time to implement that. And John, let me tell you something with just a little bit sense of horror. I'm still waiting for the mobile app of Search Console. The mobile version of Search Console, yeah. Maybe PWA. Yeah, they're working on creating the new reports in the mobile kind of layout. So as the new reports start popping up and replacing the old reports, then I think it will be easier. But it will still be a while before you can do everything on mobile by the way. Yeah, we know it's not easy. Well, assuming and stuff. It's also something where it's sometimes tricky because we have such kind of established infrastructure for some of these features. And it takes time to actually shift things over. But it's like, what am I saying? It's like, you have the same problem as well, right? It's not unique to us. Yeah, but step by step, Search Console will be more useful. Now it's very much useful from the old version. And I think it will be more in the form. Cool. Yeah. All right, great. So thank you all for joining. Thanks for submitting all of these questions. We have another English Hangout, I believe, on a Friday morning. If you're in European time, if you're in the Asian area, then probably that's a good time zone for you as well. We have two German ones lined up this week on Thursday, I believe, the first one for news publishers, kind of news publisher questions. And the second one is kind of the general German Hangout that we always have. So if there is anything on your mind, feel free to jot that in there. And we can take a look at that then. All right, thanks a lot. And hope to see you. Thank you, Tom. To all the rest of the night. Bye, everyone. Thank you, John. Bye. Bye, bye.