 All right, welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a webmaster trends analyst here at Google in Switzerland. And part of what we do are these Hangouts, together with webmasters, publishers, SEOs. People have questions and comments around web search. So as always, if any of you who are kind of new here would like to ask the first question, feel free to jump on in. Hello, John. All right, go for it. A few days ago, I have noticed that there is a Google Developer Style Guide for content, like how to use a capitalization, how to use a punctuation, everything. So are those things connected to the SEO ranking? If I apply those things in my content properly, will these affect my ranking? No. So I think you're probably looking at the writing style guides that we have for the developer side. And these are just best practices that we've worked out for internal things. And we're trying to make that public so that others also have insight into that. But it's not the case that this is an SEO ranking factor. It's essentially just good practices that would be found. We're trying to make that public so that others also have insight into that. Oops. Thank you. All right. So let me run through some of the questions that were submitted. And then we'll definitely have more time for your questions as well, if there's anything specific on your mind. Hey, John. Hi. All right, one more. Sure, one more. Actually, whenever I search in Google Search Engine, in right-hand side of a URL, there is a two option called cache and similar. Sometimes it's so cache-tech. Sometimes it's so similar. And sometimes it's both. What is the difference between these three? So similar shows other pages that our algorithms feel might be similar to that page. Sometimes we have some information we can show there. Sometimes not. The cache is the cache version of the page. You can control that yourself for your own website using the no-archive robots meta tag, if you want. So that's one reason why sometimes we don't have a cached page. And other reasons are sometimes just for technical reasons that we just don't have and have a cached page that we can show anymore. OK, thank you. All right, so the first question on top is a bit of background noise somewhere. So let me just mute you all. How would you define high-quality backlink and low-quality backlinks? What's the difference? So I feel we've talked about this topic pretty much in almost every Hangout. So instead of taking too much time on this, I'd recommend really just going back and looking at the other ones, reading our blog posts. We've written lots of blog posts around links and digging into the information there. Let's see. A question about flexible sampling. Since escape fragment URLs will not be supported in future by Google, would it be OK to show Googlebot the non-paywall pages with class names and schema markup based on the user agent? Yes, you can do that. So that's one way to recognize Googlebot by looking at the user agent. You can also combine that with a lookup of the IP address. We have some information in the Help Center on how to do that. So that's pretty much the best approach to recognize when Googlebot is looking at a page versus some other random user or other crawler. Presuma page has been demoted to, I guess, for low quality content. Once it is fixed and full, what is approximate response time for Google to update all of the signals to bring the site out of the demotion? So I don't think there is any fixed time that we would have for something like this. In general, we try to recognize the quality of the content overall, and recognizing bigger changes in the quality of the content means we have to reprocess everything first, which means some pieces of content will be reprocessed fairly quickly, and others take quite a bit of time. So this is something I wouldn't expect you to upload a copy of the website now, and then tomorrow everything will be updated and it'll rank completely differently. This is something that is a very longer period time, longer term thing that will probably take several months for everything to be reprocessed and everything to be re-understood, and sometimes it can take even longer. So that's something where there's no fixed time frame, there's no stopwatch you can set and say, I uploaded my new version of my site, and this time it's actually good. That's something that takes a bit of time, and our algorithms really have to understand how things have changed. Does domain or brand popularity help in rankings? Competing with well-known brand or established sites is tough and challenging. Most of the time they rank easy. Why is that? So I think from a pure quality point of view, it's not that we would say a brand or kind of like a brand is really a ranking factor in the sense that we try to recognize there's a brand done here, therefore we will rank those pages higher. But often there is a very kind of indirect relationship there in that if you've built up a brand and people know about your website, about your brand, then oftentimes they'll just be searching for you directly. So if instead of something like sport shoes, people start searching for Nike sport shoes, then obviously we're going to show content from that specific brand. So that's something that I feel is something worthwhile at least to look into, because all of these kind of branded queries where people are explicitly looking for your website are things that are really easy for you to rank for. It's kind of like free traffic, because people are explicitly looking for your site, and there is very little reason for Google to say, oh, people are looking for the site, therefore we'll show some other site. That doesn't really make sense. So from my point of view, this is something where if you can really work on being known for yourself and being known for providing a good service and having high quality content, high quality products, and you have a name that people can search for, I think that's worth trying to do. John, with that question also, when you're saying people searching for the brand queries, basically let's say somebody searching for the Nike sport shoes. So Google gave any preference to the Nike and somebody searching for the sport shoes, or like I meant just so you can say it's a normal thing, that Google can rank any website besides people actually searching for the Nike or maybe any other brand. Yeah, I mean, if it's just a generic query, then we rank them normally. There is nothing special that we would do there just because there's one brand in there, because for generic queries, there are usually a bunch of brands that are active there. But of course, if you're known for providing high quality product or services and people are recommending your website because of that, then that's something that I think for kind of a well-known business, a well-known brand, is a bit easier to attract links like that, because people are already talking about you. People are already recommending your product. So that's something that kind of it's like you're promoting everything that you're doing automatically. Hello, John. Hi. Hello. Let's say a website is demoted by Panda due to a high number of duplicated content copied from other websites. What is the best for Seo, meaning a better chance for a Panda recovery, to delete those duplicated pages or to no index them? Is there any difference in the Google eyes? For us, it's essentially the same. We dropped them from the index, so we don't show them in the search results. So that's kind of the same. But you still know about them, right? Well, when we reprocess them and we drop them from the index, we don't care about them anymore. OK. Gone is gone. I think there's perhaps more kind of an indirect effect there in that if your users are going to your site and they're still seeing a lot of this low quality copied stuff, then I don't know. It's about the outdoor biographies, which are still useful for users. Sure. Yeah, OK. OK, thank you. All right. How important is it to make use of keywords as anchor text in SEO when linking internal links or backlinks? Does it have the same value if we make use of some random relevant words as anchor text? For example, if I use for more info on SEO updates, visit my post as an anchor text instead of just SEO updates. Both of those essentially work for us. So that's completely fine. I think what's important is that you use something for more information on this topic as a part of the anchor text rather than just for more information on this topic, and then the link is click here. Because click here really doesn't tell us a lot. We can try to figure out a bit more from the context, but it's really a lot easier if you actually tell us what the link is. And this is something that we run into as well on our side. When I look at our help center or when we write blog posts, we sometimes fall into that trap of it's really easy to just say describe something and then have click here as the link anchor. And it makes a lot more sense for users and for search engines to have a clean anchor text. Google is pushing for faster web load times, assuming the load times are possible metric. How does Google calculate them? So at the moment, we essentially differentiate between really, really slow sites and kind of everything that's in the normal range. So if you're looking at milliseconds and calculating time to first byte and looking at the difference between data centers and different locations of the world, then you're probably already way into the normal speed area. You're not into that really slow site section. So from that point of view, you're probably fine. This is something that from my personal point of view, you will definitely see a lot of indirect effects in that they change user behavior quite a bit in that when people go to your website, it really changes how they interact with your website, how they interact with your content, what they look at, how many pages they look at on your website if they look at other products and services that you offer, if they actually complete the checkout flow or whatever you have that you want to do as a conversion on your website. So that's something where a lot of e-commerce sites have done studies on this. And they've really found quite significant differences when it comes to even a couple hundred milliseconds difference where you think normal people wouldn't really notice that. But it does have a measurable effect on the general conversion rate there. So that's something where if you're already in this area, that's fantastic. I wouldn't worry about it from a Google point of view. But you can still use this as a metric to help improve your website further on. And with regards to local servers or CDNs, that's something where you know your users best. You know where they are. You have the metrics. And based on that, you can optimize for that. Another one on speed, which is essentially the same, like how do you calculate speed? I suspect we'll have a bit more information on speed in general when we get closer to announcing things around mobile where maybe we'd be able to take speed into account a bit more. So at that point, we will definitely let you know if there's something more specific that you can actually look at. But at least for the moment, we really differentiate just between really slow and kind of reasonably fast sites. Is content in tabs good or bad for mobile-first indexing? As long as this content is loaded, when the page is loaded, that's perfectly fine. If the content is loaded only after you click on the tab and it stays on the same URL, then we wouldn't be able to use that content for indexing. So that's one thing to kind of keep in mind. I have heard of sites that actively want to use this like that in the sense that they have some content that they want to load in tabs separately and that they don't want to have index at all. And that might be an option as well. But if you do want to have your content indexed, it needs to be loaded when the page is loaded. A question regarding archived websites in Search Console, I can see backlinks coming from archive.is. Is that considered duplicate content? As the links are picked up by Google, so maybe the content will be as well. I don't know what archive.is is, but there is, for example, the archive.org website where they keep old copies of websites as well. As far as I know, we don't index the content there. I believe they block it by robots text. So that might be something to kind of think about. Is this really something that is picked up as actual content? One really easy way you can check to see if Googlebot can access a page is to just use a mobile friendly test. You can plug any URL in there. It'll fetch that page with a Googlebot user agent and show you a screenshot of what it found. And with that, you quickly see, does Googlebot see the same content? Is the content blocked by robots text? Is it just like the frame that's visible and the actual content isn't? That's a really simple way to check. How does Google handle when different pages have identical products? So we generally try to index both of these product pages. And then when it comes to serving, so when someone is searching for something, we'll try to pick one of these to show if we can recognize that they're searching for something within that duplicated section. And so that's something that usually works out fairly well. We have a lot of practice with this duplicate product scenario. You're not the first one to run into that. With mobile first. Yes. I'm sorry. Can I jump in on this question, which you have probably been posted a million times before about spam? And I'm working for a German bank, and now suddenly for the last few weeks, they're getting hundreds of thousands of really bad links from very bad porn sites. And those are not really porn sites, they're garbled shit. I mean, they're just stupid. They don't even have a purpose except for getting us in trouble. So I know that normally you will say that don't worry about it, those are just dumb links, and we don't even take them into consideration. I guess this would be the normal official Google ad standpoint to that. Still, the people who I work for at the bank, they say, whoa, isn't this dangerous? We hear about that, SEO is somebody trying to hurt us, and why only us, and who's doing this? And so could you just, just for the book, just say it again, what should we do with these links? We have no control over them, and we fear that if they're getting more, and if they're getting better, maybe. So if I wanted to hurt somebody, I would go into bank sites and the bank forms and place hundreds of links there to hurt somebody else. I mean, what should we do about these links? We see them, and we know they are just stupid. We didn't do it, of course, but still they're there. So I guess you touched upon the main point there, in that if they're really obviously just made up kind of random links, then that's something our algorithms would generally just ignore as well. So that's kind of... You said generally just ignore. This is not something I can tell a banker. They tell me, so what should we do? Is this good or bad? And so I read this a lot, generally we should, and we shouldn't care about it, but it's not something, we need digital answer one, zero or one. I know it's difficult, but still, I can tell them generally it should be okay. They would be worried then. So I think there are two other aspects there. On the one hand, if it's a bigger website anyway, a bigger company, then we have so much other information from links anyway, so that these spammy links are like this small, tiny thing compared to like this big collection of other things. So that's one thing that probably helps here. The third thing is if you really are worried that these are getting better or that they're getting too much, I would just drop them in a disavow file and just put those domains in there. Maybe, I don't know, once a month, collect them, just throw them in there and then say, okay, fine. And then we really don't take them into account at all and you're kind of like taken completely out of the question there. Okay, the disavow, of course we thought about that, but doesn't it, we originally Google said that we should, before we disavow, try to delete those links. And obviously in these cases, there's no way we can run after hundreds of domains of one site. So I mean, I guess when we disavow something, we have to upload the file, we have to click this button that says we have tried everything to get rid of those things, right? I think you can just, I don't think so. I think you can just upload the file and done. There's like a big disclaimer as I be careful and you have to know what you're doing, but I don't think you have to confirm anything because a lot of these spamming sites is like, why would you want to contact them or how do you even contact them, it's impossible. No way, okay. So basically your answer is, if we do worry about it a bit, then disavow is a good answer. Yeah, yeah. Okay, good, thanks. And the disavow doesn't have any admission of guilt or anything associated with that. So it's not that, oh, they were buying links and now they're trying to clean it up. It's really, for us, it's a technical thing. It's like, you put them in there, we take them out. No questions asked. Okay, that's a good point, thanks. John. Yes. We recently have got some client who came with some unique issue. When people search by their name, by their company name or brand name, they find their website appears for the search but their Facebook page or Twitter page or LinkedIn page, those page not appear on the fast page. So what can we do so that those page appear on the fast page when someone starts by their brand name or their company name? So we don't do anything special for any page like that. Essentially these are normal web pages that we found on the internet. So you could do the normal things with regards to pages, maybe like link to them from your homepage and make sure that they're known, promote them among your users so that they can link to them as well. Kind of the usual things that you would do there. And it's something for some sites, you see the social pages ranking really well. For other sites, you see the social pages not ranking so well. From our point of view, that's kind of normal. So sometimes it makes sense to show them, sometimes it doesn't make that much sense. Okay, and if we use structured data, for example, if we add structured data on the website and use famous tag to add a social link, will that work? That helps us to understand the connection primarily when we show, I think the knowledge graph entry on the side, we can pick that up and show those social links as well. But it's not a replacement for a normal link from a website. So for example, if you just have Jason L.D. Markup and you say, this is the same as this one and this one, that's not the same as having an actual link on the page saying, visit our company on Facebook or visit our company on Twitter, something like that. Okay, and there is another issue now. Some of clients have online reputation problem. For example, maybe there is some negative news which was published like three or four years ago. When someone starts by their company and those news appears on the fast page, which is not good for the reputation. So how can we deal with such situation? Because if a lot of clients come to us and ask, this happens, what can we do? And it's a unique situation, so what can we do in this situation? Because that usually three or four years ago, not recent. So what is the best way to deal with such a thing? Yeah, I think that's always tricky. I mean, it's essentially reputation management. From a user point of view, I can understand how sometimes it might make sense to actually show like what was involved in the news about a specific company when someone is searching for that because maybe you want to have a critical view of the company. Is this really the company I want to work with? Like what are the pros and cons about this company? So from a user point of view, I'm happy that we chose things like this. But obviously, from a website point of view, you might not be so happy to actually have all of this shown. And in general, this settles down over time to some extent. Things that you can also do, I think we have covered in a blog post a while back, out of reputation management, things like working to make sure that your social profiles are more visible, that you've kind of cleaned up those old issues, all of those things. So I don't think there is this one magic trick that you can do and say, I don't like this page at all. I want to have it removed from the internet because it's saying something bad about me. I don't think that's really the approach we should be taking there. But usually a lot of this cleans itself up over time, even if it sometimes takes a bit longer. Does this negative reputation affect ranking? Does it affect ranking? It's hard to say because there are probably a lot of indirect factors involved there in that, I don't know, depending on what was involved, how your users react to that. Do they still recommend your website? Do they still link to your website? All of these things are kind of very indirect. Because one of our clients, who said ranking was pretty good, but negative news when it comes fast-page for his name, ranking dropped a lot. So that's why I'm asking there whether there is any direct or indirect connection between the reputation management and the ranking. I don't think you would see such a direct effect there. So it's not like one news article shows up. Therefore, the other website is suddenly no longer relevant for these queries. I don't think you'd see such a direct relationship. OK. Thank you, John. All right. Let me run through some more of the submitted questions. And then we'll get to you folks here as well. With mobile-first indexing, how is Google going to rate sites with separate domain, dynamic sites or responsive sites? We're going to treat these essentially the same in that we'll take the mobile version, whichever version that is, m.orDynamic or responsive, and use the content from there for indexing and ranking. And all of three of these options are definitely variations that you can do. From my point of view, I think, in general, if you can put your mobile site on the same URL as your desktop site, that makes things a little bit easier because you don't have to worry about which version of the site do you link together? How do you tie things like hreflang or canonicals together? You basically just have one URL. And you don't have to worry about all of the rest. Obviously, sometimes that's not so easy. Sometimes you have infrastructure in place that needs a separate domain. And for those, just continue to follow our guidelines. But you kind of need to keep in mind that some of these cross-linking things might be a little bit trickier if you have a separate domain or separate URLs, in general, for your mobile pages. You just need to watch out for the technical things a little bit more than with one URL. Are Google rankings run on real time? Yes. We have to do this because we get about 15%. I believe the number is 15% of our queries are completely new every day. So we can't generate the rankings ahead of time and kind of be prepared for every possible query that comes in. We really have to do the rankings when people are searching. And with personalization and all of that, it always plays a role in there as well. Can I steal your Google t-shirt? That will be hard through a hangout. Theoretically, it's possible, but I bet it'll be hard. It's a lot easier to just buy these, so I bought this one as well. There is a Google online store where you can get t-shirts and coffee cups and all kinds of stuff. That's where I've been getting my t-shirts recently. Do all links show up in Search Console? If not, what factors determine whether a link shows up in Search Console? So not all links show up in Search Console. We try to pick the relevant ones to show there. You can see this fairly obviously when you have a site-wide link from one website to your website. You'll see in Search Console the number of links on top will be something like 100,000 links from this domain. And you click on it, and you open it up, and you see five listed. So that's kind of the obvious situation where you can kind of see, OK, Google is trying to just give me the relevant information there. So that's something to kind of keep in mind. I don't think it always makes sense to look at all of the possible links just because there's such a mass of things that are essentially the same. And it's very easy to drown out the actual information that you're looking for if you're just looking at the quantity of the links. We recently started using Fetch as Google, and new pages have been ranking within minutes. Is this a true ranking position, or will it adjust over time? Yes, it does adjust over time. So especially when it comes to new pages, we don't have a lot of signals for these pages. So we start somewhere, and then over time we figure out, oh, it actually fits in a bit here. So that's something that does change a bit over time. Is there a schedule for future Hangouts? Looks like it's full already. So these fill up very quickly. So if you're curious about joining one of these, you need to kind of get in and get ready very quickly and kind of click on the join link when it shows up. We do put them on our site. I think it's google.com slash webmaster slash connect. There's a calendar on the bottom. I tend to put them in maybe one or two weeks before they actually run. Sometimes a schedule is a bit tricky, but usually we try to do them on, I think, Tuesdays and Fridays for English. One in the morning, one in the evening, my time, to try to cover a bit more time zones. So hopefully one of the future ones works out for you. With the site map, how far will Googlebot crawl with internal links on the URL submitted? So you submit a high level category page, and it takes four links to get to a detail page. Will Googlebot be able to navigate there? So I'd recommend submitting all of the pages of your website with the site map file, not just the high level category page. That way you kind of allow Googlebot a chance to actually go to that product page directly. However, we do try to follow links as well. So a site map file helps us, but it doesn't replace the normal crawling. So with that in mind, we do start somewhere. Sometimes we just see a link to a home page somewhere, and then we start crawling from there. The crawling activity depends a lot on the website and how we kind of judge that website, how much we think it makes sense to actually dig into all of the details on the specific website. And crawling four levels down to find all of the product links is something that's usually less of a problem. Of course, if you have 100,000 pages, then that's something where Googlebot might at some point kind of wonder, does it really make sense to index all of these individual product pages? It looks like nobody's actually recommending this website, so are we wasting our time kind of digging too much into this website? That's something to kind of keep in mind. The other thing is if you have too long of a click trail from one part of your site to another part of your site, users are going to have a hard time as well. So if your product pages are linked, I don't know, let's say five or six clicks from your home page, then that's a lot of kind of clicking that people will have to do to actually get to something that they can buy. And maybe they get bored or tired of searching around on your website before they actually make it to that product page. So that's something that I recommend trying to test out and see how users respond to different navigation models. Maybe there are ways that you can make your important products more visible and the less important products less visible. That's something definitely worth trying out. Does dwell time depend on the query's intent? Can you understand how it's calculated for a website or query? So I suspect this is more something analytics related. And that's something where I'd recommend understanding your users and understanding what it is that they're trying to do on your website. So obviously, you know what kind of content you have on your pages. And you can kind of guess how long you expect users to stay on this content. And you can probably also guess how the conversion rate should be for this type of content. And these are things that you know best, that you need to be able to judge, and that if you want to measure that in analytics, you need to kind of have some kind of a bar for your content and say, I'm aiming for this. And if I don't achieve that, then you need to work on getting there. And it might depend on the type of page that you have on your website. So you probably have some pages that are very short, that have just very short and concise information, and other pages that have a lot of detailed information. And obviously, the time on site that you expect users to have for these types of pages will vary quite drastically. So that's something where you really need to use your expertise to find the right answer. What should you do to rank a website nationally or internationally that ranks number one in a state? So this is, from my point of view, kind of a natural progression, I guess, for a website or a business in general. And there's no simple trick where you can say, oh, you just have to put this meta tag on your page. And suddenly, your website is not just locally relevant. It's internationally relevant. So that's something where you really need to kind of build things up over time. And you also need to understand that just because your website is ranking number one locally doesn't mean it'll rank number one internationally, or that'll rank number one in individual other countries or regions as well. That's something that is kind of more of a natural progression of a website and has a lot to do with marketing and understanding your users and building up a service, building up something that works well internationally. All of these things are more, I'd say, indirect effects than a simple technical change that you can make on your pages to make it suddenly internationally relevant. A question about backlinks. I saw one e-commerce site that got a lot of unnatural backlinks from two websites, something like $400,000. Is that black haped? So the question is, why are they still ranking? Yeah, I think we touched upon this briefly as well, in that we ignore a lot of links when it comes to things that are obviously unnatural. So what might be happening here is that we just ignore these backlinks. And the website is just ranking based on other signals that we have on that website. So from my point of view, it's not reasonable to just remove websites from a search completely just because they have a bunch of bad backlinks. Like the case before, it might be that you have a legitimate business and someone else is building all of these really spammy backlinks to your website. And it doesn't make sense for us to count that against the website, because we don't know what's behind those things. But what we can do is say, well, we recognize these are unnatural, kind of weird links, and we just ignore them. So we focus on the other signals that we have for this website. And potentially, that's what's happening here, in that we see from two websites a lot of spammy backlinks are coming, but we can ignore those, and we can focus on the rest. And that website might still be ranking, number one, because we still just have so many other good signals about this website. A follow-up question to the link showing up in Search Console, many private blog network builders may make attempts to hide links from crawlers, such as Ahrefs or link analysis software. Can PBN builders use techniques to not show in Search Console? Sure. I mean, if you're building a private blog network and you robot at Googlebot, then we won't be able to see those links either, and we won't show those in Search Console. But that's probably not what these people are trying to do, because they're trying to build these PBNs to get some kind of unnatural advantage. So from that point of view, if it's not showing up in Search Console at all, and it's a relevant link, then probably we aren't taking it into account at all. And yeah, I don't know. From my point of view, these type of private blog networks are things that are potentially kind of spammy setups that might work now to some extent, that might have some effect for some sites. But as we move forward with understanding links a lot better, this is really more of a liability almost than kind of something valuable. Because once we start ignoring those, suddenly your website has no good signals at all, then it's like, why would we show that? And you spend all of this time building these PBNs when they could just disappear from one day to the next. So I don't know. That's definitely not something that I would recommend doing. That's generally against our Webmaster guidelines as well. So if you plan on being visible in Search for the long term, I would not go down that route. If we have no warning message in Search Console regarding unnatural links, do we still need to use a disavow tool? It's not so much about the warning message in Search Console. So you get the warning message there when the web spam team has actually detected unnatural links. But more with regards to whether or not you want those links associated with your site or not. So that could be something like spamming links just coming to your site that you really don't want to be associated with. It could be something like a previous SEO that built links in a really shady way that suddenly you realize, oh, this is something that's actually against the Webmaster guidelines. And you don't want to be associated with that. That's something you can also use a disavow tool for. So I wouldn't necessarily wait for a message to show up in Search Console, but rather kind of think about what you've been doing over the years and think about what you still kind of want to be associated with. Our website uses UK English, but US English keywords rank much higher. Will our use of UK English penalize us in an attempt to rank? We have an international website for an international organization. I don't think that would be a problem. So from the general Google algorithm point of view, for the most part, we should be able to recognize that both of these words are kind of synonyms for each other, and that they essentially mean the same thing. So if you use one version on your website and that works for you, I would just stick with that. So I don't think UK English on our website is any reason to demote a website in Search internationally. I would stick to that. If that's how you write your content, if that's where your company is based, if that's your writing style, that's perfectly fine. If I were to control the Google algorithm, once I had narrowed down the top 20 or 30 most relevant documents for a given keyword based on lots of factors, I'd put a lot of emphasis on user interaction. So it goes on pretty long. Does this resemble reality? It's, I don't know, this is something where probably there are some indirect effects in that we test a lot of our algorithms when we put them out. So I believe we test pretty much everything. We do these kind of tests all the time. There are thousands of tests that take place every year. And in our tests, we do look at how users respond to that, how users react when they see the different algorithms that we try out, different UI interfaces, which is essentially the same as what you could do if you just did a lot more A-B testing on your pages. So that's something where, at least indirectly, there is this kind of effect that's happening where that we test algorithms, and if users hate them, then we're not going to roll them out. Therefore, you might see this kind of effect there. But in general, since you mentioned so many of these technical terms with regards to what we could be using for ranking, what I don't know, what I might recommend in your case is to look at our job site and see if there's something interesting there that interests you where you could start optimizing our search engine as well, to kind of bring your ideas in there. For example, we're still looking for WTAs to join our team here in Zurich. If that's something that interests you, take a look at those listings and submit your CV there. And then you can see a little bit more what's happening on Google side with regards to ranking and talk to the quality engineers and kind of bring your ideas in there as well. Internal duplicate. Yes, go for it. Sorry to interrupt. Sometime we face an issue. For example, our client wants to rank one page for three different keywords. For example, one of our clients is heater and barbecue. So on the homepage, he wants to add like heater store in Sydney, barbecue store in Sydney, these two keywords in the title tag. So now my question is, which one is better if we add these two exact keywords in the title tag? Or if we add something like that, heater and barbecue store in Sydney, which one would be better? I would write naturally, whatever you think makes natural sense. Especially for titles, titles are shown in the search results. And if your title is just a collection of keywords, then users are going to look at it and say, what does this mean? But if a title is something that's readable that people see in the search results and say, oh, this is what I was looking for, then obviously you get that click. And it's not just the ranking. It's kind of people go to your website and they see you're offering those kind of services and products, and then they convert in the end. So that's something where I would try to write naturally. And having multiple keywords or phrases that you're targeting on a specific page is completely normal. That's something that happens, especially when you're talking about the homepage. Hi, John. OK. I have a quick technical questions. Would it be OK to add one by one or just riddle you with questions? Actually, there are four questions, so not many really. Let me see. So there are a bunch of questions still submitted, but they're really long. So go for it. Maybe we can just shift over to some shorter things now. Yeah, thanks. The first one is regarding using rel canonical along with robots meta tag. I was able to Google and find that it's not really recommended using no index with rel canonical. But I was never able to find anything about using index follow along with rel canonical. What would be your position in this place? We ignore the index and follow robots meta tags, because that's the default behavior. So OK. It doesn't really make much sense, because it's kind of confusing sending two different signals. That's what I thought. Index and follow are the normal kind of the default robots meta tag signals. So that's not sending something different, essentially, than not having that on there. So you can put it there or not. It doesn't really matter for us. But it's just really rel canonical is what you're going after at that point, right? Exactly. OK, thanks. The second one is related to faceted navigation. I've noticed that on some of our websites, there is a discrepancy between URLs and the actual situation. So for example, in URL, it would say page zero, and you would actually be on page one. And then URL would say page one, but you would be on page two. So is that something that Google would be looking into or now? Doesn't really matter. I don't think we look at it like that. So if there's a number in the URL, doesn't necessarily mean the paginated page. OK, good. About hreflentag for Norwegian. I've looked there are three standards on Norwegian N-O-N-B for Norsk Book Mall and N-N for Ninosk. And I Googled somewhere that Pierre Farr recommended using N-O-N-O, Norwegian Norway. So my concern here, on most of our sites, we use N-O-N-O. But on one, we actually use N-O-N-B, which is kind of against the recommendation that I've found. Now, the problem is that if we try to switch back to N-O-N-O, we're afraid that somewhere in certain sections on a website, we might kind of forget. So it might be like both N-O-N-O and then N-O-N-B. So is it a problem if we just leave it on N-O-N-B? Norwegian or actually N-B-N-O, I'm sorry, because it's the language first. Do you have multiple languages on there? Yes, it's like we have 10 or 15 languages. It's not just one. So we have different countries. I suspect that would be fine. So one way you can double check this is to use Google Search and do a language restrict to just that language that you're looking at. So I think you can still do that by looking at the tools, link on the side, where you have the advanced search options, where you can say just in this language. And you can see, like, do your pages show up or do they not show up? If they don't show up, then obviously we think they're in a slightly different language than the language that you said. OK, great. And the last question about robos.txt, obviously, this is an e-commerce store. And we have certain pages of the site that we don't want to have crawled. But what happens if the Googlebot comes and finds a resource that we don't want to be crawled? Does at that point leave the site or actually just goes elsewhere? We essentially try to crawl the website in whatever links that we have there. So it's not the case that we kind of keep track of where we've crawled through and stopped when we come into a barrier. We kind of spread out into all of the URLs that we find. And if we can't go this direction, we'll just continue going the other direction. So having individual pages blocked by robots.txt is perfectly fine. OK, OK, great. Thanks. All right. So let me just ask a question, if you mind. Sure, go for it. Regarding JavaScript, I'm following you, of course, in the JavaScript form, JavaScript sites in Search Working Group, Google Form Group. But one question, is there any announcements on news regarding big changes in how Google handles JavaScript, especially client-side rendering? I see a lot of small improvements on a daily basis. But is there any big step that we are going to see? I think the next big step that should be announced fairly soon is that we're dropping the HX crawling scheme that will focus on just rendering the pages instead of using the escape fragment URLs. I think that the next bigger step, and there are a bunch of smaller things that are happening as well, where maybe some additional tools are coming out, or some improvements with the rendering on our side are happening. These are smaller things, where I suspect at the moment we don't have anything particular to announce on that. But we'll definitely do, I guess, do a bit more on that, maybe towards end of the year or early next year to really explain a bit better what you could be doing on your site with regards to client-side rendering. If you use these technologies to build your website very quickly, what does that mean for Google? What options might be available to you? What round should you take? One question that comes up a lot, I see that in the fetch as Google and the Search Console, I hear developers telling me, hey, it shows up, it renders. We see the picture in there. And then I see other questions in the form that say, well, it renders fine, but still Google doesn't index the page and maybe doesn't get all the links. How do you cope with those discrepancies between it shows up as rendered in Webmaster Tools as Google, and it still doesn't work at all? I think that's tricky. So there are two aspects where that sometimes happens. One is when we run across an HTML page, we'll generally try to index it as HTML first. And then in the second step, we render it. So that's something that in the future will be much closer together pretty much at the same time. But at the moment, we might still index the HTML version first, and then we index the rendered version of that. So you see that kind of difference there at least over time. If you're pushing out pages very quickly, then maybe that's a bit more visible. Sorry, that's if it's an isomorphic rendering. That means we try to have two different sets. One is server-side rendering. One is client-side rendering. That's if those versions are not exactly the same, that might be a problem, right? If you're using the HTML version first, and then you're looking at the JavaScript version. Well, if you're using the isomorphic JavaScript, you would be serving the rendered HTML to us already. So that's something where you essentially bypass Google's need to render the page. You already give us the full HTML that the page would have when it's loaded. It will, it's almost, right? It's never exactly the same, right? Yeah, I wouldn't worry about those differences. So if it's essentially a version rendered of your content, that's kind of the same as the final rendered version. That's pretty much OK for us. It's different when the content is very different. But usually the case where we might index the HTML and then later the rendered version is more if you have a single-page app setup where you have one HTML file that you serve to all pages and it doesn't have any content at all. And then just through JavaScript rendering, we actually pick up the content. That's where you tend to see more of a difference. The other. If you were today to tell somebody, if somebody comes to you and asks you, so we can develop a website in normal HTML, just normal CMS, or we can do it on React.js basis, what would you suggest today? I would look at the ceiling and think for a long time. I think some aspects really work pretty well with regards to JavaScript-based websites. But for other things, you also have to realize that a lot of the tools just aren't at the same level at the moment. So things like if you want to crawl your website and pick up all of the titles so that you can put them in a spreadsheet, that's really hard with a JavaScript-based website because you have to render those things as well. And sometimes the work that you put into developing a site is so much more than the work that you take for maintenance and for the SEO side that you just give that more weight. And you say, I can move my business a lot faster if I use these modern frameworks. And I am going to take into account that maybe the SEO side is going to struggle with this. On the other hand, maybe you'll say search is the critical factor for our business. We have to make sure that it absolutely works perfectly well. Then you might say, OK, I'm going to put more focus on the search side and the tools and the maintenance type of things for search. And we'll have to take some, I don't know, some cuts with regards to development and say, maybe we have to figure out how to render the pages on our side. Maybe we have to find some other approach there to handle those pages. So that's kind of what I would look at there. I don't think there's one simple solution for that at the moment. I suspect towards maybe mid or end of next year, it'll be a little bit easier in that more and more tools support rendering. And at that point, maybe it's a no-brainer to say, OK, we just do everything with React and it'll just work. All right, thanks. Hey, John. Hi. Actually, I have a question. I have a page where I'm comparing different type of products. In each product, I'm mentioning some description for that, but in drop-down. If you click there, then you will see those content. How Google it these pages? If we have the content when the page loads, then we can use that for indexing. Then that's pretty good. If the content needs to load separately when you click on that drop-down, like you have some JavaScript that pulls the content from the server, then we probably wouldn't see that. OK. And how will you rank on those pages? At the moment, what would probably happen is we recognize that this content isn't visible by default, and we wouldn't use it for things like the snippet on a page, but we would use it for ranking. So if someone is explicitly looking for that content, we can show that page. Because we know the content is on there, that the user might have to click around to actually find it. But we know this is a match for that content. OK, then what will be the best practice to rank those pages? I would follow just the normal advice, just making these pages good. What I would do, though, in a case like this, is test it with your users. So a simple way that you can do this is just to do A-B testing on your own website. The more complicated way to do this would be maybe to do a user study, invite a group of users over to your office, and let them try to complete tasks on your website. OK, thank you, John. All right. John, I have one quick question, so if we have some time. OK, go for it. All right, so we have an e-commerce website, and we are actually competing with very generic, or you can say very commercial keywords, basically. So I'm in one of the keywords that I'm competing for. It's let's say dresses, for what I'm doing my Google search. So I'm seeing the three page of the listing is come from the US and the UK. So I'm Indian-based websites. And out of these 30 listings, only two listings from India. It's something that, like, I mean, somebody looking for the dresses in India, basically, they might search for the dresses, or maybe they might for the longer form of queries. But at times, if you like, I mean, we are not even competing within India itself. But I mean, other websites, which is coming from other regions, is actually taking all this position. So I mean, how do we ensure that, I mean, at least to have some sort of, you can say, visibility in the first page, or maybe second page itself, instead of having everything from the, you can say, UK and US. So but these are people in the UK or the US that are searching? Is that correct? I don't know. I mean, there's something that when I'm searching for the dresses. So first listing I'm seeing from the India. But I mean, apart from that thing, everything is from either from US and the UK. And most of the times, they are not even, I mean, providing all these products to Indians, basically. I mean, they are not able to buy from these websites. Yeah, I think that's always challenging. I don't see a simple approach to ensuring that your business is always ranking for those kind of queries. It's sometimes tricky with international businesses if they're really strong international businesses and they're active in a lot of different countries, but they're not active in your country, then it's hard for us to recognize and say, well, this is a really good result, but not here. And this is something that I see in Switzerland as well. In that a lot of German companies rank here in Switzerland, then you go to their website and they say, oh, we don't deliver to Switzerland because you're too small. We don't care about Switzerland. So that's something that I don't see a simple answer to that other than just keep working on your website to make it stronger and stronger and to promote it in ways where you can be visible in the search results. So that's something where I don't think there's a simple trick to doing that, a simple meta tag. You can just add to your pages to make it suddenly appear relevant there. Oh, understand. But the thing is, I mean, it's not just about my website. It's something that a lot of Indian websites will have as a very good presence in terms of popularity in other ways. So I mean, they are still not able to get on this particular first page or maybe let's say on second page. So that's something questionable. I mean, if you're talking about the user experience-wise, I mean, people might be looking for, they want to buy something and they are not able to do it because they're not getting enough relevant results. So that's something I'm just asking like. Yeah, I don't have a simple answer to that. I know the teams are working on improving, in particular, ranking in individual countries. So that's something I expect to get better over time. But it's a very tricky problem in that sometimes international content is very relevant in individual countries and sometimes local content is a lot more relevant in individual countries. So I don't think there's like a clear tweak where we can just say, oh, we will just say for this type of query, we'll just tweak it up and more local content. So that's something I would expect to see changes more in the long term. And in the meantime, if it's your business that's associated with this, then that's something you kind of have to deal with like with any other ranking challenge, where you think about what can you do to improve your website with regards to those queries? Or what can you do with your website to target queries where you don't see this problem that really big companies are involved in as well? OK, thank you. All right, so I need to take a break here. It's been great having you all here, lots of good questions, lots of good discussions. And we'll probably see each other again in one of the future Hangouts. I'll set those up. So if you're looking out for the next date, I'll try to get some in. I might be able to get one in next week as well, together with some of the top contributors that we're meeting. But otherwise, I wish you all a great weekend in the meantime. And hopefully, we'll see each other again in the future. Thanks, bye. Bye, everyone. Bye, Joe.