 All right, welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts on Air. My name is John Mueller. I'm a webmaster trends analyst here at Google in Switzerland. And part of what we do are these office hours hangouts where webmasters can join in and ask us any questions that might be on their mind with regards to web search and their websites. There are a bunch of questions submitted already. We can go through some of those. But if any of you have any questions to start us off with, feel free to jump on in now. Nothing particular on your mind yet? OK. Hi, John. OK. All right, go for it. Do you have different algorithms for the desktop version and the mobile version for the ranking purpose or the same? Well, we try to use the most factors the same across both of those. But there are different differences when it comes to desktop and to mobile. So you will see some changes in the rankings. And you will see differences with regards to the types of elements that we show in the search results. Because sometimes we think it really makes sense to maybe show more local information on mobile because maybe you're on the road. Sometimes it makes sense to show less. So the amount of these different elements that we show in search, they can vary as well. Any block suggestion where I can go through these kind of things? I don't think we list our ranking factors anywhere. So I don't have anything specific to point out there. OK, any recommendation to follow for that one? Ranking purpose in the mobile? For ranking, I would just continue to work on making a great website, making sure that it works well for users, that it's easily crawlable, and that the content is easily findable by Google. So not that you need to do anything kind of like magical on mobile there. OK, thank you. All right. Hi. Hello, John. Hi. Pratik here from Ahmedabad, India. We normally talked on Twitter. Probably you are aware about Pratik Joshi, right? I'm not quite sure what you mean. Yeah, my question is that, hello. Yes. Yeah, my question is that normally we read about our home page, the landing page should be not more than three click away from home page. So is this mandatory to keep a landing page not more than three click away from home page? No, no, it's not a requirement. It's a good practice, I think, because it makes it easy for users to find your content within your website and easy for us to crawl to that content, but it's definitely not a requirement. OK, another question is, link building is still an important ranking signal, right? I don't think you could really say it like that. Yeah, I agree. Because I have seen many websites having zero backlinks are ranking more better in Google with the good quality content. Yeah, that's certainly possible, yeah. Yeah, thank you for that. Hello, John. Hi. Hello, my one question is my website ranking in the first page in Google.co.do team, but no ranking top 100 in US, Google.com, on keyword software development company. In April 2017, Google.com is top three digit of first page, but now you are not out of 100. Anything we need to take for a minute? I think that's completely normal. So these are for different countries, and just because it's ranking well in one country doesn't mean it will rank well in other countries. So that's kind of expected. Sometimes that also has to do with the competition, and that in some countries, the competition is just a lot stronger. And you can't easily say, for example, I would be ranking well in Switzerland, but that doesn't automatically mean that I should be ranking well in the US. So that's something that's completely normal from our point of view to see different rankings across different countries. All right, let me run through some of the questions that were submitted. And as always, if you have any questions or comments along the way, feel free to jump on in. Let me mute some of you. There's a little bit of background noise. Feel free to unmute if there's anything specific you want to add to that. All right, we're a clothing e-commerce site, and for stock reasons, we need to split out a number of color variations into kind of separate URLs. We'd like to copy the product reviews over from the master URL to the different color URLs as well. Is this duplicate content? Or is this tolerated because the reviews are valid? And essentially, we need to compute the other manuals from the master URL. So if someone is specifically looking for your site, then that's it. Google Master is a good idea. Everyone is meeting with John. Yeah. OK, so essentially, to this question about copying content across different URLs for the same product, that's something we would essentially see as duplicate content. But it's not the case that we would say we would demote the website because of that. We'd recognize this block of text is the same across different URLs on the website. And if someone is just looking for that block of text, we would just pick one of those and show just that one. So that's not necessarily a bad thing. With regards to the reviews, I suspect that might be a little bit trickier. I don't know for sure how we would handle that from the structured data side if that would be OK. I assume that that wouldn't be too much of a problem, though. So that's something where what I would do there, if you want to be sure that what I would do there, if you really want to be sure, is to go to the Webmaster Help Forum and post in the structured data section there to get a confirmation. Feel free to send me the link to that thread, and I can pass it on to the team here to get a better answer for that. The other part of the question is why is Search Console reporting over 40,000 index pages, whereas when I do a site query, it shows a more realistic figure of 8,000 pages? So I think this just has to do with the way different things are counted. A site query from our point of view is an artificial query, and we optimize that more for speed rather than for accuracy, especially the number like the of about count on top. So I would not use the of about count in a site query for any kind of diagnostics purposes. What I would use instead is the number in Search Console, and ideally the index count per sitemap file. So in the sitemap section, you get a count of the URLs that you submitted in the sitemap file. How many of those are actually indexed? Because the difference between those different kinds of numbers is with the sitemap file, you're looking at the URLs that you really care about. So these are the ones that you think are important, and of those, it makes sense to let you know how many of those we actually picked up for indexing. Whereas if you just look at the URLs overall, then maybe we'll run off into some kind of infinite filter or some kind of pattern on your website and crawl and index a bunch of different URLs. But those are not necessarily URLs that you actually care about. So that's why I would focus on the sitemap's count. And of course, for that, you need to have a pretty good sitemap file, which is a good practice anyway. We just migrated to HTTPS, but our developer created a new property in Search Console as opposed to adding another property within the HTTP version. Is this OK? Yes. That's actually the normal way to do it. I don't think you can add a separate property within an existing property. So unfortunately, in Search Console, it can look a little bit messy if you have a lot of different variations of your site listed there, but that's the normal way it actually is. Hey, John. Hi. Hi, could I ask you a quick question? Sure. We're in the process of developing a new website. And the guys, the technology team here are thinking of developing it in React to JS. So I just wanted to get your recommendations and whether or not actually we should maybe choose a different technology over React from an indexation perspective. It's tricky to say. So I think if you're using a JavaScript framework like React, then one of the things you kind of need to watch out for is that Google needs to be able to render those pages to be able to index the content there. Without rendering, we would usually, I mean, depending on how you set this up, we would just see the blank kind of HTML framework page and not the actual content there. So that means that some things are going to be a little bit trickier in the sense that we need to be able to render those pages. You can, of course, test that with Search Console, with the fetch and render tool to make sure that it's actually working. You probably also need to think about what happens outside of the Google ecosystem. So if you're interacting a lot with Facebook or Twitter or social media or other search engines, then that's something you might also need to take into account and think about what you might need to do specifically for them. And finally, one thing that's also worth keeping in mind is that pretty much all of the SEO tools, they focus on the individual URLs and on an HTML basis. They don't render their pages. So if you need to use some of the existing SEO tools to kind of analyze your website and figure out where the actual problems are, that's something where you need to make sure that these tools support a rendered view as well, which might make it a little bit trickier. So from my point of view, what you're kind of balancing is kind of ease of development and usability with ease of doing things for SEO. And sometimes it makes sense to focus on kind of fast development and usability and say, well, we realize SEO is going to be a little bit trickier and we can live with that. We have enough other ways of getting traffic to the site that we don't need to be 100% reliant on that specifically. Maybe that makes sense. Other times it might be that you're saying, well, actually, search is a really critical aspect of our site. And so far, 90% of our traffic has come through search. And we really need to make sure that everything around search is perfect. And in a case like that, maybe it makes sense to think about other ways that you can handle this, which might involve pre-rendering the pages on your site to make sure that search engines and other kind of bots already have a static HTML version that they can crawl and index from there. So yeah, I guess there's no yes or no answer. I think the whole ecosystem is moving towards being able to process these kind of rendered pages, because lots of people are using these frameworks now. But it's still fairly early days. And do you think long term this is the approach to take, or is there an alternative solution that we should think about? I think in the long term, more and more services are going to support rendered content directly. That's kind of the way that the web works in that things evolve over time, and then other services kind of adapt to the current situation. So that's definitely going to happen. In the meantime, there are some things that you can probably do as a stopgap measure, which could be to pre-render the content on your site and just serve that to crawlers and search engines, for example. OK, great, thanks. All right. Let's see, one question around merging sites and the change of address tool. So I think they're merging two different e-commerce sites, and the change of address tool in Search Console doesn't seem to work for that, because it tests for different kinds of redirects. From my point of view, that's kind of working as expected. The change of address tool is specifically for moving from one domain to another domain and kind of taking everything as it is on your old site and putting that on the new site. So kind of replacing the address, the domain name, within your website, and keeping everything else the same. So that's what this tool is essentially made for. So for merging sites, it's not really suited for that, because you're taking two sites and you're turning that into one site, which could be one of these two, or it could be even a separate site that you're kind of turning that into. And from our side, merging and splitting sites are things that generally are a lot trickier to process, because we have to essentially reprocess the whole final website to figure out what is the actual context of the individual pages on here, because suddenly everything is kind of interconnected in a new way. And so that's something where you will see some amount of time of fluctuations until everything settles down a lot longer than if you just move from one domain to another one. So this is something that will take quite a bit of time to settle down. And the change of address tool wouldn't be suited to try to resolve that. OK, let me see. There are a bunch of questions in the chat as well. Wow. Is it preferable to use dynamically generated content for optimizing pages for search queries? So I'm not quite sure what you mean with optimizing pages for search queries, but in general, dynamically generated content is perfectly fine. If that's something you generate on your server or if you generate that with JavaScript, that's perfectly fine. A lot of websites use different ways of generating content dynamically, kind of past just using the static HTML pages that used to be kind of the commonplace. Let's see. Some of those we've kind of looking into. You're always suggesting that content is king and concentrate on quality. Then how are Pinterest and YouTube ranking on the first page? I think that's kind of making an assumption that Pinterest and YouTube are not quality content and not actually good. I think a lot of people would disagree with you there. There are lots of people that use these services and they're pretty happy with them. So I think to some extent that makes sense to actually show them the search results. OK, some examples for a question from previously. In general, if you have example URLs, it's probably best to post them maybe in a webmaster help forum thread because that's something I can pick up separately. Looking at examples live is really kind of tricky. So I'll just copy these onto the side. But in general, I'd recommend trying to put these into maybe a forum thread and getting advice from other people as well. De-localizing a domain on Search Console would this affect the local rankings? For example, taking a .com from UK to global, that can affect the local rankings in the sense that if we were relying on geotargeting to show this page a bit higher for local search results, then that would no longer be the case. So specifically with geotargeting, what happens is you tell us that your site is specifically for users in one specific country. And on our side, when we can recognize that users in that country are searching and that they're searching for something that they'd like to have locally, then we can take your site and promote it subtly in the search results to show it a little bit higher. So if you remove that geotargeting aspect, then for users in that country searching for something local, maybe we wouldn't show it as highly. But for the most part, I suspect if you're saying you're going from one country to global, then probably that's kind of compensated by everything else that you're doing globally. So I suspect you would see a subtle change in rankings, specifically in that country that you used to geotarget. But I imagine overall, it wouldn't be that much of an issue. All right. Let me see what other questions we have here. Internal links. I think we talked about this. Should the canonical be in the mobile site hreflang or the m.pages? So specifically for mobile pages, if you're using m.urls, so separate mobile URLs, and you want to use the hreflang tag to link different versions of those pages. So for example, you have one mobile page for Germany, one mobile page for Austria, and they're both in German. So you want to use the hreflang link between those pages. Then the hreflang link should be between the same versions of the pages. So the desktop pages hreflang tag should point to the other desktop page, and the mobile URL hreflang tag should point to the other mobile URL. So that's something that has subtly changed over time. But that's specifically for, I think, the mobile first indexing and for web search in general. That kind of makes sense because you're telling us this is the equivalent version for that specific language page. Of course, all of this is irrelevant if you use responsive design or if you use dynamic serving where you're using the same URLs for desktop and mobile. Those are kind of the recommended approaches anyway from our point of view. I think a lot of the quirks around mobile first indexing essentially go away once you use responsive design because then you have the same content on your desktop and mobile page. You don't have to worry about manually updating that. And all of the links are always the same because you have the same URL for desktop and mobile. So that's one thing where if you're unsure about a redesign maybe this year or next year, I guess, then I would definitely make sure that you're looking into responsive design as something kind of as a baseline setup. Um, we're seeing some strange and unrelated site links. Is this something I should report or give feedback on? Or would the unrelated site links eventually be changed due to low interaction? I think it's useful to have this kind of information. So if you want to send them to me, that's perfectly fine. In general, we do adjust our algorithms for these different elements all the time. And it's sometimes possible that you see something kind of weird as a site link and then it goes away. The important thing to keep in mind is that site links are related to the query and to the user. So if you're doing something like a site query and you see something weird in a site link, then that wouldn't necessarily be representative. But if you're looking at normal queries that normal users are doing and you see a site link, then that's probably more representative of what users would actually see. What do you think about SEO university challenges that tries to top on Google imaginary terms? I don't know specifically about the SEO university challenges. We sometimes do see these kind of SEO contests for imaginary terms. From my point of view, that's kind of, I don't know, it feels like a lot of time wasting because it's really kind of irrelevant with regards to normal content, normal URLs, normal websites. But people do weird things to spend their time. I think it's a bad example because it's not related to reality. Exactly, yeah, I mean, but people do weird things. So it's something where I think, as long as it doesn't affect the normal search results, it's less of an issue for us. But it's not really relevant in the sense that whoever wins there is a master of SEO and knows how to make any website rank number one. Often, that's not the case. Sorry? It may change over the time from one hour to another. If they're fighting, if two teams are competing for the top position, it can change from one hour to another. Yeah. And I think it's totally irrelevant. And it doesn't bring any knowledge. It's far better to try to minimize a bounce rate than to do some challenges like that. Yeah, I mean, I don't know. Like, I think one CMS or someone had an SEO contest like that recently. And as far as I recall, the top site was essentially a website that was reporting on the SEO challenge. So it's not even like a normal participant in the whole thing. But it's like, I don't know. People do weird things. And from my point of view, for the most part, if that doesn't cause any issues with the rest of our search results, it's fine. Thank you. Sure. A question about redirects, particularly 302 and 307. OK, you said that for 302, search engines tend to index the content under the original URL since it's unsure that it will always redirect to the new URL. So I think that means it indexes the content. So the original content is completely ignored. I'm quite understand the question. But in general, it's something like the theoretical difference between a 301 and a 302 is kind of which URL we use to show in the search results. It's the same content, because with a redirect, there is only one version of the content, like the final landing page. The end result of the redirect is kind of the only one that has the content. So it's mostly a matter of which URL that we show in the search results for that piece of content. So with a 301, you're saying the old URL is no longer relevant and the new one is kind of the one that is the full replacement. It's a permanent redirect. So what we'll do is use a new URL primarily. That's kind of simplified. There's lots of other things that flow into that. But for the most part, it's this case. Like you say the old one is irrelevant, so we take that one away and we focus on the new one. With a 302 redirect, you're telling us this redirect is temporary. So you're saying that it might change at the moment, like this URL is pointing at this one here and that's the content that we pick. But it might be that tomorrow it actually points at a different URL. And we'd be able to index the different URL's content. So what would happen there is we would take the content of the final landing page, but we would try to index it under the URL of the initial URL, kind of the one that is redirecting. So in a case like that, the same thing happens from a crawling point of view in that we crawl. We see the redirect. We crawl the final destination. We index the content of the final destination. But we try to show it in the search results under the URL of the redirecting one, so kind of the original URL that we saw. That's essentially the main difference there. With regards to page rank and all of our signals, like all of that stays the same, it's the same content, the same signals. It's just a matter of which of these URLs we actually show in the search results. So with that in mind, it's kind of a very subtle difference. And another thing that kind of makes this trickier is when a 302 redirect stays in place for a really long time, it's more like a permanent redirect. It's more the situation where from our point of view, it looks like maybe you use the wrong type of redirect, and maybe you want it to use maybe a permanent redirect instead, so maybe we'll shift over to the other URL. And there are some other factors that also come into play there when it comes to picking the right URL to show in the search results. So that's kind of everything that we consider under canonicalization. So different factors like a rel canonical on the page, they help us to pick the right URL, internal linking within your website, sitement file if you have anything there. All of these things kind of add up. And then we say, well, everything points at this URL. So we'll pick that one for indexing. Or what might happen, or what happens in a lot of cases, is we have seven signals that say, this is the right URL. And some signals say, this is the right URL. And we may need to make a judgment call. And sometimes we'll say, well, we'll take this one because there's more strong signals here, or we'll take this one because we think that's a cleaner one or other strong signals there. So all of that kind of comes into play. So with that in mind, it's usually less of an issue of SEO value and crawl budget with regards to redirects and more matter of just which of these URLs you want to have shown in the search results. And if you do have a preference, which one you prefer, make sure that everything kind of aligns with your preference. And when you think about it that way, you also don't need to worry about things like 307 redirects because essentially, that's just another type of redirect. It's not anything kind of unique where you could add another spin and say, well, in this case, Google will index my content under one of these two URLs. It's always a matter of which one of these two URLs will use for indexing. Does Google consider the backlinking between the main domain and a subdomain that serves as a service of the main website as a spammy behavior? No. So if you have different parts of your website and they're on different subdomains, that's perfectly fine. That's totally up to you. And the way people link across these different subdomains is really up to you. I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website. And sometimes things on separate subdomains are like a single website. And sometimes they're more like separate websites. For example, on Blogger, all of the subdomains are essentially completely separate websites. They're not related to each other. On the other hand, other websites might have different subdomains, and they just use them for different parts of the same thing. So maybe for different country versions, maybe for different language versions, all of that is completely normal. How does Google Team handle complaints about websites that build its content using copy and paste policies? We do take that into account. So if you use the spam report form, that is something that is valuable to us. However, for a large part of these issues, our algorithms tend to take care of them automatically. So websites that are purely based on copied content from various sources, those are pretty trivial for our algorithms to pick out and to show less frequently in search. If a website offers adult-related products as well as products for all age groups, does this mean none of their products can achieve rich snippets or only those that fall under the adult-related categories? Or in other words, are rich snippets based on a per-page basis or as an entire domain flag as adult-related due to a part of the assortment is for adults? In general, we try to be as granular as possible with regards to different kind of classifications of websites to figure out which parts are, for example, adult-related and which parts aren't adult-related. Sometimes that's easier to do. Sometimes that's a lot harder to do. So what can sometimes happen is if we can't cleanly separate out the adult-related content and we see that it's clearly a large part of the website, then it might be that we will say, well, the whole website is kind of adult-related or we can't really tell which parts are clearly adult-related. And that might be something where we'd say, well, it makes sense to kind of use safe search as a way of kind of showing this to the appropriate audience. Other times we can recognize different parts of a website as being clearly adult-related and other parts as being clearly kind of for a general audience. That can be something that you can help us with in the sense that maybe you can split things up into a subdirectory or a subdomain to make it easier for us to recognize which parts of the website are kind of fall into which bucket. Another thing you could do is maybe even just use separate domains to really clearly tell everyone, including Google, that these are actually kind of clearly separated parts of the same business. So those are all some options that you can do there. In general, it's not something that you can rely on our algorithms to always get it right on a per-page basis. So if you have a lot of varied content and you have two or three adult-related pages in there as well, you can't rely on Google to always just pick those two pages out and say, well, we need to treat these differently. Or similarly, if you have a large amount of adult content and you have a handful of pages that are clearly not adult-related, then you can't completely rely on us to be able to pick out those individual pages and say, well, all of this is adult-related. But these two pages are not adult-related, so we'll try to treat them differently. So that's something that's kind of tricky at times. But again, the easier you can make it, the more likely we'll be able to take that into account. Hello, John. Hi. Hi. I have a question about content duplication. OK. Tell me for cases when this duplication is really, really hard to avoid. So let me explain. We are a health services comparison website. And we have, for example, pages. For example, for dental clinics, endowments, offering dental bridges. And another one for dentists endowments, offering fillings and dental implants. And the list goes on and on. So you can imagine that for the majority of those pages, the content that will be presented in terms of the clinics that will be listed will be fairly similar. And the same, I think, holds true. If you look at it from the location dimensions, so again, you can imagine that dentists in doubling. The content will be similar to dentists in counting doubling and so on. So we do want to provide to the users the exact page that they want in terms of the location they're looking for and the treatment that they're looking for. At the same time, we are conscious that this causes some kind of content duplication. So the question is, is this type of duplication something to worry about? For the most part, it should be fine. I think the tricky part that you need to be careful about is more around doorway pages in the sense that if all of these pages end up with the same business, then that can look a lot like a doorway page. But just focusing on the content duplication part, that's something that, for the most part, is fine. What will happen there is we'll index all of these pages separately, because from a kind of holistic point of view, these pages are unique. They have unique content on them. They might have chunks of text on them, which are duplicated. But on their own, these pages are unique. So we'll index them separately. And in the search results, when someone is searching for something generic and we don't know which of these pages are the best ones, we'll pick one of these pages. And show that to the user. And filter out the other variations of that page. So for example, if someone in Ireland is just looking for dental bridges and you have a bunch of different pages for different kind of clinics that offer this service, then probably we'll pick one of those pages and show those in the search results and filter out the other ones. But essentially, the idea there is that this is a good representative of the content from your website, and that's the one that we would show to users. On the other hand, if someone is specifically looking for, let's say, dental bridges in Dublin, then we'd be able to show the appropriate clinic that you have on your website that matches that a little bit better. So we know dental bridges is something that you have a lot on your website. And Dublin is something that's unique to this specific page. So we'd be able to pull that out and to show that to the user like that. So from a pure content duplication point of view, that's not really something I'd totally worry about. I think it makes sense to have unique content as much as possible on these pages. But it's not going to sink the whole website. If you don't do that, we don't penalize a website for having this kind of duplicate content. Kind of going back to the first thing, though, with regards to doorway pages, that is something I definitely look into to make sure that you're not running into that. So in particular, if this is all going to the same clinic and you're creating all of these different landing pages that are essentially just funneling everyone to the same clinic, then that could be seen as a doorway page or a set of doorway pages on our side. And it could happen that the web spam team looks at that and says, this is not OK. You're just trying to rank for all of these different variations of the keywords. And the pages themselves are essentially all the same. And they might go there and say, we need to take a manual action and remove all of these pages from search. So that's one thing to watch out for in the sense that if they are all going to the same clinic, then probably it makes sense to create some kind of a summary page instead. Whereas if these are going to different businesses, then of course, that's kind of a different situation. It's not a doorway page situation. That was very helpful. Thanks, John. Sure. OK, let's see. Long question submitted here. I know Google tries everything possible to understand what content is the most important on a page and what content isn't as important, partially based on backlinks. However, for small businesses, that often means forcing content on your home page, because that's the only place you can get backlinks. Would it ever be possible to allow webmasters to directly tell Google which pages are important on a website? I think it goes into some examples there. For the most part, I think the assumption that we would only show pages that have direct external backlinks is incorrect in the sense that we do try to use links to some extent when it comes to ranking, but it's not the most critical factor. We do look at the primary content on a page, and we do use internal linking to kind of spread this signal that we get from those links out to other pages within the website. So just because everyone is linking to your home page, which, like you mentioned, happens a lot with small businesses, doesn't necessarily mean that you need to put all of your content on your home page as well. I would definitely go ahead and structure your website in a way that makes sense. And have a clean structure within your website. Have clean internal linking. Maybe don't separate things out too far. But essentially, having a well-structured website, a well-made website, should just work in cases like this. Obviously, especially for small business, it's sometimes a bit tricky because you don't have a lot of practice putting together websites like this. So what I would recommend doing is maybe going to one of the Webmaster Help forums. It could be ours. It could be any of the others as well to get advice from peers with regards to what makes sense to put on a website, how big to kind of how broad to make your website, or how focused you should make those individual pages. And maybe to get some ideas for things that you can test, to try out, things that you can track. And based on that, kind of work on your website a little bit. So it's definitely not the case that we would only show the home page of a small business in the search results. We do definitely show a lot of lower-level pages as well. We run an e-commerce fashion portal. We're currently planning on using SiteLink, generated using our search module to target only one keyword per page. We've decided so because it's difficult to target many keywords on a single page. All of the keywords would be related to single product type, but separate page would be generated with a desired title, the scripture on H1. Hard to say specifically what that would be like. I don't know. I probably need to see some examples. So what I would do in a case like that is also to check out the Webmaster Help Forum and to give them some examples of what specifically you're trying to do there. In general, it sounds like you're making essentially product pages for these individual items, and that's perfectly fine. On the other hand, if you're doing something like automatically generating pages based on just a list of keywords that you're finding, then that seems like a bad idea because in general, these pages tend to be really low quality. When people land on those pages, they don't really find what they're looking for, and that's something you probably want to avoid doing. So depending on which subtle direction you're trying to head there, I would definitely get some advice from peers to get their take on what makes sense or what doesn't make sense. In mobile, we're giving one link to the user, and if they click, then the information is given because of the small screen. When a content developer, as they said, it's available in the DOM, but I'm getting content mismatch for that. Is there any article in the Webmaster blog that I can go through for this? So content mismatch is something that is specific to AMP pages at the moment in the sense that we want to have the AMP pages be equivalent to the normal pages that people view on your website. So if someone is looking at a page on your website in one format, it should be pretty much equivalent in content and functionality in other variations of that page. So if you have something like a simplified AMP page where you just have a few lines of text and then a link on the bottom saying click here to see the full content, then from our point of view, that's a really bad user experience because users go to the AMP page to get the content quickly. And if you're saying, well, I'll serve them a teaser for the actual content very quickly, but for the real content, they have to go somewhere else, then that's kind of not really in the spirit of a fast mobile page. It's more like you're just leading people along and not actually giving them what they're looking for. So for that, I'd really make sure that when people open the AMP page that it's really equivalent to the actual full page that they would otherwise open, be that on desktop or on mobile. So that would be my advice there. I have a question on Google for jobs. We have a bunch of clients that added structured data and their name is displayed incorrectly somewhat. What I would do there is also post in the Webmaster Help Forum. I know, especially for the job side, people are actively looking for these type of questions and trying to help find solutions for that. So if you can post in the forum, ideally with some example URLs and some example queries so that we can see this problem as well, then the team can take a look at that and see what they can do to help to resolve that issue. So a while ago, probably a year ago, we had a discussion about our .com domain. I'm based in the UK, and it's had a legacy .co.uk domain that was starting to rank, although it was redirected into the .com domain. So more recently, what we've noticed is the same .com domain is now starting to get visibility in Australia and South Africa and other English-language-speaking countries. So we see this as kind of a recurring problem with geolocalization of the .com domain and whether should we migrate essentially to the .co.uk to try and sort this out from your side to make sure that we're not... I didn't get that last part. Can you repeat that again? So the question is whether we should move on to the .co.uk domain to ensure that we localize visibility, all our search results to .co.uk, and we don't essentially have our .co.uk URLs ranking in Australia and South Africa. Ideally, that should just work. So I wouldn't migrate back. I think that would probably just confuse things a little bit more. But if you can send me some example queries or an example URLs, I can try to take a look at that as well. I've been talking with some of the folks that are working on the internationalization side a bit and examples like those would be really helpful to kind of point them out and say this is really kind of not working the way that it's expected and especially if you're looking, or if you're seeing that code at UK ranking in Australia instead of a generic.com or an Australian domain, then that seems like something that shouldn't be happening. Okay, I'll send Davis some examples. Great, thank you. All right, perfect. Okay, looks like we still have a bunch of questions left but time is running kind of short so I thought I'd just open up for any more questions from you all. What else is on your mind? I have one more question. Okay. So I think it was last year from a Google source. It was publicly mentioned that the three most important things for the rankings are rank frame, content and links. Is that still valid? Do you have any comment on this? I don't think it makes sense to have like a three top ranking signals list. Primarily because that changes over time and that changes from query to query and from day to day and user to user essentially. There's a lot of personalization involved. It's something where you can't like say this is always like the top ranking factor. I think the kind of like the external ranking factor list they're kind of interesting to look at but it's not something that you should use kind of as a prioritization list with regards to making a website. My brain, can we assume that it's importance is going to grow as we move on? I think a lot of people make assumptions around rank brain that aren't necessarily true. So from that point of view, it's really hard to say like what will happen with this big chunk of things that people kind of put under this heading of rank brain. So a lot of that falls into, for example, understanding the query better. So if you type something into Google, sometimes it's a query that we've never seen before. Sometimes there are subtle things there where we can't just match the keywords one to one and we need to understand like what it is that you're actually looking for. And especially like if you look at the newer generation of people who are using the internet, they're not going to Google and saying keyword, keyword, keyword, they're asking a question. They're asking like what happens when I do multiplication or something like that when actually they're trying to kind of look at very specific pages. So this kind of understanding of the query is something where we use a lot of machine learning in there. And that's something that people sometimes kind of put under the rank brain heading. People sometimes just put artificial intelligence in general as something under rank brain where I think we do use a lot of machine learning across search, across different parts of search. And not all of it can really be kind of quantified in the sense that it's like here are links and here is content and here is machine learning. There is, it's like a technology that you can use pretty much everywhere. So yeah, I guess that in short just means I don't really have an answer for like can we expect rank brain to grow in importance? I don't think that's really kind of a question that has a specific answer that we can give there. Thanks, Zoom. All right, more questions from any of you. What else is on your mind? So quiet today. Okay, that's fine too. Let me double check the questions that were submitted and see if there are any that we've kind of missed out on. Our site was ranking the first page for the keyword fridges and freezers in Australia. Since two weeks, we moved all the way down to page five. Technical changes haven't been made to the site. The only modification was we added more category landing text to rank for various other fridges. I don't understand what happened here. So I'd probably need to take a look at the site to see a little bit more of what is actually going on there. But in general, our ranking algorithms change all the time. And kind of the way our algorithms look at sites also changes over time. So just because a site was ranking number one for a long time doesn't necessarily mean that it'll always be ranking number one. That's something that can change over time. The modification that you mentioned, that you put more category landing text on the page, that might also be something that's playing a role there. What I see a lot with e-commerce sites is that they take a category page that's actually pretty good, and they stuff a whole bunch of text on the bottom that's essentially just kind of roughly related to that content, which is essentially bigger than the Wikipedia page on that topic. And from our point of view, when we look at things like that, our algorithms kind of quickly kind of back off and say, whoa, it looks like someone is just trying to use keyword stuffing to include a bunch of kind of unrelated content into the same page. And then our algorithms might be a bit more critical and kind of be cautious with regards to the content that we find on this page. So that's one thing to kind of watch out for. I think it's good to help provide more context to things that you have on your website, but kind of be reasonable and think about what users would actually use and focus on that kind of content. So for example, if the bottom of these pages is just a collection of keywords and a collection of sentences where those keywords are artificially used, then probably users aren't going to scroll to the bottom and read all of that tiny text and actually use that content in a useful way. And then probably search engines are also going to back off and say, well, this page is doing some crazy stuff here. We don't really know how much we can trust the content on the page. I have a question about PageSpeed Insights. It's telling me to set an expiry date or maximum age for static resources for the Google Analytics code. Why does Google not accept their own rules? Yeah, good question. So I think there are two aspects there that kind of play in with regards to these tools. On the one hand, we don't special case our own content. So if you use different tools on our pages, we count those the same as any other tools that you use on the pages. So if you're using Analytics, if you're using AdSense on pages, and these codes are slowing your pages down because of the way you've added them to those page, then we call that out and say, well, these are slowing things down. It's not the case that we would say, well, it's slowing things down, but it's from Google. Therefore, we'll just ignore it. We really want to make sure that these pages actually work quickly. So we will call out issues that we find, even if they come from Google products. The other thing that sometimes plays a role here is that we sometimes do things in a very, I don't know, how can I say it? Smart way in the sense that we try to figure out what the user is and to serve them kind of more personalized content for the same URL, so specifically around things like scripts that are used fairly frequently, that's something where our servers can sometimes serve this content in a way that makes a little bit more sense, which also means that for testing tools, it might be served in a way that's more generic than it actually needs to be. So it might be that what the PageSpeed tool is looking at here is not actually what a normal user would see when they access these pages. For that, what you could do is to use Chrome, for example, or any other browser when you load those pages and to see how it's served with an actual browser as the user agent. And sometimes, you'll see that there's a subtle difference with regards to things like caching headers or compression. And maybe you'll see that it actually is served in a fairly reasonable way. And finally, the last thing to keep in mind there is when it comes to PageSpeed and the various testing tools, they focus on a very specific set of factors, which are not necessarily always the things that are critical on a page overall. So for most pages, these might be really good factors, but you still need to understand that this is essentially calling out one specific issue here that might not necessarily actually have an effect on the overall page's loading time. We will still call it out because we think it's worthwhile looking at, but you need to be a smart consumer of those reports and to understand what this actually means and what that actually means with regards to changes that you could make or with regards to changes that are good use of your time. So those are all things that kind of come into play there. All right, I think we're pretty much over time. I hope this has been useful. I hope you've had some of your questions answered here. And if not, I'll be setting up more Hangouts as well. I plan on doing these more also towards the next year. We hope to have more of these in different languages as well. But if there are any kind of general changes that you'd like to see being made for this format, please also let us know. So if you're saying, well, these kind of live Hangouts don't really work well, or I'd rather have kind of a smaller set of people in the live audience or bigger set of people in the live audience or a different way of submitting questions or different kinds of questions or just different kinds of content, then all of these are really useful suggestions for us. And I'd love to see them from you. So let me know what you think, how these go. Feel free to add comments, send your feedback my way, and we'll see how or if we need to kind of subtly change the format going forward in the next year. All right, so with that, let's take a break here. Thank you all for joining, and hope to see some of you again next time as well. Bye, everyone.