 OK, good morning, everyone, and welcome to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a webmaster trends analyst here at Google, currently visiting the London office. So a little bit change of scenery here. As always, we're here to kind of answer your webmaster web search related questions, specifically around Google web search, of course. Is there anything on your mind before I get started with the submitted questions? Do any of you have something to bring up from the start? I'll very quiet today. That's fine. Sometimes Fridays are a bit quiet. OK, so a bunch of questions were submitted already. So I'll just run through some of those. And as always, if there's anything on your mind in between or anything that you'd like to get clarified, feel free to jump on in. And we can touch upon that, too. OK, so we have, I guess, a fairly long two-part question to start with. How does Google treat multiple search results from a single domain when all pages provide unique and valuable content? So this is something that comes up fairly regularly. Sometimes with a positive background, it's like my site is showing up multiple times. Sometimes with a negative site, how come someone else's site is showing up multiple times for the same query? From our point of view, there is no hard and fast answer for this. Sometimes it makes sense to show multiple results from the same site. Sometimes it doesn't make that much sense. And we try to limit ourselves to maybe one or two results from the same site. So it's not the case that there is a simple line that we draw and say, OK, this is the maximum number of results that we show from one site for any search results page. Or based on these specific factors, we will show more content from that site. It's really something that we try to do based on the query, based on what we think the user is looking for. If we think that the user would welcome multiple results from the same site, then we'll try to show those. All right. Let me just mute you. There seems to be a bit of echo in the back. Can adding filtering or attribute options to category pages on e-commerce sites have a positive SEO effect given that it's improving the user experience? So from our point of view, we generally don't focus that strongly on user experience that we would say we have like a user experience ranking factor or anything like that. That's not the case. It can definitely improve user experience. And then through that indirectly, it could, of course, have a positive effect. If users go to your site and they're happy with what they see and they recommend it to other people, then that's something that kind of multiplies by itself and that brings your site value, on the one hand, from the users going there directly. On the other hand, from the recommendations that we can pick up, the links that we can find to the site, from other signals maybe that we can pick up and say, this is really a good site. We should be showing it more in the search results. So it's not the case that there's like a one-to-one relationship between adding filtering to category pages and having better ranking. But it can be that there's kind of an indirect effect in that you're improving your website, assuming this does improve your website. And then indirectly, of course, that's something that users will try to pick up. And then from that, from the recommendations that we see, we can try to show that a little bit better in the search results. We have an affiliate site, and your quality guidelines mentioned that we can do product comparisons at value. If we were to do such a thing, should we show the external link of the competitor's product we're comparing against, along with the link references, where we got the information from, or is Google clever enough to work it out from our content? How should we implement, essentially, an affiliate site with a product comparison? So from our point of view, it's not the case that we would say product comparisons are exactly what you should be doing. But rather, if you have an affiliate-based website, you need to, like any other website, make sure that you're providing sufficient, unique, and compelling value to the web. So finding ways to improve your website, to make it easier for users to do something on your website, to make sure that when we see your website, we understand this is something unique. This is something important that we should be showing in the search results. All of that kind of adds up. And there's no simple recipe for that, where you can just download a piece of code, and suddenly you have a valuable website. You really need to kind of find that value on your own. That's something that sometimes takes a bit of time. Sometimes it's something that you kind of know, because you know that area already. You know where things are still missing on the web, where you can provide a unique spin to kind of that topic. All of those things are kind of ideas that you could work on. Is schema markup used in any of the ranking algorithms at all? So this is kind of a tricky one in the sense that it's kind of like the underlying question is, like, should I be marking up my pages at all with any structured data markup? So on our side, there are maybe, I'd say, three main ways that we use this kind of structured data markup. On the one hand, there's the traditional rich snippets, which aren't a ranking factor, where essentially the search results that we show for your individual pages, they rank in the same place. But they're kind of marked up in a nicer way. Maybe you have those review stars. Maybe you have kind of additional pricing, availability information shown in the snippet. Our idea there is that this provides additional context for your site, so that if users feel that your result is the one that they want to kind of look at, based on the additional information that they have, then maybe they'll click on that a little bit more. So that's something where you wouldn't see a direct ranking effect. Another effect that we do sometimes have is that some of these structured data types are used to show content within a carousel, that we kind of show either on top of the search results or somewhere within the search results, where essentially your content is, on the one hand, ranking as it normally would in the normal search results, but also visible in this carousel maybe on top. And that's something where you could call that a change in ranking, because you're more visible on top of the page. But of course, you're in this carousel with other sites that are similarly shown there. So it's kind of hard to say that it's really a ranking factor, because your normal site is still ranking in the normal place. And then finally, there is the aspect that a lot of this structured data markup is also used by our algorithms to better understand your pages. And this isn't something where there's a direct one-to-one ranking factor involved, but more that if we can better understand your pages, we can try to show it in the more appropriate ways in the search results. So by being able to rank your pages in the right place at the right time, that can help too. All right. If I understand correctly, Google is very good at dealing with spammy links coming to your website, so we don't necessarily have to disavow all the time to keep on top of things. If Google can take care of things on our behalf, then isn't the disavow tool so much redundant? Yes and no. So for a large part, we can deal with these on our own. But from talking with webmasters, I know you all like to take control of things as well when you're not sure if Google is actually doing what you expect it to do. So the disavow tool is a great way for you to say I found this weird problem with spammy links to my site, maybe something a previous SEO did for your website, and I just want to make sure that Google really, really, really doesn't take these into account. And then you can use the disavow tool, and then you can be kind of sure that they're taken out of the picture. So it's something that we can use to give you a bit more opportunity to kind of clean things up. All right, looks like there's a bunch of stuff happening in the chat. Let me just run through that in the meantime. And then we can, if there's anything else in between, from your side, let me know. Does Google ignore the robots.txt file if it returns a 30x result code? So basically, if the robots.txt file redirects, we have that mentioned in our documentation. We do follow those redirects for robots.txt file. I have an e-commerce site, which is on WooCommerce with multivariant product. Now my question is, how we use schema for this is every variant has a different price. How to tag multiple prices? I don't know. That's something you'd probably have to look up with your platform provider to see if there are some options there. Why is the change of address tool required when Googlebot can follow a 301 redirect? Does it save us or does it make the migration a bit faster? Yes, it does help us to understand this migration a little bit better. So in particular, we can use this to recognize you really, really want to move from this domain to another domain. And we can try to package pretty much everything from the old domain and just forward it to the new domain. So otherwise, with a 301 redirect, we just see individual pages are moving. But we don't know for sure that actually the whole website is supposed to be moving to the other domain. So I have 300,000 pages with 404 status generated by error. It's an e-commerce site. I'm afraid of 301 and 302, how to convey to the crawler that they're no more live now. So in general, 404 pages are perfectly fine. You don't need to redirect. You don't need to hide those error pages. That's something that basically just let those 404 pages. We're a website builder startup with 45,000 pages. We have created 35,000 websites on different subdomains. All of them have powered by badges linking back to our website. We set those badges to nofollow at the beginning. But recently, I set them to dofollow. Now Google has penalized us and started to bring four times less traffic daily. What's the best way to recover disavow? Should we leave the links nofollow or dofollow? So in general, if these are links that are specific to your brand, then that's less of a problem. If you're doing something with keyword-rich anchor text, then that would be a little bit more of an issue. With regards to being penalized, that's something where if you see a manual action in Search Console, you can use that to try to understand what specifically we're flagging there. You can do a reconsideration request after you've cleaned anything up that you think needs to be cleaned up. And you'll get kind of a confirmation that we've picked it up and processed it properly, or maybe that there's still a problem that you need to work on. If you don't have a manual action in Search Console, then my guess is probably your website is just ranking the way it normally is. And we do see these kind of changes over time where websites go up in rankings, other websites go down in rankings. That's kind of a normal part of the web. And usually that's more a matter of the overall quality of the website itself, not so much of the setup that you have there with regards to your clients' websites and your websites. With regards to disavowing those badge links or not, if these are really brand-specific links, if it's powered by, I don't know, John Mueller, and you link to johnmueller.com, then that's essentially fine. So that's not something you'd need to explicitly disavow. All right, let me run through some more of the submissions, and then I'll get back to all the activity in the chat here. Can a poor user experience a mobile responsive site affect the rankings of a website on the desktop version? So again, this is not something we really take into account on a one-to-one basis when it comes to SEO and rankings. However, especially with a move to the mobile first index, we would be using your mobile version of your page as a basis for ranking both the mobile and the desktop pages. So it's not so much a matter of poor user experience, but if you have missing content, if you don't include all of the images, don't include all of the text, if the internal linking is bad on the mobile version, then that's something that will affect the ranking of the desktop version. That's not the case at the moment. At the moment, we use the desktop version for everything. But as we see, more and more users are primarily accessing the web on mobile. That's something that we think is worth changing. How important is it that we keep our link profile clean? We're struggling for time to keep our disavow files up to date. So as I mentioned before, we do work very hard to automatically handle links as we find them and to process them appropriately. And if you're not actively going out and building artificial links to your website, then for the most part, you don't have to keep up with your disavow file every day. If you're seeing that random people are building lots and lots of really weird links to your website, then I would maybe look at your disavow file every couple of weeks so that you can be sure that you have the right domains in there. But otherwise, I wouldn't focus on that too much. It's much better to focus your energy on something that kind of moves your website forward to improve your website overall. Those organizational schema, name, logo, same as need to be on every page of a website or just on the home page. It's my understanding that we primarily need this on the home page and not on every page. We're seeing some incorrect information populating in a featured snippet. Specifically, if you search for our product name plus price, you get a result from an order form with over 50 products listed, but the wrong product name and price in the snippet. We've submitted feedback for this issue for over a year. Is there anything we can do? Should we look at redesigning the page and the content? So I guess redesigning the page and the content is definitely one option. If you want, you can also send me the details. So specifically, the query that you're seeing and the full featured snippet that you're seeing. And I can forward that on to the team. But if you have essentially a list with a lot of products and that's the one that we're using to rank your pages, then it sounds like that's something you can also be doing slightly differently in the sense that maybe you could make individual product landing pages so that we can actually have a better page to send users to when they're interested about your product. How can I handle quickly expiring content when I can't update this iPad file daily? So ideally, updating the iPad file automatically is what you should be aiming for, especially if you have kind of a setup for your website where content comes and goes fairly quickly. Then I would definitely make sure that you have some kind of automated system to generate these iPad files. So a site file shouldn't manually create and upload and put on your server. But it should be something that automatically is created, automatically submitted to Google and other search engines so that the content can really, really quickly be crawled and indexed. Otherwise, if you're updating this site file maybe once a week or once a month, then at that point when you submit the site file, we recognize, oh, these things have changed, and that's already a month later. So that's something that doesn't bring that much value. It's still useful, I think. But it's not as good as really having an automated system to submit to your site map file. And with regards to expiring content, so perhaps if the old content is still listed in your site map file, that's not a problem at all. So if we recognize a new last modification date in the site map file, and we go off and crawl that page and see it returns a 404, then the only thing that will happen is we'll process that 404 and drop that URL from our index, which might be what you're actually trying to do, namely to get the old content out a little bit faster. Gary was talking at SMX about engagement metrics being used to see if an algorithm is working. This is a counter-reliable and systematic way to measure engagement on site. How do you do that? So this is kind of tricky. On the one hand, I'm not exactly sure what Gary mentioned at SMX. So I don't know if I can provide full context there. But in general, when we test algorithms, when we test changes in our search results pages, we do that in a way that the site might do A-B testing, essentially. We try to figure out which of these versions tends to result in users being happier, tends to result in users going to the page that they're actually looking for a little bit faster. And sometimes that involves, like, subtle changes in the design of the page. Sometimes that involves subtle ranking changes. Sometimes that involves bigger ranking changes. So all of these things are aspects that we try to test. And we do tons and tons of tests all the time. So pretty much every time you're searching at least one or two experiments where we're trying something new to see how users respond to that. And the metrics that we look at there are kind of unique to the search results pages. It's not something that I think you can apply across the board to every website. It's something that you really need to work on and try to figure out what works best for your website. I don't think it makes sense to kind of take what Google is measuring in search results pages and applying that to an e-commerce site, for example. Those are very different sites and very different kind of models where it's really hard to determine which users are happy which ones aren't. And even on our side, this is something that we have to refine all the time. We have to work on to try to figure out which model best works and kind of results in the information that we're looking for. Because sometimes things can change. Sometimes the search results are kind of weird. And we pick up weird metrics and we need to understand what these metrics actually mean. So this is one absolute answer, but rather where you really need to understand your users best and react to that. People are obsessed about page speed for thinking of increased rankings, even if they're removing quality components from a desktop mobile pages to kind of get a higher speed. What's up with that? So I think if you're going so far that you're removing important content from your website and in hope that it ranks faster, that it ranks better because it's faster, I think you're probably doing it wrong. So essentially, if there's nothing on your pages, then that's a really fast loading page, too. But that's not going to rank for anything because we don't have any content that we can rank there. So you really need to make sure that you find the right balance with content on your pages that we can use for ranking, functional content on the pages that people want to find, that they're happy to use, that they're happy to recommend. And at the same time, of course, being built in a way that really works fast for users. Because we've noticed that when things work really fast, when the web is really fast, users tend to use it more. So this is something that's kind of in your own interests, of course. And if you remove this content on your website, you might have a really fast loading page, but users just go somewhere else because they can't find what they're actually looking for, which is not what you're trying to do. So that's one of those aspects where you need to find that balance for yourself as well. Ranking for search query seems to differ from the actual results on Google search. So I assume this is specific about search console, search analytics information, that the ranking information there is different from what you might see in search. And that's often the case. It's not a matter that what you see when you search is exactly what everyone else sees in search. So we have a lot of aspects around personalization, around geo-targeting that flow into the search results as well, where one user might see a site ranking fairly high. Another might see, for the same query, the site ranking much lower. And that's completely normal. And from our point of view, that's not something that would be wrong. But rather, we try to reflect what users actually saw in the search results in search analytics. Search analytics is not an artificial ranking indicator where we say, well, theoretically, your site would be ranking here. And it's not a matter of a one time ranking indicator. So it's like from our data center when we scrape Google search results, we see it ranking here. But rather, we reflect what users actually saw. And I think that's much more useful because that is really what we're seeing and not just what theoretically might be possible. How to handle a migration from an MDOT site to a responsive site. This is something where, essentially, at the moment, it's fairly easy to do, in the sense that you just redirect from the MDOT version to the main responsive version. And if you're just redirecting your mobile pages, then that's something that shouldn't be affecting search at all, in the sense that for indexing, we use your desktop pages anyway. So we wouldn't see any redirects there. It wouldn't be the case that you're moving something from or with the canonical pages, with the desktop pages. But rather, you're moving the associated page to the same as a canonical page. So doing a redirect there is absolutely the right thing. This is something that should be fairly straightforward and should be fairly easy to do. With the mobile first index, when that comes, obviously, that will be a bit different because that will be like you're moving your canonical page to a different URL. So that's something where, if you plan on doing a change from an MDOT site to a responsive site, it's probably a lot easier to do that now than to wait until the mobile first index has come out because with mobile first, you'd be moving the canonical pages, which means you're doing a site migration, which means you're seeing probably a lot more fluctuation as that gets reprocessed. We've been running into crawl errors, 500s for URLs that appear to have no issues with them at all. When we navigate to the problematic pages, there are no errors on our end. We also had a warning come through on our site map a couple of days ago for a URL that exists and is working properly. What could be causing these issues and how can we fix them? So I took a quick look at some of those URLs, and it was really the case that we were seeing server errors for those URLs. So it's not the case that we were making these errors up and showing them to you to kind of keep you busy, but really that when we were crawling, we were seeing a bunch of server errors on these URLs. And sometimes that's something where when you check later, your server is doing fine and everything is working well again. So that's sometimes really hard to reproduce. But what I'd recommend there is double-checking your server logs to see what exactly was happening on those days. Maybe there's something on the server end that needs to be worked on. Maybe it was just a temporary glitch where someone was working on the data center and, I don't know, the server was overloaded or whatever. So these are things where on the one hand, maybe it's already fixed. On the other hand, if you're really not sure where it is, I would try to dig for details so that you can avoid that in the future. Same domain, another category website in the past. How does Google calculate the link value? So I assume that's like you take one domain and you put a completely different website on it at some point later on. What does Google do with that? We do try to recognize these types of transitions where a website significantly changes. And that's something we do try to take into account in our search algorithms. So you can't necessarily just take an old domain and say, oh, this was working really well in the past, let me put a completely unrelated website on here, and it'll rank just as well. That's not how it would work. If I focus my English directory slash en to UK in Search Console, will I lose presence in the US? No, not necessarily. So geotargeting in Search Console is used to determine where we can rank this website a little bit higher when users are searching for something local. So it's not necessarily the case that it would rank lower in the US. But if someone in the UK is searching for something local, which we might be able to recognize, and your site is geotargeted for the UK and would match that, then we can say, oh, this is a really good result for this user because they're searching for something in the UK and this website has something and it says it's in the UK. So we'll promote it a slight bit for geotargeting here. On the other hand, if someone is just searching generically and not looking for something specifically local, then we probably wouldn't use that as much anyway. Let's see. I was setting up. I was here on my website ranking in the search results for keywords on a 38th position, and then I moved, improved to the 23rd position. And now I'm not anywhere in the search results. What's the cause that made my page drop out of the search results? So really hard to say. Search results are extremely dynamic. Our algorithms change. The rest of the web change users change over time. So it's, from our point of view, kind of normal that you would see fluctuations in the search results. And especially if you're looking at something where you're like ranking on page four and will probably be fluctuating quite a bit there because all of these other sites and other pages are also competing for those areas. So that's something where I would say this is kind of normal. You might see some things go up over time. You might see some things go down. And it's really a matter of focusing on your website, making sure that it's the best possible website of its kind. So don't aim for position 20 or something like that and say, I just want to be just as good as everyone else on page three. Really aim to be the best of its kind there. And saying what I provide on my side is actually much better than the first search results here. And over time, this is something that Google will try to reflect. And when we see issues around these type of ranking changes, we can go back to the ranking engineers and say, hey, look, this page here is clearly much better than everything else on the first page of the search results. We're doing our users at this service by not ranking it first. On the other hand, if we go to the engineers and say, hey, this page is clearly just as good as all of the other results on page three, we should be ranking it on page three as well, then they'll come to us and say, well, if it's just as good as the other ones, why should we change anything? Because we already have those other ones there. So that's something where, obviously, it's not easy, but finding ways to make sure that your website is really a step above everything else, that's kind of what I'd aim for. How many links with exact keywords are usually necessary to make a spam report? I don't think we have any absolute numbers there. I don't think that's something that the web spam team looks at on an absolute basis. But rather, when they see spam reports and they see that this affects enough users that it's worthwhile to do a manual change here, then that's where they'll take manual action. I set a filter in Search Console for a URL with a date range. Search Console is showing a total of four clicks, but only two clicks for this query. What's up with that? Usually, this is a matter of us filtering out individual queries that we think are so rare that it's not useful, on the one hand, to show them in Search Console. And on the other hand, there might be privacy reasons for us to filter these out. So especially if you're looking at smaller numbers like that, that's definitely a possibility. We have a bit of information about that in the Help Center for Search Console. So I have a software blog that includes download links. So should I create new posts for every new version or update the old posts, the posts will be almost identical except the version. So I personally would recommend just updating the old content if you're making changes to these things. One thing I'd kind of watch out for with regards to a blog that links to download links is that you're really providing something useful and compelling there, that you're not just linking to illegal downloads of existing software. So really that you're providing something that makes sense to show in the search results separately and that you're really providing enough significant value of your own there that it doesn't just come across as, oh, I'm linking to 5,000 torrents on my blog and I call it a blog, but actually it's just a list of links to all of these illegal download sites. That's probably not what you'd want any for. Let's see. Some people are having trouble joining, but I think we're probably full. So there's room for 10 people a year in the Hangout. And otherwise, you won't be able to get through unless someone else jumps out. One page no index for one crawling in 404 for one crawling after indexing again will recover to the same ranking as it was previously, maybe not. So you're kind of in the situation where you remove a page from your website and you wonder, like if I put it back, will it start ranking again? And it's possible that we can pick it up again and use those existing signals again. It might also be that we think, oh, this is actually a significantly different page and we need to treat it like a new page rather than kind of the old page jumping out and jumping in again. In Search Console is a way to filter IP level impressions or clicks because I do these searches myself. No, that's not possible in Search Console. I don't think that we would provide such a tool either. I think that's something that you need to do in analytics itself. For desktop pages, we have corresponding mobile page. Why does Search Console give us the option to filter the mobile device? It's hard to say there. So we show that filter in Search Analytics when we see that mobile users have searched for this content and we've shown it in the search results. So that's not something that we base purely on the current setup. It might be that something along the way was such that these mobile pages were visible, or these desktop pages were visible for mobile users as well, which might be a sign that maybe you're set up with the mobile pages and the desktop pages isn't completely perfect. My site's not getting any traffic from Google. It's properly indexed. What could be the problem, I guess? That's kind of tricky to say without actually looking into the details there. So what I would recommend doing there is posting in the Webmaster Help Forum, giving some insight into what you're trying to do and what you're trying to provide there. I'm kind of worried because the domain is total health aids that you're providing some kind of health and medical information that actually you're kind of maybe not the right person to provide the set of information. I don't know. But it's something where I would definitely try to get the advice from peers to see, are you actually providing something useful there? Is that something that maybe there is a technical reason for it not to be indexed properly, but maybe there's just a quality reason where we look at this content and we say, I don't know if we want to actually show this content in search results for these types of queries. But getting kind of some tips from peers is usually the best way to move forward there. We have different niche services under one domain. Is that a problem? No, that's not a problem. Sometimes that's just the way it works. My law firm's website was built on Joomla with the native multilingual content. The website's content is Greek, English, and Russian and being served under one domain with slash en for English, slash ru for Russian. I added a JSON-LD script with schema to the homepage with basic information about the website. However, the script was added in English. So what's the best way to add JSON-LD schema tags for local business on multilingual site, which is built under the above structure? So I would probably look into putting it under the individual subdirectories as well. So if you have one clear sub-directory for your business like this and one clear sub-directory for your business like that, then I would add that information there separately. I don't know where specifically we would double that up in the search results. So that's something where I can't guarantee that we will actually use that information one-to-one in search later on. But adding this type of markup is fairly easy, so I would just give it a try and see what happens over time. Major indexing issue, our recipe is indexed but not in the carousel. What could be the problem here? So I took the sample in URL and I sent it off to the team yesterday to kind of double check and they say everything is working fine. The important part to keep in mind is that these carousels have normal organic ranking as well. So it's not the matter of us being able to pick up the markup and saying, okay, we will rank this right on top in the carousel, but really it's something where even within the carousel we do have to think about ranking as well. So that's not something where just by adding that markup you would automatically be shown there in the carousel. You'd be eligible to be shown if you have this markup in the right way, but you wouldn't be guaranteed that you'd be shown in the carousels. We have one site that targets different countries. Each country is assigned its own CCTLD. There are some technical reasons and we can't implement hreflang. Recently we've discovered that a search result for query shows like a title from one country and a link to another. So on Google.at we get a title from the Swiss shop and a link to the Austrian shop. What's up with that? So hreflang is definitely a good idea there. I think that if you can find a way to implement that that would probably be a really good approach, especially if this is really kind of equivalent content across different country versions. On the other hand, if you see this kind of mix between country versions, then quite often that's a sign that these pages are just so similar that we index them as one version. So if the Swiss version of the page is essentially the same as the Austrian version of the page, maybe like small tweaks and maybe the title is slightly different, then we may choose to just index one version of this page and swap out the titles. That makes it a little bit easier for us in ranking to kind of show your side. But it also means that you might see this kind of mix between the two language versions. So what I would recommend doing there is firstly making sure that these pages are really as unique as possible, that you're really providing something unique to the Swiss users compared to what you're providing into the Austrian users and the German users. And then if at all possible, I think the hreflang tag would be really useful there. But the main issue here is really, probably just because we think these pages are too similar. I posted in last week's hangout about the issues that we've been seeing with regards to domain change that we did there, where the new domain has a history of spammy adult content and we've requested to be removed from the safe search list from the help center. The response was that this is being processed. So how long does this take? Our new domain has a much higher domain authority than we ever had with our old domain, but we're not getting as much traffic. So what's up with this? So in general, this is something that does take a bit of time to be reprocessed. So that's not the case that you would really see a change jump from one day to the next. Sometimes these things do take a bit of time to be processed by the team, the submissions that you have there. And even after it's been processed, it can take a bit of time to kind of bubble up in search as well. The other thing to keep in mind is that just because a domain has much higher metrics based on third-party tools doesn't mean that you will get anything out of that when you move to that domain. So just by moving to a domain that has a much higher like domain authority score in this case, you don't necessarily see any change in search from that because Google does not use domain authority. This is a metric that is created by a third-party site which might be really useful to kind of understand different domains and different websites, but it's not something that we would use. So moving to another domain in the hope that your website will suddenly rank a lot higher just because you switched to another domain that has a history is not the case. So it's not the case that your site will be ranking higher just because you moved to a different domain. It's really a matter of us seeing your site and ranking your site as it is. And if you move to a different domain, then we'll try to rank that other domain the way that we would have ranked your existing site. So that's not the case that you can just go out and buy a domain that you think, oh, this is a really strong domain. I'll just move to that domain. And then suddenly my website will be ranking a lot higher. That's definitely not the case. In Search Console, there's a menu of links to your site. I see many links to PDF files that are on my website, but when double checking, it looks like these links are not actually to my website. Well, what's up with that? In general, this shouldn't be a problem. This is something that might kind of bubble up like this in Search Console, which is mostly a reporting issue there. On the other hand, if you want to send me those details of which site you're actually looking at there, I'm happy to double check with the team to see where the status is coming from. We're moving from HTTP to HTTPS, and in doing so, we update our site to be mobile responsive and much faster. However, we're doing this in Waves. So we're moving some content in individual steps. We can't move everything to HTTPS yet. Some content will be duplicated, some not. How much of a problem is essentially this going to create? So in general, you kind of have to do it the way that works for you. It's not the case that we'd be able to say you need to do it like this and only like that. If you plan on moving to responsive, I think that's a fantastic move. I think that's something that's always worthwhile doing. If you plan on moving to HTTPS, that's also a great move. Sometimes you can do that all in one. Sometimes you have to do that in steps. The one thing I'd like to kind of mention here is that especially if you're doing bigger URL changes within your website, like you're going from subdomains, subdirectory, moving things back and forth a little bit, then these are essentially bigger changes within the website that we have to reprocess and re-index everything. So you will definitely see some fluctuations happening in search as we reprocess this new structure of your content. So it's not the case that I would say that the final state would be worse, but you will definitely see fluctuations over time as we pick up that new configuration, those redirects, maybe rel canonical changes, and kind of have to understand a new site structure compared to the old one, which signals we can forward, which ones we have to kind of learn new again. So if you're doing these in waves, then just kind of keep in mind that every time you do a bigger wave like this, you will probably see some fluctuations from search. If you're sure that the final state is the right one for your site, then sometimes you have to take that into account and say, OK, I'll bite the bullet and just kind of live with these fluctuations during this time. I'll try to get those waves through as quickly as possible so that we can kind of move to the more stable state situation a little bit faster. But it's something you just kind of have to keep in mind, especially when you're looking at the metrics during these types of moves, you need to understand that this is kind of a temporary state that takes a while to settle down again. Just having the main keyword on both H1 and H2 is considered keyword stuffing, not necessarily. So keyword stuffing is more a matter of us really finding significant reuse of the same keywords over and over again. And usually what happens is our algorithms will pick that up and say, oh, this keyword is so common on this page. Maybe it's not really that important after all. And we'll kind of try to ignore that to make it a little bit easier for us to process the rest of the page. All right, let me try to see if I'm missing anything from the chat. Wow, lots of things there. Does Google consider canonical pages content as duplicate content? Not really. So if you have two URLs and one has a rel canonical pointing to the other one, then we will focus for indexing on the one that you try to specify as a canonical. So there's some tricky aspects there in that we use more than just a rel canonical to determine which of these pages. But if these are identical pages and you're telling us this is one that you prefer, then we can understand this is a set of pages. And we will just focus on one of those pages as the content. And even if we index both of these pages, then it's more a matter of us picking the right one to show in the search results. And not a matter of us kind of demoting or penalizing a site for having duplicate content like this. Does paid link submitting tools actually work? I contacted some websites for guest posting, and they replied with $10 for each link. So on the one hand, it sounds like you're trying to get guest posts made for your site. That's probably not a good idea. We did a recent blog post about that. On the other hand, the spam report forums definitely work. There's something where the spam team is really happy to get these kind of reports as well. If I focus my English default directly on Ian, I think we looked at that. Google is sending good number of traffic to our desktop version, but not to the mobile version. We have a normal mobile layout, no issues in Search Console. It's crawlable. We also have an AMP version for most of the pages, but we still have very low traffic. What could be the issue here? Is it the user experience issue? It's hard to say. For the most part, we do have the same ranking on desktop and mobile, but we try to adjust the mobile ranking to make sure that the mobile users have a better experience in Search Results, which is primarily because on mobile you're much more limited. You have a slower connection, and you kind of need to have a good experience there. So that might be something that you're looking at. On the other hand, since you mentioned you also have an AMP version, it might just be that you're tracking those numbers slightly incorrectly. So specifically, if you're looking at analytics and you're not tracking the AMP analytics in the same place, then it might be that you're just looking at the mobile analytics and not even realizing that the AMP pages are actually getting significant visibility, too. It's difficult to control thin content and common content in e-commerce sites, specifically in product description, almost more or less the same words are used, and it should be no index and follow. Yeah, I think this is probably a tricky thing to do, but it's something that we're really looking for. So when we look at websites, we really want to see that they're providing something unique and compelling and something that adds value to the rest of the web. So if everyone else is using exactly the same product descriptions, and your pages are only these product descriptions, then we'll look at them and say, well, we already have this content here. Why do we need the same thing yet again? So that's something where having good product descriptions makes sense. Maybe finding ways to improve those pages overall, even with the same product description. So maybe you have reviews from existing clients that you can pull in. Maybe you can encourage clients when they buy something to leave reviews in a good way. Maybe there are other things that you can do to add value to these pages so that it's not just the same text, but actually there is this piece of text, which is the same, but this whole page itself is actually so unique and valuable that it's really worthwhile to rank that properly. I implemented HTTPS three weeks ago. My pages are indexed when I do a site query, but I can't see the cached pages when I do a cache query. Why is that? That can happen. So the cached pages are not necessarily a reflection of what we actually index, but rather this is something that happens separately on the side. So that's something where sometimes it takes a bit of time for the cached pages to catch up. Sometimes it's just a matter of the normal search results where we don't show a cached page for every page that's out there. Would you recommend to use a CDN in a multilingual website, or do you think that hreflang is enough? So these are kind of completely different things. A CDN is a way of serving your content, which could help to make sure that your content is really fast to users worldwide. hreflang is a way of telling us you have the same or equivalent content for different users. So there's simply different things there. So with hreflang, you could say this is the German version. This is a French version, or this is a version for Germany. This one is for Switzerland. With a CDN, you're basically saying, this is my website. There are lots of pages here. They might be for different countries and different users. And the CDN takes care of serving that in a really fast way to users. Various internal links with various anchors from blog posts at the same domain go to category and subcategory pages. Is that OK? Or should we stop? That's fine. I don't see a problem with that. If that works for your users, if you're kind of getting people to the right place within your website, that's perfect strategy. No problem there. All right, let me just double check what else we have from the questions. If there's anything on your mind here live in the Hangout, feel free to jump in as well. There's one question here with hreflang, where basically I have the same URL pointing for different language versions or different country versions of the same language. That's perfectly fine. You could say this is the version for Germany, Austria, and Switzerland. And this is the version for all other German-speaking users, for example. That's perfectly fine. Totally up to you. You can also include the X default, where you say this is version maybe for Germany and Austria. This one here is separate for Switzerland. And my X default is also the version for Germany and Austria. That's totally up to you. I work with clients, with pages provided by different partners under different directories with regards to sitemap index file. If I place my sitemap index file under slash search with a directory of those links, is that acceptable? If I have the sitemap file under slash shop, it'll only have for shop or slash catalog. Yeah, so with regards to the scope of a sitemap file or the URLs that you can have in a sitemap file, we have that fairly well documented in the Help Center. On the one hand, if you just anonymously submit a sitemap file for us, the scope will have to be under that place where the sitemap file is hosted. So if you submit it to us under slash shop, the URLs will have to be under slash shop. However, if you submit it to us within the Search Console or within the robots.txt file, then we understand that actually, you place this file here, but you own the whole domain. So therefore, we can apply the sitemap file to the whole domain there. So that's something where, depending on how you submit the sitemap file, it's slightly different. If you use robots.txt, then obviously, that will just work. I have different subdomains for the same service for different countries. Now, one of those country subdomains has slowly moved to domain services. So I guess this is kind of like you move your content from subdomain to a different subdomain. That's something that can happen. That's, I don't know, that happens with some sites. So from our point of view, that's not a problem by default, but rather something where we have to process that, re-index those pages. And it just takes a bit of time for us to actually kind of deal with all of those changes. So it's not so much a matter that you shouldn't ever do this kind of move within your website, but just that you should keep in mind that any type of migration within your website will take a bit of time to actually be re-processed and re-indexed. All right. I think we actually got through pretty much all of them. Oh, let me see. One more. The Android app search console data shows 7,000 clicks on a query level and 7,000 slightly different number on a page level for a given date. But the analytics report shows something very different. What's up with that? So I'm not actually sure what's currently the state of the app indexing data when you're looking at things in Search Console. What I would recommend doing there is contacting the Firebase team that's currently doing the app indexing things within Google. They have a contact form where you can submit your queries there directly, and they can take a look to see what the details are there if there's something within the way that you're tracking those numbers, maybe that's slightly inconsistent, or maybe even if there's something on our side which is slightly inconsistent that we need to improve as well. So I really go and contact the Firebase team on that type of issue. All right, you're a quiet bunch today, but that's fine too. It's early in the morning here. If there's anything else on your mind, feel free to speak up. Otherwise, we can take a break here for the weekend. That's fine too. OK, so in that case, let's take a break here. Thank you all for joining. Thanks for all of the questions that were submitted. I hope you found this useful, and maybe we'll see each other again in one of the future Hangouts. Thanks again, everyone, and have a great time. Bye.