 All right. Welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a webmaster trends analyst here at Google in Switzerland. And part of what we do are these office hours hangouts, where webmasters and publishers in general can jump in and ask us questions around search and their website. And we try to help clarify anything that might have come up since the last hangout or otherwise has been on people's minds. As always, if any of you are kind of new to these hangouts and want to get started with the first question, feel free to jump up now. I see some new questions. John, can I ask one question? All right. I think you submitted it well. So we'll get to some more. Actually, one was when high priority for me. So this is why I just wanted to clarify this. Is there any change from last two weeks in terms of I am just typing WW page name in Google search and it is showing WW and m.booth? I don't think there's any generic change around that. That sounds more like we don't understand the connection between the mobile and the desktop page. So maybe they're not linked well. Maybe that's something you could look into. Because usually, if we understand that connection, then on the mobile page, we would have the canonical to the desktop page. And we would just show the desktop page on desktop and the mobile page on mobile. And if we're showing both of those, then somehow we're not seeing that connection between those two pages. OK. I'm checking that those alternate tags and redirects and to boot TXT, these three things. Thanks. All right. Perfect. All right. Any of you other kind of new to these Hangouts folks that would like to ask a first question before we get started? Otherwise, you're welcome to jump in with your questions along the way. We'll definitely have time for more live questions as well from all of you towards the office hours. Definitely have time for more live questions as well from all of you. All right. Let me just grab some of the questions that were submitted. And we'll see how far we can make it. Glenn asks if there's been any change with regards to the disavow file. And there's some confusion out there as to whether or not it should be used or not or when it should be used. So essentially, nothing has changed there for quite some time. The disavow file definitely makes sense if you have a manual action that's based on link issues and you can't clean those links up. So ideally, you would go and try to clean those links up to either remove them or have a no file place. If you can't do that, then the disavow file is a great way to do that and can help with the resolution of the manual action. So after you've taken care of all of those problematic links pointing to your site, you can submit a reconsideration request. And the web spam team will take the disavow file into account and lift the manual action if that's appropriate. With regards to sites that don't have a manual action for link issues, we do try to take those links out of the equation automatically when we can recognize them. In general, that's something we're pretty good at. We have quite a bit of practice doing that. So most of the time, we can get that fairly well. If you're unsure as to whether or not Google is actually taking those into account or taking those out of the equation, then the disavow file is a great way to get peace of mind and to say, well, I'm sure these won't get taken into account by any of Google's algorithms. And that way, you're absolutely certain that you're not associated with those links to your site that you can't remove or change. So in particular, if someone is pasting links on a spammy site or maybe a previous SEO went off and ran some script to place links in a bunch of forums, and you can't clean that up for whatever reason and you don't have a manual action but you want to make sure it doesn't even get that far, then the disavow file is a great way to just preemptively say, well, I know about these issues. I don't want to lose any sleep over them. I'm just going to disavow them and get them taken out of the equation. So what do you do if you're using a popular software out there, I'm not asking for you to answer on behalf of them, but just in regards to, they're saying that, oh, you have like 300 toxic links coming to your site this week, do something about it. And so how do we take control of that? Don't take control, don't do anything, right? It's totally up to you. I mean, if you're looking at that and you're like, well, these are all random gibberish things that any cheap algorithm will be able to figure out, then maybe you can just ignore that. On the other hand, if you're really worried about it because it's bubbling up some things that might look problematic, if someone manually looked at it, for example, then that might be something you'd put into disavow file. So the frequency is something to you, it depends on your website. Most websites never really have to worry about this. So I think the largest majority of all websites out there, they don't have a disavow file. They don't know what this is for. They don't need one because they're like spammy junkie links out there all the time and we take care of those. So for the most part, it's not that you need to go out on a daily basis and look at all the links to your site and kind of enumerate them manually one by one and go through, yes, no, yes, no. It's really just the case that if you run across an issue where you see this larger scheme kind of put in place that you really don't want to be associated with, then that's something you could put this in. All right, I have a client and in his link profile, okay, we're getting more link questions. In his link profile, there are links from forums where there are new forum members starting healthy discussions with links in context in natural forms linking to my client site with the URLs only. I should have been worried about these links because they're from new members. I don't have a reputation and lack posts without links. So I think first of all, it sounds like you've kind of looked into this link issue a little bit more than just casually noticed this. And it feels like maybe someone on your site is involved with these new forum members starting healthy discussions with links in content and they've never posted on these forums before. That sounds kind of iffy to me, just like reading this question. So if you're somehow involved with these new forum members starting healthy discussions, then that might be something to kind of back off on if that's like someone else within your company that's organizing this, then I'd recommend they kind of back off on that as well. Because if you're essentially going to these forums and just dropping links there, then that's not something that the forum appreciates. So that's kind of just as a side note from this question, it sounds a bit iffy. It might not be iffy. It might be something where you're really surprised that people just love your stuff and suddenly they're talking about your stuff on forums which can happen. So in general, just because a link is from a forum doesn't necessarily make it a bad link. So that's something to kind of keep in mind. This can be a normal recommendation for a website. So maybe that's perfectly okay. But still kind of at least the way that this question is phrased, I kind of try to be like extra honest with yourself and really think about like where these people might be coming from and if that's something that your website is somehow involved in, then maybe back off on that. But even if these forums do all like no follow, it's not an issue, right? I mean, Google doesn't have anything against forums or anything like that. No, we definitely don't have anything against forums. Forums I think are a great place to discuss things, a great place to recommend things. Even if they're no follow though, it might be that if you're going out and dropping those links into that forum, then that forum probably doesn't appreciate that. So that's kind of the thing to also keep in mind. It's not just Google that's out there looking at these forums. They have a community as well that is active in those forums. And if someone sends a bunch of people over there that just go off and start new accounts and drop links into forum threads, then that's probably not appreciated by those forums. Exactly. About three months ago, our site suffered a drop in rankings. We've taken steps to rectify this by adding better content, improving the user experience, and as a result, our user engagement metrics have improved, and our domain name is Trending on Google Trends. So that sounds pretty good. How long does it typically take to see improvement in the quality of the site reflected in Google rankings? Site, there is no fixed timeframe from making changes on your website to that being reflected in Google search. And we take a lot of different signals into account when we look at a site and try to figure out where its ranking should be. So just by improving some things on a site doesn't necessarily mean that the whole site will just jump up in rankings suddenly. Usually what happens there is that there are a lot of indirect effects in play in that if users really start loving your site, then they'll start recommending it, and we can pick up those recommendations in the form of links and take that into account with regards to search. And that's something that has kind of a certain period of time that it takes for this process to just kind of work. And even if people come to your site and immediately link to it because they think it's such an awesome site, that still takes a bit of time for everything to reprocess and kind of re-understood for our systems to understand that actually the site is not as bad as we thought initially. Look at all of these people who are really big fans of the site. Maybe we should adjust our rankings appropriately. So I think this is something where I'd see a timeframe of maybe like up to a year or so where it just takes a period of time for everything to be reprocessed, re-indexed and all of these signals to be recalculated and to take a look at the new overall picture of a website. John, that is a per-page basis or per-site basis? It really kind of depends. So for some things, we can pick things up on a per-site basis or other things. We look at it on a per-page basis. It's something where we try to take the appropriate signals into account. So it's not just one or the other and kind of black and white. There's a question in the chat. PageSpeed has an enormous impact on our rankings at the moment. We're seeing ranking losses with every performance issue we've had in the last couple of weeks. Would you say Google is using PageSpeed data from analytics as a ranking factor or user data like increase in bounces due to bad performance? No, we don't use analytics data at all. With regards to speed, we primarily differentiate between sites that are really, really slow and everything that's kind of normally fast. So if you're seeing speed differences with regards to the loading time of a page on the order of a couple of hundred milliseconds or a second more or less, then that wouldn't be reflected in search. That's not something that our algorithms currently take into account. I could see that maybe playing a bigger role in the future at some point, but certainly at the moment where we're not taking it into account that much. So you're talking this is that mobile and desktop? Is it the same? Yeah, at the moment, we're pretty much doing it the same across desktop and mobile. Obviously, some of the factors are subtly different on desktop and mobile because we have different data to kind of work with. So the ranking wouldn't be the same in general, but I believe with regards to speed, we're doing that the same on desktop and mobile at the moment. I could imagine us changing this maybe with the mobile-first indexing or something like that, but that's not currently the case. And sorry, one follow-up. Is this also related with crawling? It's the same? Are you doing the same mobile with desktop, not ranking-wise, but crawling-related? So with regards to speed, particularly, or how do you? Just regarding speed issues. With regards to speed, OK. It's tricky, how can I say? So the speed that applies for crawling is essentially the speed of us requesting an HTML file from your server. So kind of the raw request for the HTML page from the server is what can affect the overall crawling rate of the website. So if we see that a website is really slow to crawl because all of the requests that we send take a lot of time to come back, then we'll generally back off with our crawling rate. So we won't request as many pages per day from that website as before, just because we want to make sure that our crawling is not the reason to slow this website down. So that's kind of one subtle effect that comes into play there. The other thing that can come into play is when it comes to rendering the pages. If it's really complicated for us to render pages, then there's a chance that we won't be able to render them completely. So in particular, if you have a JavaScript framework and everything is pulled in with JavaScript, and your JavaScript requires hundreds of include files, to even get processed, then there's a chance that we won't be able to get through to all of those hundreds of include files to actually process the rendered view. So if you have a static HTML page, then that doesn't matter at all. But if you have a JavaScript-based page and you have a lot of include files that make it really slow to be processed in the browser, then that also makes it hard for us to kind of process those pages on Googlebot's side because we really, from a practical point of view, we can't just request 500 different include files just to be able to render one page. Is it true there's a limit to how much you can fetch and render? Like if you're over 11 times and it's saying temporarily unavailable to crawl and so on, you guys can say, hey, come back in 24 hours. Yeah, that's documented in the help center. I don't know what the limit there is. It's something per day or something per week or per month. I'm not sure. John, one last thing on the data in Search Console, it's only about the fetch, not the render, right? The crawl stats in Search Console are only about the fetches, yes. But the thing there is they include all of the fetches for content on your site. So that also includes things like the AdWords landing score, pay check, things like images and better JavaScript files. All of those are also in the number of pages that are fetched there. All right, then there's this really long question from Javier, which is kind of tricky to go through in a live hangout like this. I really recommend trying to follow up on that in the Webmaster forum. It looks like you've started a thread there already. So it sounds like you moved from subdirectories to subdomain, and you're currently not showing up in Google News. That's something specifically with regards to Google News and the Google News One box that we show. That's something you'd need to take up with the Google News publisher folks. So in the Google News publisher forum, maybe do a post there instead of the generic Webmaster forum. And there is, I believe, a Google News publisher center where you can submit changes like that and make sure that that's updated in our records on our side. Because Google News does watch out for which sites we know about and include in Google News. But I'm not a Google News expert, so I don't really have those details. What specifically you might need to watch out for? That's why I'd recommend going to the Google News Publisher forum. And in general, short questions work a lot better for these type of hangouts, easier to understand and process. What percent of automatically translated pages is acceptable to send for indexation? Does Google Translate provide translations with enough quality for indexation? So the first one is zero, and the second one is no. In the sense that, from our point of view, automatically translated pages are automatically generated content, and that's not something that we recommend making available for indexing. So if you're using Google Translate, then I would recommend using Google Translate as a tool to help you create manually translated pages, but not as something that you would use one to one for content on your website. I think a lot of you have used Google Translate for various things, and it's getting better and better. I think it's really good at some times, but sometimes it's still the case that you don't really understand what the actual meaning is, and that's probably not how you want to present your content on the web, kind of in saying that, well, we have local content in your language, but it doesn't make any sense, maybe, or we don't really speak that language, so we can't tell if it makes sense or not. But really, if you want to provide translated content, then make sure that it's actually translated content, not something that's automatically generated. My site rarely goes up in search, and when it does, it just goes back down. Today it dropped nine places, though today the two sites that were above it dropped the same amount of places as well. In September, I took out direct requests for links, and in October, I improved titles and headings of 1,200 forum posts. Should I just no index my forum completely? What's up with that? I answer for why things go up and why things go down. We improve our algorithms all the time, and the web is very dynamic. So in particular, when it comes to forums, sometimes there's a lot more content out there. Sometimes we get a lot more signals for some specific sites, and sometimes we don't get a lot of signals for other sites, so that's not something that I would say would be normal for a site to stay in the same ranking forever. In general, if you're seeing your site fluctuate wildly, that's more a sign that you're kind of on the edge with regards to our algorithms that our algorithms are sometimes thinking, oh, this is actually pretty good. And sometimes they're like, maybe it's not as good as I thought. So that's kind of a hint that it doesn't take a lot more to push it in one direction or the other. But at the same time, these are things that don't happen from one day to the next. So if you make bigger changes across your website in September and October, then I wouldn't expect those changes to be reflected in rankings like a couple weeks later. That's something where I look at it more as something that you're doing for the long run, where maybe over the next half year, maybe over the next year or so, you'll start to see those changes. And with regards to no indexing the forum completely, from my point of view, for most forums, I think that would be way over the top because there's often a lot of really useful content in these forums. But it really depends on your website. And some forums are filled with spam where nobody's been paying attention. I suspect that's not the case with your website because you're working on it so actively. But I can't just blindly say no index your forum or you shouldn't no index your forum. That really depends on what you've kind of collected over the years. Was there any indexing related issue recently? Some web pages are out from ranking on their main keywords and less related pages are appearing. There are always changes in search. So we make, I think, thousands of changes every year that we push out to search. So to some extent, it's kind of normal to see changes. And if you're looking at it in a very granular way, then you might see changes that look very, very big. But from our point of view, from a bigger picture point of view, maybe they're not that big overall. A question about an H1 tag. The website uses H1 on the home page, which appears letter by letter as if someone is typing the phrase, so kind of like this banner, old school-type thing. Is that a problem? Is that a good practice or a bad practice? So if the HTML for that page actually has a normal H1 tag in place with the full content, then we would probably just pick that up and use that full H1 tag as the heading on the page. A really simple way to check for that is to search for that kind of text explicitly in search. And if that page comes up, then we're picking up that heading. On the other hand, if you're using something like JavaScript and you're actually swapping out the characters individually, then it might be that we're just seeing individual characters kind of come and go. And we index one state of a rendered view there, which just has a handful of those characters in there. And in a case like that, you wouldn't expect that site to show up when someone explicitly searches for that phrase. So that's something you can kind of try out there. I don't know if this is actually from a usability point of view, probably not so much. But sometimes these kind of flashy effects are part of a company's identity. And it doesn't make sense to just kind of get rid of them just blindly. So I would double check to make sure that we can actually pick it up. And if we can pick it up, then kind of think about the usability aspect. If we can't pick it up, then think about whether or not it's actually something important for your site or not. Maybe you don't really care about that phrase. Maybe it's just like a slogan that doesn't really have that much to do with your actual content. Why would my home page list above a dedicated page for a specific service? That's a good question. So I think that's something that can sometimes happen and isn't necessarily a bad thing. In particular, that can happen if we find similar content on the home page. So a common scenario that we often see is that you have a blog and you have an excerpt or kind of the full content of the blog post available on the blog home page as well. And in cases like that, if someone is searching for that content, we could find that content on the home page. And we could find it on the blog post page itself. And it's sometimes unclear to us which one of these pages is actually the best one to show to users. And then we might show both of the pages in the search results. So maybe something similar is happening here in that you have a lot of good information on your home page. And you also have a lot of good information on that dedicated page. And if someone is explicitly searching for maybe a title that you have on both of these pages, maybe we'll show both of those pages in search. What you can kind of do here is not so much tell us that you really prefer to have this other page ranking instead of your home page, but more to kind of guide users more to those detailed pages as well. So if you're seeing that people are going to your home page and you really want them to go to specific content within your website, then maybe from a UI point of view, make it a bit easier for them to actually find that content directly on your home page and go to those pages directly. That's not necessarily something that will immediately have an effect on your rankings. But over time, if users go to your detailed pages directly and they recommend those detailed pages directly, then that's something we could pick up kind of as an indirect signal. Are you guys ever going to go to the Google Sandbox? The Google Sandbox. I don't know how you mean. Well, I mean, new businesses wonder why they're on the seventh page and you need to explain. You need to give them like an hour and a half of lecture on why and so on and so on. So I was just wondering. That seems like a good service to provide kind of as a consultant. I mean, in general, it's hard to say that there's anything that we can really change there. Because especially if a website is new, we just don't have a lot of information about it. And we have to make a rough judgment call and say, well, we think we'll start with it here in the search results. Sometimes that's fairly high. Sometimes that's a bit lower. And over time, as we collect all of those signals and kind of process them, we can make those adjustments. So I don't think it would be the case that we'd ever be able to say, well, this business belongs in this place in the search results and it's going to be the perfect placement from the beginning without us actually knowing any of the details around that business. So I think there's always going to be this effect of we start somewhere and then it kind of shuffles up or down depending on what the reality turns out to be. One said that, hey, I've been around since 1973 and why should I go on probation now? Well, then it wouldn't be a new website. No, they never had a website with a certain name, product law. Well, we don't know about it then. If they never had a website, it's like, we can't really, I don't know, we wouldn't have any signals about it. We don't go around asking people on the streets, like, hey, have you heard about this new website that we just ran into? So yeah. So I don't see that really changing. Let's see, question from the chat. Is it OK for a site not to use H2 and organize its content only using the main H1 and then all subsections have H3 only? You don't need to use headings. You can use headings to make it clear which parts belong together. But we're not that picky with regards to headings. Some sites have multiple H1s on a page and kind of organize content like that. Some have a clearer structure with H1, H2, H3. Some just go from H1 directly to H4. We try to pick up the content as we find it. So some do it more kind of theoretically correct and some do it more organically. And we have to relive both of those versions. What about not using any headings? That works too. I mean, it's a bit harder for us to actually understand the content if we don't have any way to kind of organize pieces of it. But it's possible. Or using bold or using ESS styles to kind of tell us, well, this is actually kind of something important on a page. All of that kind of helps. And this is from an indexation perspective or also ranking? From understanding the content and from there, essentially, that flows into ranking. So especially when it comes to understanding which of these chunks belong together, that's really important for us to kind of have a block or something that we understand belongs together. So a really common example is you have an image. And below it, you have a paragraph of text. Then it's easier for us to understand that this image probably kind of belongs to that paragraph of text. And we can kind of pull out some information from that text and apply it to the image for image search, for example. Whereas if you just have like one just long pile of text and you have a random image that's placed somewhere else, then it's really hard for us to say, well, what's the context between these individual items on a page? All right, another question from the chat. We see a lot of URLs by robots text, but no impact on the indexation. Is this something to be concerned about? Or, Gus, is Search Console is just telling us here where Googlebot is disallowed, kind of like a booking funnel? Would it be OK? Or are there possibilities to check which URLs are blocked? Yes, in Search Console, you have the, I think it's blocked URLs report. I'm not sure what the exact name is, where you can see which URLs have blocked content on them. The idea there is not so much that the landing pages are blocked, but that there's some embedded content on those pages, which is blocked by robots text, which means that if those pieces of embedded content are bringing additional content into that page, then we don't know about that. And that might be something that we don't know about, and then we can't use it for ranking. So that's kind of where this comes into play. It's not so much that we'd say this is a bad thing, and you need to kind of let us crawl everything. It's more the case that we don't know if it's a bad thing or not, because we can't look at those URLs. So we can't tell if we're missing something or if we have everything. So in particular, if you have tracking pixels there that are blocked by robots text, then that's something we don't really care about. There's nothing that we would miss by crawling that tracking pixel. On the other hand, if you have a JavaScript API that's pulling in content from your server and displaying that on the page, and that's blocked by robots text, then that would be content that we would otherwise miss. So that's kind of what we're trying to do there, is tell you that we don't really know what we're missing here. You have to look at this list and use your judgment to tell us whether or not everything is good here or not. So John, in the index status report, that's not only pages. Those are basically any URLs that are blocked. So CSS files that are blocked would show up in there. For example, image files or anything like that. You mean the new report, right? No, the standard one in the index status where you can see if there are blocks. I don't know exactly the terminology. Blocked pages or blocked URLs? I'm not sure. I don't know. So I think that's just the one with the graph, right? OK. That would be blocked kind of landing pages. So that wouldn't be specific to the embedded content. Because embedded content is not something that we index on its own. These are essentially like the one that makes a lot of content that you have blocked. Try to mute you. Sorry. Kind of landing pages. So that wouldn't be specific to the embedded content. OK. Good. Thanks. All right. Yeah, so that would be specific to normal landing pages that we try to index. So nothing to do with CSS or images. But what if it's like a feed or something like that? It has to be an HTML file? It has to be linked in a way that we assume that it's an HTML file. OK. Let me run through some of the other questions. My blog traffic dropped in half in the past two months. And now even Google index status says that it dropped in half. I checked everything, backlinks. It doesn't seem to be a problem. What might be the issue there? So if index status is actually going down, then that sounds like a technical problem. That's not a quality problem. So it wouldn't be associated to the content you have on your site or the links to your site. But that sounds like something where we're actually not able to crawl your pages in a way that we can actually use them for indexing. I would really focus on technical issues there and see if there's anything kind of weird happening with your pages. Use maybe Fetch as Google in Search Console to double check that we can crawl those pages. Use Fetch and Render to make sure that we can actually see the full content of those pages. And if you can't find anything there, I might go to the Webmaster Help Forum to kind of get advice from peers as well. Maybe there are some things that you're kind of missing, that you're not noticing. Maybe there is a weird hack on your website in a way that is redirecting Googlebot. All of these things might be an option there. But especially if you're seeing the index status report go down, then that's a really strong sign that there is something technical that's kind of these URLs from indexing. And if we drop them from indexing, obviously, we can't show them in Search and we can't send traffic to those pages. So I'd really kind of focus on a technical aspect there. Using React.js, we create links available on user interaction in order to filter products on the product pages. So in a certain price range, delivery type, bots can only see the root URL and the source code without URL parameters. Is this considered cloaking or not? So I'm not sure how you're filtering between bots and normal users. That's essentially the aspect that we would look at when it comes to cloaking. In general, cloaking is really a big problem for us when it comes to things that kind of apply to web spam issues, where we see completely different content when we crawl that page. If you're doing things like this, including a URL parameter or not, then usually that would be less of an issue. However, it does make debugging things a lot harder. If you're not sure what Googlebot is actually seeing, if it's not the same thing as you see in the browser, then that makes it really hard for you to debug issues around, like, is this page being no indexed or not? Are there other issues happening with these pages? That's kind of the tricky part there. So what I'd recommend doing there is not relying on the fact that something is using JavaScript to kind of block indexing, but instead, just make sure that you use all of the normal techniques for faceted navigation, even if you're using JavaScript to create that faceted navigation. And we have a lot of information on faceted navigation in the Help Center. So I take a look at that. Is it possible that Google shows different answer box for search, depending on the place where the search is made, even within the same city? So if I search for Starbucks Madrid, then it shows me the nearest Starbucks phone number. Yes. So this isn't really an answer box. We call them, I think, one boxes. And what you're seeing there are probably just the local search results from map listings. And obviously, if we have an idea that you're searching for something very local and we have your location, for example, if you're using mobile phone or a laptop that we understand its location of, then that's something where we try to show local search results. So because that's kind of what you're looking for. So that would essentially be the normal for us to do. Usually, this wouldn't apply to the normal organic search results. So if you're just moving around within a city, then I don't think we would show different ranking or kind of featured snippets or anything different like that. You would probably still see differences just because we do experiments all the time. And there's a lot of personalization that falls into kind of normal organic search. But you generally wouldn't see us changing the ranking significantly just because you're like one street down on the road. For the mobile version, I see almost 100% AMP results ranking for all queries. What if I had a whole website in AMP rather than just having an AMP version for article pages? Would that improve the user experience? I don't know about the user experience, but you can definitely make your whole website an AMP. There are, I know of a couple of websites that are fully AMP, like the ampproject.org is a website that is completely written in AMP. It doesn't even have kind of the traditional HTML version of it. It's all essentially AMP HTML. There are a bunch of other sites that also have kind of used the AMP framework as a way of building a normal website, building a normal responsive website. So you can definitely do that. That's something totally up to you. Really, from our point of view, AMP is shown on a per URL basis. So on mobile, if we have an AMP equivalent of a specific URL, we'll try to show that. If we don't have an AMP equivalent of a URL, we'll just show the existing mobile or desktop URL, whatever we have available. If you redirect Googlebot between mobile and desktop pages based on user agent, do you still need real alternate tags on desktop pages and canonical tags on the mobile pages? Yes, yes, you do. So we can try to recognize a lot of these patterns. But the clearer you can make that connection between those pages, the more likely we'll actually follow your lead and understand what you're trying to do. So I still recommend using the appropriate markup there as we have a document in the developer site. Does it make sense to redirect old AMP pages to new AMP pages? Yes. So the AMP cache will try to update the AMP pages as well. And if the old AMP pages just return 404, then we'll probably just drop those old pages out of the AMP cache instead of understanding that actually they moved to a new URL. So I would make sure to also redirect old AMP URLs to new AMP URLs, and the same thing also for images. So that's something that's often forgotten when you migrate a website from one URL to another or the website's URL structure. Images need to be redirected as well so that we can kind of keep those in place when it comes to image search. Otherwise, we lose the old image. We drop it from image search results. And images generally take a lot longer to be refreshed. So it can take a certain period of time for us to actually pick up the new image on the URL. So anything that you're kind of using on your website that you're changing the URLs for, I'd recommend setting up redirects for that. I'd like to know if I need to add JSON-LD markup on all pages of my website or just the home page. For example, the organization or the web page markup, it really depends on the type of markup that you have there. If it's organization markup, I believe that's something that you can just put on the home page because that's where we picked that up from. Other types of markup would need to be done on individual pages. So if you have recipes, then obviously those recipes need to be on the recipe pages. And you might not have that recipe markup on the home page. So it really depends on the type of markup. About site maps, one month before one month, I submitted a site map. All 17 pages are crawled. But when submitted yesterday, it shows only one page indexing. What might be the problem that not all pages are indexed? So two things. On the one hand, a site map file doesn't guarantee indexing. So we might choose not to index everything that's linked in a site map file. That can always happen. The other thing that's more common, especially when it comes to smaller sites, is that we use the site maps index count on an exact basis per URL that you specify. So if your site map file has a URL that's subtly different than the one we actually use for indexing, then we wouldn't count that as being indexed. So I'm just going to guess here if you have 17 pages and one of them is shown as being indexed, probably that one page is your home page because it's hard to kind of guess that URL incorrectly. But maybe the other URLs in your site map file are different than the ones that are linked within your website. So that could be things like dub, dub, dub, or non-dub, dub, dub, maybe a trailing slash or no trailing slash, maybe that HTML at the end or not. All of these things are differences to us. And if we index the URL in a different way, then they don't count. So what you can do here is double check your site map file and compare it to what you actually have linked within your website. If someone wanted to start an online store, would you recommend them to use their name brand as a domain or to better by domain that contains some keywords? So recommend if you're planning on being online for the long run, be yourself. Be your name instead of just a bunch of keywords. Because there are kind of two aspects there. On the one hand, if you have your name brand than people who already know you, they can search for you. They can find you directly in the search results. And on the other hand, what might also happen is that at some point you want to expand or kind of change your focus subtly. And then suddenly you're stuck with a domain name that has those keywords in there that you focused on when you first put your website out there. So that's something where, especially when you're first starting online, you probably have very different ideas than you have maybe five or 10 years down the road. So you might start off with, I don't know, cheap blue shoes. And then suddenly you're like, oh, actually, I want to sell sportswear. And then you have cheapblueshoes.com as your website. And you're selling something that's subtly different and that doesn't even match. So you're kind of like, you have a brand and it doesn't match your domain name and it doesn't match what you're actually doing. You're kind of in this weird situation. So I'd recommend trying to use your name as your brand or whatever you have kind of built up on so far and using that as your domain name instead. Do RSS feeds help with SEO? Hard to say. So we use RSS feeds as a way of finding new content, the same that we use them as that we use sitemap files. So if you have an RSS feed and you don't have a sitemap file, then that can definitely help us to pick up that content. But it's not the case that we'd say, well, this has an RSS feed. Therefore, it'll be a much better website. We'll rank it higher. It's more a matter of discovering that content. We're rebranding from A to B. Is it advisable to redirect B.com visitors to A.com before rebranding? Probably we'll launch B.com next month. So I guess the question is, if you want to redirect people before you actually make that change, I would just do it all at once. So I think it's kind of, I don't think you would have any additional value from redirecting people from the new domain to the old one and then redirecting them back to the new one at some later point. So I don't think you kind of gain much from doing that. I'd like to know if Google looks at WordPress Tag sitemap as important for SEO. I see photographers using tags excessively. We do see tag pages kind of on a site, essentially the same as any other kind of link on a site. And if that's the only content on a page, then we can kind of pick that up. But for most pages, we have enough other content. So the tag pages kind of help us to crawl through the rest of the website. But it's not the case that they're critical for actual ranking of the content within the website. All right, what else can I help you all with? I need to head out a couple of minutes early, so we don't have that much time for more questions. But maybe I can get one or two more in. I have a quick one in the chat if you can address it fast. OK, Search Console HPE account. So you've moved to HTTPS and you're wondering what's up with the data in HTTP. I think for the most part you can just ignore that. If you have redirects set up, then that data will kind of subtly shift over, but it's not something that you really need to watch out for. Even after two years and even if it shows new content? I don't think there be. I mean, the only thing I would worry about is if you're seeing crawl errors there, which would be a sign that the redirects aren't that visible. But if the redirects are there, then that will just maybe be found that way and kind of move over to the HTTPS one over time as well, even if it's been a while. Cool, thanks a lot. I want to ask something regarding the recent. Can I? All right, go for it. Yeah, regarding the recent changes that Google has made when we search and Google gives a simplified result. For example, when I search, when is Diwali Festival, it gives a date. So most of the people will not come to my website. And almost I would say that 50% will go without visiting a website from which Google has taken the result. And I want to know that in that case, the person, the webmaster is losing something. Of course, the user is getting, gaining his time is saved. But the one who is creating the website is losing. I think that's always tricky. If you have content that can be simplified down to one date, then that's not a lot of content. And that could be visible in a snippet or title as well. So that's something where I don't really know what the best approach there would be. My recommendation would be to really make sure that you have significant value on your site so that users still go to your website for queries that are relevant to your site. And not just for ones like, when is this date? Because that can be something that they can see in a variety of places. It's not unique to your website. But it's a tricky situation. I mean, it's always, yeah, it's like you're creating content. And suddenly, we say one word, then that's something where maybe you need to think about what can you provide that people want to find more than just one word. In most cases, Google just finds the exact thing. And maybe the content is more than the date. But by showing that. Yeah. I mean, it's tricky because I think if users really want to see that content, they'll be able to get to that content anyway. And if you can tell that they don't actually care about the rest of the content, then maybe it's worth thinking about where can I provide content that users do actually want to read the full content for. So that's maybe worth rethinking like the strategy of your content or what you've kind of been focusing on there. Make a video about the time and everything else. Do a video about it, and you'll stand out even more. That might be an option as well, yeah. But I mean, it really depends on what people want. If they just want the date, then they're not going to watch a video just to see that date mentioned. Yeah, I mean, you guys are looking always for different things, unique things, and so on and so on. If I make a video of what happens in Toronto at 1057 AM, then maybe people will be interested in it, like what's going on. Yep, maybe. I mean, it really depends on what you have and what kind of content you're providing and what people are looking for. But it's certainly an idea. I mean, talk to people who've been active on your website. All right, I need to head out. So I need to kind of close out a little bit early. I saw Hugo just added a gigantic question. I'd recommend maybe posting that into the Webmaster help forum and checking what peers have to say about that there. I'd have to kind of read through it a bit more in detail to actually understand everything there. But otherwise, I'll set up the next batch of Hangouts probably this week. And we can kind of follow up there with more questions from you all. All right, thanks, everyone, for coming by. Thanks for asking so many questions. And I hope to see you all again in the future. Bye. Bye, John. Bye, John.