 All right, welcome everyone to today's Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a Webmaster Trends Analyst here at Google in Zurich in Switzerland. And part of what we do are the Office Hours Hangouts, where folks can join in and ask any kind of Webmaster web search related questions that they might have. The YouTube live Hangouts is going away August 1st. So we'll be switching to a slightly different model. I think we'll try to do it with the Hangouts Meet setup, which is kind of tied into Google Calendar. And I'll try one of those on Friday to see how it works. On the positive side, there'll be more room for people to join in live. On the downside, we can't live stream them. So I'll record those and put them on YouTube afterwards. But I think the general setup with you all submitting questions and me trying to find answers for you, that will remain. As always, if any of you want to get started with the first question, feel free to jump on in. If not, that's fine, too. We could talk about updates if you want. Updates. I don't have any update news. I saw lots of blogging and tweeting on updates. So I don't know what specifically is happening there. I don't know. We'll see. I haven't chatted with Danny about that. So not quite sure. OK. But I mean, we always do updates. So I think it's more a matter of, is this one of those core updates? And it's like, which month will it be called after? They named this one. They called it Brechtabke for Webmaster World, called it the Maverick update. The Maverick update. For the new movie, I think. But that's not even out yet. That's kind of like calling an update something that hasn't even happened yet. That's pretty crazy. It's like calling it the Christmas update now. I don't know. That wouldn't make sense. I don't know. I can't blame you. I can always blame you. Don't worry. I think it's a blame Gary. It's better that way. OK, we can blame Gary. OK. All right. Otherwise, I'll just get started with what was submitted. And we'll see where we can go from here. Can I set a page URL slash best leather jackets when the title and the H1 of my page is to invest in other jackets? Will that keep you warm in the winter? Will there harm rankings of that page if those exact queries are in the URL? I want to set it like this because I might update the product count and add more details to the page in the future. I don't want to change the URL every time I update that page. Will Google consider this to be unnatural keyword stuffing? Users won't mind. So I think Google shouldn't amount that page based on those exact queries, right? That's a great question. From our point of view, you can call that page or use a URL or whatever you want. Some people use text like you're suggesting there. Some people use numbers. It's essentially up to you. The URL or the keywords in the URL are something that we do look at slightly, but it's a very, very, very tiny factor. So it's not something I would really worry about too much. I think your general goal of keeping the URL the same is great. That's generally something I would aim to do. Every time you change a URL and you have to set up redirects, you kind of run into that situation where you have the redirect set up. You have to keep maintaining those redirects for essentially forever. You have to track the links going to the old URL and the new URL and update the links going to the old URL. It's always a bit of a hassle. So anytime you can look a little bit into the future and think about what you want to have on a page in the future and making sure that that URL still works, from my point of view, is fantastic. That's especially the case when you're looking at the overall URL structure of a site. So this is just one blog post or article. If you set up your website in general to kind of have a structure that works for you in the long run, then that makes it a lot easier to keep those URLs. So if, for example, you just have an e-commerce site now and you plan on perhaps setting up a blog or setting up, I don't know, product reviews or something like that, in the future, then maybe it makes sense to have that e-commerce section into a sub-directory from the start or somewhere slightly separated so that you can have multiple sections of the site over time as you kind of grow. So instead of having the situation where you set up the e-commerce site in the root of the home page, and a year later, you have to set up redirects from all of those URLs to certain things slightly different, you're already from the start in a place where you can expand your site in the directions that you think might make sense over time without having to revamp everything else on the website. So from that point of view, anytime you can look ahead and pick a URL that works for you in the long run, that usually makes things a little bit easier. That also goes for domain names. So if, for example, your website you're thinking of, well, I want to write about leather jackets because I think leather jackets are really cool and maybe have some expertise on that topic, but at the same time you think, well, maybe in the future I want to write about jogging shoes or something else, then it probably makes sense to pick a domain name that's a little bit generic enough that's maybe something that you can brand in various ways. So once you do kind of get to the point where you're saying, well, I've written everything about leather jackets that I can't possibly write about, now is the time to start expanding into shoes, then you don't have to think about, well, do I need to get a new domain name? How do I set up redirects from the old one to the new one? What do I have to watch out for? What kind of technical issues might I run across when I try to do that? So any time you can think ahead and pick URLs that stay for a longer time, I think that's effort that's well spent. You mentioned that Google's algorithms have become critical in the health and medical niche over the past year, and it's pretty clear that's the case after seeing the volatility across many sites in that space. If a site doesn't entirely focus on health medical and only has some older content related to health medical topics, is it best if they just remove that content from the site? For example, maybe they just aren't qualified to write about certain health topics, but they did so in the past. Would Google's algorithms be less critical of that site over time if that your money or your life content was removed? I think that's kind of tricky. It's one of those topics that we've talked about a little bit with the ranking and the indexing teams over time, and it kind of bubbles down to you know that some of the content on your site is not fantastic. And you're kind of wondering what should you do about that? Usually the two options, or maybe three options that you can think about there, are on the one hand improving that content. That's something that the ranking team is always kind of encouraging us to tell you about. So from their point of view, if you know that your content is low quality in some regards, then find a way to improve that content because you obviously put that content out there for a reason. And if you want to keep that content out there, then find ways to improve that content so that you don't have to kind of think about this situation while Google recognize that my content is bad. Because if your content isn't bad, then there's nothing bad to recognize there, right? The other alternative is essentially just to remove that content. That's what Glen is suggesting here, that you put this content out there, and over time you realize that maybe it's not something that you want to stand for. And you decide to just remove that content. That's obviously an option as well. Again, talking with the ranking team, usually they tell us, well, people shouldn't just remove the content if they can improve it. So I don't take that with a grain of salt, if you will. I think sometimes a website just grows over time that it's unreasonable or unfeasible to kind of go through and improve all of the low quality content that you know about. So that might be something where you say, I really just don't want to focus on this niche anymore. I'll remove that content and move forward with the rest of my site in the direction that I've chosen to go. So that's another alternative. And a third alternative is kind of something in between where you say, well, actually I want to have that content on my website. I just don't want to have it findable in Google. And for that, you would just put a no index on those pages. I think that's a reasonable alternative if you have a really good reason for that content to be on your website. And you need it there for users when they're interacting with your website. For example, if you feel that your category pages are really not something that should stand on their own because they just have snippets of information. They're not really that exciting. Then maybe you'd put no index on those so that users can still go there if they click around on your website, but they're not indexed by default. And therefore, Google won't take those into account when evaluating your website overall. So those are essentially the three options that you have there. And from my point of view, I think you can pick whichever of those options makes the most sense for you. From our ranking teams point of view, obviously, they like to have high-quality content. So they'd like to encourage you to improve the quality of your content if you can. I think from practical reasons, sometimes that just isn't that easy to do. So pick whichever one works best for you. From our point of view, either of any of those options, if you do that, when we reprocess your website over time, we'll see that the slow-quality content is actually gone. And we can focus on the better quality content and evaluate your website based on that. John, I've actually got a quick question relating to that, if it's right to ask. All right, go for it. I actually know a company, Loosely, that works in CBD at all. And they're looking to obviously build out their content strategy, but they actually know another brand that got hit by men. They're quite nervous around it. I mean, they're wanting to write about it, but they're not medical experts themselves. And whoever wrote it wouldn't be qualified. I mean, do you think it's a bit of a risk them going into it, or do you reckon it's a necessity? I mean, which way do you see it almost? Because obviously, I know they are quite apprehensive about getting hit by anything and then losing their sales pages. I don't know. I don't know how they would see themselves in this regard. It's something where we don't use things like the quality-rater guidelines as a one-to-one ranking factor, where we say exactly like this, and then we'll rank it exactly like that. But rather, that's something that our quality-raters do or look at when we give them, essentially, search results to compare to see which of these search results is better, which direction should our algorithm revolve over time. So I would certainly take the quality-rater guidelines and use those as a guide for situations like that. But obviously, there's no absolutely answer there. And it's something where, if you're already saying, they know a lot about this, but they're not experts, then that is already kind of a shaky foundation, right? So that's something where maybe it makes sense for them to work together with some experts and to put together some real expert content where it's clear to anyone who's reading those pages that, actually, this information is correct and it's trustworthy. It's something that an expert has written or an expert has reviewed. And accordingly, it's essentially something that anyone can take and forward to their friends and say, this is kind of something to watch out for or something important that you should read on this topic. I think it is. I think it's always tricky with these kinds of situations because it's not like a technical thing where you either do it right or you do it wrong. So it's hard for me to give an answer that is like, yes, you should do it like this or no, you should do it like that. Thank you. All right. Long question. We've created our website in 2013. Our SI was above 10 in March. I'm not sure what SI is here. We then switched to a different domain name to connect the Austrian page together with our German and Swiss pages together after climbing until six SI. I'm not sure what SI is, but we were kicked back down to two within a few weeks. So a few questions. Why does a transfer of trust from the old to the new domain take so long? I really don't know what you're looking at with regards to transfer of trust. So in general, when you do a site migration, so when you move from one domain name to another, that's mostly a technical change. And the important part for us is really that we can map or ideally map everything one to one from the old domain name to the new one so that we can forward essentially all the signals from the old domain to the new domain. So in particular, if you have a site structure on the old domain, make sure you have the same site structure on the new domain. So it's not just a matter of URLs from the old domain somehow redirecting to the new domain, but really mapping one to one from the old domain to the new domain. So when you have it set up like that and you use something like the change of address tool in Search Console, which checks a few of those factors too, then usually the site migration there should go fairly smoothly and fairly quickly. I've seen some cases where people are saying, well, within two, three days, everything was back to kind of the same situation afterwards. So that's kind of from a technical point of view a site migration that some of the things you should watch out for. We have a ton of detailed information in our help center on specific items to also watch out for. So I double check that to make sure you have all of that covered. Then there's another aspect that you have mentioned here that you now have a German domain and a Swiss domain. I'm not sure if you had that before. That's something that you also changed with this move. Any time, assuming you added these with the site migration, then you're back in that situation where you don't have a one-to-one mapping anymore. And any time you take one domain and you split it up into multiple domains, so in this case, maybe you had one global domain, you split it up into the Austrian, German, and the Swiss domains, then that's something that does take quite a bit of time to be processed, because we have to essentially reprocess all of these domains individually and understand how URLs on each of these domains should be evaluated and how they should be shown in search. So that sometimes takes a bit of time. That's another thing to watch out for. A third thing to watch out for, I think you're hitting all of the complicated topics, is if you have Austrian, German, and Swiss content, if this content is essentially the same content just on different domains, then what will likely happen is we will pick one of those URLs for each kind of set of URLs that we have there and choose that as a canonical. So with the hreflang, assuming you use the hreflang markup for these pages, we will still swap out the URLs depending on the location. But we'll pick one of them as a canonical URL and that's the one where we'll pull the information from for indexing. So from a practical point of view, what that means is in the search results, we'll try to show the appropriate local versions. However, in Search Console, where we show information by the canonical URL, both for the indexing information as well as the search performance information, we will show it by canonical URL. So it'll look a bit skewed there, assuming you have that situation that pages across the German, Austrian, and Swiss version are essentially the same content just for different countries. Then again, in Search Console, you would have a really tough time understanding which of these country versions is ranking which way, which makes it a little bit trickier. And then finally, from the domain name, it sounds like you do have a business model that's focused on some financial industry content here where I have seen a lot of reports in the past couple of months where sites were seeing fluctuations based on the way that our ranking algorithms have reevaluated the relevance of these individual sites. So that could be flowing in there as well. So you have a bunch of things which are kind of happening at the same time. And assuming you moved your domain in March and you split it up into these three country versions and your domain is in that financial industry, then you have a few things that could be making your life a little bit trickier where you might want to, from a first point of view, make sure that everything technically is OK with the site migration, especially if you're splitting things up across multiple country versions and double check that all of that is OK, then double check, of course, that the metrics that you're tracking are actually useful metrics for your website. So in particular, if we are picking a canonical across these different versions for some of those pages, how are you tracking that? How are you tracking the traffic from search? Is that something that you're tracking overall or are you looking at it per country where maybe it's hard to compare our kind of the previous situation to the current situation? And then thirdly, make sure that past all of these technical details that you're really focusing on all the right elements on your website itself, that you're providing content in a way that is useful and relevant for users, and that you're covering, essentially, some of the things that we've been talking about with some of the newer updates. So I think you kind of go into the updates themselves as well. Last year, there was a Google update for Your Money or Your Life. Is that the reason for the new finance pages to rank behind the old pages? Again, these kind of general, we call them core algorithm updates, they don't focus on things where we say you're doing things wrong and you need to fix them, but rather we essentially try to reevaluate how we determine the relevance of a website overall. And that's not something that is based on kind of the age of the domain. So if you move from one domain name to another, that doesn't mean that your website will never have a chance to compete with the old websites. If you've done that domain move properly, then you're essentially in the same state as you were before. So that's something where it might be that either you did something from a technical point of view, slightly complicated or incorrectly, or it might be that just about at the same time there were these algorithm changes that also affected your site in that way. Should I prevent the old domain from being indexed with robots text? No. That definitely not. If you have an old domain and you set up redirects from the new domain, then I'm not going to put it in the table. I don't know. Somewhere. That's all right. If you set up a redirect from the old domain to a new domain, then you should not block that by robots text. We should continue to see those redirects. Looking for site colon old domain in Google cache, why does the old domain appear in the search results and the title and description match to new domain? That's something that is our algorithms trying to be helpful and it's confusing in this particular case. So our algorithms are seeing that you're searching for the old domain name and they know the old domain name is associated with that site that we know about. So we'll try to show it to you. If you look at the cache page, you'll see that this is actually the new domain name. So that's a little bit confusing as a webmaster and kind of meant to help users who are searching for the old domain and don't know that it has actually moved on. For the query posters in Sweden, I can see that two of the domains have been given a carousel in mobile. Does this work for category pages for products? Should we send item lists with the products for category pages to increase their chances to get a carousel? Or are there other ways of using structured data that can help us to be given into the carousel? So the carousels that we show in search are essentially algorithmic and it's something where once we've kind of been able to pick up the content from the structured data on a page, we'll be able to show that. But that doesn't mean that we will always show that. So perhaps you're seeing these carousels for these sites for that query at the moment. Maybe for other queries, we're showing the carousel from your site. So that's something where you can't really force that type of rich result just by using structured data. I would recommend double checking the developer's guide for the structured data types that you're interested in to make sure that the setup that you're looking at is really compliant with our policies so that in the long run, it'll be something that works for your site and not something that you have to suddenly fix if it's not compliant with our policies. Our website is scoring low on PageSpeed Insights for pages with embedded YouTube videos. Without the videos, the score is 100, which has better SEO benefit. Pages with embedded YouTube videos or top score, that's really fast on pages without the videos. That's hard to say. So I think, first of all, I just need to make sure that it's clear that the PageSpeed Insights tool and, in general, our speed tracking for pages is not something that would exclude other Google products and services. So if you're using some product from Google or some service from Google to add content to your pages and that's slowing your pages down, we will show that in these scores. It's not something where we'll say, well, they're using YouTube and YouTube is fine, so therefore, we will just kind of ignore that in our scoring. If we see that you're embedding something, regardless if it's from Google or not, that's slowing your pages down, that should be reflected in the score. And that would be the way that we would take that into account when it comes to ranking things for kind of the mobile speed ranking change. So that kind of said, this particular situation with the YouTube videos is one that I hear every now and then. There are different techniques that you can use to embed YouTube videos on a page. Some of them are a lot faster than others. So that's something where I would double check to see how you embed these videos on your page, if there are ways to embed them that don't necessarily slow things down for your users. And to kind of think about what it is that you're trying to achieve with those YouTube videos. If it's really the situation where you're saying, oh, we don't care if these YouTube videos are on here, on these pages or not, we just want to rank well, then maybe that's a sign that those YouTube videos aren't really critical to your content. So maybe that's also a hint in the direction of, well, maybe you should focus more on the textual content than on the videos. On the other hand, if you're saying, well, these YouTube videos are critical for our pages, they're critical for our users to get a full picture of the product or the services or whatever we're talking about on these pages, then I would recommend to find a way that you can embed those videos in a way that works well for users that doesn't cause a lot of latency when the pages are loaded. And there are a lot of tricks around being able to make sure that they're better. I would recommend you to a team that's also working on finding ways to make that a little bit faster by default. Then we have a giant question on the topic of coupons. The content users are looking for is simple and terse. There are only a few ways to rephrase, say $5 with this code. On the surface, all results look alike. The difference in quality seems to be the number of coupons and whether these actually work, which is difficult for Google to observe. It seems that excluding results by detecting duplicate content doesn't work well at all. The results look very similar. This can be exploited on a very large scale by companies placing the same content trivially transformed on multiple domains. One tactic I saw predominantly in non-English countries was to add a big blob of text below the actual coupons. This could be to make the overall page look more unique to Google. Finally, the question is adding a long piece of text below the actual short content. Even if users ignore it, something that currently could help a site to rank better. So I think, first of all, I totally appreciate all of your research into these kind of coupon sites, these white label sites that you found in Flag. It is something that we're looking at internally as well to figure out how we can best handle these. So that's definitely useful. With regards to the general question of, should I add a big blob of text to the bottom of my content to make it a little bit more unique, that's something that we don't only see in non-English languages that happens across the board. It's really common, for example, on e-commerce sites on category pages where if a site is, say, too lazy or not that invested in creating category pages that are actually really good, then we'll often see all of the products for that category on top and then on the bottom in really small font to Wikipedia articles that essentially focus on those keywords in that category page. And from our point of view, that's something that generally doesn't work that well, because it confuses us fairly easily. Because on the one hand, you're bringing all of this e-commerce content. So in the case of an e-commerce site with categories, you're bringing the e-commerce categories on top with all of the e-commerce content, and then you're adding this essentially Wikipedia article full of informational content on the bottom. And the goals of those pieces of content is very different. And that makes it kind of tricky for us to understand what this page is actually about. Is this something that we should show for people who are interested in buying something? Or is this a page that we should show to people who are interested in researching a topic? What is kind of the goal of this page? So that, from my point of view, is something that probably doesn't do that much good for a site. In general, for e-commerce sites, in particular, my recommendation there would be to make sure that you have actually useful category content so that it's clear both to users and to search engines what this page is about. I can kind of understand the situation that you can't create content for every category page that you have. But at the same time, if you're just copy and pasting content like this or writing this big blob of text that's not meant for users to read, then that doesn't really do your site that much good either. So my general recommendation would be to more drop those kind of unnecessary blobs of text on the bottom. I would, however, still make sure that you do have enough useful content on that page. So if all of the content on the page is kind of these two, three-word snippets, like links to products within a category page or links to coupons on a coupon page like this, then there is not really that much useful information to kind of rank this page for. So instead of kind of going for the situation where there's not a lot of useful content to the situation where we're confused because of all of the content on this page, I would aim for a situation where it's actually very clear what this content is about and what the value of the content is. So to write kind of reasonable category content rather than just a blob of text. The other problem with adding a giant blob of text is that frequently it goes into the direction of keyword stuffing. So even in situations where we're not completely confused as to the intent of the page, we quickly run into a situation that, oh, this keyword is mentioned two, three, 400 times on this page. Maybe we should be careful on how we rank this page for that keyword. So keyword stuffing is one of those things which you probably want to avoid. So that's something that I tend to see more on these pages where you're adding just kind of textual content for the sake of having textual content where maybe you heard some guide somewhere saying, well, you need to have at least 500 words on a page. And our category page doesn't have a lot of text. Therefore, we'll put 500 words of text on this page. If you go down that route, then quickly you're in the area of we should write SEO text for these pages, which means artificially including our keywords so many extra times. And then even if search engines don't get confused about the intent of the page, they'll be in the situation where we don't know if we should really trust this page for these keywords because it's obvious that they're doing keyword stuffing. So I realize that doesn't help answer your question in a way that I say, yes or no, you should do this. Personally, I would tend to avoid going down this path and instead creating reasonable content instead of just a giant blob of text. OK. Hi, John. Hi. Yeah, John. I have the same related question in this content. Since we are talking about this content part. So my question here is, so let's say actually, I spoke at Google, so we are just focusing on user experience. OK. So let's say I have a website for gold rate. OK. In that gold rate, we can't be able to add much content. Of course, we can able to add, but since it's just a gold rate, I can just put what is a gold rate today in some country and with two, three line of a text. So is that a possible to rank in Google? Sure. I mean, it's possible to rank in Google for any kind of content. I think if you're going out there with the intent of I want to write a minimal amount of content and rank number one for kind of competitive queries, then that's probably not going to work out so well. But writing a lot of good content and writing really useful content for users, I think that's something that over time will work out for most websites. So the thing is like we can't be able to fight in ACRP with less number of content, right? So that will be the conclusion in this room. Can you repeat that? So you mean to say that the conclusion is if you are just having the minimum number of content, so we can't be able to place in ACRPs, right? There is no word limit for content to be shown in search. So it's not that you have to have minimum so many words on a page in order to be ranked. It's even so that if a page is blocked by robots text, it can still rank despite us not having any content from that page at all. So it's not that there's any minimum word length that you need to aim for in order to be competitive. Obviously, depending on the topic, depending on what you want to talk about for that topic, you might need to write more. You might need to write less. But that really depends on the topic. It really depends on the users how you're fulfilling that user need. And sometimes less content is more useful. Sometimes more detailed content is useful. Sometimes a mix across a website is useful where you have kind of introductory content and then more detailed content, which is longer. So it's not that we have any specific word count that you should aim for. Yeah, so that is also based on user experience. I think user experience is a good way to look at it, looking at maybe user studies that you do with your particular users. Assuming you're the expert in this field, you'll know a little bit more about which topics need more explaining and which topics need less explaining. So that's kind of where I would say you should kind of know where you need to have more content or you need to have less. Yeah, can I butt in a little bit? So John, if you were a search algorithm, how would you and what specific metrics would you use to figure out a good user experience on a website? Um, I don't know. I'd probably have to think about that a bit to see what would work well for me. I mean, it's something where if you have an overview of the whole web or kind of a large part of the web and you see which type of content is reasonable for individual topics, then that's something where you could potentially infer from that, like for this particular topic, we need to cover the sub topics. We need to have this much information or more images or fewer images on a page. That's something that perhaps you could look at like that. I'm sure our algorithms are quite a bit more complicated than that, though. OK, anything new and exciting coming from the Google Search Console team? But you want to add that? New and exciting coming from the Search Console team. They're working on a lot of stuff recently. So I think we'll see some more stuff happening. I know they need to migrate a lot of the old stuff to new Search Console, so that's definitely one thing that's also happening over time. But they're pretty busy. So I don't have anything in particular at the moment to pre-enounce that. OK, all right, thank you. Yeah, Joe, I have a question in the URL structure. So basically, we are following a folder level. Of course, people are following with the naked URL also, and people are following with the folder level also. So as I'm seeing in US sites, most of the sites are following the folder level structure. But in India, if you see not, everyone is following the folder level structure. India people are putting with the naked URLs after.gov. Of course, there is a priority and authority comparatively between the both sites. If you are just putting a naked URL, of course, that authority is good, actually. But again, when it comes to the folder level, Google can able to crawl it properly. So that is also there. So which is good to follow, either folder level structure or naked URL for a better ranking? For ranking, we don't care. So whatever works for you. Kind of similar to the first question, I would personally go into the direction of folder level just so that you have more room to expand over time. So something where if you have a clean folder structure, then you can add multiple categories without having to kind of re-juggle all of the existing URLs. So if you see sites like big sites like Amazon, they are not following that much folder level structure. But still, they are ranking. And of course, they are in top. So that was a question for us. Yeah, I mean, like I said, there's no ranking advantage. It's really more a matter of how do you deal with maintenance? How do you change things over time? There's no ranking advantage to having kind of like everything in one file name versus in multiple folders. There's also no effect of if you put content below five levels of folders, it will rank worse than if you put content directly on the root of the URL. There's no effect like that. OK, but that was a big note for us because sometimes it was ranking, sometimes it was not ranking, and the big sites like Amazon, they are following naked URL structure. And some big sites are following the proper folder level structure. Yeah, so yeah. I wouldn't call it proper folder level structure. I mean, if it works for a big site like Amazon, then I'm sure they put a lot of thought into it. And for them, that's the proper structure. But it's totally up to you in the end. So that's one thing where some sites like to do it in folders. Some sites like to do it everything together. I personally like folders because it makes maintenance a little bit easier. But ultimately, it's totally up to you. So these kind of URLs will never affect in any Google algorithms, right, even in future also. I have no idea what will happen in the future. If I could tell you what the future of the algorithms were, then I don't know. I would probably be doing something different. Yeah, because recently we got affected. And last year, we got affected in MediCAP data so much of even our category pages went down. So that's the reason even if folder level is a proper one, but naked URL is not a proper one. From my point of view, I don't see that changing in the future. These are just different URL structures that some sites have. Some sites use parameters. Some sites use folders with file names. Everyone does it slightly differently. The important part for us is that we can take that one URL that you have. We can crawl it. We can index it with that URL and pick up the content. How you determine which URL to use is ultimately up to you. The only thing I would watch out for is it should be, I think, less than 1,000 characters, which you probably have to work really hard to make URLs that long. Yeah, yeah, sure, sure. Thanks, Joe. Sure. Let's see. Let me run through some of the other questions. And I think there are some questions in the chat as well. My UX designers asked that they can move the H1 to the footer, same with the internal links. Does this have any impact on SEO performance? From an SEO point of view, I don't think that makes a big difference. It makes it hard for us to understand what the heading is for, so that's something that probably doesn't make that much sense. And from an accessibility point of view, that sounds wrong as well. But if you have a strong reason to put an H1 tag into the footer, then that's perfectly fine. With HTML5, of course, there are different elements on a page that can have H1 tags as well. So maybe they're just saying, well, we have kind of the article, the H1 tag in our footer, with an H1 tag as well. That's perfectly fine. A website has millions of pages indexed by Google that drive mainly irrelevant traffic with low engagement, high bounce rate. How might this affect the ranking performance of this small percentage of high quality pages, if at all? So I think your suspicion is probably right on here in that if you're already seeing that a website has a lot of bad content, then you're in that situation that we talked about in the beginning, namely, what do you do if you have a lot of bad content? In general, our algorithms do look at a site overall. So if we see that a large part of the relevant content on the page is low quality or bad content, then we'll have trouble trusting kind of the newer content or the rest of the content on the site. So once you're kind of in that situation where you're recognizing that there's a lot of kind of bad content on the site, then you have those three options that we talked about, mainly improving the content, removing that content completely, or just no indexing that content so that Google doesn't see it. Those are kind of the directions I would head there. It's really hard to imagine what kind of situation you're in where you have millions of low quality content or millions of pages with low quality content and a small percentage of high quality pages. That seems like something where you've built kind of a website on kind of shaky foundation, where it's probably worth thinking about how did you end up here and which direction do you want to go and how do you avoid running into that situation again. For a little over a year now, I haven't been able to get our sitemaps viewed in Search Console. When I submit them, they immediately go to Couldn't Fetch, and I don't see any requests in our logs. Any thoughts? So I would recommend starting a thread in our Webmaster Help Forum with the URL of your site in the sitemap file. And you can send me the URL of your thread, and I can double check with the team to see what exactly is happening there. We're a large publisher with a big international mobile audience. We're wondering if our US-UK rankings are affected by the percentage of users internationally with slower page loads, specifically, what is it, FID, Time to Interactive type metrics. We do look at multiple things when it comes to speed, especially on mobile. On the one hand, the kind of the lab tests that we do with things like page feed insights. On the other hand, we also look at things like the Chrome user experience report data that is more aggregated and tracked on a rough level. And that data is based on what users are seeing. So if a lot of your users are seeing really slow performance of your website, then that's something that might be worth figuring out how you can improve there. Sometimes it's as easy as adding a CDN in where the CDN might have endpoints in multiple countries. And then suddenly, it's like you have a local presence in those countries. So that's kind of what I would watch out for there. We have a Google News-approved news website. The content is primarily in English where we're planning to also publish other language content, say Hindi, Willis Effect in Google Discover Traffic or the website getting unapproved from Google News. I don't know how Google News deals with this kind of situation. I know there are not a lot of international news sites out there, so I assume this is something that should just work. With regards to Google Discover, it's also tricky to say how that works. But as far as I understand, it's based on normal search results there. So if these pages are indexed normally for your website, then we can show them in Google Discover, regardless of the language of your website. Does Google ignore the URL parameters that come after hash? In other words, to Google is slash categories hash some attribute the same? Yes. In general, we ignore everything after hash. There are two exceptions. One is the hash bang, kind of the hash, and then the exclamation mark, which is what was used in the old Ajax crawling scheme, which we kind of separate out individually and treat as unique URLs. And the other kind of exception is for a very, very tiny number of sites we've recognized that URLs with a hash lead to unique content. So it's not just going up and down within the page, but actually leading to unique content. And there we do sometimes index those URLs with the hash as well. But that's extremely rare, and that's not something that I would rely on. So if you're using the hash to change the content of your page, I would assume that we will crawl and index the URL without the hash, for the most part. If you're using the hash to jump up and down within your page's content, then that's perfectly fine. We tend to ignore everything after the hash. So things like links to the site, the indexing, all of that will be based on the non-hash URL. And if there are any links to the hash to URL, then we will fold that into the non-hash URL. All right, let me double check. What I missed here in the chat, is there any limit to the amount of code or JavaScript that Google will render? We have a lot of client-side rendered content by JavaScript, but not all of it appears to render in the inspect URL in Search Console. There's no absolute limit to the amount of JavaScript that we will process. There are a few quirks with the testing tool, though, that might be worth watching out for. In particular, in the testing tools, we optimize for speed rather than completeness. And we optimize for fresh content rather than just using anything that we have cached. That means for the testing tool, if we recognize at your page, loads a lot of JavaScript and loads a lot of resources with that JavaScript, using APIs to pull in content, then at some point, our algorithms or our systems might say, we're running out of time. The user is waiting. We need a result now. And we will stop processing the JavaScript and show the user the content as it is there. So that's something where, from a practical point of view, we tend to time out fairly quickly in the testing tools. And in indexing itself, we have a lot more time, because nobody is explicitly waiting for a result right away. So we will take more time to pull in all of that content. We will try to aggressively cache JavaScript and server responses to make sure that we can get all of that into indexing as well. If you're seeing that it's still not being processed properly in the index version, and you know you have a lot of requests that are made, so if you use something like the Chrome Inspect URL tool, look at the Network tab, and you see hundreds and hundreds of resources being pulled in, then that's something where it might make sense to figure out ways to combine those resources and to have fewer resources that are required to render the page. OK, let's see. Two weeks ago, someone suggested alt text should include keywords that you're attempting to rank for. Say we are a site about roof tiles, and the picture is someone sitting in a park bench. For accessibility, a description like person sitting on a bench in a park, but the guest suggested it should be person on a park bench thinking about roof tiles. To me, this is bad. As the photo doesn't suggest this, I always thought Google would infer this from the image based on the content on the page that it's embedded in. I think you're totally right. It is something where I would not artificially just stuff keywords into the alt text. The thing with the alt text is kind of twofold. On the one hand, we use it as part of the HTML page. We use it as part of the HTML page. So that's something that could be seen as part of the visual page. You can, of course, just make that text visible on the page. It does the same thing. The other thing that we use the alt text for is to better understand the image. And we try to better understand the image for image search. So if you have an image of someone sitting on a park bench, then the question is, what would you expect that to rank for in image search? In a way, that would be useful for your website. So in particular, if someone is searching for a person sitting on a park bench, and we were to show that image because that's kind of what it is, how would that be useful for your website in that general flow of this person searching for someone sitting on a park bench comes to your website, sees that image, and thinks, oh, that's a nice image. But they're really not interested in roofing tiles. So what is the purpose of this image with regards to image search on your website? And if it's really completely unrelated, then I don't think you have a lot of value of that traffic from someone searching for person sitting on a park bench. So it's not something that you would need to artificially improve to have that ranking or that visibility in image search slightly differently. It's also something where if you were to add kind of roofing tiles to that alt text and someone were to search for roofing tiles and see this image of a person sitting on a park bench, then they're probably not going to go to your website because it feels like, well, they're searching for roofing tiles. They see an image thumbnail for someone sitting on a park bench that doesn't sound like what their intent is. So I guess what I'm saying is if you're using images kind of in a descriptive way, then I would certainly focus on using an alt text that works well for accessibility reasons to make it so that those images, if they weren't shown that a user who is accessing that page without those images being visible, still has enough context around the content of that page. And on the other hand, if you're using images in a way that are critical for your site's content where you really want people to go from this image in image search to your content, then obviously make sure that we understand that this image is actually about your content. So if we have an image about roofing tiles and you have an alt text that includes roofing tiles, that works both for accessibility reasons as well as for image search. So those are, I think, kind of the two things that are worth differentiating, which I think a lot of SEOs tend to focus purely on, I need to get all of this content into image search. And from a practical point of view, you can do that. But I don't think you get a lot of value back out of that. If you just blindly put them into image search with your keywords on there, nobody is going to go to your website after seeing a thumbnail of a person sitting on a park bench if they want information about roofing tiles. And similarly, the people who are looking for an image of someone sitting on a park bench, they're probably not going to go to your website and say, oh, yeah, roofing tiles. I've wanted to buy those for a long time. I now need to buy some. That doesn't work either. So kind of think about what you're really trying to do with image search. And I think there's a lot of potential there. And it's something that SEOs who are thinking a little bit more strategically about the user journey of how users might search for content, come to your website, and interact with your content. Those are probably the ones that will be able to pull this off in a way that doesn't just kind of focus on the massive images in image search, but rather focuses on getting the right images in the right place with the right context so that users can find those and go to your website and do whatever it is that you'd like them to do there. Can't Google but to read the Facebook comment on my web page. I don't know. So that probably depends on how the embed is done on your web page. In general, I'd recommend using the Inspect URL tool to see how your web page loads. You might even make a test page just with that embed for kind of the Facebook comments setup so that you can test exactly that setup when you make able to tweak different settings or try different things out. And based on that, you'll be able to figure out, like, how can I get my comments in? Or maybe you even decide, like, I don't want my comments indexed. That's fine, too. All right. I think that's pretty much it. I'm sure more questions will have dropped in in the meantime that were kind of out of time to handle. Yeah, looks like a bunch of stuff has happened as well. So yeah, if there's something critical in those new questions that you already added, feel free to copy them over to the next Hangout. And we can take a look at them there. You're also welcome, of course, to go to our Webmaster Help Forum. There are lots of fantastic product experts that are active there who can kind of help you figure things out or help discuss different options if there are different things that you want to look at. They've seen a lot of interesting, confusing, and weird websites over the years and can generally give you some tips on which directions would be feasible or reasonable to go into. If there's something else, of course, feel free to reach out to us on Twitter. And if there is a short question that you'd like to have answered in video form, we're going to continue doing the Ask Google Webmasters kind of video series, which we recorded a bunch of episodes recently, so hopefully those will be out soon. But we'll continue to make more episodes, so feel free to continue using that hashtag as well. All right, so as always, thank you all for your many questions and feedback and suggestions. And on Friday, we'll be doing the next one with a slightly different layout, so the links will be different. It might be a little bit confusing in the beginning. Hopefully it works out. We'll probably have to try this a little bit to see what the optimal setup there is. But I'm sure we'll find a way to make that happen. Algorithm changes don't only affect websites, they also affect how we do these hangouts. All right, so with that, let's take a break here. And I wish you all a great week until Friday. Bye, everyone.