 All right. Welcome, everyone, to today's Webmaster Central Office Hours Hangouts. My name is John Mueller. I'm a Webmaster Trends Analyst at Google in Switzerland. And part of what we do are these Office Hour Hangouts, where people can join in and ask any question related to the website and web search. And we can try to find some answers. Looks like people are still jumping in, which is cool. There are a bunch of questions that were submitted already on YouTube, so we can go through some of those. But as always, if any of you want to get started with the first question, you're welcome to jump in now. Hi, John. Hi. How are you? Pretty good. How are you? I'm good. John, I have two questions. So one of the question is about one of our clients. We are trying to build a new website for them. Now, when we did some computer research, we have found some of our computer. They build a website that will be different to it. For example, they create every single page for every single type of search term. For example, GardenShift Sydney, they create one page. Then GardenShift Cell Sydney, they create another page. And those pages actually rank on Google. And each page has a lot of content. Now, I don't know whether it is the best practice to do or not, because if we try to create a page in this way, the website will have more than 5,000 pages. So what is the best practice? What we can do? Yeah. So in general, if you're just creating pages for variations of keywords and you're all guiding people to the same content, then we would consider that doorway pages. And that would generally be something that the web spam team and the search quality team wouldn't be very fond of. So that's something I would try to avoid. On the other hand, if you really have unique content that you can bring for individual variations of keywords, then that might be something you can do. In general, my recommendation is to try to limit the number of pages that you have on a website to focus more on quality and to make those pages that you do have much stronger. So instead of taking a website that's ranking OK and diluting it by creating 10 times or 20 times the amount of pages out of the same content, make those pages that you do have much stronger instead so that they rank better. So that's the balance where I would tend to go more towards concentrating the value on those pages, making them stronger. So that they can rank better for the more competitive queries. And if you want to expand from that, then I think that's generally fine. I just wouldn't go the extreme of saying, I will create pages for all variations of keywords, because probably you're just making it so that a page that ranks kind of in the middle of the pack, I guess, would suddenly, because it's so diluted, just not rank at all in the search results. So you might appear in the search queries for some of those keywords, but nobody's searching for those long tail variations that often. So it's not really that valuable. And the last question is, this is also for the same client. So they actually sell GardenShade. So when we search GardenShade on Google, we have noticed that fast 15 websites are e-commerce websites, except our one. So is this something Google decide, OK, for this keyword, they will only give importance to the e-commerce website? Because my client's website is a little bit different. What they do, if people see their product, they can use the created quote option. They contact with them and give them code. They don't give the price on their website. So it is a little bit different from all the e-commerce website. So is this something Google always decide, OK, for this keyword, the e-commerce website will get the most benefit? Is there something like this? It's not so much that we would say, for individual keywords, we would prefer e-commerce websites, but we do try to understand what the intent behind the query is. And if we can tell that the intent is something where the user wants more information, then maybe we'll show more informational pages. If we can tell that the intent is more commercial, more transactional, that they want to buy one of these things, then maybe we'll show commercial pages more. But I think the difference that you're looking at is more that both of these types of pages are commercial pages. It's just one that lets you buy it right away, and the other one has maybe a more detailed process to buy the product, which, from my point of view, those could be equally valuable. It's really hard for me to say that I would assume our algorithms wouldn't differentiate between those two. They would say, these are both e-commerce websites. One is just you call them, or you send them an email or fill out a form to get to buy the product. And the other one, you go and click the button to add it to your cart to buy the product. But they're both e-commerce websites. So if someone is looking for something with a commercial intent, then both of those kinds of sites would be relevant to show. Thank you. I think the bigger effect is probably more what happens afterwards when people come to your site, if they're able to convert well. That's something that's more, I guess, within the control of the site owner, and less something from the SEO side that they have to worry about them. So you are trying to tell that the algorithm decide based on the intention of the user. So if user prefers e-commerce website for that keyword, that means you are trying to rank e-commerce website for that keyword. It's not so much e-commerce websites, but we try to understand what the intent is. And if the intent is really to buy something, then we will try to show something where they can buy something. And a lot of times that intent can shift. For example, if a completely new kind of product comes out on the market, then if you search for that, initially, you want information. And then maybe a couple months later, when it's available broadly, then people want to buy it. So the same query can, at one point, be informational. And at some later time, be more transactional, more commercial oriented. Thank you. All right. Any other questions before we jump into the submitted ones? Sure, I just had one, actually. So I have a few clients in the same industry. And over time, they've actually started working with each other like naturally. But as the webmaster of these multiple sites and that network is starting to grow a bit more, is there anything in particular I need to do so that Google doesn't think I'm doing a PBN or something because they're all closely related, for example, a hotel that sells the tours of the other company I work with? I think, in general, that's less of an issue. We're able to understand when sites work together, when sites belong to the same company, all of that. And that's generally less of an issue. If you're cross-linking between these kind of sites, that sounds pretty reasonable. On the other hand, if it comes across as something that's clearly an advertisement, then that might be something where you just use a no-follow link. So if you have something that looks like a banner on top that says, you can book, I don't know, special tours here, and it looks like an advertisement, then I would just treat it like an advertisement. OK, yeah, we don't really use any banners like that, but we do have, say, like a ticket product page on the hotel's website. And that goes to the tour provider, for example. But it sounds like Google's good enough now. It's just like every forum I go to, people sound paranoid. Yeah, I think, for the most part, we can figure that out. And especially if it's something that's kind of on the edge like this, then I wouldn't worry about it too much, because if we figure out that this is something we need to ignore, then we will ignore it. It's not that we will go in and penalize these websites from a web spam point of view. It would be different if this were something that were done in a really large scale. It was like if you're going to all other tourism sites and you're essentially buying links from them and you're doing some partnerships with individual sites, then that would be something where if the web spam team looks at it, they're like, the whole nature of this website is around unnatural links. Therefore, we will take action on all of these unnatural links. But if it's like this small part where you're just working together with other companies and the rest of your website is kind of ranking naturally, then that's not something I would spend too much time worrying about. OK, cool. Yeah, thanks a lot. That's good to know, because one of them is a hotel chain, and they need separate websites for each location. So yeah, I just wasn't sure how to handle that. Yeah, that works great. Cool. All right, let's go through some of the submitted questions. And as always, if there's something on your mind that's kind of related, feel free to jump on in. And we should have some time towards the end as well to catch up on other things. When we search a brand name, Google shows our terms and conditions type page and site links. Is there any way not to show it? So the simple answer is if you don't want this page to be shown in search, you can use an Oindex on that page that does remove the page completely from search, though. So that's kind of the extreme variation if you don't want it shown at all. But that's how you can do it. The other approach here might be to think about what you can do to make it clear that other pages within your website are more relevant. So to make it so that the terms and condition page is still indexed, but to make it clear perhaps that maybe some related products are more relevant rather than the terms and conditions page. And the general approach when it comes to site links is to have a clear hierarchy on the website so that we can clearly understand, like, this is the primary page. This is the top level categories. These are pages associated with that category, those category pages, to have kind of a clear structure on your website, rather than just everything on the same level. So instead of focusing on kind of reducing crawl depth to make everything as flat as possible, which is sometimes something that people focus on, make sure that there's a really clear structure on your site, and that structure is mirrored across your whole site. So wherever we look at your site, we can understand, this is the home page. These are the top level categories. These are the important parts of the site. And these are pages that are related to individual parts of the site. And by doing that, usually we can figure out which pages are more relevant to show as site links. As recommended, we switched our breadcrumb markup from data vocabulary to schema. The website in question is redirecting users from the root domain to the corresponding language folder. So domain.com redirects to slash en or slash de, depending on IP address. Search Console now shows an error under navigation for the root folder. This seems to happen because a crawler follows a redirect, in this case to en, and then tries to use the data found on en for breadcrumb markup of the home page, which of course is not working. Shall we remove the breadcrumb markup from first level language folders or D of other ways of getting rid of this error in Search Console? So I don't really know which error you're seeing in Search Console, so it's really hard to say what you should do next. In general, though, if you're talking about your home page, probably there's not a lot of breadcrumb markup that you need to put on your home page because there's no trail leading to your home page. So from that point of view, that might be an easy solution. The other thing to keep in mind is that if you're always redirecting Googlebot's IP addresses to slash en because we crawled from the US, then we might have trouble indexing your DE folder at all. So that's something to kind of watch out for. Usually my recommendation is more to use a banner and let people go to the right language version rather than to automatically redirect them because by automatically redirecting, you're kind of running into situations like this across the whole site where you link to one version and then it redirects to a different version. And then if you check it at home, like if you're based in Germany, you check it at home, it looks completely different than what Googlebot would see when it crawls from the US. So because of all of these things, usually I recommend just using a banner and showing that when you think a user is in the wrong location. We do a lot of Google Ads marketing, and we're running millions of ads. Usually it results in waves of requests from Google Adsbot. To have a stable user experience for customers, we dramatically scale our infrastructure once these waves are coming. Is there any way to nicely throttle Adsbot? Will it affect us negatively if we respond with 429? Just a second. Will it affect us negatively if we respond with 429 or 503 to some Adsbot's request? So in general, there are two approaches that you can do with regards to crawling activity overall. So Adsbot runs on the same infrastructure as Googlebot, so you can use the same tools as normal Googlebot for that. On the one hand, serving at these kind of temporary errors like 503 is a good way to automatically kind of throttle Googlebot and let it kind of crawl less. Another option is a setting in Search Console. The setting in Search Console for the crawl rate is something that we respect across all crawlers. And we prioritize crawlers a little bit. So if you throttle things and you have a lot of kind of irrelevant requests, then probably what will happen is we will still do our important crawls first. And then we'll just do fewer of these less relevant requests. And generally speaking, that might probably include the Adsbot landing page checks that are done. So that's another option that you can do. And finally, what you can also do, if you're seeing that things are consistently confusing and crazy with regards to crawling, and especially if you're seeing really high crawl rates in your site, generally is able to deal with that. Better cost a lot of money in all of these things. Then from the crawl stats page, help page in the Help Center, there's a link to a special form that you can fill out, which goes to the Googlebot team, where they can take a look and see if there's something on their side that they need to tune specifically for your site. All of these methods take a bit of time. So 503s are probably the fastest way to get Googlebot to crawl a little bit less frequently. But usually the reduction in crawl rate will happen on the next day, or the day after that. It happens automatically over time. The setting in Search Console also takes at least a day to be effective. And contacting the Googlebot team through the form, of course, depends a little bit on how things go on their side with regards to how quickly that can be picked up. So those are kind of the options there. I'm working in travel, and we're massively affected by COVID-19. We had to stop all marketing campaigns, and that resulted in no traffic from Googlebot. To reduce cross, we also scaled down infrastructure. A week ago, we started a campaign on two of the 20 domains that resulted in huge traffic from Googlebot to all of our domains, and to the pages which are not currently advertised. The adbot also checks the same page on every domain every five seconds for four to five hours. So that last bit sounds kind of strange. I don't know why that would happen, but I also don't know how the adspot setup is. But generally, the advice here is kind of the same as before in that you essentially limit the amount of crawling that Googlebot can do. You can do that in Search Console. Probably is part of the easiest way to do that ahead of time if you're kind of in the state of less crawling happening at the moment. So that's probably the direction that I would head there. During the course of the last four years, we migrated our website to a new domain five times. Wow, OK. After the last migration, our most important page has dropped in rankings for all its keywords. Other pages on the same website have recovered their rankings without problems. The important page was ranking for top five for several years and during the previous domain migrations also recovered without problems. Indicates that the page is meeting the needs and the search queries of our target audience. The migration was done technically. This ranking drop feels more like an algorithmic penalty. The website doesn't have any manual actions. We have a feeling that we might be affected by a page level algorithmic penalty. Do you have any idea, thoughts on what might have happened to our website? So overall, it's really hard if you do this many migrations over such a short period of time. So that makes it a lot harder to kind of come to a situation where you have clear expectations of what would be normal in a case like this with a website and what, I don't know, broke along the way. If you're doing this many migrations, oftentimes it's also a sign that something kind of shady is around the website. I really don't know your website. I don't know why you're doing so many migrations, but this is something that we really commonly see. For example, when a site is kind of in a spammy area and they do some things on one domain and then suddenly that domain doesn't rank that well anymore and then they move to the next domain and they kind of do the same things and it doesn't work that well anymore and then they move to the next domain. So that's kind of something that personally I've seen quite a bit happen. I don't know if that's the case with your website. Maybe there are other reasons why you're doing so many migrations. But that's kind of the first thing where if I came across something like this with regards to a website, I would be kind of cautious. In general, we have algorithms that do all kinds of things. It's not the case that we would have an algorithm that would try to manually penalize a single page on a website. But it's certainly possible that over time that the algorithms will look at your website and rank it differently and kind of adjust its visibility over time. Whoops, now we have a coffee grinder in the other room. The fun parts of working at home. So from my point of view, I don't think that's something that is kind of tied to a penalty specific to a page. That's something where personally I would try to take a step back and think about what you can do overall to improve those pages. Can I just ask a follow-up question? Because this is my question. Oh, fantastic. Yeah. So the reason for migration is not some kind of spam lanes or something like that. The reason is we don't really want to migrate our domain. But there has been a problem with the legislation in the country we have our domain in. And so we are forced to migrate our website so many times. And the website is generally doing very good. But when the government comes in court, there is some kind of problem we need to resolve this way because there is no other way to resolve it. So we really don't know what we are supposed to do right now. We know that it's not ideal to have these many migrations. But since there is no other way, is there something we can do to improve it or to recover from these massive drops? Yeah, I think there's always a tricky situation I'm happy to take a look if you want to post the URL that you're seeing these problems with, maybe in the chat. I can pass that on to the search quality team as well to double-check. But usually, if you're seeing this kind of an effect on a single page across a bigger website, then that sounds like something where the algorithms are just generally unsure about the website overall. And that might be a side effect of all these migrations where we have to pass the signals from one domain to the next one. And at some point, it can happen that something along the way doesn't forward that well anymore. So if some of the redirects along the way don't work anymore, if some of the old domains don't work anymore, then suddenly we lose a little bit of context that we might have had collected with more persistent setup. But I'm happy to take a look and pass that on to the team. You can post it here. And I can check that out after the Hangout. Yeah, that would be great. It would be very helpful. Thank you very much. OK, cool. Thanks. I have a client in a highly competitive market where being in the Mapbox can mean the difference between success and failure. How does the quality of the website affect the position in the Mapbox? So I think Mapbox probably refers to the Google My Business listings. From our point of view, I don't really have a lot of insight into how Google My Business does the ranking within their local search results. I don't really have that much insight that I can give you there. But I do know there are special forums for Google My Business owners. I would double check there. It's very possible that some of the folks active in these forums have some tips and tricks and things that work well, things that don't work well. I doubt that there is a clear list of ranking. So many people. Yeah, I doubt that there is a clear list of ranking factors when it comes to Google My Business that the Google My Business team would be able to just get. So on March 23rd, we used the indexing API to submit 50,000 salary pages that didn't contain job posting data to Google. During the indexing API beta, we were told that we could use the API to submit salary pages for purposes of the estimated salary. On the same day, we saw our pages with job posting schema get de-indexed. You recently mentioned on Twitter that the indexing API is only for job posting and live stream structured data. For everything else, just use sitemaps. Could this be the root cause of our job posting schema pages getting de-indexed? Going forward, we will not use the indexing API for salary pages. Can you confirm that if this is the root cause of the job posting schema pages being de-indexed? Oh, that's awesome. So first of all, no, this wouldn't be the reason for existing pages to be dropped from the index. So when it comes to indexing API, essentially what happens is we have to crawl those pages to see what is on those pages, to see if the right structured data is on those pages. And if the right structured data is on those pages, then we can process those pages. Whereas if we look at those pages and don't find the right kind of structured data, then that's something where, essentially, we would just ignore what we just crawled. We wouldn't pass that on. We wouldn't use that for indexing. So in a case like this, where if you're submitting pages with the indexing API that don't have the right kind of structured data, we would just ignore that. It wouldn't negatively affect the other pages on your site that we already have indexed. So that's something where probably this kind of thing that you're seeing is completely unrelated. I have two sites that sell very similar products. But all the branding content and product descriptions are unique and written in a tone of the voice for that brand. However, the back end code and the website structure is the same for both sites. Would this cause duplication issues between the sites? It might. My feeling is that these, if you have two websites that are just slightly different, then that's not something where I would worry about that too much with regards to duplication, especially if they're focused on kind of a unique audience. So from that point of view, I don't really see a big problem here. I think it would be different if you have a lot of different websites that are essentially selling the same products. And that's something where obviously it makes it a little bit trickier for us to figure out which one of those pages to show. So again, going back to the multiple websites, if you have things that are essentially unique on their own, so if you have these two types of websites that are slightly targeting different things, I wouldn't worry about it. If you have a large number of them, then that's something that would probably be a bit different. Would it be considered abnormal for a relatively established site to no longer appear or receive impressions in Discover? Search Console is showing zero impressions clicks after getting hundreds of clicks and thousands of impressions daily. This happened immediately after a rather large and early November update that caused a stir, which would suggest a correlation, though not necessarily causation. So changes in visibility in Discover can happen over time. And we have lots of algorithms that essentially affect large parts of search and that would include Discover. So it's very possible that a site might for some time get a lot of traffic and visibility in Discover and over time our algorithms change and that visibility essentially goes away. Or it changes over time. It can also happen that your site currently doesn't have any traffic from Discover and over time we recognize that actually does make sense to show a little bit differently like that. We're thinking of changing the main nav anchors by removing gender specific terms and listing them as clean text under a dropdown for gender. For example, we currently have men's citizen watches. This would change to citizen's watches placed under a dropdown menu for men's watches. Would this affect the ranking of men's citizen watches? Or can the crawler understand the sub-nav dropdown? So usually we can understand the context of pages fairly well. And if you're just moving things around from direct anchor text to more indirect anchor text and that you're kind of shuffling the cabin queries around a little bit, then from my point of view, that seems like a reasonable thing to do. And it wouldn't necessarily be something where I'd assume like something negatively would come out of that. That said, anytime you do restructure your website, it does mean that we have to relearn the setup of your website a bit. And that can result in some ranking changes. So that's something where my general recommendation for these kind of things, if you're a little bit worried about that, would be to test it out and try it out on a small part of your website where you get significant traffic, but it's not a crazy amount of traffic. I'm running a website that deals in beauty care products for Thailand. After the last Google update, I noticed that the site ranking has dropped and noticed that all informative review-based YouTube, Facebook profiles started to show up on the first line of search results for a lot of research queries. My website was ranking almost all of the product pages, category of pages, and other pages. What's the issue is Google ignoring beauty products related to e-commerce websites and prioritizes informative sites if it is, and how can we bring back the ranking, or can you help me take a look? So in general, these kind of ranking changes can happen over time. That's not something where I would say Google's algorithms have decided to only start showing these kinds of pages, but rather we try to figure out which pages to show appropriately in search, and that can change over time. So that's something where it can certainly happen that a page, or a type of page, or a set of pages is ranking really well in search, and then over time, our algorithms try to figure out that maybe this set of pages isn't as relevant to these user queries as we initially thought, and we have to rethink that. Similarly, it can happen the other way around that our algorithm think, well, these set of pages actually are much more relevant now than we thought they were before, and maybe we should show them a little bit more, frequently, more visibly. We have the same database for two price comparison sites in two different countries, which we have shown in their native language. One site is extremely popular in its country, while the other is just a veteran in its respective country. Will it change anything if we implement HREF Lang? They're about 15 or 20-year-old domains. So I think from the age itself, I wouldn't see that as a factor here at all. With regards to just using HREF Lang to make one site more visible in the search result, that wouldn't work, primarily because what happens with HREF Lang is if we show the other version of the site in the search results, then we will swap it out based on the HREF Lang. If we're showing neither of these sites in the search results at the moment, then there's nothing for us to swap out. So you can't use HREF Lang to make a site more popular in a country based on the popularity of a similar site or the same site in another country. So essentially, what you would need to do here is really work to improve the standing of that one website in that specific country. And it's not something where you can just use technical methods to kind of tell Google, like, hey, I'm really popular in one country. You should make me really popular in all countries. That doesn't really work. That's not something that users expect either. It's something where you really need to make sure that your website is really well received in that country on its own. I've been waiting for two months to hear back from a reconsideration request. I submitted in February. Is this normal? How long should I wait? And when should I submit again? So sometimes reconsideration requests do take quite a bit of time. I think two months seems like a pretty long time. So I don't know if that's kind of still within the normal range, but it's still something where I'd say it can certainly happen, that it takes this amount of time to be reviewed. What I would do also in a case like this with the reconsideration request is to check in with the help forum to make sure that at least the site that you have, the way that you have it set up now, is as correct as possible with regards to the issues from the web spam site that came up. And oftentimes, the folks in the Webmaster Help forums, they have a lot of practice with all kinds of different sites. And they can help you to figure out how to make that actually work a little bit better. I have anchor text and backlinks from adult content sites. And some of my pages rank as highest position one for clearly adult content queries. As evidenced by the search results, is disavowing the adult content sites the best thing to do. I don't have any manual actions. You can definitely disavow these links if you want. In general, just because you have links from adult sites and just because you might be ranking for some of those queries doesn't mean that you would not be ranking for the normal queries on your website. So this is something where people often feel that this is kind of like negative SEO because I rank for these queries I don't want to rank for. But at the same time, if this wouldn't negatively be affecting the queries that you do want to rank for, which probably aren't these kind of queries. So from that point of view, that's something disavowing them probably makes sense. It kind of reduces the possibility that your site would rank for those queries. But just because it's ranking for one set of queries doesn't mean that it won't rank for other kinds of queries. It's not that we suddenly assume that your website is an adult website, too. Looking to get access to the 3D AR for Google Search and Google Ads program have applied through the forum, but I haven't heard back. I have nearly 1,000 products that we can upload for retailers who really need the extra help at this time. That's really cool. If you can send me a link to your site, I can double check with the team. I know the team that works on structured data has been really busy with a lot of the coronavirus launches that have happened recently. So it might be that they're a little bit stretched thin in this regard. But I'm happy to take a look and see if there's something that we can do to speed things up. In general, there maybe that's also something where we can just help the team with better documentation to set the expectations a little bit clearly. Are there ever webmaster events that are in North American time zones? They always seem to happen at 3 AM for me. I do two of these during the week, one of them towards evening my time, which is more US time, time zone friendly, and the other one here on Friday mornings, which is more time zone friendly for the Asia APAC folks. So depending on your location, one or the other is probably a good match. Is redirecting a root domain a bad practice? I found it's kind of common for multilingual sites since every language has their own subfolder. They have one for ENUS too. So they usually redirect the root folder to ENUS homepage. For example, IBM.com will redirect to IBM.com slash USEM. Is that a bad practice? So how should we handle this? Also, IBM seems to put their root domains as X default, even if it's a page that's being redirected. Is that a bad idea? So actually, the X default setup for something like this is something that we have documented and recommended. So if you have a redirecting homepage, then that is generally a good match for people who aren't within the hreflang set that you currently have. So for example, if you have pages for English, French, and German, and so on, with a Spanish browser comes to you, then it's usually a good idea for us to send them to your redirecting homepage. Because from there, you can best make that choice between where they should be guided to. So that X default part is essentially something that we would recommend there. Redirecting homepages, in general, like I mentioned before, is something that personally I don't like that much because it makes it a lot harder to diagnose issues. Technically, you can do it. You just need to keep in mind that if you're redirecting Googlebot from the language versions themselves as well, then it's possible that we would not see any of the non-English versions. So if you just redirect the root of the homepage to different language locations, like slash d-e and slash e-n, and if people go directly to any of these language versions, then they're not redirected, then that's generally OK. The hard part is really if you redirect people from the German version to the English version because you think that the English version is a better match for them, then it's very possible that Googlebot would have a lot of trouble finding and being able to index any of the non-English content. But that's kind of the second step there from the redirecting home page. What I usually recommend instead is to have a country picker either as a country picker page, which is also kind of tricky with regards to usability, or if you can recognize that a user is coming from a different location to show a banner on top and say, hey, it looks like you're coming from an English-speaking country. Here's a link to our English home page, or here's a link to the English version of this current page, and let the user make that decision. That usually makes it a lot easier on the one hand for users, on the other hand for search engines as well, because then we can crawl and index all of these different versions, and we don't kind of get sent in the wrong direction. Ooh, OK. Running towards the end of the Hangout, still a handful of questions left, but maybe I can switch over to questions from your side. If there is anything on your end, that's still pending. Hey, John. Hi. Yeah, I represent a pharma e-commerce brand in India. I had a couple of queries regarding that. So one of the queries is, we see a lot of discrepancy in location-specific keyword rankings. So we are an e-commerce brand. We operate across the country, but we see that for a generic keyword, which is not location-specific, we are ranking probably fourth in one city, but not in top 50 for a different city. Yeah, that can happen. So what are the issues or the causes or the solutions that you would recommend for this? I mean, usually that's a sign that our algorithms believe that users really want local information for those kind of queries in those specific locations. And that's something, if you don't have that kind of local information, that kind of local content, then that makes it possible for us to say, well, maybe we should show more local content instead of this kind of more general version of content. So from that point of view, it's not something where I would say you're doing something wrong, but rather it's like we, our algorithms believe that maybe your site isn't as relevant for those queries in those specific locations. And if you think that it is relevant for those queries and those locations, then that's something where you need to make sure that it's a lot clearer on your website that it is relevant for those queries and locations. OK, one specific thing that we have on the website is that our default location is a particular city. And for that particular city, the rankings are quite good. And that city is actually visible on the website, and that is indexed by Google as well. So do you think that this could be a problem wherein all the cities have all the users from different cities are exposed to this particular city? So should we remove the city and make it generic? Do you think would that solve the problem? That might help. So if there is one kind of city that you're already highlighting, and that's something where I could certainly imagine that that makes it a little bit trickier. So I know, for example, we had similar issues, or I don't know if there were issues. But I saw similar things with some classified sites in the US where they would show different content on their home page based on the location of the user. And they would try to do that on a city level. And that meant every time we crawled the home page, a lot of the content that they were presenting was based on San Francisco and Silicon Valley kind of area. And that essentially made it so that for us, the home page was very San Francisco-oriented. So even though they had content for all other locations, it was something where they were focusing so much on San Francisco with regards to the content that we thought, well, probably this is a website specific for San Francisco. So that's something where similarly, if you really focus on one city on your website, then it might be that we look at your website and say, well, this is really focused for that one city. Maybe it's not so relevant for other cities. And if you can reduce that focus or make it so that, at least by default, that city is not something that's standing in the foreground, then probably that would help to even things out a little bit. Yeah, got it. Thank you. Another question that I had was, so since we are in Pharma vertical, we are focusing a lot on EAT elements to add on the website. So would you say that adding those elements on top of the page compared to the bottom of the page, does it have an impact on the overall rankings that Google might consider? I wouldn't see that as something as a primary ranking factor. But for users, that probably does make a difference. That's something where I would primarily focus on this as usability as a conversion factor and less as an SEO element. All right. Thanks. One last question that I had was, I'm really sorry. This is the last one. So this is regarding duplication on the website. So we have multiple molecules, composition of the medicines. And we have similar content for multiple pages, which has the same constituents and the same elements in them. So say that duplication in these aspects would not cause an issue, or would you recommend that we should try to make it unique or add more elements so that the overall page gets a bit more unique than before? I don't know. I would try to look at it more from the point of view of what people are actually searching for. And I really don't know that area so much with regards to what they're actually searching for. If people are searching for very specific variations, then that might be something where it makes sense to include that on your website. But if you just have all of these variations and people are primarily searching for the category of molecules in this case, then maybe it doesn't make sense to include those smaller lower level pages. This is also something that you can probably test fairly well on your website, where you take one section of your site, where you're receiving enough traffic so that you have some reasonable numbers to compare, and then you make an AB test there, where you take a small set of those pages and you do them in one way and a small set of pages you do them in another way. And then you see what is the overall traffic to this set of pages versus that set of pages. Got it. Thank you so much. Thank you. Sure. That's it. All right. Any other questions from any of you all? Hi, John. Hi. Hi there. Yeah, just a question about duplication of code. So we have two sites. And they both target the same similar or pretty much the same products. But the branding for both sites is different. The content's different. The product descriptions are different. Everything's pretty much unique, other than the code. The back end and the site structures are the same. I just wanted to know if that would pause an issue between the sites in regards to ranking or duplication of code. Usually that would be OK. I mean, especially if you're targeting different types of users, then that's something where I wouldn't really worry about that. That seems reasonable. OK, that's good. One other question. We've got quite a large navigation for both the sites. And we target male and female with different products. We want to clean the navigation so you have the products listed underneath men's. So you have men's watches, men's jewelry. But instead, just have the product terms rather than gender specific with the anchor text. Would that have an impact in rankings for those pages? My guess is we would still figure that out. And probably it would be pretty neutral overall. But it could, like depending on how your site is structured now, it could be a pretty big change in the way that your navigation is set up, that your hierarchy of the site is set up. So that could be something that would have an effect there. My recommendation there would be to kind of try to test that, if at all possible. Like before, to take a significant part of your site where you get enough traffic that you can compare things and try that variation versus the current variation and see if that has any effect there. And my general feeling is probably the overall effect will be minimal. So that's, but it's really hard to say offhand. Sure. That's great. Cheers. Sure. All right. Hi, John. Hi. There you have it. Hi. This is one question I just thought I could get you inside. So we have two pages, one page for a particular keyword. We have the homepage ranking for it as well as the dedicated page ranking for the same keywords. OK, so what we have observed is that the competition that is ranking for that particular keywords have their homepage. And even though they do not have a dedicated page, we somehow seem to rank somewhere. So should we consider targeting our homepage for that particular keyword and redirecting that specific page to the homepage? That's something you can do. It's not something where I would say you would always want to do that. So it's fairly common that both a home page and maybe a category page or a product page rank for the same kind of query. And it can happen that one of these pages ranks more visibly than the other one. But essentially, that's something that you can control a little bit based on the amount of content that you have on your site, based on the hierarchy of your website. And that's something that you can control in a way where you can make the decision on what you would like to have happen there. So if you feel strongly that your homepage is the much better page for this kind of query, then make sure that it's really clear to all search engines when they come to your website that the homepage is the right page to show for that query. And similarly, if you want people to go to a product page or a category page instead, then make it clear that this is really the page that people should go to for that specific query. For example, if you have exactly the same content on your homepage or if you have a full set of the multiple product categories that you have on your homepage as well, then sometimes it's hard for us to tell which of these pages is the better one to show to users. So if this decision is essentially up to you, and if you choose to have one or the other pages to be shown visibly or you prefer to have it more visibly shown for individual queries, then you need to make that as clear as possible. Got it. So the environment, for example, the landscape in which that entire arrangement is there, should that not be a concern for us? Because what we observed was that the competitors, even though they do not have a dedicated page, their home page, one that seems to be of the highest authority or highest strength happens to rank for that. So should in the landscape be something that we need to consider for this? I wouldn't worry about what your competitors are doing, because maybe they're doing it wrong or maybe they don't want to have this happen and it's happening accidentally. I would really think more about what you want to have which one of those pages do you want to have ranked in the search results? And maybe it's your home page. Maybe you want everyone to recognize your brand, your overall company for this query. That's fine. That's your decision. Maybe you prefer to have people go to a product page where they can convert quicker. That's another option. But essentially, that's a decision that you can make. And that's one where you can focus on your metrics. Like what do you really want to have happen? Do you think people need to find out about your company? Maybe the home page is better? Do you think people would be interested in just going off and buying the product? Then maybe the product itself is better. Thanks, John. Thank you very much. Take care and stay safe. Thanks, you too. OK, let's maybe take a break here. I'll set up the next batch of Hangouts. And hopefully, they'll be a little bit less disruptive along the way. If you have any questions along the way, feel free to drop me a note on Twitter or to join us in the Webmaster Help forums as well. Thank you all for joining in. I wish you all a great weekend. And stay safe, stay healthy. Don't do anything crazy. Go outside, I guess. All right, see you, everyone. Bye.