 All right, welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a webmaster trends analyst from Google in Switzerland. And part of what we do are these office hour hangouts where webmasters and publishers can join us and ask any questions that might come up with regards to their websites and web search. And what we do are these office hour hangouts where webmasters and publishers are coming in here. As always, if any of you want to get started with the question, feel free to jump on in now. Otherwise, we'll just go through the questions that were submitted. Hello, hello. Hi. Hey, John, how are you? My name's Salán Onius. I think you probably remember us from battle.com. How are you? Good evening. Hi. How's everything going over there? Pretty good. How about you? Good, good. It's 2 AM here, you know. I'm here in Mexico right now, so I have to wake up a little bit early in order to be with you guys. But I'm happy that I could make it through. And I just want to be quick, John. We've been talking to you for these past seven, eight months. And I just wanted to specify with you the problems that we'll be going through. And maybe if you could just guide us and check out the problem that we're having. And mainly, that is that we have been working with subdirectories, right? And then we tried to move on to subdomains for each country that we have. We have 14 edition countries. And something was not properly done when we did that. So then we went back, again, to subdirectories. But when we did that, we basically lost all the traffic. And we already found what the problem was when we did the subdomains. We did something wrong. So now that we're trying to revert it, obviously we have been losing all the traffic that I mentioned. And the reason why we think that that is happening is that the tool that Google has to detect the news from each country that we have, they don't really take it as a local or as the news source for that country. But it takes it as international. So that's the problem that we're having. I don't know if that makes sense. So you moved from subdomains back to subdirectories now? Or are you back to subdomains again? We're back to subdirectories. That happened within that change back in February. So right now we're back to what it used to work for us. But then what we did obviously harmed us. And right now it's clear to us that when you put Bible.com in any search query for any specific country, it doesn't show up as we're part of that country. So for example, if you go to Google.com and you type in New York Times, you're going to get the news. If you type us, it takes us as we are an international news source. And it doesn't really crawl us because of what we did wrong with the subdomains. So basically, we wanted to just ask you if we could send you an email so you could review the case. Because we know with other media outlets that they have the same system as us that they're international. And they're working with subdirectories. They're having the same issue. So I know it's not only us. And probably there's this little flow that hasn't been detected and that it could be helpful. Because we know that it's already working not only with the sports, but with our news outlet, our international, and they're basing their operations in different countries. Sure, I'm happy to take a look. But it feels like something from a web search point of view that shouldn't really play a big role. So I don't know from the Google News side if there's something unique on the Google News side. No, but it is not related to Google News because right now, our searches are showing up much better than they used to before. We fixed the website. But right now, even on search, they're properly fine. But when you look for news specifically as the top news thing, it doesn't show up. And I could understand what some of the people mentioned to us in the forum that it could be the quality of the content. But we know it's not that because we have breaking news and exclusive news from countries that even BBC, ESPN, and all those big media outlets, they're even using us as their main source. So we know that it's not the quality part. And we know it's something technical. And the World Cup is 30 days away. So if you could really help us, it's been such a tough time. And waking up to it, it's crazy. Yeah, I've been following along and trying to guide stuff to the team here. So I'm happy to take a look. I suspect that's the only thing I would really think for if you can only take a look. That's it. Sure. OK. Yeah, feel free to send me the details. I can take a look at that. In general, if you have geo-targeting set up, then that's kind of what we need. And I assume that's what you set up for the site. So past that, we wouldn't, at least from the web search point of view, we wouldn't really have anything like trying to recognize international versus local sites. But maybe there is something from the Google News site that I don't know that needs to be done slightly differently. Happy to take a look. Yes, how could we reach out to you? What is the easiest way? Easiest is just by Google+. So just send me a private message on Google+. All right, OK, John, I really, really appreciate it. You don't know how I would really appreciate if you could take a look at this. And then I'll just keep listening to what you have to say to everyone else that just joined. Sure. Thank you so much. All right. Anyone else who wants to get in with a question before we head off into the submitted ones? Is it OK that I quote a message that I posted already? Sure. Great. So what happens when we need to change CDN for images? What would be the best practice to handle that? There's no such thing as ref link, I think, for images. And sitemap doesn't give any clues. So how do you mean change CDN? Would you keep the same URLs and just? No, so the URLs have to change because the hosting provider changes. We don't have our own domain. So that would mean there are practically different images, but the content is the same. The pages would link to different images, but eventually they are the same pixels. Yeah. So what you need to do is make sure that the images also redirect. So don't just change the embedded link from the web page. Also make sure that the image URLs themselves also redirect to the new URLs. So with images, it's always a bit trickier when you do these kind of site moves because in general, we don't crawl images as frequently as we crawl web pages. So you really need to make sure that everything is lined up so that we can move over as quickly as possible. But if we can keep both CDNs up for a transition period, would that be OK? So the old images would still be there and the new images would also be presented on the web pages. Would the ranking be hit by that? I mean, there's no way to correlate two images, right? One to the other. Well, if you can redirect the old URLs to the new ones, that would be the ideal situation. Keeping them both live, I think, would help as well. But it's always, like I said, with image search, it just takes a bit longer for everything to update. When you're looking at just the web search part, if you're mainly focusing on the web search part, then I wouldn't worry about that because we can rank these web pages regardless of whether or not we have the indexes associated with them. But specifically for image search, that does take a bit of time for us to kind of move everything over. So what I might consider doing there is setting up maybe a subdomain for your main domain and just transferring that from CDN to CDN if you plan on moving to another CDN over time. Because that way, the URLs for the images wouldn't change. It's just kind of the hosting of those URLs that would change. Yeah, definitely something we plan on doing when we have more resources to afford it. Thank you. OK, cool. Great. All right. So let me jump in with some of the questions that were submitted, a whole bunch of stuff here. Let's see how far we can go. We do AB testing on our website. The pages are changed with JavaScript test variations. Currently, we only do that for humans and not for bots. The problem is we do this almost all the time. So I guess in general, we want to have Googlebot be treated more as a normal user. So if you could include Googlebot in these tests, I think that would be great. With regards to the way that you're doing AB testing, if you're doing this purely with JavaScript, in some cases, these JavaScript files are blocked by robots tech, so we wouldn't be able to see the AB test anyway. And that's generally fine as well. The important part is really that the primary content of the page is essentially the same, not that you're testing something like an AB test on the A version as a page about cars and the B version as a page about, I don't know, unicycles, that wouldn't really be the same kind of primary content. And it would be kind of confusing to use it as well. But if the primary content is the same, then this kind of AB test is also fine. Even if Googlebot doesn't see the AB test versions because it's using JavaScript or the JavaScript is robot it from the provider or whatever is involved there. It goes on to mention, Google used to recommend 302 redirect for AB testing. Is that still necessary if we use the same URL? If you use the same URL, then you obviously don't need to redirect. And wow, OK. Is there a difference between subdomain and subdirectly for Google? I did a video about this briefly a while back. I would recommend taking a look at that. In general, we see these the same. I personally try to keep things together as much as possible. So if it's the same site, then try to put them up in the same site, essentially, and use subdomains where things are really kind of slightly different. But there are lots of really strong opinions on this. From my point of view, this is something that could go either way. And if you have really strong reasons to go one way or the other, then obviously, that might be what you'd want to watch out for. On the other hand, if you're like, well, I don't care either way, then I would just keep it within the same site. In the off chance, let's see, a question about canonical and hreflang tags. We have a page that's paginated with query parameters. Should we display hreflang tags on the paginated pages that point to other versions of this exact page? Let me ask a question. Oh, hold on a second. So let's see. Should we display hreflang tags on paginated pages that point to other versions of this exact page with query parameters or only display hreflang tags on the canonical versions? You can put hreflang tags on both of these. The important part is that the hreflang link should be between the canonical versions of the page. So the non-canonical version would generally have a canonical to the canonical page, but also the hreflang links to the other variations. And we'd also like to put them on AMP pages. In this case, should our hreflang tags on AMP page point to other versions of the AMP page or point to versions of the canonical URL? In general, the hreflang links should be between the same format pages. So if you have separate AMP pages, then the hreflang links would be between those AMP pages. If you have separate mobile pages, the hreflang links would be between those mobile pages. So that's kind of what I'd watch out for. All right. You had a quick question in between? Oh, sorry. Yeah, I was out for a while. How are you? Pretty good. How are you? OK, thank you for giving me time to ask you a question. So I have a few questions, if you have the time. First of all, thank you for giving me the chance to talk. I have a website. OK, so I have a website that was having some issues. And I wrote a message on the last group call. You gave me some information. So I wanted to elaborate on that. There was an affiliate website I talked about. And you told me that mostly those affiliate sites can be downgraded due to their don't have that kind of informational article, right? But our website has enough informational article. And the site was growing steadily. And then the site drain dropped. And I don't know what we can do to get that back. Sorry about my family, because I'm not used to talk. Sorry about that. No problem. So basically what I want is let me just clear it up again. Basically what I wanted to clear here that what you told me that most of those affiliate website doesn't have that amount of informational article or blog post that can contribute to the readers, right? That's why in those websites, that probably ranked. But when Google see that they don't have those type of informational article, they just only have commercial content, their rank can drop, right? But what about the website that do provide value to the website visitors? They provide enough informational content, plus they provide commercial content. And their rank was improving. And then they suddenly dropped. And you just scratching your head for the last few months to figure out what did went wrong. And you can't find it, because Google is not giving you any kind of penalty or message like they used to two or three years back. They give you a message on the search console. Yes, you have a back end problem. You need to sort it out. You have team content problem. You can sort it. But now, what can we do? What I would do in a case like that, if you think that your site is pretty good, then I would post in the Webmaster Help Forum with the details that you have to kind of explain what kind of pages you have, what kind of content you have, what kind of queries you used to rank well for, and get feedback from other web masters to kind of better understand what you might be watching out for. Because in general, our algorithms do change over time. And sometimes sites are very visible for a while, and then they become less visible over time. Sometimes they become more visible over time. But these kind of changes are kind of normal. And sometimes they're due to things that are easy to spot, especially from other people who've run other websites as well. And sometimes it can be really tricky. And sometimes maybe it's even something that we need to take a look at. But in all of these cases, it's usually very helpful to first get more feedback from peers in the Webmaster Help Forum or someone like that. Well, I tried to ask those questions on those forums without sharing my website and everything. But I did provide enough information, like I did earlier. But everyone gave me some kind of work answer, not give me a clear answer. And I do have a limitation on the knowledge. I can't if someone asked me to do a specific thing I can do from within my knowledge. No one is, you know, a jack of all trades. We need some information to fix something. But for that, you really need to share your URL so that other people can take a look at this. Can I do that if it's OK with you to take a look? Just to take a look. Nothing else. I'm happy to take a look. But usually, I don't have much time to go into detail. So that's why I really recommend doing it in the forum, because the people there, they have a lot of experience seeing all kinds of different sites. And often, they can take a look and say, oh, this is like a problem, or this is still too thin affiliate. You need to do something to make it much better. Or maybe they look at it and say, this is a fantastic website. Google is doing something wrong. Sorry to cut you short, but I'm taking it too long. But it's just a bit frustrating, because we spent a lot of time on this particular website. And we did try to do everything within the Google guideline that has been provided. And other websites are doing good. But this particular website, we don't know what is going wrong. We are trying everything. And the frustrating thing is that it's like when we are at college or on university, we do have some syllabus that we can follow. There is no guideline. We have the guideline, we try to follow. And now, we don't know what did we do wrong. It's kind of frustrating. And I hope you get it right. I hope you understand. But I'd really try to get some feedback from peers on that. I think that's kind of the best approach there. There's no one trick to getting top-ranked in Google. I understand. But we need to know something. We can't figure out something out of proportion. Like in a map, you need to know something to figure out something. Sorry about taking too long of your time and others. And thank you for giving me the time. I will try what you suggested. Thank you. Have a great day. Good luck. There is another question, I think, in the chat with regards to a company that develops and promotes a lot of different online stores in a narrow theme. Our sites all work on the same CMS and have the same URL structure, similar categories. Because one catalog of goods is used, all sites use the same site unique with useful content. We have three sites. Google index them. After indexing the traffic, all sites are on the same IP address but belong to different people in different cities. So I think they're seeing a loss in traffic in general. So I think there are the two aspects there. One is kind of on the same IP address. That's really not a problem for us. It's really common for sites to be on the same IP address. That's kind of the way that the internet works. A lot of CDNs use the same IP addresses for different sites that's also perfectly far. I think the bigger issue that you might be running into is that all of these sites are very similar. So from our point of view, our algorithms might look at that and say, this is kind of a collection of doorway sites in that, essentially, they're being funneled towards the same products. The content on the sites is probably very similar. Then from our point of view, what might happen is we will pick one of these pages and index those, that, and kind of show that in the search results. That might be one variation that we could look at. In practice, that wouldn't be so problematic, I think, because one of these sites would be showing up in the search results. On the other hand, our algorithms might also be looking at this and saying, this is clearly someone trying to kind of overdo things with a collection of doorway sites, and we will demote all of them in search. So what I kind of recommend doing here is really trying to take a step back and focusing on fewer sites and making those really strong and really good, unique, so that they have unique content, unique products that they're selling, so that you don't have this collection of a lot of different sites that are doing essentially the same thing. That's kind of the direction I would take there. You mentioned that these are three sites. From my point of view, three sites is not a big issue, so I don't think we would see that as a collection of doorway sites. But it feels like you might be heading in that direction when you're saying, well, I have three, and then next week I have six, and then I have 100. And that would really kind of be problematic. But in general, the fewer sites you have, the easier it is for you to make those really strong and really fantastic. And the more likely it is that we'd be able to show those and search a little bit more visibly. All right, so some of the other submitted questions can Google interpret keywords with a backslash in them. For example, pot slash pans comes up together or as separate items. I don't know. I didn't actually try this out. But you can try it out fairly quickly and see what happens. In some cases, we do take symbols as having a special meaning in that we think this ties the two terms a bit together. And in some cases, we see symbols like this as essentially separators between different words. And we treat them separately. It might also be that we sometimes kind of have this mix where we say, well, we're not 100% sure if this is a special character that ties these words together or if this is actually more of a separator. So that's something where I would just try it out and see what comes up. We received a manual penalty six weeks ago. After a domain move, we sent a reconsideration request five weeks ago and no response from Google. That sounds like a fairly long time. So if you want, you can send me maybe the link to your domain or post it in the chat. And I can pass it on to the team to kind of double check. Usually, it's something where when we see a site do a lot of different reconsideration requests over and over again on the same issues, that might be something where the team would say, OK, we're just going to wait a little bit to see what your final state is going to be. But if this is a one-time issue that you ran into, then that feels like something that we should be able to process a little bit faster. Hi, John. Hi. Yeah, it's me that submitted that question. Oh, fantastic. OK. Yeah. I'll post it to the link. OK. Yeah, I did. OK. Cool. I'll take a look at that later. Thanks. Let's see. I'm focused on delivering the best content to our audience and look at engagement metrics to test whether or not I'm on the right track. I was wondering if you noticed any trends, insights, or obscure metrics that are often overlooked indicators of audience engagement. Ooh, wow. I don't know what I could recommend there. That's something where you probably need to look at your site specifically and see what works specifically on your site. So for some sites, people read a lot of pages, and that's a good sign. For some sites, people get all the information they want from one page. So it's really something you'd have to look at individually on your website. Our website has a network of affiliate partners. We use a third-party platform to manage it. Our partners place our gadgets on their website. Every gadget has a link that points to a third-party platform URL, but if you click it, the link does a 302 redirect to our website. The problem is these gadgets don't have a real no follow, and we're worried that this could be hurting us from an SEO point of view. About 50% of the backlinks to our website are on affiliates. I don't know. So this sounds a bit tricky. I think if you're doing this kind of a redirect through a separate URL, one thing you can do if you really feel that these links are problematic is just use robots.txt on that kind of redirecting URL. So if you're redirecting, if you have a link that goes to like one URL, and from there it redirects to your website, you could just robots.txt that intermediate URL to make sure that search engines don't pass any signals to your final site. That might be an option here. In general, I'd kind of be cautious when it comes to kind of these widget gadget links because a lot of times there is abuse that's done with these kind of links. So I don't know. It's kind of tricky to kind of look into. What I might recommend doing here is also posting in the Webmaster Help Forum to get some tips from other people to see what they would think of these widget links. If these are like invisible on the page, if they're misleading, if they're kind of branded links going to your website, or if it's like keyword type links, all of these aspects can play a small role in that as well with regards to whether or not it's a critical problem you need to watch out for, or if it's essentially something that kind of just happens. If a 404 error goes to a page that doesn't exist, should I make them 410? From our point of view, in the mid-term, long-term, 404 is the same as a 410 for us. So in both of these cases, we drop those URLs for my index. We generally reduce crawling a little bit of those URLs so that we don't spend too much time crawling on things that we know don't exist. The subtle difference here is that a 410 will sometimes fall out a little bit faster than a 404. But usually, we're talking on the order of a couple of days or so. So if you're just removing content naturally, then that's perfectly fine to use either one. If you've already removed this content long ago, then it's already not indexed. So it doesn't matter for us if you use 404 or 410. It's a way to no index for Google search but have the pages still in Google Shopping. I don't know how Google Shopping handles this. So I don't know what they would pick up or require for pages being listed there. For hreflang, we need to add them to desktop mobile pages. But what about the other pages? I think we talked about this briefly. There's also a page in the developer documentation on the AMP side on hreflang, which has a nice diagram that shows between which versions you would link and which ways. So I take a look at that as well. Is website coding is responsible for bounce rate problems? I'm not quite sure what you mean here. So I'm not quite sure what you're doing there. So it sounds like you're converting your HTML pages into PDFs. And they take a while to load, I guess. In general, that can be fine if this is providing extra value to your users and they're downloading them as PDFs. And why not? For the most part, you can probably achieve something similar just with a responsive design. So that might be another option. But I'm not sure how website coding and bounce rate problems come into that. So it might be worth writing in more detailed questions in maybe the Webmaster Help forums to get advice from other people as well. Does Google have any plans to expand the number of keywords that can we search for with the new keyword planner? I don't know. So keyword planner is run by the AdWords team. And I don't know what their plans are there. It seems like they have been working on this in the past. So I wouldn't be surprised if they make more changes there. But I really have no insight into that. We're having issues with our numbers. Pages are dropped from search results every now and then according to the Search Console. And it's often determined that they are tracking issues. How often would you say Google Console or Analytics experiences a bug that affects tracking? And what types of things can happen on a page, on a site's page, that can impact tracking? Not quite sure how you mean tracking issues. So we know the page is not dropping off completely as we're still getting conversions through organic search. So really not quite sure what you mean in this regard. In general, the data in Search Console is fairly comprehensive. So if you're seeing that pages are not being indexed in Search Console, for example, then that's something that would really mean that they're not being indexed. They can still get traffic in other ways, though. I can elaborate on that. It's not my question, but I've seen it happen. On Google Analytics, suddenly there's zero traffic arriving from Search. And Google Console shows that there were clicks on that days. And I think that it's like a rollover to the other day or to the day before. It's like zero days, and then they do not agree. That's what I've seen. So maybe he's asking that. So it's not that they're not indexed. It's that it's showing no traffic in Analytics, but actually in Search Console, it is showing traffic. It's like a disagreement. Exactly. OK. Seems like something that shouldn't happen, though. Does that happen with new data or also with older data? I've started to see that in the past three months. And we had six occurrences, like six days of zero days, where they did not agree. I just took it for granted that something is not running over correctly. Maybe it's bug, but it started happening suddenly. OK. Cool. If you can send me some screenshots or some details, I'd love to take a look at that. That seems like something that we should be able to fix. Absolutely. Thank you. Cool. All right. When you have pages, where prices vary, but everything else stays the same, like same product from different merchants via URL parameters. And we want only the preferred merchant to rank. Should we use URL parameter exclusion or canonical? We have millions of pages to be crawled, and this would enable better use of the crawl budget. So you can use both of these. So if you have URL parameters that you can tell us, this parameter is something that you can ignore where we can just crawl one of them, then that's a great way to do that. That helps us with understanding which URLs we need to crawl. The canonical helps us to understand which of these pages we want to have indexed. So if you have the canonical on the non-preferred versions, and we still need to crawl those non-preferred versions to find the link to the preferred version with the canonical tag. So that wouldn't change crawl budget too much. But the URL parameter tool would definitely help us there. But you can also do both of these at the same time. That also works. Then there's a question about a GDPR cookie modal and reducing SEO rankings. I think we talked about this in the last one and also in, I think, on Twitter or somewhere else, where essentially, if instead of the actual content, you're displaying an interstitial, a legal interstitial like this, then chances are we would just index that legal interstitial. We wouldn't know that we can click on that interstitial and go somewhere else. So that would be problematic. On the other hand, if you're displaying this interstitial using JavaScript on top of the existing content, then for the most part, we'd still be able to index the existing content because we'd already have seen that on the page as well. So that's kind of those two aspects to watch out for. Hi, John. It's James. That's my question. Oh, awesome. And my concern is because it's a full-screen label, we are doing it via JavaScript, is that Google is going to treat the content behind it as hidden. And although it indexes that, I believe the content which is considered hidden gets ranked lower. And so I don't want people to start searching for a cooking policy, and that's ranking higher than the actual content. So yeah, so for the most part, we should be able to just get that right. What you could also do is use robots.txt to just block the JavaScript file that you're using to display that interstitial. That's also fine from our point of view, especially for these kind of legal interstitials, which aren't like marketing interstitials or anything like that, then using robots.txt for that is perfectly fine. So using robots.txt rather than because all of our JavaScript are minified, rather than just having a detector in there which detects if it's Googlebot just don't show that being interstitial. Good question. I don't know if we would treat that differently. So I think for the most part, that would work. I don't know. You'd need to test the rendering of the page if that actually works with that kind of recognizing if it's Googlebot or not. Because of the way that the pages are rendered, I don't know if we show the Googlebot user agent and JavaScript in the same way that it would from a server side kind of request. But if you test that and you see that your content is actually visible, that's fine. That's great. Thank you very much. Sure. Hi, John. I wasn't able to join earlier when you answered my question about the hreflang and the canonical tag. I just had a question regarding because you mentioned that you could show hreflang tags on the paginated pages. But should they be pointing to, say, reviews page equals 3 in a different language? Or should they be pointing to the canonical URLs? So the hreflang link needs to go to the other canonical version. So if you're on page three of a series and the canonical version is page one, or if you have a View All page, then the hreflang link should go to that canonical version of the page. OK, OK, gotcha. All right, thank you. That's all I needed. Thanks very much. I think in most cases with lists like this, you probably don't really need to use hreflang because we'd still be able to pick the right page and show it in the right search results. So it's probably one of those situations where you could spend a lot of time working out this detailed, technically correct approach that actually has very minimal effect in practice. So that's kind of what I'd watch out for there if you're just trying to get everything right, but actually doesn't have any big effect in the end. OK, show, thanks. All right, what will be the best practice with the image changes? We talked about that briefly. With following the change of policy regarding dynamic rendering, we use a hybrid rendering approach, React-based. We never render on client-side structured data tags because it's useless for end user needs. But we do serve it in the initial render for crawlers and social media sites. Would that be a violation of the policy if our dynamic renderer only provided the structured data to crawlers and left the normal user rendering without it? From our point of view, you could do that. I'd be kind of cautious to just kind of doing it by policy because, for the most part, the structured data that you provide on pages is minimal additional work or minimal additional content. And serving that usually doesn't play a big role on the overall end user speed. And sometimes, maybe it does provide some value for users as well. So that's kind of the thing where I don't know. Personally, I just try to serve all of this directly to both versions because it feels like there's not much to be gained by actually stripping this out from the user version of the content. And on the other hand, you're adding a lot of extra complexity to kind of conditionally recognize and treat this content slightly differently. So I personally try to just serve everyone the same content. And if you're using something like dynamic rendering or you're serving pre-rendered content to search engines or social media sites and serving the same content to users in client-side rendered way, then that's something where I would just include all of the structured data in there normally as well. Image site map structure provides an optional image location property. How is that data being used by Google to serve images? And what format does it accept? I don't actually know. Good question. I assume we don't use this at the moment because if the documentation is vague and it seems like something that's optional, then probably we wouldn't be using this at the moment for image search. But I'll double check on that. Because if we have it documented somewhere and we don't use it at all, then maybe it would be good to add some kind of disclaimer, at least, there to see if we can provide more information on what it actually does. Then we have this one. And our form is not very active. It's old and it ranks well. Is it advisable to take top form entries, add to blog posts, and redirect the form entries to the relevant blog post articles with the same content, and then create internal links? Or just keep the forum and create a strategy around that. You can do that. You can do both of these, essentially. So essentially what you're doing in this case is you're moving content from one part of the website to another part. And whether that's a forum or a blog and you move it to stable documentation or you move it to another blog post, that's essentially all the same thing. That's perfectly fine to take content from one part of the site and move that to another part of the site, even if it's kind of semantically very different part of the website, as long as it's essentially the same content. So from my point of view, I think that's a good approach. Sometimes you also get good input from things like forums where you have user-generated content where people discuss topics. And then based on those discussions, you can create a blog post. In that case, it's not necessarily the situation that you move the content from the forum to a blog post. You're just creating additional content on your blog based on the content that you have in the forum. And that's fine as well. Hi, John. Hi. Yeah. I have a game site on our home page already ranked for the general query for the game. So the intent of the user is to find information and examples or how to play the game. My question is, we also offer a way to play the game online on our site, something that a lot of the competitors don't offer. And my question is, if you put that link to play at the top of the content or at the bottom of the content, we realize when we put it at the bottom, the user spent average of like seven minutes on the site as opposed to at the top, they only spent 10 seconds on the site on average. So which is better from a search point of view? From an SEO point of view, it doesn't matter. That's totally up to you. I think if you're seeing significant differences in how users behave, then that's what I would use as a guide and pick whatever works best for your users. So that's kind of what I would focus on there. From an SEO point of view, both of these approaches essentially are fine. You're linking to more detailed content on your pages. That's I think it's always a good approach. OK, thank you. Sorry, just a quick question, kind of following up something a bit earlier around user engagement. So I will have a content website which provides very long-form detail about specific regions with sub-pages breaking down, providing more information about those regions. What would be a good metrics to show that my readers are being engaged for this kind of content website? I think you'd really need to look at what works best on your site. So at least from a search point of view, that's not something that we would really take into account. So it's something you kind of need to figure out on your side what works best with your users to recognize if they're doing the right thing, if they're happy with the content, if they're able to engage with the content. Those are things that are more, I think, unique to every website. Right, OK, thanks. All right, well, I think we made it through all of the questions. We're still not over time. Oh, wow, here's a new one. OK, we're in a home improvement business, and we believe we have great content for users. Content-wise, what do you think is better? Creating a full and thorough guide for a subject versus creating specific pages, articles, per subject. I think you'd need to test this out. So from an SEO point of view, I could see both of those playing, working well. But you'd probably just want to try it out to see what works best with your users. So don't have any magic advice there for which direction you should head there. Usually, these kind of things are fairly easy to test, and sometimes the results are really obvious, where you see, well, my users really love long-form content, or my users really prefer short-form content. And based on that, you can kind of build out the rest of your site. But this is also something that I would continue testing over time, where double-checking your assumptions over time to see if they haven't changed. All right, what else can I do for you all? What else is on your mind? If it's OK, I had, sorry, I just had another part to my question before that you read and that I want to ask. So our content describes physical items in the world. So for example, a statue in a park or a piece of art somewhere. And we are wondering what's the right signal to give to the indexer to know that this piece of content is about a specific location, which is not Google My Business kind of location. We can give Google's place ID for that location, but it's basically just a coordinates on a map. And we have tons of this, and we think that it gets more and more relevant there when people use Google Lens, for example, or mobile searches. I wonder if there's a guideline for how to do that aside from Google My Business. I don't think we have any guidelines on that at the moment. So there is something like a geo meta tag that you can add to pages. I believe we don't use that at all. There used to be something like a geo site map extension that you could use. And I believe we deprecated that maybe five or six years ago. So at the moment, I'm not really sure what the best approach there would be. I'd recommend doing in a case like that. I think the type of content that you have there is probably something that not a lot of sites have. So it's probably something where we wouldn't necessarily have a kind of a standard approach of what you would need to do there. But what you could do is maybe look into the schema.org documentation and double check to make sure that there's nothing that might match what you're trying to do there. Because we do take structured data into account, especially from schema.org, even when we don't use it explicitly in search, it does help us to better understand the page and the context of the page better. So if there's something like, I don't know, a place mark or something in schema.org where you can specify, this is a specific location, and give the coordinates for that location, then maybe it makes sense to just populate that onto your pages as well. Thank you very much. All right, Rob, I think you had a question. Do you know what my question is? I know what your question is. I passed it on to the team to double check, but I haven't heard back from. Did you give Mihai his email or tweet? I'm not sure what you sent. He said he followed up with a, you know, you asked me, for examples, when we were at Brighton, I passed on to Mihai, and he said, oh, well, don't send, don't email John just yet, because I've already emailed some examples of, so did you get those? I got those, yeah. OK, all right. That's all I have for you, then, and we're at a safe distance this time, so. Oh, well, OK, cool. OK. All right, any other questions? Anything else out of your mind? Hi, John. I have one question about one new client in a very hard case. You must be strong before to hear that. The previous agency has made network of over 150 blocks in boxpot.com with exact much domain names, kword. There are several types of the box post types, and the other is same. Such of them is with one box post, those with two or three box posts with poor quality content. We will delete them. But there are also a real block of the business owner with 50 or 60 box posts, very helpful publication with high quality content. But the problem is the latest type of the box post is between five or 10 box posts per subdomain. And some of there have a quality content. But the problem is he has now 30 or 40 box posts on different subdomain with quality content. And I don't know what we must do now. OK, well, it sounds like they're trying to do the right thing. So that's already a good first step. What I would try to do there is just to combine all of the good content and put that into a single site and make a really strong single site rather than kind of keep it across all of these different other sites. I believe on block spot you can also set up redirects for individual URLs for the whole site. So that might be an approach to kind of combine everything into one stronger version. But in the long run, I'd probably like set up these redirects maybe for half a year or a year. And then afterwards just delete those blocks so that you kind of really cleaned up all of the old stuff. You don't need to worry about maintaining them. Like they're all gone. Absolutely clear answer. Thank you so much. Sure. I think there was someone else had a question. Yeah, I had an additional question if that's OK. I'm not sure if this is an easy one to answer. I work for ITV Broadcast in the UK. And the name of the product in the site is called ITV Hub. People Google ITV Hub. And what's coming up is our similar cost live page, which is ITV Live rather than the home page. The home page itself gets a huge amount more internal links, external links, and traffic. But for some reason, it appears a second in ITV. The main channel appears first. And I'm desperately trying to get those switched over because it's confusing users and staff internally alike when they're Googling ITV Hub and ending up not on the home page. OK, I think that's always a tricky situation because there's no kind of way to say I prefer to have my other page ranking instead of this other than the really brutal method of saying, well, I'll no index one or put a rel canonical on one. And then that other page doesn't rank at all. So that's kind of tricky there. I don't know. Let me see. One of the things that I think might be causing the issue is the URL for the similar cost page is ITV.com slash hub slash ITV. So as far as keyword matching goes for the URL, it is actually matching for the keywords of ITV Hub. Yeah. Maybe. Yeah. That might be something that could be throwing us off there. So you'd prefer for the query ITV Hub, you'd prefer to have just ITV.com showing up? Yeah, please. Yeah. I mean, I don't have a dial. So I can't just switch them around. I can't just switch them around. But OK. That's interesting. OK. I need to kind of see what I can figure out here. But it looks like when I do this query now, the homepage itself actually is number four or so. So there's a lot of stuff that's kind of above that already. Well, that's new because there used to be positions one and two. So I prefer you can either search in the UK. Oh, OK. Oh, yeah, of course. Because I have Google CH and it's in German. So maybe that throws things off as well. OK. I don't know what the best approach there is. So in general, for these kind of things, the clearer you can make it which version is actually the right one, the more likely we'll be able to pick that up. But obviously, if you have this kind of a mixed setup and some of these pages actually include the keywords from the home page as well, then that can throw us off like this. One thing that you could do kind of to, especially if you're worried from a user experience point of view, is to put maybe a banner on top of the page that you don't want and say, hey, if you're looking for the home page, this is where to go or for the company page or however you want to call that. Probably at least users will be able to find their way to your preferred version, even if they land on the wrong one. Yeah. That might be something at least for the meantime to look into. But it's always tricky because sometimes there are good reasons for algorithms to pick the other page and to say, well, everyone is searching for this other page. So therefore, we think it's probably the most important one here. But yeah. OK. I need to see if I can find something there. Cool. Thank you. All right. So with that, let's take a break here. Thank you all for joining. Thanks all for all of the questions. And I hope this was useful. And I'll follow up on some of the things as well to see if there's something we can do to improve things on our side as well. All right. Have a great weekend, everyone. And see you next time. Thanks.