 All right, welcome everyone to today's Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a Webmaster Trends Analyst here at Google in Switzerland. And part of what we do are these Office Hours Hangouts and usually more in meeting rooms. But today, we seem to have run out of room, so I'm over here in the cafe. Let's hope it doesn't get too crazy here. All right, I think you had a question about rebranding and moving sites. Yeah, that would do it. Yeah, so do you want me to ask the same question again? Sure. OK, so the question is that we want to migrate a segment or a portion of our website, say 100 to 120 pages, to a new domain for rebranding purpose. Right now, we are ranking well for these segments, but will we be able to rank well for these pages on a new domain? Or is it treated as an independent separate business, a separate website by Google? You can definitely do it on a new domain as well. In general, what will happen if you split up your website into separate parts is we'll have to look at those parts individually. So we'd have to reevaluate those pages on a per page basis. So you can definitely rank like that, but it's not so much the case that you could say be certain that you'll rank exactly the same as before, because it will be a different situation. Like, you'll have a different website set up. Yeah, I get it. So yeah, that's what I thought. Thank you. OK. Hello, John. Hi. Yes. Sure. I have one question that previously when we used Google Search Console, then there was an option of landing page. When we put a URL on that landing page, then we could see that on which keyword they were ranking. Is there any alternative to this now on the new Search Console? In the new Search Console, so seeing which keywords rank for which queries, or is that's normal in the Search Analytics section? That should be no problem. Can the exact view of that previous, we can see of that landing page in your Search Console? Which, so seeing which landing page would be shown there? OK, let me find out that. Yeah, because essentially in the new Search Console, the Search Analytics feature is very similar to the Search Analytics feature of the old Search Console. So you should be able to see the landing pages, and the queries, the clicks, and impressions, positions. Melissa, I'm talking about that now we can see only my websites that I submitted on Google. But previously there was an option that I can find out any websites keyword that for that site was ranking. No, that wasn't in Search Console as far as I'm aware. So maybe you were looking at a couple of keywords. I have one account on that. It's still that once. I will show you that. Because you really need to have a site verified so that we can show you the data in Search Console. So it wouldn't be the case that you would be able to see the clicks and impressions data for a third party website that you don't have verified. But previously I could see that's not in Search Console. It's a very nice feature. That shouldn't be in Search Console. I don't think that would be coming to Search Console either. Another thing, does it provide any guideline about pagination and citation? Pagination, we have a lot of guidelines on pagination. So there's a Help Center article on pagination. Can I see the links up? Sure. I mean, it should just be pagination. So let me see if I can pull it out. It's called Indicating Paginated Content to Google. Thank you for that link. Yeah. Oh, very positive, yeah. Cool. Hi. Hi, John. First of all, as always, excuse my English. Hope you can hear me well. Yes. OK. On the new Search Console, I saw the tab links link. I had a problem in the last few months. There was a clone site. Do you understand when I say clone site? So it was copied or you moved it or? No, a site completely cloned mine with the logo and links and whatever. So yeah, so it created many backlinks because this clone site used all my sites or all my relative links, creating links, all the menus, creating links. So actually, I have the sites with main links. It is still, even if I disavow that domain, still signed that I have 968,000, et cetera, links. The second one is 69,000. So as you can see, it's 10 times a normal back links from a site from the second one. OK. So I disavowed, and probably it works, it should. But also, I see on the link tab, there is another table that is in Italian, it's test-opening principale. It's the anchor, anchor of the main links. So this is the question. This table is changed because of those backlinks. And can it affect my search ranking? Can it change? I mean, I know that if I use that anchor and search, I see myself in the first position instead of the old real anchors classifying. Sorry for my English. So I don't think that would be a problem. One of the confusing parts there, I guess, is the count of the number of links that we show there. That's probably just a sign that this is a site-wide link from the website. So it feels like a lot of links, but it's actually just one website. So it's not really something kind of problematic. Also, if you disavowed those links, we would not take them into account. So that's pretty easy to do, just disavowed by the domain. What would not happen is the links would not disappear because of the disavow. So in the links tool, we show all of the links to your website, even the ones that we ignore. So if you've disavowed those links already, then they would not disappear from the Search Console tool. So you basically look at those links in Search Console there and essentially take them the way they are and kind of ignore the ones that you know that you've already disavowed. Also, the anchors, right? Yeah, exactly. Thank you very much. Hi, John. How are you doing? Hi. I'm here with Stefano, with whom I work. And Stefano, do you want to ask your question right now? Absolutely. And I'm sorry for being late. I could not find the link, so very, very sorry. I hope that I'm not stepping on anyone's foot right now. Anyway, thanks so much, guys. I have a few questions regarding optimization for videos and streaming videos and on-demand videos in general. So here's the deal. I help managing a website that's basically focuses on this kind of content, so streaming videos and on-demand videos. What we've been doing in the last few years, actually, has been using more couple language and trying to optimize as much as we could. The problem is that, on my hand, whenever I try to crawl the website, we're using any tools from Simrush to Mars, anything, apart from Search Console, I cannot really crawl the website because of the way in which it was built. It's the old server rendering, the client-side rendering, the old conundrum that's been discussed by SEO, a manager for ages. But anyway, my question is, is there anything else, apart from the market language and, again, try as much as I can to check that everything else, like titles, content, anything else, has been optimized accordingly? Is there anything else I could do to make the videos and the streams more findable and to earn organic traffic and to earn organic ranking? Also because, and this is the last part of the question, another question that I'm getting asked again and again is to do everything that I can to make sure that our videos appear in the feature snippets. And some of the content that we stream, it's basically sports events of long-tail teams that nobody has is really streaming. So I've been asked why. Sorry, this is my dog. Why are we not appearing in the feature snippets if we are the only one who are streaming these events, basically? Why our highlights are not appearing there if somebody is looking for their event if we are the only one streaming this kind of content? So again, I'm very sorry for the very long question, but basically two parts. Part number one, if there is anything else, apart from the markup, that we should do to optimize videos and streams. And question number two, is there anything that we could do besides this kind of optimization to push our presence into feature snippets? OK. So I don't know for sure because of the streaming side. I think that's something I haven't looked into very much. But in general, I think the first thing that you'd want to double check is the indexing side, like if we can actually index the pages and the content, the video content on the one hand, the streams on the other hand. It sounds like that's not really a problem. It's that we can show these pages, these videos in search. Is that correct? Yeah, that's correct. And as a matter of fact, we are actually ranking not badly for quite a lot of keywords for many, many of our pages, which is, I mean, it says a lot about how Google, many times, is more intelligent than our own code. So that's good. OK. So I think the basic foundation is already set. That's good. So we can pick up the content. We can index the videos. That's kind of the foundation there. For structured data in general, I think you'd want to differentiate between streams and recorded content. That's one thing that would be really important to make that clear with regards to the structured data so that when you're submitting videos, recorded content that's available at any time, that's something where you can use a normal video markup for that, or where you can also use the video site maps to tell us more about those videos. For example, if the video content is only available in certain countries, you can tell us about that in the video site map. For streams, that's where I'm not really sure what the right approach is there. I believe there's also the indexing API that you might be able to use for live stream content. I'm not 100% sure, but you might want to double check that. But in any case, you really want to differentiate the recorded content from the live stream content and treat them differently and make sure that they're marked up separately in their own ways. With regards to featured snippets, I don't think we show live streams in the featured snippets. I think, as far as I know, the video previews that we show, the video one boxes are more for the recorded content. I don't think we would show streams there. But if you haven't recorded, then that's something that would be eligible there. So to interrupt, yeah, that's exactly correct. And I'm sorry if I might maybe not express myself clearly, that's exactly it. So we would like to make sure that we do everything that we can to have our own recorded videos in the featured snippets. But again, so my question was, is there anything else besides the markup that we could do to make sure that happens, that we have a shot to be there? You don't need to do anything special past that. If we can pick up the video content and show it for the video search results, like if you go into the video mode and search, if we can show the video thumbnail, that means we kind of have all of the details that we need for the video content. The important parts, I guess, are that we can crawl and index the thumbnail. So if you have a thumbnail image, that's not blocked by the text. And that the video file that you link to is also kind of indexable. But it sounds like those are things you already have. Yeah. And sorry, can I just ask a very, very last one. When it comes down to the featured snippets for sports events, there's also another part. Besides the actual highlights or whatever, on top of it, there is usually a table that shows the results of the matches and all that part, which is part of the featured snippets. My understanding has always been that that's also part of what you guys fished from a page that you catalog as the best answer to the query. Is that the case? Or in that case, there is some kind of, quote, unquote, shortcut that big companies, like, for example, MBA or whatever, do to make sure that their own pages are listed as the source for the results? I'm not aware of any shortcut there. It's really just normal search ranking. For a lot of this type of content, if we can show it in a tabular form, then, obviously, having that content in a table form on the page in the HTML helps a lot. That's it. So I suspect you already have that. If you have the information in a table in HTML, that's the right thing to do. It would be tricky, for example, if there were just separate views and they use CSS to align the individual cells to make a table-type thing, because then we wouldn't be able to say this line is something that we can show in search. But if it's a table in HTML, we should be able to pick that up and show that when we think that it makes sense. Patasti, thanks so much. John, I have a technical SEO question. It's something that people hear from me. So you have a URL, and in the URL, they have a parameter that tells you where you clicked from, which I don't recommend, but this is the website that has it. They have a parameter like you came from section A or section B in the URL. And they also have like page equals one, two, three of a rel-next-rel-prev-to-set. So they ask me if they should go ahead and keep the section parameter in the URL and also keep use rel-next-rel-prev with the section parameter. I think it makes more sense to probably rel-canonical. You can use canonical and rel-next-prev at the same time, right? So rel-canonical to the main URL without the section parameter, and then rel-next-rel-prev should just exclude the section parameter. Yeah. I think any time you can simplify it so that you have fewer URLs that lead to the same content, then that's an approach I take. OK. I would say my number one approach was like just remove the section parameter, but they said they can't do that for tracking purposes. Yeah. I mean, sometimes you have these things for tracking. So you have to deal with what you have. Sometimes what you can do is move it to a fragment instead of a normal query parameter, which usually means that we would drop that for indexing. Yeah. I suggest that also. But again, OK, cool. So that's the approach I recommend. Now you can answer Glenn's questions submitted in the. Oh my god. OK, I should check. Refresh. OK, let me take a few questions now that you bring it up that were submitted. And then we can move on to other things live as well. Since Google confirmed that none of the core updates lined up with any of the neural matching updates, is it safe to say that sites should really look to improve on quality and relevance over the long term when it comes to these broad core updates? I think that's safe to say regardless of any updates. It's always worth improving things. I have no idea what you mean with neural matching updates. It sounds like something machine learning that we're trying to or someone is trying to pull out and separate. We use machine learning in lots of our parts of our infrastructure. So pulling that out as something specific is sometimes more artificial than really useful. With the March 12th core algorithm update, there were many other sites that saw a positive movement that dropped heavily during the previous update. Was there a softening with whatever rolled out in August? I don't know how this would relate to the updates in August. I mean, in general, when we make algorithm updates, we do try to work with one state and work towards a new state. And sometimes we improve things where we recognize maybe the algorithm went a little bit too far. And sometimes we improve things where we recognize algorithm didn't go far enough. So these kind of incremental changes, I think, are completely normal with just algorithm updates in general. E-tag versus if modified sense, which of the following HTTP headers would you recommend using for crawling budget optimization and why? So I think these are two separate things. Like if modified sense is a heading that we send with a request, and the E-tag is something that's sent with a response. So I mean, obviously, there's a match there in that if modified sense is kind of based on a date, and E-tag is in kind of an ID type system. In general, both of those work. We do try to use those when it comes to crawling, but we don't use them all the time. So we realize a lot of sites either don't implement these or have implemented them incorrectly. So it's something that doesn't completely change the way that we crawl the website. It helps us a little bit, but for the most part, sites generally don't have to worry about that level when it comes to crawling. When it comes to users, those kind of optimizations can make a big difference, especially with users who come back regularly. They'll be able to reuse the CSS or the JavaScript and things like that. Recent Webmaster Central blog posts about the dates. Google said be consistent in usage. I have a client who wants to show a here-rich date. I'm probably saying that wrong. As a publishing date in article pages, is it OK to show that in the visual parts and use Gregorian and structured data? Yes, you can do that. The different formats is less of a problem. And for structured data, we definitely need to have the Gregorian date. You need to specify it with the ESO, what is it, P601? I'm not completely sure. Whatever language code that we specified there, you need to use that code to specify the date and the time, ideally also with time zone so that we have that sort of properly. But if within the visual part of the page you want to use your local date format, that's generally pre-fine. That's something that we should be able to understand as well. We're a newish business that helps businesses work with temp recruitment agencies across the whole of UK. When we started, we dynamically created pages for combinations of locations and job roles. And due to an error, we created 13 pages for each combination. OK, that's a lot. We're trying to remove all of these with a no index meta tag, but they're still there. What are we going to do to get this fixed? So in general, no index meta tag is the right approach here. It means that when we recrawl those pages, we will drop them from our index. That's perfect for something like this. What you can do if you feel that you need to get these removed faster is either use the URL removal tools if you have a clean kind of subdirectory that you can remove. Or you can use a sitemap file to tell us that these URLs changed recently with a new last change date. And then we'll go off and try to recrawl these a little bit faster. In general, though, especially if it's a larger website, then sometimes it just takes weeks and months for us to recrawl everything. Especially if we went off and crawled an index like some obscure facet of your website, then probably those are URLs that we don't schedule for crawling that frequently. So that's probably something that can take maybe half a year or so for it to completely drop out. In practice, that shouldn't be a big problem, though, because we can recognize which URLs you want to focus on. And we'll focus mostly on those as well. So these old ones will be indexed for a while until we can process them and drop them. But they shouldn't be affecting the indexing of the rest of your website. I have a few clients that get their local knowledge panel triggered by non-branded terms. Do you know what signals may influence this? I have no idea. So that kind of falls into the general local business, Google My Business results, as I don't really know what would be triggering that. Probably something with the listing in the website, but I'm just guessing. Probably like how you would be guessing. I'd recommend going to the Google My Business Help Forum and checking out what other experts there are saying. Is image search different from web search? If I block an image using Googlebot image, disallow doesn't mean that Googlebot default will also not call the image. Will not show in web search. So yes, it is different. Kind of like how you mentioned, we have different directors for the different spots. If you blocks an image for Googlebot image, then we won't show it in image search. And we also won't use it for web search within the universal results, because the universal results are built on the Google images search results. So if that image is blocked, then we essentially won't use that image in general. It wouldn't be a problem for your normal web results. So it's not the case that when we have a web page and we index it for web search that we need to have all of these images available for web search, because we do something fancy or something around that with the images. But it's really just a matter of we can't call these images, so we can't show them in image search. You don't have any positive or negative side effect of images not being available in normal web search, so for the normal kind of text type searches. We started ranking for unrelated ingredients. We don't sell nor have the content for pharmaceutical drugs. We haven't been hacked nor have incoming anchor texts with those terms. Is this an issue? And if so, what things can I check? So my suspicion is that maybe you were hacked after all. So I don't know your website. But usually this is not something that just randomly happens that a website starts ranking for pharmaceutical terms. But rather, maybe it was hacked in a way that is not completely obvious from your side. So what you could do is go into Search Console. And in the Search Analytics section, check out the pages that were ranking for these. And use the Inspect URL tool, the live fetch option there to check out what this page looks like when Googlebot looks at those pages. And sometimes you can see in the visual part, sometimes in the HTML, you can see that actually someone added a bunch of pharmaceutical links to those pages. And that's kind of a sign that maybe there is something that was hacked on these pages. Sometimes it's also something that just kind of lingers a little bit when a website was hacked in the past and you cleaned up the hack. So those are kind of the directions I would look there. I would not assume that a normal website just suddenly ranks for pharmaceutical terms without any reason at all. So usually there is something behind it. Can I share my screen now? Sure. OK, thank you, sir. Can you see my screen? No. I think now you can. Oh, OK. Thinking about this previous keyword planner, here you see when I click on it. That's not very helpful. That's the AdWords tool. Yes, sir. I will ask that in new infrastructure, this option is not available. Is there any alternative of this? I don't know. So that's the AdWords keyword planner tool. So I don't know what the plans are from the AdWords side. That's not something that is related to Search Console. That's really purely from the Ads team. OK, sir, thank you. All right. Oh my gosh, very posted a link from Danny on Twitter about neural matching. Big company. I have no idea what that is. You never heard of it? OK, interesting. It's clearly unrelated to ranking. It's just about how Google figures out. Well, yes, it's more of a query side versus the ranking of pages. I understand you, like rank, rank. I have no idea. I mean, that was September last year, so wow. Yeah, no, it was some rumors. We're recently rumors around that Google adjusted neural matching, and that's what the core ranking algorithm was about. And then Google and Danny responded that no, core algorithms have nothing to do with this. Never have, so don't worry about it. OK, cool. You should be answering these questions instead of me. I mean, the text. We should do a show together. All right, let's see. What else is here? When disavowing backlinks, besides disavowing the spammy backlink itself, do we need to disavow all other URLs it is referred from? No. So if you're disavowing a problematic link, then that link won't be taken into account. If you think the whole website, all of the links to your website from someone else's website are problematic, then you can disavow the whole domain. That's the easiest way to cover your bases there. But you don't need to follow back the chain to all other links pointing to that page with the bad link. It's really just that one link that you need to disavow. And oftentimes, you don't really need to disavow things. Like, if it's not the case that you have a manual action or that you look at your links and you say, well, Google is going to give me a manual action next week because this looks so bad, then usually you don't need to use a disavow tool at all. How does Google treat backlinks from website analysis websites or user profiles, I guess user-generated content automatically-generated content sites? For the most part, we ignore those because they link to everything and it's easy to recognize, so that's something that we essentially ignore. Our business is about software development and DevOps services. We have separate pages for services that we provide and blog pages. The article on the blog have diverse topics, starting from how to hire a good team of programmers to how to beat procrastination and the role of women in business. I'm worried that such a broad spectrum of topics in a blog could dilute the overall topic of the website. And as a result, it could be difficult to rank for our transactional keywords that sell our services. Is that a problem? No, I don't think that would be a problem. I think, vaguely, these are all related. The thing I would focus on more is if people go to those pages and you have absolutely no value from those users going to those pages, then maybe that's not a good match from a business point of view. So for example, if you're selling software development services and you have a blog and you're listening funny cat videos, then all of the people going to look at those funny cat videos are not going to be interested in the rest of your website. So essentially, those people are coming to your site. They enjoy the videos that you're linking to or that you're posting. But you don't really have any value from those visitors. So why are you doing this? It doesn't make sense from a business point of view. From an SEO point of view, sure, maybe you'll get traffic for funny cat video queries, but that doesn't really help your website. A question about Rich Snippets, our company and our offers based in Switzerland. In our case, it's often a case that we're seeing websites from Germany with prices in euro. From my point of view, that's not very useful for users in Switzerland. Can we use structured data to show up instead of those other websites? As someone in Switzerland, I agree that sometimes seeing a lot of sites from Germany is not very useful. But essentially, if we don't have good local content and if we can't tell that the user is looking for something from a local point of view, then it's hard for us to say that these pages should not be shown in the search results at all. So for example, if we can recognize that the user is really looking for something local and we have some local content, then we'll try to show that appropriately in the search results. If we can't tell that the user is looking for something local, so for example, if they're looking for information on a specific topic, then maybe we should just show information on that topic regardless of whether or not that's for Germany or for Switzerland or for some other country in Germany. So that's not something that you can clearly specify in structured data. What you can do is use the hreflang markup if you have multiple language and country versions. So if for your site you have different versions, you can specify that for us. You can also use the geotargeting feature in Search Console to tell us that your website is specifically targeting Germany or specifically targeting Switzerland. And we'll take that into account when we recognize that a user is looking for something local. Oh, wow, the sun here. Window play, shades going up and down. Cool. All right, one more question here. I see a disturbing amount of link networks and nefarious link building schemes being used by vaping companies. I reported these as suggested, but is there anything else that we can do? This is really frustrating. So reporting them in the, what is it? I think in Search Console, the spam report form, the link spam report form, that's kind of a good place to go that helps us to better understand that these are pages that we need to review from a 90-year-old web spam point of view. It's never really guaranteed that we drop those pages completely. So in particular, when it comes to kind of competitive areas, what we'll often see is that some sites do them something really well, and they do something really bad. And we try to take the overall picture and use that for ranking. So for example, it might be that one site uses keyword stuffing in a really terrible way, but actually their business is fantastic. And people really love going there. They love finding it in Search. We have lots of really good signals for that site. So we might still show them at number one, even though we recognize that they're doing keyword stuff. A lot of times, what will happen is also that our algorithms will recognize these kind of bad states and try to ignore them. So we do that specifically with regards to links, also keyword stuffing, some of the other techniques as well, where if we can recognize that they're doing something really weird with links or with keyword stuffing, then we can kind of ignore that and just focus on the good parts where we have reasonable signals that we can use for ranking. So what could be happening here? I didn't look at these sites. I don't know specifically what you've reported, what the query is there for, but what could be happening is that they're doing really terrible link building stuff. We ignore most of that, and they're doing some things fairly well on the side. And based on the things that they're doing fairly well, we're ranking them, trying to rank them appropriately in the search results. It's kind of frustrating when you look at this, and you're saying, I'm doing everything right, and why are these guys ranking above me? But on the other hand, we try to look at the bigger picture when it comes to search to try to understand the relevance a little bit better. And it's something that sometimes works in your favor as well, because maybe you'll do something really terrible on your website. And Google's algorithms look at that and say, oh, that looks like a mistake. Like maybe we can just ignore that and focus on the good parts of the website and not remove that website completely from search because they followed some bad advice that they found online. I think that's pretty much the question. I don't see what else I'm thinking. Ooh, yeah, go for it. Hello. Sorry, I was the one that asked earlier on about removing lots of pages, but I just wanted to go into a little bit more detail about what happened and what you're doing. So what happened? So let me just say very quickly what we are as a business. So we help companies find recruitment agencies to work with. And broadly, what we have on our site is it dynamically creates pages for combinations of jobs and locations. So for example, you might type in Programmer in Oxford, and I'll show you a page saying Programmer in Oxford. And so I had these dynamic pages created and I put them into the search engine. And I saw a really, when I first did this, I saw a real uptick in volume coming from search because previously I've been relying on paid clicks. And as I said, then we made this mistake where we had every single link had 13 different pages because I'm missing something. I've tried to get them out. What's happened, I've seen my impressions just go down over time to zero. And literally the only thing I'm coming up for is my brand name. And I did what was recommended in that I went to the old version of Search Console and said, can you ignore these web links for me? It still doesn't seem to have made any difference. And I still seem to be having nothing coming in. And I'm kind of very nervous about what you said, saying it could take months for this to come through because as a startup, I know it's not Google's business, but it's very hard for me to get traffic in at a reasonable rate. And it's anything else I could be doing just one second part of this question is the way I do my pages is I have in my database I have a little bit written about a role and a little bit written about a town. And I combine them together on a page to say his program was in Oxford. So a programmer, our job spec will say these things and you should pay this amount of money for a programmer and Oxford, this is kind of like the employment, it's a bit about employment in Oxford. But of course, the piece about Oxford is repeated on every single job role. And then the piece about programmers for a piece of every single location, does that mean those pages will get de-ranked? So I've packed a lot in there, but these are more issues. So I'd be kind of worried about the setup that you have there. So that sounds a lot like you're just taking a lot of chunks from the database and just automatically generating combinations of those pages. And it feels to me that a lot of these variations could be very thin, very low quality, and that maybe you don't have a lot of content for those variations. And you quickly run into a situation where you create thousands of pages with all of these combinations. And the actual amount of content there is very minimal. So what could happen on Google's side is we go off and crawl and index all of these pages. And we think, oh, there's so much stuff here. We can kind of pick up all of this stuff. But in a second step, when we look at the details, we're like, actually, there's not a lot of really useful information here. And especially if you're also taking the jobs, maybe from existing feeds, from other websites, then what we see is essentially just a mix of some city information, some role information, job listings, which we've already captured somewhere else. And we kind of look at that overall and say, what's the additional value of having your page indexed in addition to the things we already have indexed about all of these combinations? And that's, I think, the tricky part there. And that's kind of where you need to jump in and say, actually, the way I have things combined are completely different from everything else. There is a lot of value that I add through these pages that when people go to these pages, they clearly see that this is not just a combination of existing feeds, but actually there is something really fantastic here that nobody else has captured so far. I mean, just on that point, we don't have job feeds on. So this is where it's very subtle for people that we don't list jobs. It's a place. It's a business-to-business site. So it helps businesses find recruitment agencies. And that's so we don't have any extra content in that sense. And the reason I have these multiple, what I call cartesian product, is on AdWords. I bid on AdWords. I'll bid on the word, say, laborer in Oxford. And I want to show the laborer Oxford page because I get a better score in AdWords for having a higher quality thing. That's why I do that join. And so I show that page and I get a better score. It looks more relevant to what they want because basically we tell them how many agencies can serve them for that job combo there. And it's quite a thin page before we pass them on to our thing and. John, can I give a story about that? Just some more situations. Go for it. Probably well over 10 years ago, I was playing with that just like taking content descriptions and making like 15 variables of those and then matching it to like city and making city variables with 15 city variables for each city and kind of matching it together. This is like almost 15 years ago. And then I forgot about it. I want to see how Google will react to it. And back then it was like, oh, cool stuff. And then I want to see how Google will react to it. I forgot about it. And then Matt Cots came to me. He was like, do you know the S My Spam team who really wants to penalize your whole entire, not my website, but this specific website for doing that. I'm like, oh, I forgot about that. So I think the Spam team looks really poorly on that, on that taking content and content and pairing them together, especially when it's very duplicative across the board. I could imagine for ads, maybe that's slightly different if you have ads landing pages, but especially from an indexing, from a quality point of view, that's something where I would expect that you would have a really tough time unless you're really providing a lot of really strong value in addition just to those listings there. How do I play that off, though, because it's like optimized for pay-per-click, which is what I survive off at the moment, I'll optimize for SEO. If I put no index on those pages, which I can do, doesn't that mean that AdWords will just ignore them as well? So I can't speak for ads. But my understanding is you can block these from indexing and AdWords will still use them as landing pages. OK, so I could put no index on the dual ones. And then I have just a set of pages for roles and a set of pages for locations and have those in there, because that's unique in my thing. Yeah, I think what you really want to do, especially for search, is make sure that you have a significant amount of value there that's much more than what other people have for that kind of content. So in particular, one approach I usually take is if you bring this problem to me and you say that your site is not ranking well, and I take that problem to the search ranking engineers, and they say, well, actually, we have this combination already indexed five times. Why should we add this other variation? And I don't have a good reason to tell them, like, instead of these five, you should take this new one. Then that's something that's really hard for me to justify. And that means that in the end, you'll be competing against a lot of established players in that scene. And you're kind of a new player in that scene. And you don't have significant unique value that you're adding to that. So it's really, from our algorithm point of view, it's like, why should we spend time on this website that's not significantly changing the web for our users? OK, so that's quite dark. I just want to be direct. I mean, there are things you can obviously do to improve. So there are technical things you could do to kind of tweak things. But you're changing things in, like, 10% up and down, when actually you need, like, 100% up and down to kind of significantly make a dent there. So I mean, I don't know your website. So maybe there's something missing that I don't know. If I no index those pages, because is it likely that maybe I've been marked down by the filter and it's going to take months for that to pull through? And there's no way to get around that. I don't see that would be a problem. Like, if you have new pages that are indexable that are really good, then over time, that will get picked up. It's not something that would change overnight, but it could be picked up over time. OK, OK. Thank you for taking time. Sure. John, can I have a really quick one? Sure. It's a follow-up on a couple of hangouts to go. I told you about a website that had a planning structure data markup action. That was really old, and it didn't show really any information about what URLs were affecting anything. The site is on Angular, and the structured data comes through server-side rendering, while the product information comes from client-side rendering. So we were wondering if that might be the reason why Google doesn't think that the information correlates there. I was wondering if you found out anything about that, or if there's any details. Yeah, I asked with the team about that, and they said you still haven't fixed the problem. So let me just drop the URL that he gave me there. It's like you have a lot of reviews in there, but they're not visible to the user. They're kind of hidden structured data. OK, gotcha. So that might be something. Maybe there is something just with the front end that's set up incorrectly. Maybe it's just legacy stuff that you had there. But that was what they were saying. You still need to fix this issue. It's not that something is stuck. OK, so there's no Angular issues or anything like that that Google would have. OK, awesome. Thanks a lot. Cool. All right, any last questions from any of you all? I have a question. OK. So this is relating to structure on our website. This is a magazine. They have, this is in a Drupal site, and I'm trying to figure out the best structure since there is articles in every category, and those articles could be repeated through categories. But there's also landing pages for each category and subcategories. So I'm kind of trying to avoid such a big mess and trying to condense that. Do you have any advice for that? I guess in general, what you want to make sure is that the pages that you do have index for search that they're able to stand on their own. So if you have all of these different combinations of categories and landing pages and things like that, then I try to figure out which of those variations you think you want to keep for a search and no index the other ones. All right, easy enough. Hopefully, yeah. I mean, easy to say maybe the implementation is harder. Yeah, since this is a Drupal, it just creates that automatically. But yeah, I mean, I think I can work around that. Cool. Thank you. All right, I need to head out a few minutes early. So thank you all for joining in. Thanks for all the questions that were submitted. And hopefully, I'll see you all again in a normal meeting room next time. I guess on Friday is the next one. On Thursday, the journal one. All right, thanks a lot, everyone. See you. Thanks, bye. Bye. Thank you.