 Maybe. All right, welcome, everyone, to today's Google Search Central SEO Office Hours Hangout. My name is John Mueller. I'm a search advocate on the search relations team here at Google in Switzerland. And part of what we do are these Office Hour Hangouts where people can join in and ask their questions around websites and web search. A bunch of questions were submitted already, and we have a bunch of people here live. So maybe we'll just start off with some live questions from one of you all. Hey, John. So John, my question is regarding offer schema, the price markup. So my doubt is, if I markup price and if that price is valid only for three days, and if my page is not crawled and I end up showing a wrong price on a CRP, so how do I approach that? Now, I don't think you can really fix that in the sense that you can force Google to recrawl your pages immediately when you make price changes. But you can use the usual best practices to make it possible for Google to recognize that. So in particular, the sitemap file, let us know that the page has changed, that we can pick it up. If it's an important page for you, then make sure that you link it from important pages of your site as well. So if it's something that is going on sale and you think it's an important sale, then make sure it's linked from your home page, for example, so that we can understand your perception of the importance. But there's no work around to making it so that Google automatically picks up all of the changes within your website. Cool, John. Thanks. So I would be marking that in sitemap, saying the last updated date. Yes. Cool, cool. Thanks, John. So that is another question of mine. So how important is the pixel font size? So probably one part of my page is in 14px, and the next part of my page is in around 12px. So is it going to affect my ranking? I don't think so. Usually it shouldn't. The one thing I would watch out for is mobile friendliness. So if your page can be viewed properly on a mobile device, mobile friendliness is still a factor that we look into for ranking. But otherwise, it doesn't really matter. It's more something between you and your users so that you present something that they can read, they can understand. That's easy for them to kind of process. Yeah, cool. Thanks, John. Thanks. Sure. All right, Pranit. Yeah, hi, John. Good morning. This is the first time joining your call. So I have a couple of questions. The first one being that we're trying to get an answer for this from Google blogs and getting some conflicting information. We have our pages where we have some FAQs or content that is long enough for mobile views where the user needs to scroll. And we are using read more, read less kind of buttons that expand and collapse the view. How much of the content that the user is seeing? My question is if our page loads the data at one time, so the read more is not loading anything extra, or read less is not taking anything away from it. The page, if you look at the doc on the network side, you have all the questions, all the answers. Is that going to affect what Google sees and how does it affect what Google reads on the page? That's usually completely fine. So that's not something where I would worry about it from an indexing point of view because if it's on your page when we load the page, we can index it even if it's not visible immediately. One place you might want to watch out for, since you mentioned FAQ, with structured data on your pages specifically for rich results, we want to make sure that that information is visible. So especially with FAQ markup where you have questions and answers, at least the questions that you mark up should be visible. It's fine if the answers are kind of collapsed, but at least the questions should be visible. Sure. Yeah, we do use the FAQ markup on the structured data and everything is there as well. And we do, probably the questions, we're just not sure if we need to show the answers. It goes for a bad user experience if they had to keep scrolling to get to the next one. So that's what we were trying to solve, but at the same time make sure that Google is reading our pages. Second question I have is around sitemaps. We don't have that much of crawl for it, I think. And many of our pages are crawled by Google that are not listed out of the sitemap. But when it comes to the sitemap, it only crawls about 10% of the pages that we put on the sitemap. I know that the priority is not something that Google uses anymore. Is there any other way that we can say, Google, if you're trying to crawling only 10% of our pages, these are our most important pages? To give a sense of importance within your website, you'd need to do that more with regards to internal linking, less with just purely the sitemap and more kind of making it easy for Google to recognize when we crawl your site, which of the pages you care about the most. Usually that's like we would start with your home page, depending on the website. But in many cases, we start with the home page and things that are linked from the home page we would see as being more important than things which are just like, I don't know, five or six steps distanced from the home page. Sonskin, thank you very much. Sure. All right, Rob. Yes, yes, I'm here. Is there still any kind of API access for Search Console which would give us more information on where the 404s are breaking? Because I understand it used to include that, the referring page. But our developer said he looked into it and it was closed. The access was closed nine months ago or something. Is that true? So we can't actually access the referring data anymore? I don't think you can see that, yeah. So that's something. I think it's visible in the index coverage report, but that's not connected to the API at the moment. I don't think they're referring URLs in there. I don't know if it's in the default downloads, but I mean, we don't have an API for that. So that's kind of the tricky part. I guess what I would do in cases like that is try to crawl your website with a local crawler and see what comes back as 404s and where they come from. Oh, OK. The second question is that the new site we launched a year or two ago to replace the other one that you broke. The there are some URLs still reporting in Webmaster Tools that are at least seven or eight years old and haven't been live for that long and have been for 10 previously a long time ago. Is there anything else we can do about that? Or is that perfectly normal? These can't have any external links pointing to them. There's no way you could be finding them. Externally. So is that still normal? Does Google have a memory of what you used to be and just keep trying? Yeah. Yeah. I don't know. Seven, eight years sounds like a really long time, but especially if you'd like to be an anomaly. Well, I mean, especially if it's something that we saw in the past, then we'll try to re-crawl it every now and then. We'll tell you that, oh, this URL didn't work. And if you're like, well, it's not supposed to work, that's perfectly fine now. So if the site wasn't live for five or six years, we just parked it. So if we just 4.10 again for six months and see what happens, it should finally get the point. Yeah, I don't think you could guarantee that we would try those URLs because it's one of those things where we have them in our system and we know that at some point they were kind of useful. So we'll just kind of like, when we have time, we'll just retry them. It doesn't cause any problems. It's just we retry them and we show you a report and tell you, oh, we tried this and it didn't work. And you're probably like, wow. Because I was just worried about the sheer number or percentage of 404s because I was chatting to Mihai as well. And then Mihai had me worried that 30% of the URLs are 404ing. And that might start to look like it's only 70% of our site is actually fundamentally useful. No, that's perfectly fine. That's completely natural, especially for a site that has a lot of churn. So if it's like a classified site where you have classified listings that are valid for a month, then you expect those listings to drop out. And then over the years, we just collect a ton of those URLs and try them again. And if they return 404 or 410, it's like, whatever. Perfectly fine. So if on a natural kind of ongoing basis, 30% of your URLs go out of date, that would look normal. But if you had a site where everything's new and fresh, but suddenly 30% of them were broken, that might look unusual. I don't think that would even look unusual to us. It's not that we would see that as a quality signal or anything. The only time where I think 404s would start to look like something problematic for us is when the home page starts returning 404s, then that might be a situation where we go, oh, I don't know if the site is actually still up. But if parts of the site are 404, whatever, that's like a technical thing. It's like, it doesn't matter. And usually what happens is we remember those URLs, but we keep them in more of a low priority queue. And whenever we have time left, when we think, oh, we could still crawl, I don't know, 1,000 URLs from the site, we'll go and say, oh, well, how about we check these old ones just to make sure that we're not missing anything? OK. And there's no remove from index button anymore, is there? No. I mean, there is no button like remove from Google's memory. Because I mean, they're probably not in index. Anyway, it's more like, oh, we just think we try. All right. OK, thanks. Sure. Derek. Hi, John. My question is regarding the country targeting feature in Search Console. So let's say I have a website, xyz.com, where by the content on the homepage, it's actually designed for users in the US. However, I also have a subfolder, xyz.com slash UK. That's actually meant for users in the UK. So in this case, can I actually use the country targeting feature in such a way that the homepage is actually targeting users in the US? And then the subfolder slash UK is actually targeting users in the UK. And the main reason for asking this is because I'm not sure if country targeting at homepage level is going to be treated as a site-wide signal, causing country targeting at folder level to be ignored. You can definitely do that. So we'll look at the most specific part and use that. So if you have the rest of your site is for the US and you have a subdirectory for the UK or some other country, that's perfectly fine to set up like that. The important part is that it's a generic top-level domain. And then you can apply country targeting. If it's a country code top-level domain, then you wouldn't be able to do that. OK. Thank you so much. Sure. John, would you also add hreflang on those sub as a secondary signal? Does it not matter if you geotarget it in Search Console? Yeah, I mean, if there are equivalent pages and using hreflang, definitely makes sense, especially if it's the same language, different country content. All right, Stefan. That pesky mute button. Question about breadcrumb structured data. The context is we have a product detail page that does not include the product description in the URL. And so we were hoping to improve the user experience and click-through rate with breadcrumbs. So we've marked up and we've added the structured data. We've checked it in the rich result test tool. We've confirmed that at least a sample of URLs have been picked up by Google through the breadcrumb report in GSE, but still haven't seen any change in the display URL for SERPs. So I'm curious whether there are additional steps or requirements that we would need to fulfill. Not necessarily. So I think in Search Console, there is a report for breadcrumb structured data. There is. Do you see your URLs listed there? So there's two observations there. Just before this call, I went and picked one sample, which was in that report, retested it, and so forth. And so it is there. I did see actually a very large drop-off in the number of valid pages. It was hovering, and I think it's dropped nearly, let's call it, 70% in the last month, which seems odd, because I don't see an uptick in errors or valid with warnings. OK. But anyways, that's a separate topic, I think. So when it comes to structured data and rich results in general, we try to look at it on several different levels. And I don't know if that applies so much to breadcrumbs. But just to be complete, the first things are kind of from a technical point of view is a valid markup. It sounds like that's the case. You can test that with the testing tool. The second one is, does it comply with our policies, which probably for breadcrumbs is less of an issue, because I don't know, breadcrumbs or breadcrumbs, it's hard to do them in a bad way, I guess. And the last one is more of a general, usually site-wide signal that is about the quality of the site overall. Like, can we trust this website to provide something reasonable with structured data that we can show in the rich results? And usually what happens when everything from a technical point of view is set up correctly and we've had enough time to process it for indexing and it's still not shown, then that's usually a sign that our quality algorithms around the rich results, in general, are not 100% happy with your website. So I don't know if that's the case here. One way to roughly estimate if that's the case is to do a site query for your website and to see if the rich results are shown there. If the rich results are shown there but they're not shown for normal queries, then that's pretty strong signal that from a quality point of view we're not completely happy with your site or at least the structured data algorithms there. If in the site query you also don't see those rich results types, then that's more of a sign that for whatever reason we haven't gotten around to processing them completely. So that's something you might want to double check there. What I would otherwise do is if you continue to see this over a longer period of time, like you give it a couple of weeks and it still doesn't show up, I would definitely post in the help form and get input from other people there with specific queries that you're using and specific URLs that you're looking at. OK, that sounds good. One quick follow up to that is presentation. Are the links in the presentation layer important? So is there some rough requirement that you have to have a link, caret, link, caret, and so forth? No, not necessarily. Oh, yeah, maybe one thing else that's kind of usually important for rich results is we have to be able to see it on the page itself. So if these breadcrumbs are not visible at all on the page, that might be something where our algorithms are not sure if they're actually relevant for that page. They are present and visible. Can I ask you one additional question? It's related to SSR. Real quick, we're rebuilding our approach to SSR. Your step-brother system, Bing, has some pretty onerous requirements in terms of size of the document, which we're trying to fit our solution to. One of the biggest drags or biggest page bloat is related to our hydration code. So question is, for SSR, do you see, would you suggest including hydration, particularly for user experiences that have carousels? So obviously, you're going to use a viewport you'll look at and try to expand that. So would you suggest keeping hydration, for at least Googlebot's consumption, in SSR? My guess is it doesn't matter so much. As long as the server-side rendered content is essentially what the user would see, that update from there to the full version is less something that we would care about from our side. So I think the important part is that we have normal link elements on the page, and we can follow those links, and we have the full content on the page. But the JavaScript framework behind all of that is usually something that's more useful for users than for Googlebot. Sweet, thanks. Sure. Let me run through some of the submitted questions, and I'll give back to all of you who are raising hands. I'll try to extend it a little bit longer as well, so we have a bit more time to get to all of these questions. And probably won't make it through all of the submitted ones either, but we'll see how far we can go. Let's say I want to improve the content on a page. I add as much relevant content as I can for the users. Does this mean that when I add relevant text to the page, Google automatically assumes that the page is better? Does it work out like that? Is more text better in the eyes of Google? I'm trying to convince those in charge that it's not quite as simple as that. Yeah, it's definitely not quite as simple as that. From our point of view, the number of words on a page is not a quality factor and not a ranking factor. So just blindly adding more and more text to a page doesn't make it better. It's a bit like if you want to present something to a client who's walking in, you can give them like a one or two page brochure, or you can give them like a giant book of information. And in some cases, people will want a book with a lot of information. And in other cases, people want something short and sweet. And that's similar to search. If you have the information that you need for indexing, for making it so that users and Googlebot understands what this page is about, what you're trying to achieve with it, in a short version, then fine, keep a short version. You don't need to make it longer. Just blindly adding text to a page doesn't make it better. Imagine there are no index pages with the parameter on a website, and then they're the same pages indexed without parameters, which users can find only in the Google search results, not on the website. Could that cause problems in the long run? That shouldn't cause problems in the long run in the sense that we index what we find when we show it to users in the search results and they tend to go there. And we kind of have something that we can show them there, so that's essentially fine. I think it's something where it's more a matter of making it hard for you to understand what your users are doing on your own website. So in particular, if you're using analytics or any other kind of user tracking metric system, then if you have multiple sets of URLs that show essentially the same content, then it's a lot harder for you to figure out which of these pages are actually useful and which ways to users. So having fewer pages that show this content in general is a good idea. For search, it wouldn't be problematic if you have multiple versions and you just show one set to users when through the search results and one set when they click around on your site. In many cases, you'll also have another set of pages that are your kind of paid ads landing pages, which are also completely separate from the normal search pages that are visible from crawling and indexing. So that's something where it's pretty common to have this kind of setup, but of course, it just makes it harder for you to track things. It's not an issue with regards to SEO or with regards to Google. Let's say the core web vitals values of my website are quite good compared to my competitors. However, my traffic is much lower than my competitors. How important is site traffic together with core web vitals in the search results? In short, can a website with good core web vitals beat the competitor website with millions of visitors in the search results? So for core web vitals, the traffic to your site is not important as long as you kind of reach that threshold that we have data for your website. If we don't know anything about your website, then obviously we don't know that maybe it's a really fast website. And the data that we use in search is from the Chrome User Experience Report, which is aggregated from users that are kind of opted into this kind of metric system. And that's essentially what we require. And then that's kind of the baseline. Like we have data for your website. We know that users are seeing a fast website. It doesn't matter if millions of users are seeing that or just, I don't know, thousands of users are seeing it. So just kind of the pure number of visitors to your site is not a factor when it comes to core web rank, core web vitals, and generally not a factor for ranking either. The other thing that I do need to mention here is that core web vitals, the page experience, is at the moment not an active ranking signal. So we announced that for me. That's kind of one aspect. And the other thing is that relevance is still, by far, much more important. So just because your website is faster with regards to core web vitals than some competitors doesn't necessarily mean that, come may, you will jump to position number one in the search results. So we still require that relevance is something that should be kind of available on the site. It should make sense for us to show this site in the search results. Because as you can imagine, a really fast website might be one that's completely empty, but that's not very useful for users. So it's useful to keep that in mind when it comes to core web vitals. It is something that users notice. It is something that we will start using for ranking. But it's not going to change everything completely. So it's not going to destroy your site and remove it from the index if you have it wrong. It's not going to catapult you from page 10 to number one position if you get it right. I'm seeing some really strange issues with paged index without content report and search console, pages that clearly have content are showing up in the report and dropping in and out of the index. For example, the home page of a big site have seen a few other people talking about the same issue. Have you seen anything on your end? So I had to double check what this report or this category of issues actually was, which points at me not actually seeing a lot of reports about this. So that's one aspect here. That means to me that if you're seeing issues in this regard with your website, then I would go to the Webmaster Help Forum and list your explicit URLs and the issues that you're seeing, maybe some screenshots so that folks there can take a look. And the folks there can also escalate that to Googlers to review in a little bit more detail what actually is happening there. The cases where I have seen this happen from before we started showing it in Search Console are usually cases where there's some kind of issue on either the CDN that you're using or the server itself. So one example that I've seen in the past is the CDN is somehow set up that some requests go to some node that doesn't have the full content. And essentially, the node returns a 200 status code but a completely empty response. And for us, we can kind of tell that this is probably not what you want to have indexed, or at least there's nothing here for us to index. And then that's something that we would bring into a report like this. So if you're seeing this regularly, my guess is something around a CDN. Sorry, John, but great. Yeah, that was a small question. It's really bizarre. Yeah, like I'm seeing, I can see it happening for a couple of days, and then the URL will show up as normal. Everything looks fine in the Live Inspector tool. I checked the CDN. I checked our bot blocking platform that we use. Everything looks normal. So I'm really banging my head against, well, I did upload it to the Webmaster Help Forum, and I haven't heard back from anybody yet, but I did see it. I saw a couple of people on Twitter and read it talking about the same thing. So I'm wondering if it might be a similar bug that we saw back in October. Now, if you can link to the forum thread here in the chat, I can couple it out afterwards and take a look. That'd be great. Yeah, the ones I've seen are sometimes really annoying to figure out, especially if it's a sporadic CDN thing where you, when you test it, you don't really see it. But when Google IPs test it, then sometimes we see it. Sometimes we don't. And I can't always pinpoint exactly what is happening there, but usually when I look at the logs, I can see, oh, this file size is 10 kilobytes, 10 kilobytes, and then zero, and then 10 kilobytes again. Then usually that kind of helps to pinpoint some of the times when it happens. But it's really tricky to, I don't know, figure out, especially afterwards, when you see it in the report, by then it's already kind of passed. Yeah, yeah. OK. Yeah, I'll post a link to the thread here. Thank you. Cool. I've got a question about Hreflang and the wrong version being picked up. I work for an international news website. And among our editions, we have three Spanish versions on different subdomains. These target Spanish speakers, Spanish speakers in Latin America, and Spanish speakers in the US, respectively. All have Hreflang set up across potentially duplicate pages, while the es.domain.com has Hreflang for ES and ESES. So two versions. The other one, Latin America, gets picked up and ranked for Spain. I'm seeing it particularly in news carousels with articles, but it also happens with featured snippets. Is there something else I can do to ensure the correct version is ranked in the correct country? So I think the absolute short answer is no, you cannot guarantee that we will always show the right version in the right country. So that's kind of, I think, awkward truth. But I think it's something that you have to assume will be the case that there are situations where we don't get it right. And if you can recognize that a user is coming to the wrong version of your site, then I would show a banner and try to catch that on your site as well. This is also useful for all of those cases where people find other ways to get to your website, be it through social media or just following old links on across the web, kind of making it to your site, and then they end up on the wrong version. With a banner on your site, you don't cause any issue for indexing, and users can find the right version. In general, for this kind of setup that you have there, if you have a general Spanish and a Spanish for Spain version in your HF Lang setup for Latin America, you would need to add tags for all of the countries that you're targeting on that page. So if you're targeting, I don't know, five or six countries in Latin America for Spanish, you need to add those versions in the HF Lang set there too. And it's fine to have them all land on the same page, but they all need to be there because without the HF Lang connection there, we will assume that this is just a separate page in Spanish that we can also show. And then we don't know that this is actually part of that cluster of pages that you have, where you have one version for Spain, one version for all of the Spanish-speaking countries that are not mentioned. But rather, with all of those country links in there as well, we can understand that, oh, this one is actually the right one to show for individual countries. It doesn't guarantee that we'll always get it right. And in particular, with indexing, kind of like the timing of indexing can throw things off. With news content, sometimes we index one version a little bit faster than the other versions. If we don't have it indexed yet or processed properly for HF Lang, it can happen that we show the wrong version there as well. So that's kind of the aspect to keep in mind there. One approach could be to say that you swap the HF Lang around a little bit as well, that you say maybe for Latin America, you have a lot of countries for Latin America in Spanish, that you treat that one as a default Spanish version, and you have the Spanish for Spain version instead. Then you just have one country version and then kind of the default version for Spain, for Spanish. And then you'd have fewer sets of HF Lang links to maintain. But it kind of depends on the way that you have your website set up. I believe that when I run a test with Lighthouse in Chrome for a page, this test will use my connection. Therefore, the location of the test is my exact location. But when I use PageSpeed Insights, the test will run on a server close to me. In this case, I'm in Brazil. Am I right? Would this information would be nice to have in the tests? So I don't actually know where PageSpeed Insights comes from, or I mean, where the server accesses your site from. But in general, all of these lab tests are predictions of what users might see. It's not meant in a way to mirror exactly what users see, but rather to roughly predict what a user might see. And for that, the location is usually less critical. So the bandwidth or the time to connect from some other part of the world to your server, again, is something that usually plays a much smaller role with regards to the Core Web Vitals than other factors, like how you have your site laid out, how you provide images on a page, how large the files are, all of those things. So the location specifically for PageSpeed Insights is less of an issue there. When it comes to search, what we use is the field data, which is based on what users actually see. And for that, we use, obviously, what users see, so their location, their connections, their devices. And that's usually something that mirrors a lot closer what your normal audience would see, so you don't really have to worry about where those lab tests are coming from as well. When our post ranks for long tail keywords with less traffic volume, it's said it should also start to get ranked for shorter keywords. How can I make that happen? So yeah, I think everyone wants this as kind of a quick way to rank for more higher volume queries. And I think the short answer is there is no magic solution. There is no meta tag where you can say, I want to rank for both long tail and short tail queries. But rather, we have to understand that your page over time is actually a relevant and useful result for people who are searching for these things. So that's something where all of the usual SEO advice applies. There is no simple trick to say, I want to go from long tail traffic to short tail traffic. There is no setting that you can do that. My pages, including the home page, now have a date in the search results, just like blog posts. So why is that? And how can those dates be removed? The website itself is built on WordPress. So essentially, the short answer is we found a date, and we started to associate it with your pages. And we think it's sometimes useful to show to people in the search results. And that's essentially why we're showing that as a part of the snippet of the page. There is no simple approach to not showing any date at all. So there is no meta tag that you can use to say, don't show any date in the snippet. The one thing you can do is, of course, remove mentions of a date on your pages. And then if we don't have a mention of a date on a page, then that makes it less likely that we show a date. But if these are blog posts and the blog posts have data associated with them, then obviously you'll have those dates somewhere. In general, this is less of an issue because if this date is based on the blog posts that you have on your page, then that's kind of a relevant date, and it kind of makes sense there. If we're showing a wrong date for your pages, for individual pages, there's a Help Center article that we have that goes into how we pick a date. And in particular, that covers the structured data that you can use on your pages to specify a date in a machine-readable way. And with that structured data, if we can confirm it on the page and see that actually this date is also on the page and in the structured data, then it'll be much more likely that we actually use that specific date for your pages. The date itself is not a ranking factor, so it's not that you need to push something in there or that you need to kind of make it look like the date has changed. It's essentially just a part of the snippet that helps inform the users of when this page has changed. And sometimes they like to see that a page has been stable for longer. Sometimes they like to see that a page has been freshly updated, it really kind of depends there. Why does Google add to my website title the name of my brand before the defined title? So I have my title set like this, red shoes dash my domain name, but Google rewrites the title to my domain name and then red shoes. With and without JavaScript, the source code looks good, but Google still rewrites the title. There's also no quick fix for this. Essentially, we sometimes rewrite titles to try to make them a little bit easier to understand for users. And sometimes that results in this weird swapping of the page title and the site title, because we try to recognize those separately. And that can sometimes happen. In general, that's not something that causes any problem. However, if you're running across situations where this kind of swapping or, in general, the rewriting of your title causes problems in that it's not understandable at all, then I would go to the Webmaster Help forums and present your case there. Show some screenshots of some queries and the pages that you have on your site and see if there's something that others have that might give you some tips on what you can do to improve. In general, when we rewrite titles completely, so I think this is less your question, but when we rewrite titles completely, it's because our system think that the title that's specified is really kind of an awkward or weird one that doesn't reflect well what the page is about. And that might not make it so interesting for users to actually click on and look at the page. So in particular, things like just a collection of keywords or a lot of stuff text when you just put keyword stuffing into the title or you just try to include as much as possible in the title that's not really relevant to the page, then that's something where algorithms might say, oh, this title is really weird. We need to figure out how we can make this better so that users actually want to go there. So that's something where sometimes the folks in the forum can point you at that. With regards to the length of the title, that's totally up to you. Sometimes a longer title is useful for our system. Sometimes a shorter title is useful for our systems. It kind of depends on what you want to show to people in the search results. Oh, and maybe one other thing with regards to posting in the forum, specifically queries where you're seeing weird things happen, if you can find queries that are as general as possible, that makes it a lot easier for us to treat them appropriately. So in particular, if you're searching for this really long, convoluted phrase where anyone who looks at that query can say, well, you're probably the only person in the world who would search like that, then that's something where our algorithms or our engineers might look at that and say, well, we don't really have time to fix this one query because there may be five queries like this, which happen every month. And we can spend a lot of time trying to figure out a solution to this, but it's not really an effective use of our time. Whereas if you can come to us with a query that is really more or less generic where you can look at it and see, well, there are probably thousands or millions of people searching like this every day, then our engineers are much more likely to take that and say, OK, we really need to figure out a solution here. Lots of people are seeing these terrible search results. We need to find a way to make that work a little bit better. And we'll take that and try to improve our algorithms over time. So we wouldn't go in there and say, we'll manually tweak the title that we show here, but rather we'll improve the algorithms that we use to create titles in general and use this one situation as a kind of a test case. Recently, in my Core Web Vitals feature in Search Console, there are big fluctuations before the 17th of February, which green URLs go down to zero in favor of yellow URLs. And then one month later, the situation reverses. Why does that behavior in Search Console report if I didn't do anything in the performance? It's hard to say without having a little bit more information like which site you're talking about in particular. And it feels like something where also maybe folks in the forum might be able to jump in and provide some help there. But I think what sometimes happens with these reports is that if a metric is right on the edge between green and yellow, then if for a certain period of time the measurements tend to be right below the edge, then everything will swap over to yellow. And it'll look like, oh, green is completely gone. Yellow is completely here. And when those metrics go up a little bit more, back consider green, then it'll kind of swap back. And those small changes, they can always happen when you make measurements. And to me, this kind of fluctuation back and forth points more towards like, well, the measurement is kind of on the edge. And the best way to improve this is to kind of try to provide a bigger in the right direction so that it's less like just between those two versions, but rather clearly on the green side. So that might be an approach that you could take there is try to figure out like, why is it almost yellow and not purely green and see if there's something that you can push to make it so that it's really kind of strong green. The tricky part, of course, is you don't see any more fine grained information between that. You just see bad, middle, and then good. So it's harder to tell, am I barely in the green or am I completely in the green? Especially with the field data, with the Chrome user experience report data, you see a little bit more the exact numbers that users saw on aggregate. And with the lab tests, of course, you have a lot more fine grained information on what exactly might be causing this or what the numbers actually look like when you do kind of this artificial prediction of what users would see. So many questions. OK, maybe I'll just switch over to more questions from you all here. And I also have a bit more time afterwards so we can hang around a little bit longer after the recording. Should even more questions pop up? I'm just going down the order as I see the folks have raised their hands in the Hangout, which is a really useful feature now. Now that I think about it, Amos. Yeah, hi, John. I have a question about the find results on feature that we've been seeing mostly in the European markets. So I couldn't find any documentation on how to optimize for find results on. Is there anything available? I'm not aware of anything specific there. My understanding is that something that is mostly algorithmic in that we try to recognize which sites show there and we kind of show that automatically. But I don't think there is any structured data or anything specific that you can do to kind of like force that or encourage that. Yes, because I can see it's kind of mostly for listing sites. So it's a bit inconsistent. And some sites in our group are appearing and some aren't. So it's hard to understand whether we need to do anything or not. In addition, in Google Search Console, there's no way to also filter for clicks coming through that. Is that something that's going to come? I'm not aware of anything coming there with regards to Search Console. My feeling is it's also something which is rare enough that probably a Search Console team wouldn't make a special report or kind of an attribute for that. And just the final question, is it going to be launched in other markets? So it's only in Europe we can see at the moment? I have no idea. No idea. It might also be that it's something that's more experimental where we're just trying things out and seeing how that works out. But it's really hard to say. Yeah. Thanks. Not a lot of good answers there. Oh, well. OK. Bilal, I think, if I got your name right. Yeah, John. So I have a situation where I can see 84 valid pages in the Search Console. When I check it through site column, it just pops up four URLs. And this situation is being pervading from last four weeks. So I'm just trying to understand what could go wrong. So continuously checking all the parameters and basic things like crawl stats, robot, site map, et cetera. Is that something you can help or advice? Yeah. It feels like something which might be a matter more for the forum to look at there. But I mean, the general situation what can always happen is that we have multiple data centers. And what you see in the search results might not be completely reflected on what you see in Search Console. Because Search Console kind of tries to pick the overall view of your website. And it might be that in a lot of data centers, these pages are not indexed at the moment. So Search Console might say, oh, but I see them indexed. But you, when you try them out, you don't actually see them. So that's something that sometimes just settles down over time. That might be something that is happening there. Of course, if you're seeing very few pages indexed for your website, it might also just be a sign that our systems are not really sure of what they should be indexing from the website. And usually, especially with regards to smaller websites, it's something where we try to index as much as possible. So if you're just seeing four pages out of, say, I don't know, 20 pages indexed, then that, to me, suggests either a critical technical issue or a critical quality issue where our systems are really like they're trying to get these things indexed. But either for technical reasons, which you would see in Search Console, or for quality reasons, we're not able to actually index them. Quality issues you probably wouldn't see in Search Console. You'd kind of have to deduce them based on things technically being indexable. But our system's saying, well, actually we don't want to have this indexed. So another question related to e-commerce website. It is actually public for the last two months. And the situation is going on where, if I check the website through site colon, sometimes it pops up one page. Sometimes it pops four pages. And things like are continuously uncertain. And I'm just trying to understand is something what should I do in that case? Because I've checked the basic things are correct as well. But yesterday, there was two pages that were indexed. I checked through site colon. And today, it was just one page. So not sure what is going on. Yeah, it feels very similar to the previous case. And yeah, I think if you're really sure that there are not technical issues involved, then I would try to figure out if there's something from a quality side that you can do to significantly make steps forward there. The other thing, of course, is in particular with new websites, we need to have some understanding of the context of the website within the rest of the web. And if we just know that this website exists, I've not necessarily signed it. We'll go off and actually index it as much as possible. But rather, we need to have a confirmation that this is actually something really useful and worthwhile to show. So that's something that you could also work on, especially with a new website. So similar to a new business, when you open up a new business, you don't just open up a new business, but rather you go to places where people are. You do advertising. You do other things to make it known to people so that they start coming to you. And then over time, you grow your client base and people who are regular and keep coming back who recommend your business to other people. So that's the same kind of thing that you would need to do with an e-commerce site or any other site that you open up. You need to build up that first critical mass first and then really have something fantastic so that when they go away, they say, well, I need to recommend this to my friends. And that's not or usually less a technical factor, so less something where you just need to get some technical meta tags, right, and submitting things in the right way. But you really need to have your best foot forward. All right. So with this website, the last cached page is of December 11. So luckily, the home page gets available today. But if I check the cached variant, it is still off December 11. So I'm just trying to figure it out. We'll obviously discuss later. I can send you the link. But yeah, thanks for your insight. Yeah, I would definitely go to the health forum for something like that because that's something that folks there can take a look at various things and give you a little bit of advice. Sometimes it's also, I don't know, the hard to take advice of like, oh, your website really looks out of date or something like that, where you sometimes need a little bit of a nudge to go past from like, oh, this is my baby. Everything is perfect. And to understand like, oh, actually, I need to improve some things. But that's always useful, I think, to hear sometimes. Thanks. All right, Robin. Yes, can you hear me? Yes. Oh, great. I wasn't sure my mic was active. So I would like to follow up a little bit on the page title answer that you gave before. You indicated that the length is not really a matter because, I mean, the conventional wisdom has been for many years that you want to have a title tag. That is as, that is limited as a room within like the Google presentation. So it doesn't end with a dot, dot, dot. Because we are also building some product new coding for our product detail pages. And so we're setting it up as a product designation and then the product category because the product designation itself doesn't say a lot of the product. But if you combine the two, you will get a good sense of what the product is and more specifically what it's named. And in some cases, our products are bundled together, which means that there are several product descriptions in a row, so to speak, which will extend the title to a lot more than 72 characters or something like that. So it will be forward and be allowed pixel width in the SERP. But as long, if I understand you correct, as long as we also have the same content on the page well represented as in the title tag, this is not really a problem. Now, I mean, it's definitely not a problem in the sense that your pages will rank worse. In a case like this, for the title tag, it's less a matter of what we also find within the text body of a page because the title can be different in regards to it. It doesn't have to be the same as the heading, for example, on a page. For the, I guess, the aspect here is that we do use the title tag in our rankings as well. So having the relevant information in the title makes sense. At the same time, the title is also something that we show to users. And sometimes you might choose to have a title that is very, I don't know, where you found that it's very useful for users, or where you found that users respond to it very well. And that's something where if you have a specific title that you want to use and find, use that. If you have a longer title that is automatically generated based on the product that is shown, then sometimes that title is just a bit longer. And dot, dot, dot at the end does not mean that we don't use the rest for ranking. It's just we can't show everything there. The other part, I think, is sometimes we show shorter titles in search. And sometimes we show longer titles. I think sometimes we even show two line, three line titles. Not 100% sure if that's always the case. But it's definitely not the case that you need to target exactly 72 characters. And then that's the perfect title. It's kind of this mix of you need to provide information for us so that we can rank your pages properly. You ideally provide that in a way that is encouraging for the user, that they understand what the page is about, and that it's interesting for them to click on it. Because that's kind of the way that you get traffic from search. And at the same time, if you have a very large website, you need to do that in some way that you can automate for the pages that you can't manually create titles for. So if it's an e-commerce site, you can't realistically create a title for every million products that you might have. So you kind of have this mix of some things you create manually to really find target specific title. Other things you just leave automated in a way that works optimal, but at the same time saves you time. Yeah, that is what I thought. But yeah, thank you very much. Cool. OK, let me take a break here and pause the recording. I'll stick around a little bit longer. If any of you have more questions and comments, we can definitely do that. And yeah, cool. OK, let me pause. And thank you all for joining. Thanks for watching so far. Thanks for all of the questions that were submitted. Always cool to see what it all is happening. All right.