 all right hi everyone welcome to today's webmaster central office I always hang out my name is John Mueller I'm a webmaster trends analyst here at Google and we have a bunch of guests here you want to introduce yourself briefly hi I'm Kevin from Zurich nice to meet you my name is Marcus I'm Auro also Switzerland I'm Jason I'm from France as you can all right do any of you want to introduce yourselves who are like in here yeah I'm from I'm British by living Poland I'm Paul and I work for a large medical provider we operate in about nine countries so we deal a lot with medical con I'm Rob from the UK and John broke my site a few years ago and won't fix it and he's smiling all right yes the others kind of quiet but you're welcome to speak up in between in any case we have a bunch of questions that were submitted but if any of you want to get started with a question or any of you feel free to jump on it can I ask a very general question okay yeah I saw Gary Elias in Australia and he was talking about images and he was basically saying I'm really interested in images and video now and optimizing for images and video the 10 blue links appeared to be interesting him a lot less these days than they used to should we be looking at images and video much more I mean my vision what I've seen is that Google results are becoming much more multimedia the 10 blue links are disappearing slowly I wrote an article on search engine about the downism in all that I'm completely obsessed by it images and video appear to be something that will appear across the board whatever the context is that what should we be going I I think images are underestimating so that's that's kind of one one thing the the other big thing is I mean a lot of people search visually a lot of the newer social platforms are really focused on images like Instagram is something which is basically just images so there's a lot of interest from people in finding visual content I think the main thing that at the moment when it comes to focusing on images I recommend not just focusing on images as a technical thing like I have a hundred images I need to optimize them and get them into image search but rather thinking about the user journey of how someone might search visually for things that you have on your website and how you can be in the right place at the right time yeah so instead of just taking hundred images from your website and adding some all tags that have keywords that match those images think about like where would you present this or like how would someone search visually for your content and how can you present your content at the right time yeah it's mentioned how much does people take into account the context around it the actual text around it into account to enormous amount yeah yeah I mean it's it's something where when when we show an image in image search and we think it's important for a query in image search it's something where we want the whole landing page to come okay so we prefer not to have this random image in the corner somewhere but really a landing page where the user knows that they go click on that image they find that image on the page and they find more information about what they were searching so if you're searching for I don't know type of luggage then you you might search visually you might say well this is the type I want you might refine it kind of visually and then you go to that page and you want that that piece of luggage there you don't want it like somewhere in the sidebar as a small thumb yeah it's talking about shopping and you take a picture something I want that banger apparently very good at it so I just I think but yeah are you as good you better I don't know I don't know I don't know how we would compare but I I think it's one of those things where it's easy from a technical SEO mindset to just say I have so many images I will optimize these images and it's a lot harder to think about it and in terms of how people might go to this okay and the context of the fact yeah yeah I think it's also worth considering that sometimes your content might not be something that people would search for visually like I don't know if you have machine parts yeah maybe people wouldn't search for visual unless they love the look of machine or unless they're they're like the liars people doing it themselves and they know it's like I want this thing I don't know what it's called but it looks like someone do you take a piece of your machine you're washing machine that you can't put back together again taking a picture and saying I need that things that's by the reason to search to make yeah that'd be good yeah next time I fail to put my washing machine together I'll do that high expectations John yeah John I would actually just back you up on this because we use images very heavily medical and and they work very awesome for us so if you present great images and for Google image search then you will start getting the traffic I think people don't look at their images enough on their website okay yeah yeah that's good cool sorry another thing you were talking about I don't know what I was talking about you confirmed the context of the text around the images just thinking about it I had a client who they were keyword stuffing in the out tag and saying actually I'll take me to describe the image they're saying but then I don't get my keywords and so in fact if you think about it if the number of images in your article for example blog article if you just read the alt text you should be able to describe what pages about if you're using your images correctly or well and if you've got an H1 or heading with a text of an image or heading with a text of an image each one of them is in a specific context within the page and they all come together and from what I can see ideally it would have these out tags that actually describe what the page is about yeah I think it depends on what what you're trying to do because what they all attribute you're supposed to describe the image yeah and sometimes the description of the image isn't really what your page is about like you might have a picture of I don't know beach but your page is about vacation homes and so well the example sorry it was they sell surveillance stuff for homes for old people and they were saying you know old couple on a sofa not surveillance now and then they've got a security guard and then they've got a camera but you say if you say old people at home on the sofa security guard and camera you've described the page so the fact that you don't have to put old people sitting on a sofa you putting the out tag as security camera security security surveillance whatever it is and I think from my point of view out tags that is a nice way to look at it yeah yeah thank you I'm very glad you agree with me so I'll shut up yeah I'm doing the same so I'm descriptive image but put some one two three keywords at the end of the in after alt text more like this so put the keywords less keywords inside but the most important thing is the description about the image more like this not to do keyword stepping you probably don't even need to put them in the old attributes if those keywords are on your page then yeah that's not for us to pick that up I've got a little question sorry okay go for it I don't know it's about that the word not being in the page I think that's really really interesting because our department who did flower delivery and we were ranking number one for cheap delivery and the word cheap was not in the page and I'm wondering is it because we were cheap and people realize we were cheap or because it was somewhere else on the side because it wasn't in the page tool I kind of point towards links or just general context that might come into play maybe I don't know reviews depending on how how that set up if you're a link from maybe yelp pages and people write about you like that okay like it was really inexpensive or whatever then that might be something that we could take off yeah so we could put one from my point of view I was just gonna Google deduce that we're cheap and we were cheap I'm ain't I don't think yeah putting it on the page would make sense actually an important part but I I don't think we would look at the prices and say oh this is below average we will cheap yeah I have a question about it's like an old topic but in a new perspective talking about multiple H1 and there was this rule like one H1 per documents and stuff but now in HML5 in specifications you have like these sections like the hand the article the A site and in the specific specifications each of them like contains one H1 and so how do I it is Google already like yeah following the HML5 specifications or will I confuse to Google Bob I I think this is something that it comes up a lot that people like sir can I use multiple H1 headings and I think even before HML5 whatever use whatever you want we have to deal with whatever's on the page so it's not that we would run HML validator and say oh invalid with these two H1s we will promote your website we have to make do with what's on the web so with with HML5 it's it's even normal I guess with multiple H1 headings and we try to make do with that it's not not really a problem okay I mean we use the headings to try to understand which blocks of text belong together and to give more context to the blocks of text on a page so if you use H1 headings for that or H2 headings or H3 headings all of that kind of works it's not that we require you to have kind of this straight theoretical structure on those pages. So HML5 blocks then come into play they're important I think with the menu the amount of the site the article or not really. Not really. Oh that was my favorite. I mean for for semantic understanding of the page it makes sense but for us we we basically make do with with how the websites are built and a lot of people they don't have this understanding of why it's semantically belongs together. So we recognize this this I don't know big HML element here is the menu or the sidebar it's something that's shared across the site and whether or not it's it's called that isn't reached with the right HML pack it's kind of irrelevant for us we can recognize it's something that's shared across the site so we can treat it. You're saying shared across the site that means if you consistent it doesn't matter how rubbish your rules are if your rules are consistently about throughout your site you're gonna win you're gonna be alright. I don't know I don't know about winning but it definitely helps yeah I think the important part to think about HML5 just because it's a brilliant. I think the important part is really that we can tell which parts of your page are primary content which parts are kind of boilerplate we call it so things that are more secondary so that we can really tell when when someone is searching and something is it within the primary content of your page then we can really highlight your page rather than if we can tell it's like in the footer or somewhere on the sidebar then maybe we don't need to rank your page that high for it because you don't see it as that's important. All right wow lots of people have jumped in pretty cool so I let me grab a few questions that were submitted and then we can move more towards organic discussion to get as well let's see can I stay updated on blogger because it has a date within the URL suppose I'm updating a page with fresh content and notifying via search console you can definitely use blogger and you can definitely have a date in in your URL that's that's fine if you update that content then usually we can pick up the date within the content of the page as well and just pull it out like that so shouldn't be any problem you might consider using something like I think the pages feature in blogger where you don't have the date within the URL not sure if that's it's still a blogger or mixing something up but that that might be an option as well. How can I block negative keywords traffic from my website I don't want my website to appear for some problems is there a way to solve that no not really I I think if if you don't want your website to rank for certain keywords and obviously maybe just not put those keywords on those pages but otherwise there's no little trick to kind of block keywords from from ranking for your changes and most people usually have the other problem when you want to rank for these not that they don't want to rank for things in the categories of an e-commerce site where there are different products listed is it possible to include the aggregate rating and or is that only for products of the same brand so the aggregate rating structure data should be only for the same type of product so it shouldn't be across different products so if you have a category listing page you shouldn't use aggregate rating for that aggregate rating is really if you have one product and there are multiple reviews of that product then you can create an aggregate rating of that. Oh my gosh someone is asking about subdomains or subdirectories it's been discussed how subdomains are sometimes treated as part of the main domain sometimes not my question is whether the same consideration may in some cases also apply to brand name domains with different top-level domains we have different we have our site in nine different languages in the same search console account with the main domain only difference is top-level domain question goes on a little bit but essentially these are different domains so that's not really something that we would tie together and say it's one site what you might see in some of these cases is that we pull these different domains together in the site links when when you search for your brand name for example we might pull in from the other languages or other countries that kind of makes sense but it's not the same as like a subdomain or a subdirector okay I don't know are there any questions from you all in between yeah I have a quick question regarding the indexing we'd recently acquired a domain bloom.com and it had been redirecting to bloom circle.com the issue that we're running into is that Google's indexed 4000 bloom circle.com URLs is bloom.com URLs and we're a little apprehensive to get started on our work because the content we're looking to put on the site is family related as opposed to beauty and wellness related so what we've done is we've removed pages from webmaster tools as well as foretelling the pages from what I understand it could take months for all the URLs to de-index but I was wondering if there's any way to expedite that timeline and any recommendations you'd have okay so you're redirecting or you just have a new domain we we have a domain and that domain previously was redirecting to another domain. Okay now so I in cases like that where it's really content that you don't want to have indexed anymore I would just use the URL removal tools and the URL removal tools are limited I think to 90 or 180 days but you can just renew that after that and that gives you a little bit more time to kind of fix things within your website so to foretell those pages and so that they drop out completely. The URL removal tool basically just hides them in the search results so if if you're primarily interested is really to make sure that if someone searches for your domain they just find the content that you want them to find then that would work. Okay the timeline for all those URLs to get de-indexed would it still be like a multiple months timeline or would it be shorter than that using the URL removal tool? So the URL removal tool is basically instant it's something on the order of like a half a day so that makes it really quick and the prep reprocessing of the old URLs is something that depends on the URL it can take multiple months to be reprocessed and drop out completely but with the removal tool you have those 90 days and then you can renew that a couple of times so basically you can kind of hide them until they're actually reprocessed. Okay thank you. Hi John. Hi. Hi we have a slight problem in India so we have about 15 IBF clinics in India and they've actually started ranking for pornographic terms so this is quite new and I'm wondering if there's anything we can do to stop this. We've only noticed it in the last month that this has started to happen. Okay I don't know it's hard to say where that might be coming from. Initially is that maybe there are weird links to those sites. Yeah that was my thought of maybe from our Indian colleagues have done something previously that might have made this occur so but the other sites probably about three years old and we've only just seen it in the last month this kind of trend appearing. So I mean there are a few things that kind of come together on the one hand if it's due to use you can use a disavow tool. The other thing is that I suspect that the actual users of your site wouldn't be using those terms anyway so it's something. Yeah of course theoretically you do kind of want a medical brand kind of associated with pornography terms. Yeah yeah I think it's something that might be more of a theoretical problem than an actual problem but of course when you search for it and you see it or someone else searches for it and points it out then that's still awkward but I would mostly focus on the links to see if there are any issues there and then use a disavow tool to kind of have those dropped. It might also be worth double-checking that there's nothing on your site that's hacked where maybe someone is dropping things in there as well because that's a real job. Yeah we've had a look at the HTML and stuff like that to see if someone's dropping in some of these keywords in there but you know obviously with IVF and fertility then there's kind of some associated terms across those other searches that we don't want to be associated with so yeah it's kind of just any ideas but we'll go and have a look at the backlinks which is one of the things that we thought might be the case as well. Okay now. Yeah that's always kind of awkward. Hi Jones. Hi. Yeah I have similar questions to last questions asked by Belly Tila. Like I have one website from India and the website is e-commerce and earlier we were ranking for like generic terms but now we are ranking for generic terms plus adult keywords so like we are ranking for adult keywords in last one month like adult keywords plus or generic keywords so it is it is happening from last one month and the website is e-commerce earlier there was not such problem but now we are ranking for some adult keywords so is there any way to block such kind of such kind of traffic because we have checked our website completely from link wise we have audited all links in different tools and different all checked or complete webmaster and also we have checked our HTML part of the website like each and everything like if some some adult keywords are present on our content but our website is completely free from adult content because we are dealing in jewelry ornaments so that there is no any adult keyword on our website but still we are ranking for some pornographic terms or adult terms so this is the issue we are facing from last one month. Okay. It seems like really a problem, John. Yeah. People on one call I don't know the person there either so you've got nothing to say with me. That's why I asked similar question in like Google webmaster questions query also because my clients are like they are not happy because they are thinking we are not ranking on search terms but why we are getting lots of clicks from some adult keyword and actually they are if we are ranking for adults keywords with all generic terms then they are giving us 100% bounce rate as well so they are affecting our websites overall credibility because improvement in like average of bounce rate is also improving for our website so it is like negative for us. So I guess first of all I wouldn't worry about the balance rate with regards to SEO that's that's not something that we would use but maybe if you too could copy some of the URLs from your site and some of the queries that you're seeing into the chat here then I can pull that out and share that with the team afterwards. Sure, sure. We will send you like we can show you search terms we are ranking for and we can also send our website URL as well. Yeah maybe just drop them here in the chat and then I can pull them out after they hang out. Sure, sure. Sure. Okay, all right. Hi John. Hi John. Sorry go ahead. Yeah, so I have a question. So previously I had a subdomain. Okay let's say for example X-ray is a dot com. So I have created a subdomain like X-ray is a dot com. So we have just built a link and we have just created a site. We have just built everything a site and we have just created a part links also. Then what we designed is okay instead of subdomain is not that much good then we just move to the subfolders. So then what we did is we just created the subfolders with a year of DNA like let's say 2018 exams, 2018 exams like that. Just to interrupt very briefly, would it be possible to get close to the microphone? It's really hard to say. Yeah, can you hear me now? Yeah, can I start from the beginning? Yeah, sure. So previously I had a one subdomain. Now I just have a subfolder. So previously I thought that the subfolder is good better than the subdomain. So we just moved to the subfolder. What we did is in the subfolder we just created the pages in a year basis like 2018, 2018 like that. But after we moved from subdomain to subfolder none of the pages were ranking. Then what we decided is again we just created the pages in a normal way like let's say for exams, just exams we created. Previously we had like a exams hyphen 2018, exams hyphen 2019. Then we decided to just create exams. So let's say for www.xyz.com slash exams like that we have created. Now the problem is none of the pages are ranking properly. Previously the rankings happened for the exams. Even that is also not ranking actually right now. So what can be done from our side to improve the ranking section? Okay. So first of all moving from a subdomain to subdirectories that should be perfectly fine. You can move within your website to watch out for redirects that seems like normal thing to do. What it seems more to me is that these are just general ranking changes that are happening either on Google side or just across the web in general. And that's probably what you're seeing there. So less something that is due just to your technical side structure like subdirectories, subdomains but more just where we're ranking your site differently. And that can happen for subdomains. It can happen for subdirectories as well. So we just redirected all the subdomains to subdirectors right now. But still the backlinks what we built for subdomains of course it's also getting redirected. But still I feel that none of the pages are ranking. Let's say 90% of pages are not in the first page in the SCRP results actually. So even we just tried for the past couple of months and still we just face a negative impact over there. So even if you make any changes in the web page it's just going backwards. It's not coming front actually in SCRP results. So do we have to build more backlinks for subdirectories compared to subdomains or is there any other activity that we can do for most it? No, you definitely don't need to build more backlinks for subdomains or subdirectories. So those that that's completely normal way to build a website. So it's not something where you need to do anything special with regard to the links. But just generally the way that you're approaching it like we should have more backlinks for subdirectories and for subdomains makes me kind of worry that maybe you're building links in a kind of a weird way that our algorithms are picking up on. So that's one thing I'd be kind of cautious about. But in general, to me, it sounds like this is just normal ranking changes as they can all these happen. If you redirect from one URL to a different URL and subdomain or sub directory, we forward all those signals that we have. So it's not that something would get lost or be less valuable if you have one set up like this or set up like that. And when it comes to the web pages, of course, when we move from subdomain to subdirectories, so we have to change the interlinks also, right? So in case if you have a old interlinks of subdomain, we have to change it to the new interlinks thread. It should not be the older one. Yeah, definitely. So if you change your URL structure, you should change it consistently across everything that you have with your site map files, your internal linking. If you have a rel canonical on these pages, all of that should match your new URL structure. Got it. Thanks, John. Sure. Hi, John. How you going? Look, I just wanted to ask you a quick question. So we we actually operate a educational publishing website. And we've recently implemented lead in metering. So to pass the text content of our documents, they're actually behind a paywall. So we've implemented that. And we've seen a weird, weird thing happen where it's our that the content that we have that covers pharmaceutical, like pharmaceutical terms or chemistry terms are actually being flagged as having content hacked injection in the search console. And so I've gone through all of them. And they're basically I'm certain that it's not nothing to do with, you know, any content being injected. But what it seems to be is that Google's picking up certain terms in there and thinking that this is the pharma hack. So I'm not really sure what to do about this. I press the request review button. But is there a is there a way that we can sort of flag that this is legitimate content without D, like, without removing these particular Yeah, so they the request review button is definitely the right approach there. And sometimes we do pick up weird things as being hacked, but they're actually not hacked. So I could imagine pharmaceuticals is one of those things that tends to show up on a lot of hacked sites. So maybe we're kind of associating things there. The usually with with the request review, that's something that would get processed. And someone that would look at that if you want, you can also drop some of the URLs here in the chat. And I can double check. Awesome. Okay, I'll drop a couple of links. Thank you. Thanks, guys. Oh, the link dropping in the channel. One place where you could drop links, it doesn't give you any page rank boost, but it's useful for me and maybe it helps your side. Who knows? Nandini, I think you had a question. We run an ecom business. So we are just planning to launch the website in different languages. So the question is whether we have to pro the URL has to change if you are providing in different languages or should just the content has to change? Good question. So for content in different languages, we need to have it on separate URLs. So that's something where if you automatically change the language of the content, and you keep the same URL, we will just index one version, and we won't be able to tell that you have multiple language versions of that content. It's something that used to be pretty popular in that sites would try to do something fancy and automatically show the right version of the content. But in practice, it means that when Googlebot crawls from the US, we only see the English content. And if you have content that's translated, it's like we never see it, we can't show it in the search results. So if you do have international content in multiple languages, make sure that they're multiple URLs associated with it. And you can use subdirectories, you can use parameters, you can use subdomains, however you want to split it up. It should just be multiple languages, like one one language per URL. And one more question also, it's about the duplicate contents. So we have two different pages like the product listing page, which has all the brands listed, and a separate brand listing page. So the brand listing page we have placed a rel canonical to the product listing page. But it's usually considered the brand pages are considered as duplicate content. Is there any way that I can remove the duplicate content? Like I wanted to be indexed and shown in the search results. So is there any way that I can do it? You you can block the whole page from being indexed, but it's not something where you can block a part of the page from being indexed. And usually, for this type of duplicate content, we can understand that this block of text within the page is duplicated across multiple pages. And that's fine for us. It's not a sign of low quality websites. That's completely normal. What happens in practice is if someone searches for something in that duplicated block of text, we'll know multiple pages on your site have that block of text, and we'll try to pick one of those pages and just show that. So it's not that we would demote your website for having this kind of duplicate content. It's just we will try to filter it out and show just one of them in the search results. Thank you. Like that's the question. Sure. It's similar to the language question. Can you point out the dangers of redirecting according to IP? I if somebody's in France, you redirect them systematically to the French version. If they're from the US, you redirect them to the English version. Can you set out the dangers of doing that from an SEO point of view? Well, we would see probably redirect to the English version if you have one, and we would drop all of the other versions then. So it's kind of drastic. Well, I mean, you spend a lot of time making the different language versions, and if in the end they don't get indexed, it's kind of a shame. And second question then is having everything in a sub director, for example, domain.com slash after domain.com slash and never having that route indexed. Is that a problem? That's okay. That's okay, because I've had a problem with Search Console getting getting the site validated, which implies to me that Google isn't fully comfortable with it. That might just be the verification where we're looking for that meta tag and like if we can't get it on the root of the page, there's a readout. And the second question is I've had a couple of clients in France who do this and systematically they don't get the rich site links on brand searches. And as soon as we stick them on the root domain for the English version or the French version with the other ones in sub directors, the rich site links come back. Is that me imagining things or is that a true problem? It shouldn't be the case. It's happened twice. So it's not obvious. It's not obviously a big case. Yeah, yeah. If you could drop the links in the chat, you know, I've gone right on the screen. Oh, that's very traditional. Yeah, so it's it's the the rich search results that shouldn't be tied to the structure of the site. Well, I can't think it was because Google was getting a little bit confused. I mean, it was combination of the redirects plus the sub directors with no root. I mean, I think that default version is a bit weird. It's certainly a bit weird, but some sites do it. It kind of works. Okay. Yeah. So if you're using hreflang, then you would specify the root as being the default version because that's the one that does kind of the redirect. But even without hreflang, that should just kind of work. Okay, back to the drawing board. Thank you. Yeah. Cool. Oh, my gosh, so much text in the chat. I'll copy this out and take a look afterwards. Let me see if there's anything. Well, there's still lots of things out there that were submitted. Let me let me try to go through some of these fairly quickly. I have an enterprise client whose crawl recently dropped off. Thankfully, traffic and rankings have not been impacted. But we're working to restore Google crawl. We focus on improving page quality, reducing errors, improving page speed. Nothing has seemed to work so far. Any advice on where to focus and how to remedy Googlebot crawl issues? So I think that's a pretty complicated topic. It's kind of hard to say what exactly I would focus on there. I think there are a few things you might want to start out with. And that might be, first of all, making sure that you have access to log files so that you know what is actually being crawled. And ideally also the log files from earlier so that you can compare what Googlebot was crawling before to what Googlebot is crawling though. And it might be the case that Googlebot is crawling a lot less, but it's still crawling the important URLs of your website. So maybe that's that's perfectly fine. Then from from crawl range part of you, we basically have two aspects that come into play on our side. On that on the one hand, how much we want to crawl from your website. And on the other hand, how much we feel we can crawl from the website. So in particular, if we think that your server is kind of slow, if it responds with errors when we crawl, then we don't want to cause more problems. So we'll kind of throttle our crawling. And similarly, if we can see that your server is fast, responding quickly, and no errors, server errors showing up, then we will try to speed things up as much as we can until we reach the limit of things that we feel like we want to crawl. So those are kind of multiple aspects are kind of competing with each other a little bit there. But again, first I double check to make sure that it's really a problem, because it's easy to spend a lot of time on these issues and it's not really a problem. And then maybe double check the speed information from your server. For crawl, it's not so much the speed that you would see in a browser, so not page speed insights, but more the speed information in Search Console, the time that it takes to download a page. That's what we would use for crawl. And yeah, I think those are pretty much the aspects I would focus on there. Let's see, I had a domain that wasn't indexed on Bing due to it being in some kind of spam backlink, blacklist. After manual review, they lifted the ban and it started indexing. And then basically if the domain was flagged on Bing, what's the chance it was flagged on Google? I don't know why you were flagged on Bing, so it's hard to say if that would affect Google. Usually, I mean, definitely we have different indexing systems, we have different web spam systems, so all of that should be pretty separate. So usually what happens on our side is if there's a manual action from the web spam team, you'll see that in Search Console. So if there's something that's blocking your site, you'll see that, you can respond to that, you can do the reconsideration request, and then everything is essentially back to normal. So that's mostly what I would focus on there. Oh my gosh, so many questions left, but what kind of running low on time? Maybe I can just take questions from either you here or you here. You didn't get a chance. About the discovery section. And does Google make different between newspaper sites and blogs on the discovery? The discovery, so the kind of the automatic fee. Yes. I don't think so. So it's not limited to news sites, it can be blog sites. I've seen forum entries there as well. So okay. I think it's an interesting topic because it's one of those things where you, depending on the site, you can get a lot of traffic from that, but it's really hard to optimize for it because there's no keyword. You don't know what people would be interested in next week, so you kind of have to understand and niche yourself and kind of think ahead. The boss told me a musician who was asking everyone, this is Google Maps and Discover, asking how you could build up his audience and thinking about it in the middle of the night. I was thinking, you've got this YouTube channel and you get people to watch the YouTube channel. Would that then mean that he would pop up in Discover if he had a gig in that town? Or is that pushing things too far? I don't know. I don't think that would be directly related, but I don't know if YouTube videos are shown in Discover. Maybe they are. I'm sorry, the idea is to get them tagged as being interested in this particular musician, so when he's playing a gig in their town, that he would come up on other Google Maps figure down on the bottom line in Discover or on Discover because Google knows that they're interested in that particular artist. I don't know. I mean, a lot of these things, I could imagine, theoretically, there's some machine learning pipeline that's trying to connect all of these things. Well, three o'clock in the morning, that seemed to me incredibly reasonable. So James Kirby, if you're listening, we don't know. I don't know. I mean, it's one of those things where, theoretically, we can combine a lot of things, but I find with a lot of the machine learning approaches, it gets creepy really quickly. So it's always important. I mean, I don't work on these things. I don't know how these things are handled, but I find it really useful to be able to follow back, like how did you come up with this? So as a user, you can look at that and you say, well, this was due to, like, me having looked for this musician's name a couple of times, looking for concerts a couple of times. I didn't find anything and now there's something here, then that kind of makes sense. So it's more likely to be a link between the search behind Discover than there is between YouTube and Discover. I don't know. I don't know if you could say that. Well, I imagine also it's something where the activity on YouTube wouldn't be something that we'd be able to take into account for a search. Maybe for policy reasons. Maybe it's just because different silos within Google, but it's quite separate, I guess. I mean, I'm really intrigued by Discover. You must be too. Stuff that comes up in Discover, for me, is not necessarily things that I've been searching for. So I'm wondering where it's going with the information. It gives me liveable football skills, but I never search for that. I go and look at them on the Guardian. Okay. Or, you know, I think they're kind of gadgets. It's not. Oh, yeah. It's not search behavior. It's surfing behavior. Okay. Which is there. But then that's when I enjoyed the phone that I think about it. I don't know. Sorry. I mean, I'm not sure. It depends. It depends. I'm just really curious about it. Yeah. I know some people use it quite a lot and other people don't use it at all. Some people adjust the settings quite a bit. It's more of this, more of that. Yeah. I'm curious to see how that works out. Hey, John, I have a question. Okay. Will Google read the URL after the query parameter? Yes. Yes. So the URL parameters, all of that, we do use that. It's something that doesn't block crawling or indexing. So if you need to use URL parameters, that's perfectly fine. If they're parts of the URL parameter that you want to have dropped, then I would use the URL parameter handling tool in Search Console, or at least make sure that you have the rel canonical set to kind of drop those parts of the parameter. Will that too? No, even after the query parameter, will it read the, will it read it? Yes. Okay. What do you mean after the query parameters? Like, I have a domain and after that, I use the query parameter so that like I can have different content over there. Yeah. So if it's with a question mark and then some like key and value things, that's perfectly fine. If it's with a number sign like the hash symbol, then that's something that we would drop. Okay. Yes. By adding even stacking, will it affect my site? Like in case of a page size? What are you adding again? I missed it. Sorry. By adding the even stacking, code to the website. So will it increase my page size? If you're adding more code to the pages, then that increases the page size. Usually it doesn't affect the time for a browser that much. If you just add some structure data because it's something, it's not like a giant block of text. It's kind of reasonable in size and it compresses well. So it does affect the size, but usually not so much to speak. And with regards to your tool, I suspect we'll move it over. So I mean that question comes up, I don't know, every day. So like this tool is only the old one. Like, does that mean you're abandoning it and it doesn't mean that we're dropping it? So we try to drop all of the things that we know we don't want to move over. And the things that we have left at the moment, we're trying to find ways to move that to a new system so that we can continue to use it on. But did you really consider the settings I do in the tool or is it just for me that? No, no, we really use that. We really use that. We use that so much that sometimes the indexing team will contact us and tell us to contact websites and say like, hey, you're doing it. You're blocking things. That occasionally happens. So it's really easy to use that tool and say, oh, Google should ignore this parameter and then you don't realize that actually a big part of your website relies on that parameter and you said it wrong. So that's kind of why we have such big warnings there as well that you should really know what you're doing before you click around here. OK, let's take a break here. I'll copy all of the chat out and double check the things that were left and do that before it all disappears. Thank you. Thank you all for joining in. Thanks for all of your questions, comments, answers. Thank you for coming by in person. Thank you for having me here. And I'll set up the next batch of Hangouts, but I might set some up for next week just to try things out to try different setups up because apparently the Hangouts on air is going away on the 1st of August, which is not that far away. So you need to figure something out. All right. Thank you, everyone, and have a great weekend. Bye. Thanks, guys. Bye.