 All right. Welcome, everyone, to today's Google SEO Office Hours Hangout. My name is John Mueller. I am a search advocate at Google in Switzerland. And part of what we do are these Hangouts where people can join in and ask their questions about the website and web search. And we can try to find some answers for you. Bunch of things were submitted on YouTube already. So we can go through some of those. But we can also go through some of your questions first, if any of you want to step up and mention something that's been on your mind. Can I ask a web stories question then? All right. Do web stories follow the same sort of rules in the domain setup? So if we have that content living at 4. slash UK and then a bunch of web stories and then 4. slash US, a bunch of web stories, is that the best way to handle that? Or can you just have 4. slash web stories and Google will work out what the best audience is? If we're creating AMP pages specifically for those. So for a geotargeting in particular? Yeah, because it'll be on a global site where we'll have US, UK, South Africa, Australia, et cetera. Would you have a central pool of those web stories? Or would you do them at the country domain level? I don't know. OK. I don't know. So I think web stories are only shown as kind of this unique UI thing in certain countries. I don't know if that's in the UK, but it might be UK in the US, something like that. The Telegraph has it. So I assume, in fact, I think the Telegraph is one of the examples that they use within the. OK. So unless I'm going mad. OK. Well, then maybe you're seeing them, because we don't have them in Switzerland. So I'm just assuming what they will look like. But essentially, for all other countries, we show them as normal web pages. So they're normal AMP pages, and we will index them like normal web pages. So that means any geotargeting that you have there for those pages, depending on where they're located, that would apply there as well. But I don't know how that would apply for the situation when we show them as kind of that web story UI in Discover and in the search results. OK. But it couldn't hurt to follow a country level role, presumably, because if the country doesn't exist in web stories, let's say it's Australia or whatever, then it's not going to find it anyway. So it doesn't matter. Yeah. Yeah. Yeah. I don't know. It's also one of those things where, since it's a bit of a new UI, I could imagine that they're kind of watching out for how these web stories are used and how they perform. And if you have the same story for multiple countries, then it could be that it comes across as, oh, you're just republishing the same content multiple times. So that's kind of the unknown from my side if it comes across as you're republishing things or if it's purely tied to geo-targeting. Right. I mean, luckily for us, we can do geo-targeting in a content per country, but OK. As far as I know, I think some of the product experts experimented with this a bit. And I know somebody with a, I think with a UK site created web stories that got displayed in US discover results, because most of the traffic seemed to have come from US users, despite that being a UK website. So my guessing, I guess, is that perhaps we should just treat them like a normal blog post. Like if you have blog posts that are just in English and are not targeting a specific audience, you wouldn't necessarily make duplicates for each kind of country version. You just have that blog post and that's it. Or maybe your commercial site has different country or language versions, but you have one single blog and you keep that separate and just in one language or whatever it's easier. And Google kind of picks it up and shows it how it makes sense for Google basically. Right. I mean, ours is country specific subfolders with country specific. This is for experienced GIFs, not the other site. So it would be specifically for that country. It's interesting to hear the telegraph. The UI, not the actual web stories do work. They're just not being shown in the UI is not being shown in specific countries. But they themselves are still being indexed by Google as far as I know. OK. Yeah, I'll double check with the web stories, folks, on their back. OK. I got a core web vitals question for you, John. I think it makes sense. I just want to understand the rationale behind it a little bit more. So metrics like LCP, which is measured in time, so like three or two and a half seconds for good, that is dependent on internet connection, right? Like that weighs heavily into that metric, right? So better developed countries with faster internet are going to more likely have a better LCP than the same website perhaps where the audience is in a country with underdeveloped internet connection. That is accurate? That can happen, yeah. I mean, for speed, the internet connection is one thing, but the way that the pages are built up is not purely tied to the internet connection speed. So that's something where superman can't fly that boot. Whoop. OK, doggy. OK. Let's hope this doesn't go too crazy. Sometimes we have, I don't know, these kind of like Zoom bombers drop in and do crazy stuff. So we'll see. I may ask a very basic question, please. Sure. I am just so stuck. I am not as technical as all of you. I am using Google sites, not suites. And I made a website. And I cannot get, I've enabled Google Analytics to track the property. And I simply cannot get, I can't get Google to crawl the site. And when I get into Google Search Console, I've auto verified the property. When I do a URL inspection, I get a message that the URL is not in property. I did wait a couple of days, but that's been stuck for two weeks. I just wonder what to do. My recommendation there would be maybe the post in the Webmaster Help Forum. So Mihai is one of the experts there. But sometimes this depends a little bit on the specifics of your case. So in particular with Google sites, there is a kind of a weird situation with the way that the URLs are structured in the sense that it can be that you're testing a URL, which is one of those URLs that we don't actually crawl, but which is something that you might see in the UI. So that's sometimes a bit, I don't know, quirky with Google sites in particular. And kind of going to the Help Forum, posting your URL there, usually people will be able to take a look and say, oh, you should have been checking the other URL instead of that one, for example. Which site is that? Which Webmaster did you recommend? It's the Google Search Central Help Forum. I can drop a link into the chat as well. That would be terrific. Thank you so much. Sure. Hi, John. Hi. This is Kiran. How are you? Hi. How are you doing? Yeah, I'm good. John, the thing is actually I found a fault in Google Webmaster algorithm bias for the software category. Is it possible to discuss how about that personally? Already I've spoken with your representative site for India representative country manager. Regarding on this, the reasons what he said is not satisfied with me. Actually, Google is a company to provide a solution for that. For my category and the following reasons, what they said to me is not satisfied with that business. Can we discuss about this personally? Not in this chat room? Because I lost a yearly, for example, $250 million I lost in my company due to the... I think you can understand. Can we discuss personally about this? Is it possible? Usually, we don't set up one-to-one meetings. So that's something which usually we don't do. If there's something where you're running into situations with regards to maybe the monetized side of Google, usually you can reach out to a what are they called? Account manager to discuss that there. But especially when it comes to search, it's not that we would set up a one-to-one meeting with any company. It's OK. How to contact the account manager in India, John? I don't know. That's something where you'd have to go through whatever monetized channel that you're working with. If you're working with Google Ads, then through that, for example, you'd have to go. Or if you're using AdSense, for example, you'd have to go through that. Right, right, OK. Then we can discuss right now itself regarding this. For e-commerce platform business category people, the following search results are very good, right? So there is no issues for the e-commerce business end, or news angles, or in blog sports, OK? Bloggers for bloggers, like that. For software categories, software publisher companies, like an file repo, a soft media, like a CNET, like for that kind of category company. Actually, I launched a company called SNFileCoffee.com in 2014, and I started building with a large enterprise to publish a software for my country, India. So there is no representative in India to publish a software in India, because software pollution is not an easiest task. It's a very highly challenging field, right? For that, I started. But due to Google BIOS engine, how the Google is seeing the, for the software category site told me about that, Google is seeing the category for them worldwide, because anyone can be downloading throughout the world. Like if I publish the software in my domain, means even the Japan people also able to download the file, or United States of people can download the file, right? Right, John? Right. Fine. But the team is actually in my country. It's a second largest country, downloading the software. So I started the company from India, but I'm not getting nothing revenue from here. So how you are going to deal, John, about this? OK, so you're not seeing your site visible in the search results in India? Yes, you're right. Yeah, OK. My recommendation there might also be to go to the Webmaster Help Forum and post some specifics, in particular, some searches that you see when you're looking from India to kind of highlight the issue that you're seeing there. And the folks in the Search Central Forum, they can escalate these if there's something on Google's side that needs to be done. They can also give you tips with regards to what you might be doing, what you might do differently, with regards to geo-targeting, with regards to kind of mapping the content to what the queries are looking for, those kind of things. OK, yeah. Actually, they're giving us some of the tips and for that to do that. Actually, they are recommending for an AdWords, because I am a startup company, how much millions of dollars I can spend for the AdWords? So is it possible to do that? I have no idea how AdWords works. So that's also something where it's like sometimes when you're a new business, you do various things to get popularity and to kind of grow a little bit more well-known first. And that can include ads, but ads is not something that we do from the organic search side. So that's something you'd have to kind of figure out separately. OK, OK. OK, thank you. John, can we just touch back on my AdWords question again quickly? Thanks. So yeah, just about the internet connection affecting the speed. So basically what I'm trying to understand, so I have a multi-national site where the majority of the customer or some portion of the customers are in North America, where there's better internet connection, and then a portion of the customers are in more underdeveloped countries with a slower internet connection. It's essentially the same architecture, different content, but the speed scores might be the same. But then because the internet connection is poorer in maybe South America, I have a worse LCP score there. So what can I do about that? Or what's the methodology behind that, I guess? In general, our idea with the score is to map where your users are actually using your site from. So if users are primarily in one location where they have bad internet connectivity, you're kind of competing with other sites who are also kind of focusing on those users. So it's not that you have kind of a disadvantage just because you happen to have users with a slow internet connection. It's more, well, for those users, those are the sites that are available, and maybe they will come to you as well. Maybe they'll go to other sites. So it's a level playing field, basically, is what you're saying, yeah? That's kind of the idea. So it's not just, well, this site focuses on users in the US, and they have really fast connections. Therefore, they have good scores, and we will rank them higher. It's more that, well, depending where your users are coming from, we will try to rank you like that. So would you look at the ranking on a per country basis? Or if it's a multinational site like that, the US section, would that have a different kind of benchmark or, I guess, be against a different audience than, I guess, the other sections? Could be. So there are a few things that kind of play into that. On the one hand, we have to have enough data from the core web vital side to understand these segments. And if we have data for those segments, then we can apply that. Whereas if we don't have a lot of data for your website, then maybe we'll just say, oh, the whole website has this one data point. And then it's kind of all bundled together in that one aggregated data point. But if we have more data, and you'll see that in Search Console, if there are multiple sections there with a number of URLs, you can see a little bit more like, well, there's data for the US part. There's data for the South American part, for the European part, all of those. And I section those different properties or those different subfolders out into different properties anyway. So I can use that data that way, then I guess. Sure. Yeah, OK. And then just to further your point with one more thing, what if a site is not in the crux data set at all? How then does it get ranked or, yeah? Yeah, I mean, we run into this problem with all kinds of metrics. So that's something where we kind of have to figure out how do we start the site off. And that's something which you see not specific to the speed side, but which you also see with regards to, I don't know, lots of other factors like around the quality of the site overall, where some people will say, oh, there's this Google honeymoon period until Google has enough data from my site, or I'm in the Google sandbox because Google is ranking my site lower than it should be in the beginning until we have enough data. And that's kind of reflected in there. It's not so much that we treat it like in a bad way or in a good way, but we have to make some assumptions. And over time, we should be able to collect data on that. OK, cool. That makes sense. Thanks so much. Sure. Hi, John. Hi. Hi, John. It's Deepak here. I'm from India. Cool. Hi. I'm doing job at an MCC company. Actually, my question regarding basic and technical website structure, actually, I'm facing some issues with my website. There is a regarding crawling and indexing. It's bigliving.co.uk website. I have changed the website structure many times. But still, I'm not able to index my website and search engine. There are basic, some categories of page are indexed, but other pages like filter pages and internal subpages are not indexed properly. So that's why I'm not able to getting good results. And this site is what we can call Enterprise website. OK. Can you please guess to me what I have to do with this website or structure where I can? Yeah. So it's hard to say without looking at the website in detail. So that's something where I might also suggest going to the forum to get some tips there. But in general, changing the website structure makes it a lot harder for us. So if you've changed the structure a few times already, then every time you change the structure, we have to understand a new structure. And that takes time. So finding a way to focus on one structure that you can keep for the long run, I think is really important. So fewer changes and more focus on one clear structure. The other thing that I think is important is focusing on fewer pages as much as possible. So you mentioned filter pages, for example. And if you can avoid having all of these filter pages indexed by setting them as no index, for example, then we can focus a lot more on the actual detail pages. So that's something where when you look at your website, maybe if you're seeing that Google is struggling with crawling and indexing, then think about which pages do you really, really care about? Which ones are really important for your business? And make it so that search engines can really focus on those pages. So limit the links to the other pages, limit the indexability of the other pages so that we can really focus on the smaller set of pages that you find important. OK, sir. One more question, please. It's regarding the URL structure. Obviously, what we were talking about is recently about that structure. But what do you think about URL structure? Means root domain, category, then subcategory. After that, product page. So is it a right URL? Yeah, I think that's perfectly fine. Like URL structure, I don't have super strong opinions on that because different sites make it in different ways. And usually if you have unique URLs, it just works. But that's a reasonable structure. That's also, like I said, something I wouldn't just change. So if you get advice from some other SEO and they say, oh, you should put dashes instead of subdirectories, for example, I wouldn't change that kind of thing because it makes it much harder for us to understand your site again. So instead, kind of pick one structure that you're kind of OK with and keep that for the long run. OK, sir. All right. And all right, sir. Thank you very much. I think I'm satisfied with your answer. Sure. I have a follow-up question, Sudhaj. I do, I do. OK. And so it would be one of the things. Yeah, so today I would like to know that actually, our website is an e-learning website. And it's ranking well and it's getting the good trophic. And actually, we want to apply the story branding thing to our website. So for that, we want to cut down our content in our service pages and in the home page. And we want to put more graphics in it. And let's say maybe we cut down the content up to 75%. And what kind of impact it will show in the ranking and as well as the trophic? I don't know. It really depends. So if you're removing content, then obviously we can't rank you for that content. If you're removing things that are not important for your site, then that's perfectly fine. If you're removing something that is critical for those pages, then of course, those pages won't be visible for those queries. So yeah. So I would like to add that. So we will take the consideration of the keywords and we will put those keywords in the content. And I mean, we want to write in terms of customer prospect to you. So because of that, we want to trim down the content and make it very crispy for the users. So yeah. So my recommendation there, if you're looking at trying to make a bigger change, like removing 75% of the content seems like a big change, I would take a handful of pages and test it and make the changes there and see how they perform. If they perform OK, then maybe do the same work on the other pages. If they don't perform as good as you expect, then maybe try different things out and test that. OK. If they're not working well, if I restore the page again with the same content which I have deleted, so will it rank again? Sure. Yeah. I mean, it obviously takes a little bit of time for indexing to catch up, but it will be like before. OK. Great. And one more question, last question. And actually, in our blog, we upload one article each day. I mean, we publish one article each day. So what do you suggest the best thing, I mean, to say in terms of content numbers, how many articles we supposed to upload in a week or a month? It depends up to you. Some sites make very few changes and few articles. Others make lots of articles. It really depends on how much content you have available that you can create and what people are looking for. OK. Great. And thank you so much. And I appreciate Google for conducting this kind of webinars and Merry Christmas. Thanks. You too. Martin, I think you had a question. Yeah, the follow-up question to the change of the internal structure of the domain. Let's say you really change a big thing like the main navigation, for example. What kind of time frame would you recommend until all the signals have been reprocessed, so to say? I would assume you would see the large part of the reprocessing taking place within maybe two or three weeks. In general, we try to recrawl all pages within six months. So the long tail will take a bit of time to be updated. But the pages that we crawl the most, that we think are the most important for your site, they should be updated within a couple of weeks. So my guess is, at least after a month, you should see the effects of a change like that. All right, that's great. Thank you. Sure. Hi, John. Hi. So which methods to choose if we have a category we produce for a few months? They are indexing very well. They run very well. But in one shiny day, the owner delete them or remove them from categories. And what method will be better for us to choose, to index canonical or something, to leave them like a blank page? So it's a product that you're removing from your site? Is that correct? Yeah, but the question is for the category, because the categories rank. And in the list, in the category, are very much products, but they finish one day. Oh, so you remove the whole category? Yeah, all products from category. OK. And the category is empty for a few months. But after a few months, we'll become back. I would just make it 404. 404. Yeah, I wouldn't try to do anything too tricky there. I think if you set it no index, the same thing will happen. If you try to set it canonical to a different category, we'll treat it as a soft 404. So from that point of view, I think 404 is kind of the easiest approach that has the same effect. OK, and if we get the products few months after that, we should create new categories. Sure, yeah. And if you have new internal linking pointing to those category pages, then we will pick it up. Yeah, of course. And thank you so much. And one more question about the last update. I was wondering something after the date. Do you think it's reasonable to immediately take some action or should be wait until it's over? Because sometimes webmasters trying to change something when the update is rollout, but I don't know. It's a very smart decision. What is your advice here? I mean, so I think the core update, I think that's finished rolling out. So from that point of view, it's finished. Anyway, but in general, if you have things that you can improve on your website, why would you wait? Especially if there's something bigger that you want to improve on your website, then there's always some update happening at some point. So I would just make those fixes and improvements and keep track of them, of course, so that you know a little bit when what happened. But I would just make those changes. Yeah, thanks. And do you think that the website can lose its visibility to recover without SEO specialist intervention in the period between two updates? Sure. It's possible. Without anything. It's like SEO specialist. You can't pick up a magic wand and kind of make a website magically appear. It's something where, especially when it comes to the core updates, we're looking at the relevance of the website overall. And that's something that anyone can take a website and say, oh, I will rewrite the content and I will recreate the website. It doesn't need to be an SEO. I think there's a little bit of an effect there that SEOs often know a little bit more of what's happening and what Google might be watching out for. But anyone can update a website and make it better. Yeah, thank you so much, John. Sure. Let me run through some of the submitted questions. And we'll definitely have a bit more time to chat as well afterwards, but just so that we've kind of run through those quickly. We experienced a severe ranking drop. Our domain is only visible for brand searches. All other rankings have nearly completely vanished for now since three weeks. On the other hand, we do not experience any alert in Search Console. Is it possible that the Search Console is a bug so that there are no alerts shown, even if the domain is punished via penalty? So it's always possible that there's a bug in Search Console. I mean, it's kind of like computer software. It's always possible there. However, I would be very surprised if there were a bug in particular with regards to the manual actions. So we call them manual actions. People externally tend to call them penalties because manual actions are something that they just happen all the time and lots of sites see them. Lots of sites try to fix them. And that's something where if there were a general bug with regards to how we show manual actions, I think lots of people would notice it and complain about that. May I ask a question? Sure. You already know the situation, but can you imagine a situation where the rankings are vanished and the reason for that? So if it's not a penalty? Yeah. So I don't know exactly the number of weeks, but we recently had a core algorithm update. And that might map into that time frame that you're looking at there. And with the core algorithm updates, we try to improve the way that we understand relevance in the search results. And with that, it can happen that a website becomes suddenly less visible and significantly less visible in some cases even than it was before. And it's not that there's something that you're explicitly doing wrong or an error or something on your website. It's more our systems feel that this is the way that we should show websites in search now. And we have a blog post about the core algorithm updates. So I would go through that. So we already checked this. That was our first thoughts. So it happened on around the 24th or 25th of November. So we are pretty sure it's not the core update. OK, I mean, there are also lots of other changes that can happen in search. So it can certainly happen that our algorithms are picking something up on your site and reacting to that. If it's not an update, I can provide another information. So there was an incident. So a link building network, so an illegal fraud network, was pointing on 300 domains which were in my company's portfolio. So they were hosted on our own servers. And by linking from the link network to our domains, those domains got indexed. And they all had a redirect to our main company page. So is that a hint as well, which can help to clarify it? Usually something like that wouldn't be a problem for us. Because if it were a problem, we would flag that as a manual action. And you would see that in Search Console. And yeah. The servers are so the servers of so it's a huge company. And the servers from my company, that's something like the daughter of the big company, they are hosted on the same server. So the traffic from the index URLs pointing on our domain on the same server. So that's why I mentioned it's not taking into respect a spam. Or it is taking into respect a spam because they are hosted on our own servers. So if their phones or servers which are not under our control, this spam attack would not have those consequences. Yeah. I don't know. My general feeling is we would still be able to deal with that normally. But it sounds like you have a very unique situation. So that's something where I would also maybe point at the help forum, maybe post there. You can also drop your URL here in the chat. And I can take a quick look afterwards. I can't make any promises with regards to what we can do there. But I'm happy to take a quick look to see that things are kind of working as expected. I just get to, I need to ask someone if I can put out what I want to. Sure. Sure. No problem. Thank you. OK. All right. Let me run through some of the other submitted questions so that we get a few of those covered. During the core update rollout, is it like the quality of the website is calculated from the overall site signals and then this site quality score is propagated to every page gradually, page by page? Is it possible that some pages drop and some pages surge and the overall traffic to the domain remains the same? It's like when we try to understand the relevance of a website. On the one hand, we try to look at the bigger picture of the website. But we do also look at smaller parts of a website. So it can certainly happen that some things go up, some things go down. And on average, across the domain, you will see some change, or maybe it'll even out, even in kind of weird coincidental cases. So that's certainly possible the way that you're seeing things there. And it's also that there are always a lot of different things that come out with regards to search. And some are a little bit more focused on the domain or on a bigger picture of the website. Some are focused more on smaller parts of a website. So even outside of a core update, you might see these shifts across some parts of your site and other parts going up, some parts going down. When do you bring back the request indexing tool? I don't know. So we will see. I know that lots of people want it. And the Search Console team is pretty close to kind of getting that back lined up. I don't know what the timing there will be. It's like there is like some chance that we might still get it this year, though it's getting really tight. From a ranking standpoint, if Core Web Vitals use real user data, I think we talked about this briefly before, in March, your big mobile update will come. If we are ranking well at the moment on mobile, can that change with the mobile update? For example, our internal linking is basically not existent on mobile. Our mobile navigation menu is blocked for the crawler currently. How much is our desktop page actually influencing the rankings on mobile? So I didn't take a look at your specific site. But if your site is already shifted over to mobile first indexing, then you will not see any changes in March when we shift the rest of the web over, because we're already using the mobile version of your site for indexing there. And for the largest part, we have shifted over most websites. So that's something where if we've shifted it over, you're not going to see any changes. If we haven't shifted your site over to mobile first indexing, then that does mean that, on the one hand, our systems are not sure if your site will work well with mobile first indexing. And if your site still doesn't work well with mobile first indexing in March, then we will pick whatever mobile content you have and index that, which might be worse than your site is now with regards to indexing. So that's something where, depending on your case, you might see changes or you might not see changes. PageSpeed Core Web Vitals' question, does the origin summary include only index pages or the domain experience? So I don't know, offhand, how the Chrome user experience report data is collected with regards to what goes into the origin summary or not. But there is fairly detailed documentation on the Chrome user experience report data. So I would double check that there. I still get alerts from an old subdomain that I moved away many years in Search Console. I'm concerned that it's affecting my ranking. With the current and active domain, should I fix these errors? Probably not. So Search and Search Console, they keep track of things for a really, really long time. And it can happen that you move to a different domain and the old domain still exists. Maybe there's no hosting there anymore in the meantime. And if anything happens where we try to crawl a page and find errors, then we will flag that in Search Console. But that doesn't mean that there's any effect on your current site with regards to Search. So for the most part, if you've moved away many years ago and something gets flagged on the old domain, then you can just ignore it. I want to split a website into two websites. I want to keep some pages and posts on the current domain, but the pages and posts I want to move to a separate website. It will keep the same branding structure, basically everything. I just want to separate visitors and make two websites easier for them. Except for 301, redirection for the moved pages, is there anything else I should keep in mind to do in order to maintain the website reputation and not lose ranking of those migrating pages? So I think the redirects are really the primary thing that you should focus on. But any time that you split or you merge websites, the end result is really hard to determine ahead of time. And it can be that things are better. It can also be that things are worse overall. So in particular, if you're taking a website and you're splitting it up into lots of small pieces, lots of pieces, not just two, then it can happen that all of those pieces individually are just kind of like, OK, from a quality point of view, but not good enough to compete in a competitive environment. So in particular, if you're active in a very competitive area and you split things off into smaller pieces, then it might be harder for us to rank those. Whereas if you combine them, it might be easier for us to rank those. On the other hand, if it's a less competitive area, then it's very possible that we'll just index all of those versions and show those appropriately in the search results. So that's something where it's really hard to determine ahead of time what the SEO effect will be. And because of that, I would recommend doing this kind of a split or merging primarily then when you have really, really strong business reasons to actually do that. If you really know your users are annoyed, if you have both of these things on the same website and maybe they convert first because of that, then that's something where I'd say, well, maybe it makes sense to split these off. But if you're just trying to tweak things slightly and you don't expect a big effect from your user side, then by doing this kind of splitting off, my suspicion is that overall you will be a little bit worse situation than before. When will Search Console allow this avow file submitted to domain verified sites and allow all four versions to apply at the same time? We don't want to do it four times. I don't know. I think that's on the Search Console list of things to do. In particular, when they rolled out the new disavow feature, we did bring them the feedback because it also came from folks like you. So I don't know what the timing there will be. The good part, I guess, here is that for the most part, it's really rare that you have to do a disavow file submission. So if you do this in an extreme case once a month and you have to do it four times rather than one time, it's not a ton of time saved. On the other hand, if you're doing this daily, probably you're focusing on things that don't have as strong effect as you'd expect. I'm starting my new website and I'm making it accessible for people with disabilities. My question is, how can we include animation and videos and infographics if I want to make it accessible? Second, does accessibility make for trouble when it comes to indexing? So I don't have a lot of expert advice with regards to accessibility. I don't have a lot of background knowledge there. So it's hard for me to say what you should really focus on there. But there are definitely experts out there who can help you to figure that out. With regards to indexing, usually I find that if you make a website more accessible, that it actually becomes better for search. Because usually that means you don't just have one image and Google has to figure out what this image means, but rather you have the image and you have an alt attribute for the image and you give more details about the image. And all of that makes it easier for us to understand what your site is about. So that's something where usually with regards to crawling and indexing, making things more accessible is more of a positive effect rather than a negative one. Should I separate my e-commerce for my main website? What's the guideline? If any, would it be better for SEO? You can do it either way. So some sites have a blog and an e-commerce site on the same domain. Some split that off into multiple domains. It's essentially possible for you to do that in any particular setup. So that's something where I would focus more on what works well for you, what works well for tracking, for analytics purposes, and then go from there. I'm very novice that have managed my website this past year. I'm really having difficulty understanding search console with regards to my accounts. Looks like everyone here is more experts. Is there a more appropriate Google forum? I would probably go to the Search Central Help Forum. I keep saying that today. I don't know. It seems like it's usually a good place to go with regards to these general questions, like how do I verify my site? How do I get things started and get set up with regards to search? There are lots of really smart and friendly people there. So I would tend to go there. And I post your details there, like the URLs that you have for your website, so that someone can take a look and say, oh, it's like you need to put your meta tag here, or you need to do this in your WordPress, or whatever you're using. How many articles are needed for Google Discover to find our site suitable? I don't think there's any limit with regards to Discover. However, the Discover content policies are a little bit different than web search. And what we focus on for Discover is a little bit different than just general web search. So I would check out the content guidelines for Discover and see how you can perhaps tweak your content to make it more suitable for Discover. And then keep working on that. I think Discover is a really fascinating thing because it shows your website to people that are not actively looking for it. But on the other hand, that also means it's really hard to create content specifically for Discover because you don't know what people are not searching for where your site could be shown. So it's a bit kind of like it's a tricky, tricky place. But I think it's really cool. OK, more and more questions. Maybe I'll just switch back to questions from you all to kind of close things out. So I've trapped you the URL in the chat. OK, cool. I'll copy that out and take a look at that later on. Great, thank you. John, regarding Discover, first of all, is there anything new that the Discover team is likely to come out with in time soon in terms of more recommendations or maybe more insights of what people, especially people who had traffic from Discover and now they don't, maybe some insights into what they can do to improve? Yeah, I don't have any idea of timing. So that's always tricky to come and say, what will happen in the future? It's like, I really don't know what the timings are there. I do know the Discover team cares a lot about people who are trying to get content visible in Discover. And they'd like to make it a little bit easier to understand what is happening there. So it's certainly possible that some things will come out that have a little bit more information or some more tips on it on what you could be doing differently. But I don't know what the specific plans are there or timelines would be. So you mentioned in one of the previous hangouts that, as I mentioned, we've been working with a publisher that has been removed from Discover in May. And we kept making sure that the new content that they publish is according to Discover policies. And they improve the content and overall user experience as well. You mentioned that Google, including Discover, requires quite a bit of time to process all of these changes. Even if the content is indexed, we might still need some time until Google understands that, OK, so this is a good enough quality website to be included in Discover again. And you mentioned, especially for large websites, the month is a small amount of time. What would you think, like, is an appropriate amount of time to wait? I don't know. I mean, it's tricky on the one hand with regards to bigger quality changes on a website. So that is really hard to say there. And on top of that, I don't know in particular how Discover pulls all of that information together. So for Search, I would guess a large site a couple of months should give us time to understand it a little bit better. I don't know if Discover would do anything past that or would take longer. OK, but so definitely a month is too little to kind of see a significant impact. So I'm guessing two, three, four months and beyond. There's a more appropriate time to OK. And I know you mentioned that with Discover, the new content, the fresh content, has a heavier weight in terms of how Discover calculates everything. So just to confirm that it's not really required to kind of go back months or years to kind of update that old content and update it. I see it as something that's not particular to Discover itself, but more with regards to how we understand websites. And the kind of website that you're kind of like talking about sounds like something where you're generally creating new content all the time. That means within your website, when we understand your site structure, we focus essentially on the newer content and on the main category sections, for example, of a website, which means that overall, we would automatically, for indexing, kind of focus more on that newer content because it's just very prominently linked within your website. You're giving it a lot of weight. So regardless of Discover or just Search in general, if you're constantly creating new content, then that's kind of where we will shift our focus over time. So it's not the case that the old content might pool everything, especially if the site has been online for five years or something like that. It's not like these four and a half years are pulling down the last six months of new best content. I mean, it really depends on your website. And that's something where if you look at the traffic from Search and the analytics traffic, you can probably make guesses on which parts of your site are critical for your site. So I could, for example, imagine a situation where you have a news type website, where we focus on news, but you also have a reference section somewhere. And the reference section is really important because it's like you're the only reference site for that kind of content. Then we would say, well, this reference part is really important, but there's also a news part. And we try to find kind of like a balance between those two parts. On the other hand, if it's purely a news website and we can purely focus on the new content, and that's really clear when we crawl and index your site, that's clear kind of from all of the signals that we get, then obviously we'll focus on the news part. OK, for Search, I'm guessing for Discover these reference sections, they might not be as relevant when calculating things. I mean, it kind of depends on what kind of things you're focusing on. So if you're looking at an overall quality issue with regards to your website and you have this reference part that's really important for your website but is really low quality, then we will still balance that low quality part with your newer quality news content and try to find some middle ground there with regards to how we understand the quality of your website overall. But it really kind of depends on your website. And it's not so trivial to just say, oh, we will take a look at all the Search console traffic and impressions kind of thing. It's like we try to figure out what is important for the site overall. Right, right. OK, got it. All right. Any more questions from anyone? John, I have a question. Sure. Actually, it's regarding a website schemas. How it's important for website. Is it really useful for increase a website visibility in a search engine or help to rank higher? It doesn't change the ranking. It changes how we show the pages in the search results. So it's not that you will rank higher if you add structured data. But if you do have structured data that we can use for rich results types, then we will perhaps show that in the search results. So for example, if you add FAQ markup to your pages and we understand we can show that for your site, then we will show that in the search results and maybe that will be more visible for your website and maybe that will attract more users to your site overall. But it doesn't change the ranking of those pages. OK, OK. Actually, I have implemented schemas overall on all websites. And they all are visible in search engines. Cool. And yeah, one more question. Sometime website, overall website perform, right? All keywords are in the top 10. And after two weeks, overall website drop, keyword ranking drop, then what we can, what issue? I have no words about this. I hope you understand. Yeah, it's really hard to say. So we have a document in the Help Center, specifically for, I think, my site's ranking dropped or my page's visibility dropped, something like that. And it goes through a number of the issues that could be playing a role there with regards to maybe technical issues that you can double check and also some tips with regards to the overall quality. But in general, even if you have all of the technical things set up correctly, it can happen. And it's very normal that things will go up in ranking and things will go down in ranking over time. So just because a page is no longer ranking as well as it was before doesn't necessarily mean that there's something wrong with the page. It might just be it's like maybe there are other pages that we should show in search. Maybe we just have trouble understanding the relevance of the page, focusing the page more on the kind of content that you want to have it visible for. All of these things can play a bit of a role. Oh, OK, OK, sir. Actually, according to all Google guidelines, I have made a document. And recently, last month, I have launched a new website. It's regarding a hoverboard. And can you imagine, after one and a half months, overall websites are ranked higher and the keywords are in top 10? Fresh new website. Yeah. And I got good traffic. Daily around 2,000 to 2,500. Yeah. That sounds good. Yeah. I mean, sometimes if you can't have everything. Overall website optimized all our Google guidelines. Google guidelines. Excellent. Excellent, really. Cool. Yeah, I mean, on the one hand, the technical things are certainly things to watch out for. But it's also, it's not guaranteed that you will be successful just because you follow the technical guidelines. So sometimes you also have to have a sense of what will be important or what areas are maybe not covered so well at the moment that we can help cover. So it's like, yeah, cool. All right, let me just pause the recording here. We can stick around a little bit longer if any of you want. And yeah, I don't know. I think this is the last office hours hangout for this year. So thank you all for joining all of these hangouts. Thanks for all of these questions along the way. I hope you found these useful. We'll certainly be doing more of these next year as well. I think Martin is doing one for JavaScript next week, so it's not completely closed. I don't know how he does it, like almost over the holidays. We'll see. But we'll definitely have more of these lined up next year. So wish you all happy holidays and see you next year in that case. Merry Christmas. Merry Christmas. Christmas. Merry Christmas. See you. Bye.