 All right, welcome, everyone, to today's Webmaster Central Office Hours Hangout. My name is John Mueller. I am a webmaster trends analyst here at Google in Switzerland. And part of what we do are these Webmaster Office Hours, where webmasters and publishers can come together and ask us questions around web search. As always, if any of you would like to get started with a question, now's your chance. Yes, I have a question. All right. I actually wrote it in the Google Plus thingy. But the question is regarding search console performance reports, I do a lot of click rate, clickable rate optimization. And some of the results that I see is that we get more clicks. The picture I posted in the Google Plus there was where we had like 100% increase in clicks. But the average position kept declining. So we went from, let's say, 100k clicks to 200k clicks. But we went from an average position at like 7 to 12. I'm wondering how is average position really calculated? Are we suddenly ranking on a lot of long tail keywords that we maybe rank badly on? And then that affects the average position? Or what's going on? That could be happening, yeah. So if you're curious about where that's coming from, I would definitely try to drill down and get as many of the details as possible to figure out where that change is happening. So the average position we show is the average top position of your site in the search results for those queries. So if you're seeing like 12, then that's probably on page 2 on average. But that's across all of the queries that we track there. So one thing that I know a lot of people do is they try to split up between branded and non-branded queries. And even between within the non-branded queries, you can try to create some kind of groups around different kinds of queries and track the performance there. And that's where you're more likely to understand how the ranking changes at the time as well. So that's kind of the direction I would go there. Obviously clicks and ranking aren't really related. So it's not if you increase your click-through rate and you're automatically. OK. Well, thank you. And yeah, I will definitely look more into it. It's just a funny coincidence that you suddenly see more clicks at the worst position. But as you say, it's not directly related. But yeah. Yeah. I usually what I do is I just try the different filters and see where that might be coming from. Sometimes it's something weird. Like in one country, you're ranking really well. And suddenly you're very visible there for a lot of queries. And in other countries, maybe you're not ranking so well. And then on average, that ends up somewhere where maybe you wouldn't expect that. We see that with the webmaster central blog, for example. Sometimes for reasons totally unknown to me, it ranks for the query Google in some specific countries. And when that happens, for a couple of days we'll have tons of impressions and almost no clicks there because who wants to go to the webmaster central blog if they're just looking for Google. So these kind of things can really throw the numbers off if you're not careful and look at the details. OK. Thank you. All right. Any other questions before we head off into the submitted ones? No? OK. That's fine, too. We'll have time for more questions towards the end. So if there's something still on your mind, feel free to jump on it and then. Or if something comes up in between with the questions that are asked and the answers that are given, feel free to jump on it and then as well. All right. So there are a few questions around the ranking changes that we had in the last couple of months. I'll try to combine some of these together because essentially it's very similar. Our rankings for the last days have had a nice increase, though. The monthly search volumes are getting less and less for the industry we're in. Do the Google algorithms vary depending on search volume and industry when deciding ranking position or is everything treated in the same way regardless of how busy the search term is? If not, was there a core algorithm change on the 27th of September, which could have caused this? So OK. So I guess the first part is kind of unique. No, we don't have anything in our algorithms that would explicitly look at individual industries and say, this is what we would expect from this industry and therefore we'll kind of change our algorithm for this specific time for that industry. I think some of that happens naturally kind of as a competition rises because something is coming up, then you'll see more competition. Maybe you'll also see more queries happen. For example, all of the Halloween-related things, I imagine, are kind of slowly ramping up now, at least in the US. And that's not because we have any specific Halloween algorithms that are trying to figure out like what date is Halloween and we will change the algorithms for then, but rather users are searching slightly differently for these things. We're picking up a lot of new content that's really cool that we'd like to show for these queries. And it becomes a little bit more competitive. And we have to kind of adjust to, I don't know, pick up what is happening and show the appropriate results in the search results. And that's something that happens to be automatically. It's not something that we would manually find to. It's something that has to happen automatically because lots of these events come and go, and sometimes we don't even know about them ahead of time. It just kind of happens. So it's not the case that we have any kind of industry in time of year or time of season kind of algorithms that are kind of expecting changes in specific ways. With regards to the algorithm updates that recently happened, I think towards the end of September and beginning of September or somewhere around then, these are essentially normal search algorithm updates as they always are. So that's not something where I would say we're looking at something very unique or very special. These are essentially changes as they always happen, where our algorithms try to adapt to what is happening on the ecosystem to try to figure out what is relevant and try to understand websites a little bit better and make sure that we're bringing relevant results to users. And sometimes when we look at these things over time, we kind of have to rethink what we show. And that's not a sign that these sites are bad that are less visible. It's essentially just a sign that things have subtly changed over time. OK, here's a mobile-first indexing question. This is kind of unique. Our website moved over to the mobile-first index. However, I noticed we don't display our blog section on mobile, whereas we do on desktop. Could this affect rankings as our blog section articles internally linked to product pages and categories? So this feels like a kind of a unique setup that you probably want to fix, just because maybe users on mobile want to see your blog pages as well. But the other thing that I suspect might be happening here is that maybe you just don't have a mobile-friendly design for your blog. It's still accessible on mobile. It's just not mobile-friendly. And that's perfectly fine. The mobile-first indexing change is not related to mobile friendliness of the pages. So if a page is not mobile-friendly, we can still index it with the mobile-first index because not mobile-friendly essentially just means it's a desktop page. And on mobile, we would see the desktop page, which is kind of a way of responsive design. And that works perfectly well for us, for mobile-first indexing as well. So if it's just the case that you don't have a mobile-friendly version of those pages, that's perfectly fine for this. But of course, users probably want to see a mobile-friendly version anyway, so that's probably worth fixing. On the other hand, if this is the case that users on mobile can't access those pages at all, like they're blocked when they try to access your blog, then that seems more of a critical issue because then we really wouldn't be able to index the blog content at all. So we wouldn't be able to show those blog pages in search. We wouldn't be able to pick up any links from those blog pages to your normal content and show those in search. That would be more critical of an issue. I noticed that our content from some of our branch pages has been copied and is showing on a business directory. Could this be harming the ranking of these pages? How does Google treat this? And make sure that we rank for it as opposed to this directory. So I think first of all, you would probably be best suited to know if this is affecting your visibility in search because if you search for your content and you see the other site's content instead of yours, then that's kind of a sign. It's not the case that our algorithms are like separately changing things around when you're not looking at changing the rankings there. So that's something where I'd really recommend that you, as the expert for your content, for your types of content, would be able to kind of judge that on your site. And you'll quickly see if these other pages are ranking instead of your pages, then that might be something you need to work on. On the other hand, if your pages are ranking perfectly fine, then I wouldn't really worry about those other pages. There's a lot of copying that happens on the web, and that's not something that, for the most part, causes any problems. We're adding a lot of new pages to our e-commerce site at the moment when these pages get indexed how long before the algorithms and Panda decide whether to score them positively or negatively. What we're trying to do is gauge whether these pages are good enough to rank or if we need to do more work on them. So there is no fixed time for kind of processing and indexing the page. For the most part, it happens fairly quickly within maybe a couple of minutes, maybe a little bit longer. But that's something that kind of depends on us being able to crawl those pages quickly and us wanting to index those pages. But that said, you wouldn't necessarily immediately see how these pages perform in search, and you wouldn't have an indicator of what Google's thought about the quality of your pages is. So that, again, is kind of a situation where you're the expert, you're the expert for your content, and you would be best suited to know if your pages are good or if the pages need some more work. That's not something that I would outsource to Googlebot. Googlebot does various things to kind of look at your pages, but you are really the expert on this topic. You should know if these pages are good or if they're not good. So that's a situation where I'd say you need to figure that out and not wait for Google to kind of make a judgment call. We're an affiliate site and our branch pages, we wanted to add value, so we're thinking about adding directions to our stores as opposed to just having a map and Google direction links. Is that a waste of time? Do you think it could help with our rankings, given the content would be very localized? It's hard to say if this is actually something useful for your pages or not. It's probably something that you could test, though, where you could create this kind of content for some of your pages and trying to see how users respond to that. And you wouldn't necessarily need to see how Google reacts to that, but rather, like, what do you actual users think about this? Is this something that they find useful? Is that something that they interact with? Is that something that you can tell they're looking at? Then all of these are kind of signs that actually this might be useful content. And if it's useful content, then why not show it in the search results as well? And that's something that our algorithms will generally more, probably more indirectly pick up on as well, where they see, oh, people are recommending these pages a lot, therefore maybe we should show them a little bit more relevantly in search. The other side is, of course, you might have extra content on these pages if you're providing extra details about these individual branches, which on its own might also be useful. That said, if you're just writing down text directions from one place to another place, then I don't know if that's something that people would actively be searching for. Would they be searching for this road that's halfway on the way to your branch? Probably not. And if they are searching for that specific world, your site is probably not the one that would give them the information that they're looking for. So these are all kind of things that you could look at with regards to adding more information to your pages. Like September, we took a big organic hit. We used to get 600 to 800 clicks a day, and now we're completely out of the index with little to no organic traffic. No warning is given. We checked our inbound links and detected link spam, which is now a disavowed with no effect. How long before this takes effect? And another option is that the late September update hit our site hard, and somehow ended up removing us from completely from the index. So I don't know exactly which site this is. It makes it a little bit hard to double check. What I would definitely do is post in the Webmaster Health Forum with some of the details, so specific queries where you're seeing that your site used to be showing up, ideally fairly generic queries, so not one long string of text from your pages and saying this is where our site should be, but something kind of generic where your site was showing very prominently the search results and is now not visible at all. Individual URLs that used to be visible in search but are not anymore, all of that can be very useful. The one thing that I would also try to differentiate is to figure out were you removed from the index or are you just ranking worse now? Because there are very big differences between not being indexed and just not ranking. So one way you can double check to see what is indexed is to do a site query for your website, so just site and then colon and then your domain name, for example. And that way, you can kind of see which of your pages are still known to Google, which of these pages are still indexed. It's not a comprehensive list. The number on top that we show is a very, very rough approximation, so I wouldn't focus too much on that. But you'll quickly see are kind of the relevant pages from your site, are they known to Google or are they not known? If they're not known to Google, if they're not indexed at all, then that's usually more of a sign that there's a technical issue involved that maybe something on the server isn't working well. Maybe there's a no index tag that you weren't aware of. Maybe there's something with the robots text file that's not okay. And that's very different from a quality issue that might be affecting your website. So that's kind of the differentiation that I try to figure out first and the folks in the forum are very good at digging into these kind of problems as well. So if you're unsure if it's a technical issue or just a ranking issue, or I know a ranking issue isn't just a random ranking issue, sometimes these are really hard as well. But kind of being able to narrow it down a little bit, that makes a big difference and makes it a lot easier for you to figure out where you should be focusing your image. And again, folks in the forum or in some of the other webmaster forums, they're usually pretty experienced with these kind of things and can help you to narrow that down a little bit. I run a successful employment website which provides free career advice to young people in the UK. On the 27th of September, our website was hit hard by an update which resulted in a devastating 30% decrease in traffic. We adhere to the webmaster guidelines that the website wasn't affected by the August update or any previous updates. Could you please confirm what kind of update this was? So again, kind of going back to the previous questions, there's nothing specific around these updates that we would say we're focusing on a specific meta tag or a specific kind of content where we'd say like you need to make sure to, I don't know, make your text readable or something like that. A lot of these updates are just essentially general search ranking relevance updates that we would always make. So it's not that there's anything specific that we could point out and say, this is exactly what you need to do. Rather, we're kind of reevaluating how we look at websites overall and how we show them in the search results. And again, that's not a sign that you're doing something wrong. It's essentially just the kind of a reevaluation of what we think we should show for individual careers. And I realize this makes it kind of tricky as a webmaster that's affected by these kind of changes. But essentially, it's more a matter of taking a step back and thinking about your website overall and thinking about what could you do overall to kind of really significantly improve things so that with the next updates, when Google makes some things, we'll look better again. And this is something I've seen for some of the sites over the last couple of months where they'll say, oh, it's like one month, everything is terrible. And then a month later, I was like, yes, we made it. We got things back on track again. And these things can change over time. And essentially similar question. We work in the pharma chemistry industry. We run an e-commerce site and the 27th of August update. So essentially the same thing here where I don't have any specific advice where I could point out and say, this is kind of the matter tag that you're missing that you need to do. It's really a matter of taking a step back and thinking about what you could do to increase things overall. And sometimes things, for example, like the Quality Raider guidelines that those guys to kind of think about how you could view your website. Also, how Google looks at quality on a website or questions. You could ask yourself about quality. That's something that might be useful. All of these things are not always completely straightforward. But it's worth thinking about, what is Google thinking about when we try to find relevant search results and when we try to show them in the search results for users? And what could you do to make it so that it's much more obvious that your site is actually really in line with the goals overall of providing relevant results that are useful and helpful for people. If you were to mark up product data, but you've had marked up CSS and internal links with product description, would Google ignore this? Or would Google penalize for this incorrect use in structured data tester? It looks fine. I don't know. It's been unclear to me exactly how you're setting this up. So I'm not quite sure what the best approach is there. But in general, if the structured data tester is saying that it's OK, then that sounds like you have things lined up properly. My questions are getting stuck. OK, let's see. Somewhere else, I guess. OK, the same article is published on more than one website. Does Google consider this as one link or one link from each web page? So what happens with regards to links is we try to find a canonical URL where the link originates from and a canonical URL where the link is pointing. And based on that, we try to figure out which of these links exist there. So if we see these multiple pages with the same article as being one URL, if they're essentially dupe eliminated and that we think these are actually exactly the same thing, then we would probably see that as one link from that page to one link on an external page. On the other hand, if these are just the same article posted on individual sites, then we would probably see that as individual links. That said, if you're just going off and posting the same article on a bunch of sites, just having a number of links to your own website through this kind of article reposting doesn't necessarily help your website. So just because you have a lot of links pointing to a page doesn't mean that it'll automatically rank better. It's essentially something where we try to look at the bigger picture instead. The HTTP version of the client website is still receiving some impressions and clicks, in Search Console, after migrating to HTTPS about a year ago, mainly on the home page. The redirect is in place, and the HTTP version in Search Console has the majority of the data. Any ideas on how that could happen? So I don't know. I probably have to look at the details of what exactly is happening here. My first thought would be that maybe you have the HTTP version of your site linked in some places that Google is picking up on. For example, if in your Google My Business listing you have the HTTP version of your site listed, then it might be that we would track that as an impression for that HTTP URL. Or similarly, maybe in one of the Knowledge Graph panels, we have the HTTP version that we show sometimes, or maybe an image that is indexed from your home page that we sometimes show in Image Search. And for some reason, we still have the HTTP version associated with that. All of these are potential places where maybe that is happening. That said, I'd really expect the majority of your data to be on the HTTPS version there. And probably it's not something worth digging too far into where this individual HTTP link is actually coming from. Sometimes you can get more information by digging into the facets in Search Analytics to see where exactly this might be ranking and how you might be able to reproduce it. A lot of times, the numbers are just so low that we kind of filtered out. We don't have that additional information there. And if the numbers are really low compared to the rest of your traffic, I would be really worried about it. It's probably more of a curiosity to dig into rather than anything useful. When using internal links on a website, how important are the anchors being used compared to the signals that they pass through the page in question? So if I have a page about kitchen sinks, and I use the keyword kitchen sinks all over, I would assume that this hits a point and becomes kind of spammy. I try to vary these so that they make sense to the reader and give the right signals to Google. Yeah, so anchors on a page are seen as a part of the page as well, as a part of the textual content on a page. So obviously, if you keep repeating the same keywords over and over and the anchors on a page, that could also lead to those keywords being a little bit too much on the page itself. So we do take anchors into account when trying to figure out how a page is relevant within the context of a website. But we also take all these other signals into account as well. So I wouldn't worry about always having perfect anchors when it comes to internal links. That's something that usually we'd be able to figure out fairly well. Automated websites rendering my content without permission are impacting my search console reports. My link profile page is growing at a frightening pace in a negative way. It shows 7,000 new links in just one month. However, the sites never actually link. They use my image URL and then redirect to spam places. You can't get to my site from theirs, but it shows a link to me. They generate hundreds of new sites a day. How do you stop something that multiplies so fast? There are some examples here. I can pass these on to the team. In general, I wouldn't worry about something like this. This is something that our algorithms have a lot of practice with. So these kind of auto-generated spammy sites are things we've seen tons and tons of over the years. And our algorithms are generally pretty good at, like, we can still pick up those links and we'll show them to you because we found them, but it doesn't mean that we treat them in any way with any kind of weight. So I usually wouldn't worry about it too much there. I have a question about average position. I think we look at this. Can AMP be applied to a trip site that has a widget? Or is this technology just for content? I'm not sure what you mean with a trip site that has a widget, so that's kind of tricky. But in general, there are lots of ways that you can implement AMP. So I wouldn't worry too much about the general type of website that you have, but rather think about the type of content that you have on the website. With AMP, you can use that for individual pages. You don't have to use that for the whole website. You can take a part of your website and put that on AMP. You can take individual pages and put them on AMP. So you can say, I don't want to care about this AMP stuff at all, and just keep all your stuff on your normal website. That all works as well. AMP is a great way to make really fast pages, but there are other ways that you can make really fast pages as well. So that's not something that you always need to do. That said, AMP is expanding at a really quick rate. They're doing lots of really cool stuff. So if AMP isn't suitable for your website now, or if it wasn't suitable maybe a year ago when you last looked at it, maybe things are kind of shifting in that direction regardless, and there are more opportunities for using your content within the context of an AMP page. What is better, or what does Google understand better, an iFrame widget or a JavaScript widget? I think it depends on what you're trying to do there. For the most part, we can pick up iFrame content and JavaScript content and associate that with the page as well. However, in both of these cases, you're kind of making it a little bit harder than usual to kind of pull in that content and show that within the context of the primary page that's actually showing this content. So I'd say from my point of view, both of these are useful options. It depends a little bit more on what you're actually trying to do there. What does Google use the product description in schema for? I know it helps Google to understand the product more. However, is this used for any rich snippets? I don't know, offhand. I don't know. It might be that maybe some of this is used for product search as well. I don't know the details there. So I don't know if it's unclear in the documentation on the developer side. Maybe start a thread in the Help Forum. And we can try to pick that up and talk with the folks who are creating this content and see if there's something that we can clarify there. If a page has been no index for a long time, does Google treat these as software fours? And also, if too many URLs redirect to one page as this get treated as a software four, and also do a lot of software fours impact the quality of a site as per Google guidelines. So both of these cases could be seen as software fours. There are definitely situations where there are clear software fours. There are lots of 404 pages that are just with a no index on them. And even the text says 404 on them, but they don't return 404. And that's kind of a clear software 404 page. There are also other pages that are more like category pages that have a no index on them. And for the most part, we can crawl them and crawl those links anyway, and maybe there'll be software 404, maybe not. In the end, it doesn't really matter because we're not indexing those pages. With regards to the other part of the question, which I think is probably more what you're worried about, do a lot of software 404s impact the quality of a website, that's a clear no. So normal 404s or software 404s, they don't negatively affect the quality as we would see a website. It's completely normal for a website to have a lot of 404 pages. For the most part, most URLs on a domain are 404 because they don't exist. And that's completely normal. That's not something that you need to hide. It's not a negative quality signal in any way. It's essentially a normal part of the web. And similarly with software 404s, if we see a lot of software 404s, we just don't crawl those URLs as frequently. We don't index those URLs because we think they're essentially 404 pages. And that's so perfectly fine. So if the pages that are seen as software 404 or that have a normal 404 result aren't meant to be shown in search, there's nothing you need to do about that. That's perfectly fine and working as expected. Definitely not a negative quality signal. I have a question about DKIM. When building DKIM for Gmail, what key should I reference? I have no idea. So I remember setting this up for my domain a long time ago, and there was a Help Center article from the Gmail side about setting that up. But I don't remember any of the details there. I would probably ask in the Gmail Help Forum to get some advice on that. Does the ratio of no index follow pages have any impact on ranking? For example, if a page has 500 pages and 400 are no index and 100 are index, no. That also has no impact on ranking. It's not a quality signal. If a page has a website, it has a lot of pages that are no index, that can be perfectly normal. For example, maybe you have a website that has a lot of private content or a lot of content that's behind the login, and you put all of that on no index pages that show the login form, that's perfectly fine. Website can be fantastic with the current content that is indexable and the other content. You just don't want it to be indexable. That's totally up to you. That wouldn't affect the relevance of the indexable content on your website. Technical question regarding crawling and rendering, our main content is initially hidden by inline CSS. The setting is overwritten as soon as the external CSS is loaded. This construct was built in order to print a Flash event-style content. Is this a problem? Because the content is only visible for Googlebot by loading external styles and by rendering the site. That's perfectly fine. So that's not something that you'd need to worry about there. You can double-check that Googlebot is able to see this content as visible content by using something like the mobile-friendly test or using fetch and render in Search Console. But this is a setup that a lot of sites use. I believe AMP does this as well. The AMP framework has something where the content is hidden by default. And as soon as it's finished loading, it turns it on, which kind of helps to prevent the jumping around of the text or maybe the flashing of the styles as you have here. I don't know the details of that. It's just something I've seen within the AMP framework on the default setup they have. And that's perfectly fine. If we can recognize that this content is visible once it's loaded, that's perfectly fine. If the textual content is at least visible within the HTML, then that's twice as good. We can pick it up as quickly as possible. I migrated to my website and accidentally had the robots text blocked. I fixed this, but Google is still showing no information available for this page. It's been around eight days now. I haven't seen any change. I did fetch as Google for my robots text, but it's still not working. So usually, this kind of thing gets picked up fairly quickly. However, depending on what you are doing there, it might be that takes a little bit longer. So eight days is about the time where I'd say probably should be OK, but sometimes things can align in a weird way and it takes a little bit longer. Usually, we crawl things like the home page, the more important pages on a website, a little bit more frequently, so maybe every couple of days. So probably we should have been able to pick that up. If you're looking at lower level pages within your website, then that can be normal. So we don't crawl all pages with the same frequency. Some we crawl every couple of days, some maybe every couple of weeks, some every couple of months. So if you're looking at a specific lower level page on your website that we just crawled, it might be that it takes a couple of months before we definitely pick that up and show that again. What you can do here is use the fetch and submit tool in Search Console for that URL. So not for the robots.txt file, but for that URL that you see as being blocked. By doing that, you can first of all confirm that the robots.txt file is OK. So Google is able to render that page and show that to you. That means we're able to crawl it. So things are OK now. And from there, you can use the submit to indexing button to let us know that we can pick this up again and reprocess it for indexing as quickly as possible. So that's usually the fastest way to get this done. Again, you don't need to submit your robots.txt file in this situation. Submitting it essentially just tells us you would like to have this page indexed. And you probably don't need to have your robots.txt file indexed. That's something that we pick up and use automatically when we crawl your website normally. Our site deals with commercial properties, and we're observing some very odd rankings. But only in relationship to one niche product landing page and only for a specific territory for the UK. So we rank highly in all territories we monitor, including the UK. We rank even higher in all territories except the UK for the full product title. Notice this is getting complicated. Sorry about that. Oh, is that your question? It is, hi. OK, maybe you can elaborate in person. Sure, thank you. So basically across the whole web, all the different territories in terms of Google.co.uk and Google.com, which are the two main locations that we're looking at, we have historically ranked very high for one of our number one selling products. I didn't want to write it on the messaging there. But specifically, it's Afan Valley Adventure Resort. So if you type in Afan Valley, to any of the local Googles will be up there. Quite rightfully at about the fourth or fifth position, there are obviously more relevant sites that would rank higher than ours. If you type in Afan Valley Adventure Resort, in literally every result other than Google.co.uk, we rank even higher than just for Afan Valley because it's obviously more specific to that particular development. But for some reason, literally in the last few weeks, in Google.co.uk, we are now nowhere to be seen whatsoever. And we're stumped. I can't see any signals from Search Console. The company doesn't have endless resources, but I'm now going through and doing a link audit. I can see that we possibly have been given some spammy links recently within the last two months or so from some generous soul. But none of those have the anchor text saying anything about Adventure Resort that would make me think, well, that's what it is specifically, and that's why we've just seen ourselves eliminated for that particular search. So I didn't know whether or not you had any thoughts as to what it might be. I'm completely stumped. It's hard to say offhand. So one thing I've sometimes seen in cases like this is when things head towards a kind of a keyword stuffing situation where the same keywords are used, I don't know, hundreds of times on the same page, then we might look at that page and say, I don't know how relevant this is actually. It seems like they're trying to overdo it in a completely unnatural way. So that's one situation that could be happening there. But I suspect that's probably not going to be the case in your situation because it's such a long keyword. If you did keyword stuffing on this long keywordy thing, then the whole page would be really hard to read. So what I do there maybe is if you can send me some queries where this is happening and some of your URLs where you saw this happening, you can copy them here into the chat. I can copy them out afterwards. Then I can double check with the team to see if there's anything kind of weird happening on our side. That would be absolutely awesome. Thank you so much. Can I just add one minor thing onto that question? So I haven't had to go on through and do any link disavowing for some time. But this has kind of brought the issue to the forefront just to go through and do a link called it, which generally I try and spend our time creating new content and not doing sort of defensive work like this. Is it still required if disavowing links to try and contact the webmaster from that spammy site to get them to remove the link? Or can we just go ahead and put the domain into the disavow and have done with it? I think in a case like this, I would just put them in a disavow. And my guess is that something like this would not be related to the links. Because if we're seeing problems with the links, then probably it'd be something more where we'd contact you from the web spam teams, like their problematic links to your site. You need to take a look there. Because it sounds like it's such a niche situation that you wouldn't be buying, I don't know, thousands of links. And our algorithms would have to do something crazy to try to neutralize that. So my guess is those links are probably not related to our links. OK, thanks very much, John. I'll get those links over to you. OK, fantastic. Hi, John. Cool. Yeah. Actually, my question is regarding robot.txt file. I have already posted here. My home page has been blocked for around 10 to 12 days. And I do all the things. I check multiple times my robot.txt files. I submitted my home page for crawling. But I couldn't get any response from Google till now. My website is telegraphindia.com. Can you copy that into the chat, maybe? Then I can take a quick look from there. And this is the website that was accidentally blocked by robots.txt. Is that correct? Actually, I have migrated around 1 million pages. From old URL to new URL. OK. But at that time of migration, my robot.txt has been blocked. But till now, it's not crawling by Google. I have fixed it within three to four hours. But still now, my website is not crawling. And it's showing no information is available. OK. If you can copy the link into the chat, I can definitely take a look there. Yeah, it's telegraphindia.com. I am trying to copy. OK. Let me just double check to see what I can see here. I probably need to take a look at the details of what exactly is happening there. All right, I'll take a look at that later on. And if you have a thread in the Webmaster Help Forum, maybe you can post a link to that thread in the Google Plus thread. And then I can follow up with you there. OK, OK, sure. I will post here in this comment, OK? All right. Are you asking something else? Yes, if you have a thread in the Webmaster Help Forum, then I will post it. All right, let's see. Some longer questions are still here. Health website, also I think seeing changes around in the August. Again, this kind of comes into the same situation as before with the more general updates that we make around Search. So it's not the case that there is anything specific I'd be able to point at and say, this is what you need to do. It does sound like you're already looking into the quality writer guidelines and trying to get a better picture of the way that Google sees these kind of websites, which I think means you're kind of on the right thread there. A three months old website, when I indexed the home page with Search Console or other interpages, then keywords appear in Google and rank on the fifth or sixth page, but they remain for a day or so. And after that, they're nowhere in Google. Is that natural? Is there some other issue, maybe a speed issue, because the site takes 10 minutes to fully download? 10 seconds. 10 seconds, a big difference. 10 minutes and 10 seconds to fully download. So this wouldn't be a speed issue. It sounds like our algorithms there are just unsure about the quality of your website overall. So it might be that we index those pages because we run into them, because we submit them to us, and after a while, we think about, well, I don't know if it's really worthwhile indexing these specific pages. So what I would recommend doing there is really working on your website overall and making it kind of the best it can possibly be. And don't aim for ranking on the fifth or sixth page. Try to make something where you can clearly rank number one by far. So where it would be a clear bug in our search results, if your content were not ranking for this kind of general query overall. And of course, with a three-month-old website or if you're just starting out making websites, then that's a lot of work. And I think that's kind of normal, because other people that are ranking for these kinds of queries have probably worked there really long time on their content. And you can't just jump in there and say, well, I spent a couple of months and made some pages, and they should be just as good as the other ones. So sometimes that takes quite a bit of time to actually make something that's really competitive with the other sites that are out there. And that can take a bit of time to kind of get picked up as well, where algorithms are really trusting your site and saying, yeah, this is an awesome website. I'm glad you made this. I'm glad we're able to show this in search. And that's kind of what you should aim for. And if the topic that your website is on is currently very competitive, so if you're aiming for something where there's already a ton of content out there and you're like, I really love this topic and there's lots of content out there, I'll get all of this traffic, then maybe it's a good idea to kind of scroll back a little bit and say, OK, I'll pick a topic that I know a lot about, but where there isn't a lot of content already, where it's easier for me to kind of be competitive, where it's easier for me to really clearly be the number one authority on this topic. So that's kind of what I would aim for there. Don't just try to be as good at other websites out there. Really try to be something unique and significantly better than the rest. I have a site that has a widget that allows a user to buy bus tickets with an internal funnel where you can choose the site payment method and confirmation screen after we purchase the site. We changed that widget for an embedded one that just takes you out of the site in order to complete the purchase. Does this affect my site in, I guess, with regards to people actually going somewhere else to complete the purchase? No, that's completely fine. Like, if you have something on your website that enables others to kind of complete what they came to your website for, that's perfectly fine. I think the thing I would watch out for here is that you actually provide something unique and compelling on your website. So especially if you're kind of in an affiliate situation where you're writing about a product and people click on that product and they go to a different website to actually do the purchase, then you really need to make sure that what you're providing on your site is significantly unique and compelling compared to the original source. Because otherwise, why would we show your site when we could be showing the original source instead? So really try to aim to be significantly better than everything else and to really provide something unique and compelling that makes it so that it's worthwhile for us to show your site instead of all of the others that are out there. Our main landing pages are virtually impossible to find in the search results while Search Console isn't showing any significant issues crawling and indexing those main pages. This is an established brand website with strong ranking history, what it looks like. The algorithm is penalizing or applying a filter to a primary landing pages in regards to the most current update. We have no manual actions. We don't engage in sketchy SEO practices. Is it possible that this update is completely removing pages from the index? So kind of similar to the other previous question, you really want to figure out, are these pages actually removed from the index? Or is this just a matter of those pages not ranking as well as they used to? And that's kind of a really basic and significant difference. And that's something that you'd want to drill into and really figure out. It was like those pages that previously were ranking, which you can see in Search Console, are they still indexed? And you can use the inspect URL tool in Search Console to figure that out, to see, does Google still think these pages exist? And if Google thinks they don't exist anymore, but they still do exist, try to figure out what the technical reason is why they dropped out of the index. And from there, you can either focus on technical things to figure out what you need to do from a technical point of view to get these pages back indexed or quality things with regards to what you might need to do with your website overall to significantly raise the quality bar overall to make your website even more relevant than it used to be, especially compared to everything else that was out there. So that's kind of, I think, the different things that you'd want to be looking into. And usually, if a page is not indexed, it's very likely to be a technical issue unless it's a very new website. If it's a very new website, then that's something where everything is still kind of a little bit in flux and it's unclear how we would be able to index all of this content. But if it's something established that used to be indexed fairly well and suddenly not indexed at all, then usually that points more to a technical issue than to a quality issue. All right. Ooh, wow. Lots of things in the chat. OK, I think these are mostly details with regard to previous questions, so that's fantastic. OK, I'll copy all of these out and make sure that I have them. I can't promise that I'd be able to get back to you on all of these things, but we do pass them on to the team here. So if there's something that the team needs to do with regard to these changes, then we'll be able to figure it out from there. Usually, if you have a thread in the help forum, like, what are you linked to that? And that'd be a way for us to kind of get back to you and provide some more details if there's something unique that we can bring back to a discussion like that. All right, any other questions from any of you? Yeah, hi, John. How are you? Hi. Is am I there? Yes. OK, I have a question. I have a rather large website, an existing website, and we've redesigned it. And we want to go from a new design, go from the old design to the new design. And I did this a few months ago. I came and talked to you about it with a different website. And one of the suggestions you said was to transition slowly, because we had a kind of bad experience on the last one. So the question is now is, in terms of the transition, my intention was to do this kind of like an AB test where I'd use a service worker on like Cloudflare or one of these services and then split a portion of the traffic to begin with a small, let's say, 10% of the traffic goes to the new server and 90% goes to the old server. Now, the sites should be mirrored relatively well. The old version and the new version, the URLs should all be the same. The content might be slightly different between two of them. So the question is, how do we handle that in terms of Google? And what happens if we're showing two URLs that are different, and how do we eventually make the switch fully over from one to the other? So what are you changing on the site? Are you moving URLs or just the content, the layout? It's mostly we've tried to keep the layout, the UI, and the content basically the same. We're basically changing the whole back end system. So with the new back end and the old back end, things don't overlap exactly one to one. So there are hopefully not too many differences, but there could be differences on the pages. And so if the URLs would be the same or? The URL structure is going to remain the same, yeah. OK. Yeah, I don't know. I think in general, that kind of situation should be totally unproblematic for Google because it's essentially transparent. We don't see what is happening. We don't see the actual changes because we look at the HTML that you serve to us. And if the HTML is essentially the same as before, then we're all happy. So from that point of view, I think for the most part, that's not something you really need to worry about. I think doing it gradually is definitely a good idea. What I would do there, though, is not do it kind of in like a percent of traffic way, but rather in a distinct part of your website way so that you can recognize when things break, if something goes wrong. Because if you're taking 10% of the traffic, then you never know which version Google has picked up from those URLs. Whereas if you say, well, I'll start with, I don't know, category pages or with the higher level pages or all products from A to C, or however you can split that up, then it's a lot easier for you to say, OK, I can track this set of pages on one hand from Analytics to see how users react to that. On the other hand, from Search Console, we'll see how Google reacts to that. So the clearer you can make those chunks in a way that you can monitor them, when it comes to the search changes, the more you'll be able to catch any issues that might pop up along the way. OK, because originally I was thinking to do it on a type of randomized fashion, but the problem is also in that case is that Google one day would come to a URL and get shift to the new version, then the next day would crawl it again and then get shifted back to the old version and then would be under the impression that it has changed twice in two days or whatnot. I think the changing part is not a problem. It's really mostly a matter of you being able to monitor what is actually happening. When you see that a page drops in ranking, can you double check to see if Google was seeing the new version or the old version? If it happens randomly, you never know. It's like, maybe today it's the old version, tomorrow it's the new version. And then the day after tomorrow, you look at the search results and you see a change. Which version caused this change? You don't know. That's why I try to take specific parts of your website so that you can really monitor that a little bit more kind of more directly. Okay, but the switching back and forth that you don't think is gonna cause a problem if there's slight discrepancies within the page? I don't see any problem with that, yeah. Okay, perfect, thank you. Cool. All right, we're a little bit over time. So with that, I'd like to thank you all for joining in. It's been good discussions, lots of good questions. Looks like there's some homework for me as well. So some forum threads and a bunch of questions there in the chat. I'll try to follow up on that as much as possible and otherwise feel free to jump on in the forum and discuss this with others as well or to ask those questions again on Friday when we join again. All right, thanks everyone. And I wish you all a great afternoon or a great day. Bye.