 All right, welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I'm a webmaster trends analyst here at Google in Switzerland. And part of what we do are these office hours hangouts, together with publishers, webmasters, SEOs from around the world, it looks like, to try to get some of the open questions answered. A bunch of questions were submitted already, so lots of stuff to go through. But as always, if any of you want to ask a first question to get started, feel free to jump on in. John, go ahead. OK, I'll get something. Just a quick question, I don't know if you saw the screenshot that I messaged over to you or not. But an interesting result that we saw on a featured snippet on the query for the term car buying tips that the featured snippet is being pulled from a different website. But then within the featured snippet box is a logo off of our website that appears from what we can tell that it's actually coming from our URL. And of course, we don't mind it being there. It's good exposure. But I just wanted to see if that could potentially indicate any kind of problem with something we're doing on a technical side or anything like that. So we do sometimes take images in featured snippets from other websites than the text snippet, kind of to provide a mix of text and visual snippets, depending on what people are looking for. That's not really a sign of you doing anything particularly wrong. It's, I think, more a sign that you're doing kind of the right things. And we found this image, which we thought would be kind of suitable here. I don't know, in your case, if that image is really, I don't know, best image for this kind of a topic. I think it's essentially just your logo that looks like a car, kind of something like that. Yeah, basically. It's like, I wouldn't complain in your situation because it's like, well, it's bringing your brand out there from a search quality point of view. I don't know if that would be the best fit there, but that's something that our algorithms have to figure out. And sometimes we were able to take both of these from the same site and show that to people, kind of as a teaser for what they could find. Sometimes it's from different sites. OK, thank you. Can you hear me? Yes. OK, good morning, John. I have a question. If there's a site and the answers, it's kind of a QA site, but the answers are very succinct and they can be answered in a very short amount of time rather than going kind of into depth. Is that any kind of challenge, I guess, for the algorithms? I think that's an option that you can do with the website. I don't know if that's a longer term, really, an optimal approach to, in extreme case, you have a short question and a short answer, and that's your primary content on all of these pages individually. I don't know if that would be a long-term strategy that I'd recommend because it's very little content on these pages. But if you can expand on those answers and you have a lot of additional information there, I think that's a perfectly fine setup. OK, and the second question I had is, oftentimes, we deal with startups who kind of go back and forth with how to best allocate their resources. And so the question becomes, how much time do you spend on things like content marketing, things that provide a clear direct immediate value versus items which are more on the branding side of things, a long term, maybe even lost leaders? Email marketing, like a particular client wants to do an email marketing campaign, but they're going to be losing money on it. How do you kind of balance out with startups? I know this doesn't maybe sound like a search question, but you know the challenge is, do what you would do as a business and ignore us. But at the same time, the startups have to run really lean. And they're not always sure what to kind of approach in that regards. I don't think there is kind of this one solution to all of them. It really kind of depends on, I guess, what you're trying to do, where you have issues, where you have potential, where you have strength. All of that kind of comes together, and everyone will be a bit different. Whereas for some people, maybe with a startup, the website is the last thing that they do. And it's just like a business card, essentially, for the longest time. And maybe that's perfectly fine if they're able to focus all of their energy on everything else. And for other people, a web presence might be critical, might be something that they really need to focus on immediately and getting a lot of information out there. It really depends quite a bit on the business. True. By the way, I don't know if you've been able to see my questions yet. I've sent you a couple in the collections as well, which have been disappearing, actually. So I'm not sure why that's happening, either. OK, I see the question. John, can you see my question this time? OK, good. Oh, OK. Yeah, so I have a proprietary ranking algorithm to try to figure out which questions we need to answer first. And sometimes it picks up really good questions. Sometimes it's not so great. And usually we don't get through all of the questions. So sometimes it's just not on the top of the list. I understand. All right. Any other question before we get started? Yes, John. Yes, I hear someone very vague. Hello, can you hear me? Yes. All right, so I would like to ask, basically, like when you said, Google usually looks at the site quality-wise, and then they decide whether they will get the kind of rich results or anything like that. So does it also applicable to the feature separate site, when Google looks at the site quality and then depending on the site quality? They show them as a kind of feature separate or else if they're not very good enough in the quality-wise, they won't appear in those features. I don't know. I assume there's some quality aspect there. But I think it's not really the same as with kind of the rich results kind of quality understanding there. So it's certainly possible that a site is shown with rich results and not in the featured snippets or the other way around. So I am pretty sure we look at some quality aspects there, but I'm not 100% sure. For the most part, the featured snippet is really just a bigger snippet to try to pull out relevant answers from the web to guide people to this content a little bit easier. So it's something where, obviously, quality plays into that, but I don't know if we'd use the same setup as with rich results. All right, so the reason I'm asking this here recently started appearing in the featured snippet, but I mean, we haven't changed anything new. On our web pages, maybe we are ranking on the first page itself, and Google certainly showing our results. So this is just wondering if maybe because of the things that the page is now having some sort of good quality, that's why this app is really rich. So that was my approach. I mean, personally, without knowing the details, that sounds like you're right on the right track, but like I said, it's something that, generally speaking, most feature teams at Google try to understand quality in different ways in ways that make sense for their individual features. And sometimes that's a little bit more direct, like with the featured snippet, with the rich results. Sometimes that's a bit harder because we obviously have to understand the content better for things like featured snippets. So it sounds like you're on the right track, but maybe there's still more work to be done if you're not seeing your rich results yet. All right, thank you. All right, let me run through the questions that we have. As always, if you have any comments or questions along the way, feel free to speak up. And we'll see how far we go. It looks like a whole bunch of them were submitted. So I'm not sure if we can make it through all of them. We'll see. A lot of our customers are older and not used to doing online reviews, so we collect them from filling out a short form or questionnaire after we've delivered the service. Is it OK if we mark some of those up on the webpages? So from our guidelines around rich results, specifically around the reviews, aggregate reviews, that would not be something that we'd like to see marked up because it's essentially a form of testimonial that you're pulling out there. We don't know where these reviews are coming from. They're not coming directly from users. So it feels like maybe you're picking out all of the good reviews and you're using that to mark things up. So I would definitely continue collecting these reviews because it's probably useful information for you and ways to figure out where you can improve things. You can also put these on your pages, of course. But please don't mark them up as review markup because as far as I understand, this would be against our structured data guidelines. John? Yes. Sorry. Yeah. So how's some follow-up questions on that one? So basically, you're saying it's good to be posting all the reviews on the same page. So just wondering, with mobile devices, we have products where we have a lot of reviews. And just wondering is a good idea to show all the reviews on that page or that we just split those reviews into a new page? That's up to you. Those reviews have to be visible for users, so they have to be accessible somehow. If you put them on paginated pages or something like that, that's totally up to you. OK. On our e-commerce site, we have a View All button which shows all products for a group of related categories. These View All pages have a canonical tag, which I think is disregarded because we also have a No Index and Follow. Can you see any problems if we were to change this to No Index, No Follow to have link equity flow more to the other areas of the site? I think you can definitely do that. I'm pretty sure you would not see any change with regards to how we crawl, index, and rank your site because, for the most part, a page that's already on No Index, after some time, we think, well, this page doesn't exist. And we kind of ignore it anyway. So that's something from our point of view where I doubt you would see any change if you made a No Index page, also No Index, No Follow, or if you just left it No Index alone. So my suspicion is you're spending time on things that have no visible effect there. And I would focus more on other parts of the website. You said there's a ranking boost for HTTPS. Would you get a higher ranking boost if your e-commerce site had an EV SSL certificate, as opposed to the other ones? No, we treat all certificates the same. They have to be valid certificates in our modern browser. But if they're valid certificates, they're all treated the same. We've identified a ton of blog posts on our site dating back to 2011, 2013, which aren't very well written, keyword-heavy, and poor quality. Given that there's a lot of content, am I correct in thinking that this is affecting our overall Panda score, which in turn affects every ranking page? And if so, would it be better to No Index all of these pages until I get around to rewriting each one? In general, we do look at the quality of a website overall. So if there's a lot of content on your website that's really low quality and bad, then that might be something that our algorithms are picking up on. How you deal with that low quality content, if you've identified it, is really up to you. Some people tend to rewrite low quality content, because at some point, you thought this was going to be useful information. So you might as well turn it into something really useful. Some people tend to remove it completely. So that's kind of an option as well. How you deal with that is essentially up to you. So not something where I'd say, you have to rewrite it. You have to delete it. You have to do this temporarily totally of you. My question is in connection with faceted navigation. Would love to know your opinion on how to handle these URLs so that value is not lost, and how to avoid undesirable duplication from faceted navigation. Ideally, we'd like to return user-friendly results, but with a search engine from friendly URL. And currently, we have in place no index follows so that doesn't pass any value. What could we do there? So faceted navigation is always a challenge. I think this is something that's always kind of tricky, because it's very easy for search engines to kind of just head off into the weeds and start crawling almost infinitely on a website. For the most part, we do understand how URLs work and how websites work. So if we recognize we're running into kind of faceted navigation area, we'll probably slow down crawling and say, OK, we've seen a bunch of stuff here, but this is really the primary content on the website. So for the most part, we try to figure that out ourselves. You can help us, though. So you can, if you use something like URL parameters within your URLs, rather than just path and kind of a long name with dashes or something like that, if you have parameters with a question mark and then the equals sign and the ampersand in between, then that's something that you can use the URL parameter handling tool for and tell us which of these parameters are actually useful and which of these parameters are not so useful. And with that information, we can usually deal with things like faceted navigation fairly well. So that's kind of what I would aim for, especially if you can't make bigger changes on your website. There are lots of kind of hacky tricks that people sometimes do to avoid Google from crawling into faceted navigation area. For the most part, I think unless your gigantic website, that's something you don't need to do. We can deal with faceted navigation with the URL parameter handling tool fairly well. If you have a reasonably clean URL structure on your website, we can generally just kind of deal with that automatically. So that's kind of where I would head is making sure you can use URL parameters on those pages and then using the URL parameter handling tool to let us know about which parts are good and which parts are less important. With the mobile-first index being rolled out, can this affect our desktop rankings if we have any issues with our responsive mobile templates or anything? So I guess the or anything is always an option. Like anything could break and suddenly your rankings change. But let's exclude that for the moment and focus on the rest. If you have a responsive website, that means you're serving the same content to mobile users as to desktop users, so the same HTML. The CSS, kind of the way that the page is designed on the screen, that can differ across different devices. Usually that's the case. So in a case like that, we see the same HTML, desktop, and mobile. So for mobile-first indexing, nothing will change. Everything would essentially just continue working. So for the most part, when we switch a site that is really responsive over to mobile-first indexing, all of that should just continue to work, and you shouldn't really see any big change there. It's always possible that you see some short-term fluctuations and some subtle changes. But for the most part, it should essentially just be one-to-one, continue business as usual. If you don't have a responsive site, so if you're serving content to mobile devices differently or using separate URLs, then we will shift to the mobile version that a mobile user would see. And if that version contains less information, then that would be something where we'd index less as well. So that could be text. If you have less text on the mobile version, if you have worse internal linking structure on the mobile version, that could be an issue because we wouldn't be able to crawl as easily. If you have missing structured data on the mobile version, that might be an issue that would cause problems. Images, videos, anything embedded on the mobile version, if that's a problem, that's something we try to watch out for as well, but something that you can fix as well. And finally, I guess this is something that is specific to m.site. So if you have a separate mobile host name, we need to be able to crawl that mobile host name with the speed that we would otherwise crawl your desktop site. So if your mobile host name is currently on a different server or not on a really strong setup, then that would be something that might affect how we can crawl your website on mobile. And if we can't crawl it properly, then ranking is obviously a lot harder, too. But going back to your question here, responsive layout, responsive template, you shouldn't see any issues at all. Also, we mentioned in our blog post that we'll be emailing people through Search Console when we shift them over to the mobile first index. So you can watch out for that email and kind of double check to make sure that nothing is actually changing in Search Console. About 15% of our crawlable pages have a no index, no follow tag to avoid duplicate content and other low quality pages from indexing. Could this affect the overall site quality or does Google only consider the index pages when evaluating the quality of a site? Yes, we only look at the index pages when it comes to understanding the quality of a website. In general, though, so one thing, maybe just taking a small step back here, you mentioned you're using this no index as well for duplicate content. In general, I'd recommend using a rel canonical for duplicate content rather than a no index. With a no index, you're telling us this page should not be indexed at all. With a canonical, you're telling us this page is actually the same as this other page that I have. And that helps us because then we can take all of the signals that we have for both of these pages and combine them into one. Whereas if you just have a no index or if you block it with robots text, then all of the signals that are associated with that page that's blocked or has a no index on it are essentially lost, they're dropped. So if someone were to link that page and you have it set to no index, like, well, they're linking to nowhere. Whereas if you had a rel canonical, we would see that link going to that page, follow the rel canonical to the page you prefer to have index and use that one for indexing. What period of time and to what percentage of users does Google start to have issues with AD tests? There's some areas of website where traffic is low, so it takes longer to reach a good sample set. That can be up to six months or longer to reach a significant sample set. We don't have a fixed timeline on this. Essentially, what we want is that the kind of long term stable situation on your site is such that we trust the content that you're providing is actually the content that you're showing to users as well. So if you're doing an AB test and you're doing subtle layout changes with things like colors or shuffling things around slightly, for us, that's absolutely no problem. Do that for the long run if you want. That's perfectly no problem. On the other hand, if you're doing changes on the AB version where one version has a lot of textual content and that's the version you show Google, and users, when they go there, they essentially just see two or three words of content and maybe a big image, then that would be a bit awkward for us, right? Because people are searching, we're showing a snippet from your page and saying, hey, you can get this here. And they go to that page and they don't really find that content. So that would be something that we don't want to see for the long run. For short periods of AB testing, fine. We can deal with that for the long run. It should really be that the content is equivalent what you're showing Googlebot as what users would see. My question with regards to gateway pages, say I have a law firm with 20 locations in a single state, we have a location home page for each of these locations with unique content for each. And then we have category pages for each category under those location pages with also unique content. They talk about the same topics, get they're slightly unique. You mentioned this is against our guidelines. Can you expand on what we should do in this situation and if our situation would be considered gateway pages by Google? So the term we use is doorway pages, not gateway pages. I think you're probably thinking about the same thing, though. In general, for us, a doorway page is something where you're essentially offering the same content with a slight tweak, which could be something like you swap out a city name. And all of the rest of the page is exactly the same, and you're leading people to exactly the same kind of conversion funnel in the end. In this case, it sounds like maybe you're doing something slightly different in the sense that people are going to individual locations and they're not all funneling through the same thing in the end. They're maybe going to specific lawyers and going to go and visit those lawyers in person at some point. And that seems less of an issue to me. So it's really a matter of making sure that you actually do have unique and compelling content on all of these pages and not just subtly rewritten, reworded content that's there to make it look unique, but actually it's all the same content everywhere. So that's kind of what I would watch out for there. I think with 20 pages, you're probably coming towards the limit there, especially if this is 20 sets of pages and each of those 20 pages has a set of categories that are more or less the same across all of these. Probably you'd end up being in a stronger situation in search if you were able to reduce that into fewer, stronger pages. So maybe those 20 location pages and category information may be shared across all of those pages where you have a shared set of category information. And from there, you guide them to the individual locations. So that's probably what I would aim for there. John? Yes? We have a single kind of situation here. We have products, let's say, but let's say we have a product like credit card. And a lot of features there, which is very common to, let's say, well, one bank, we have several credit cards. And a lot of features or the benefits are the same. So I mean, we are doing certain featured bullet point kind of form, the template-wise, and then we're just mentioning what other benefits are there and what are the advantages are there. So it's going to be kind of same with all of our credit cards. So how we could get this data? Is it considered as any same content or like Google bit to value those pages and low-quality kind of things? I think you'd probably have to look at this on your end to figure out if this is really unique content that you're providing for all of these pages or if it's essentially all of the same thing. And if it's all of the same thing, then probably you gain more value by reducing that. From my point of view, a lot of things where people tend to kind of split things off into separate pages, they're often situations where you're kind of diluting the value of your content across a lot of different pages instead of concentrating and making the pages that you do have much stronger. So it's essentially a question of strategy of how you want to be active on the web. But oftentimes, if you dilute things, then it makes it even harder for you to compete with people who have really strong offerings in individual places. So that's something where you kind of have to balance those two sides, kind of bringing specific information out there, specific topic that you have, or bringing more general information out there that can be valid for a whole different, whole bigger side of individual topics. Thank you. If my website has a site-wide link to, let's say, an Amazon product in the sidebar or fitter, is that a problem? No, that's certainly not a problem. That's something that you can do. With regards to Amazon in general, often these are affiliate links. Affiliate links themselves are also no problem from us. If you have an affiliate-based website, the thing to watch out for is just that the rest of our guidelines still apply. So just like copying, pasting all of the content and putting that on your website and calling it the website of your own is probably going to look kind of awkward to our algorithms. Whereas if you have a really strong website with lots of unique and compelling content and you have affiliate links to monetize that, that's perfectly fine. So it's not a matter of affiliate links, but usually just a matter of kind of the cookie cutter content that a lot of affiliate sites have. I know you didn't specifically ask about affiliates, but that's something that's often brought up with regards to Amazon. Manual action is the only penalty that a site can have. But sometimes you say overall site quality is also a signal that the site isn't good enough overall. How does that differ from an algorithmic penalty? So I guess the main differences here are manual actions are things that team members from the web spam team do manually with regards to websites where they recognize that our algorithms aren't really handling this properly. So they have to take, like the name says, manual action to try to isolate the issues that this site is causing or this part of the site is causing and to take action on that in one of several ways. And the other aspect is really kind of just the algorithmic normal ranking where we look at a whole bunch of different signals to try to figure out how we should show the site in search where it might be relevant, which kind of queries, which type of users we might be interested in this content, and we try to show it to users then. Hey, John, just to follow on with your comment there about relevance to queries, I know that the Google AdWords is a different algorithm and everything. The other day last week, we put just our URL onto the keyword suggestion tool on there. And the results that it gave us were, like, overwhelmingly related to one topic that's just a small piece of our website. And what I was trying to figure out is, since they're Google, you're Google, whatever, is that indicative of, because it would be an issue for us, because it was so focused on one thing that's not really the primary focus of our site, would that be indicative that the search algorithm could possibly be seeing our site in the same way? I don't know. I believe the AdWords side uses very different algorithms for that. So it's possible that from crawling and indexing, there's some amount of kind of crosstalk there. But for the most part, I believe they pull this data out completely separately on their own. So what you see in the AdWords tools wouldn't necessarily be relevant for the search side as well. Because the interesting thing is, about probably a year and a half, maybe even more ago, well, yeah, probably maybe five years ago, when we had our really bad problems with the search, there was one term which was related to jump starting a car that we were ranking extremely high on, which made no sense, because the site isn't about that. We just had a very small section. And when we did this URL search in the keyword planner, that term also came up pretty high on the list by relevance. And we were just putting our home page URL in. So we're just trying to look for clues. Is there possibly something that we're doing from a technical standpoint that is confusing algorithms? I don't know. I have no idea how the AdWords tools pull out information there. So I really don't know. OK. Sounds like an interesting thing to dig into, but yeah. Is there anything you suggest in Search Console that could possibly give an indication of overall what relevant topics the algorithm's seeing? Not in Search Console. I think that would probably be more on the AdSide. I believe they have a tool that pulls out the categories that they see your site being in. But then again, it's kind of the way the AdSide looks at your site. So not necessarily something that would be done in the same way on the Search side. OK, thanks. All right. Two external links to two different pages on the same domain from the same page. Does the link, what happens with the link value passed to both of them? Two external links from two different pages from the same page. I don't know. So this kind of question kind of goes into more theoretical areas where from our side, it's not that we have any clear definition where we would publicly say, if you have this specific setup with regards to links, then 0.7 goes here, 0.6 goes here, or anything like that. These kind of things are signals that we pick up on for our algorithms to try to figure out how these pages are connected, what the context is of the individual pages with regards to each other. So that's something where I don't think we'd have any kind of absolute number or answer to say how much value goes through this link and how much value goes through that specific link. That's more, I guess, something almost like on a theoretical basis that isn't really that useful to site owners anyway. So having clean links between your pages helps us a lot, especially with regards to understanding the context of individual pages. But usually you don't need to focus on which percentage of my link value is passed through this link versus this other link here. We're vetting a new client that currently has multiple websites. They're a lawyer with both family, law, and criminal defense. They're asking why they would be better served organically or GMB-wise with having a single site rather than a site for each area. It's hard to answer, but splitting between two value websites, what are your thoughts here? I think this kind of goes back to the other questions that we had before where you're balancing having multiple pages for the same kind of content versus single pages, or in this case, two websites or one website. In general, I try to limit things to one website if at all possible. But essentially with two sites or two pages on different topics, depending on your situation here, that feels like something that's really up to personal preference. When you start having a lot of different sites or a lot of different pages on similar related topics, that's where you'd want to try to rein things in and keep them from getting out of control. But if you just have two sites, like some people have two sites, it's really no big problem. So that's something where I wouldn't necessarily worry about this too much and think about it more with regards to what the personal preferences are there, what the long-term plans are, is the plan to go from these two sites over to 20 sites, in which case I'd say, well, you probably want to have one really strong website instead, or is the plan to have these two sites and they continue to run in parallel because they have a very different audience or anything that's really different between these two things, maybe that makes sense to keep those two separate. So really kind of up to you. We've been hit by the Google Update in March 2017. And since then, we just kept losing traffic. We tried to improve the quality of our site by nuking content, improving content, cutting out ads. But we've been demoted every update since. Since February, we're seeing traffic increase. And then in March 2018, we recovered 10% of our traffic. But still, some of our keywords are still not showing up. Even when we create new content, it doesn't rank for smaller queries. What could be the problem? I think this is something where it's almost impossible to say what the problem could be. What I would recommend doing here is just getting more advice from other people who are kind of in a similar field as you with regards to your website overall, with regards to the quality of your website, the content, the services that you're providing on your website. So that's kind of where I'd recommend still taking a step back and thinking, well, maybe you're on the right track with regards to quality. But maybe you haven't really hit that next bump to kind of really significantly improve the quality of the website. So just cutting things out usually doesn't mean that the website will rank better. You're essentially just kind of pulling out the weeds. It makes it possible for the good content to shine a little bit more. But just cutting things out doesn't make the website better on its own. After the mobile first index for m.sites, how will Google about index the content? If the webmaster redirects on a 404 page, well, for desktop sites, the website is available. In this case, Google will show the desktop content in mobile searches or what will happen. So we have a name for this, but I forgot what it's actually called. So essentially, you have a desktop site, and the mobile version doesn't have individual pages. That's perfectly fine. For individual pages, we only have a desktop version and no mobile version. We'll take that desktop version and use that for indexing and ranking as well. The important part is that when users on a mobile device go to the desktop page that they don't get sent to a 404 page, that would be problematic because Googlebot crawls with a mobile user agent now or with the mobile first indexing. So if Googlebot crawls those pages and gets redirected to a 404 page, then that's a 404. And we treat that as a 404. Whereas if Googlebot can crawl with a mobile device and access the desktop pages and they work and there's just no mobile version of that page, that's perfectly fine. So desktop-only sites, they continue to work with mobile first indexing as well. If we have a menu with submenu items that are only visible when the user hovers over them or clicks on them, will Googlebot see these items and crawl the pages linked? If the menu content and the links in the menu are loaded into the HTML, then, yes, we'll be able to see that. On the other hand, if hovering means that the website does a call to the server and says, hey, what should I show here? And then the server returns just that menu content, then that would be something we would skip because Googlebot doesn't know where to hover to see additional content. So as long as you're using something kind of in the traditional setup or using CSS to just make something visible or not visible, that should just work fine. What about if, because of responsive design, you have your navigation and the actual links coded are coded twice because of some layout issues and things like that, does that cause any issue? That's fine. Yeah, that's fine. That's something we sometimes see on responsive sites, that they have one set of navigation for desktop, one set of navigation for mobile, that should just work. I think it's a bit tricky if you have content duplicated like that as well, especially if it's not clearly separated. So I've seen some responsive sites where the responsive version will swap out individual words in a sentence because they have some really fancy software that does that for them. But for us, what happens there is we see kind of the full sentence with all of those word variations, and we think this is one sentence. Whereas actually, it's like the same sentence but phrased in different ways through the responsive design. But that seems to be really rare, so probably not something you'd run into there. When it comes to just the menu, the sidebars, things like that, that'll just work. OK, thanks. What exactly does Fetch as Google do? If you Fetch a 404, does Google see the 404 and act on it? If you Fetch a 301, does it understand the page has moved? If you Fetch a noindex page, does it read the noindex tag? What happens when you Fetch versus when you do Fetch an index, particularly on pages with noindex pages? So just Fetch as Google only requests the page. It doesn't do any additional processing. If you do Fetch and render, we try to render that page with Googlebot. So that's the extent of the additional processing that's done there. You can Fetch a 404 page to confirm that the 404 status code is returned. You can Fetch a 301 redirect page to see that the redirect is actually being picked up on. But nothing happens in our index based on just fetching pages. It's different if you Fetch and submit. In those cases, we do try to process what we've seen and process that for our index. So if you submit a page that has a noindex on it, then that's something we'll take into account as quickly as possible with regards to processing that page. If you submit something that's a 404, I don't think you can actually do that at the moment because the tool will only let you submit something if it returns a normal page. So probably you can't actually submit that. But anything that you have changed on a page or if you've added a new page and you submit that, we'd be able to process that. That's something that you see directly in Search Console when you go to Fetch as Google in the tool. If we've successfully fetched, then you have the link to submit that to our index. It doesn't guarantee that we'll be able to index it immediately. We do try to get there as quickly as possible. But for some sites or in some situations, we just can't get there as quickly as you'd like. And sometimes it still takes a couple of days for everything to be processed. But for the most part, we try to speed that up as much as we can. How Google found after the Mobile Force Index will treat this plain none? In such a case, will the content be discounted only for mobile? So we've talked about this quite a bit on mobile if you're not showing content for design reasons, for example, which is really common on mobile, responsive design a lot of times. That's still perfectly fine for us. So that's something that we will treat as normal content. How long is a piece of string? OK, let's see. It goes on. We moved the site from UK to that org many weeks ago. And in Search Console, it's still saying it's undergoing a move. Only a small site as well is a normal for a site move. I haven't personally noticed one take so long before. So I believe the setting in Search Console just sticks for a while. I don't know what the time period is there. That's like a half a year or three months or something like that. It's not the case that this signals that things are still in flux. And we still haven't figured it out. It's more that, well, you've selected this option. Therefore, we'll keep it selected until you cancel it or until it expires naturally. So that's not a sign of anything stuck in any particular way. In general, you'll see this fairly quickly if you look at your analytics data for the website. You'll see probably most of the traffic is going to the version that you redirected to. There might be some individual URL still indexed for the other version. Sometimes it just takes a bit longer for us to reprocess everything on the older version. But for the most part, probably we'll have moved things over already. A lot of times, especially for smaller sites, we can process this kind of a move within a couple of days. But still, like I mentioned, the setting in Search Console, that's kind of sticking and will remain active even if we've already moved everything over. Best practices is for fixing search results after taking over a website for charitable organizations that have 301 permanently redirected the club's domain. So I guess they set up a redirect from the old site to a new site. And now they want to go back to the old site again. In general, this is something that might be a bit tricky on our side. So what I would recommend doing is redirecting back, at least, so that from the old site, from the site that you initially redirected to, we see a redirect back to your preferred version, so that any time we crawl the version that we found from the initial redirect, we can get back to the version that you actually do want to have indexed. So that's kind of the first step that I would do there, like you would with any other site move. And then just from there, follow the general guidelines with regards to site moves. And in particular, try to make sure that all of the external signals are aligned with where you want your site to be placed as well. So things like external links, internal links, site maps, canonicals on these pages, everything kind of aligns with the URLs that you do want to have indexed. Sometimes this works fairly well. Sometimes it's a bit tricky depending on how long you've been redirecting to the other site. So wish you luck, I guess. Any time you do a site move, it can be a bit tricky. And if you have to do a site move one way, and then you redo that and redirect back again, it makes it tricky as well. So sometimes you just need to be patient for everything to settle down properly too. Wanted to implement structured data for financial products, but most of these schemas are in pending status by schema.org. Would like to know if we can implement the same, and will Google be able to recognize these structured markup? So if this markup is currently not shown in search, then we wouldn't use this to show anything in search, which is, I guess, kind of obvious. If we don't use it for rich results, then it has a very low effect there. We do use structured data on websites to try to understand the context of pages a little bit better, kind of where we would show this, where it would be more relevant. But it's not the case that you'd see any immediate or general ranking boost for just having structured data on a website. You're always welcome to implement structured data, especially in newer structured data. When it comes to schema.org, I know the team does look at the use of individual structured data items on the web. And if they see that some types are being used quite often and they're actually not officially recognized by schema.org yet, then that's something that sometimes encourages them to say, OK, maybe we should figure this out and get this verified with the other people in the group so that we can put this on the more official list as well. But that's more something for, I guess, long-term thinking and not something where implementing this markup will have an immediate SEO effect on your website. Yeah, hi. So it was my question. So I mean, my understanding is like, I mean, if we implement those sort of markups specifically, so then Google and even for understanding those kind of nodes specifically, like, I mean, we are talking about some certain financial product, whether it is a loan, whether it is a kind of credit card, debit card, those things. So I mean, it is just, it is not recognized, but I have seen many websites start implementing both things. So just wondering. Like, I mean, if you can implement it, you can understand my sense a little bit. I think implementing something like this is perfectly fine. I just wouldn't assume that Google will pick it up and turn it into magic in the search results. So it's more, I think, long-term and kind of, I don't know, philosophical thinking where you're saying, well, I believe in this markup. I know it's currently not used, but I think it'll make a big effect, like maybe a couple of years down the road, and then that's something you can always do. It doesn't have any negative kind of issues with regards to search, if you implement it. But if you're expecting an immediate return on investment, then probably that markup is not something that would be critical for your pages at the moment. OK, thank you. What's the current status of mobile versus desktop index? Will the desktop search results be completely based on mobile signals? Is there anything related to this already set in stone? So yes, even on desktop, we'll use the content crawl from mobile to generate the search results. We'll still show the desktop URLs if we have separate URLs. So we'll still try to keep things reasonably. We'll make sure that users on desktop devices continue to see great results, great and relevant results. But they will be based on the mobile index content. With regards to anything set in stone, I don't think anything on the web, or in particular, at Google, is ever set in stone. There are always changes. We're making these changes fairly slowly compared to other changes that we've made in the past, because we do want to be cautious here. We want to make sure that we're doing the right thing, that we're not causing more issues for sites, but rather that we're kind of doing what users expect. And if we see that things are going wrong, or that something's breaking along the way, we'll pause and figure out a way to kind of deal with that, to resolve that issue, or take whatever steps are necessary. All right, so I think we just have a couple minutes left. Let me just open it up to more questions from you all, if there's anything else on your mind that I can help with. Yes, John, I have. Oh, the piece you've been doing, I already asked many questions, so you can ask this one. Hello? Yeah. Hello. Yes. Hi, this is Kalina from publicas.com. I posted a question regarding manual action on the post. So basically, we've been removed from December 30th based on a hacked content alert. And we posted a review the same day. It was shown as processed, but it seems manually removed from the rank. What's the best next action to basically get this ban lifted? Because we've gone through all the checklists, and we seem to be doing everything according to the guidelines. OK, I don't know. I probably have to take a look at that, what exactly is happening there. Do you have a forum thread where you posted the details, perhaps? Yes, we have posted it on the Webmasters forum. And also, the detail is in the comment of this event. OK, so I'll take a look there to see what I can pick up there. It's kind of tricky sometimes to do something live here to figure out exactly what is happening right now. But if there's something where we've been able to kind of clean up some things or maybe not everything, maybe there's something that we can let you know about to try to clean that up. OK, cool. I'll copy that out. And if you can include maybe a link to your forum thread in the comments to the event, then I can double check to see what else might be happening there. OK, cool. Do I post that here in the group chat or in the Google Hangouts link? In the Google Plus page, where all of the questions were submitted. Yeah, OK, excellent. Because then I can get back to you there. And if there's anything specific that you need to do or to change, then we can figure that out. Great, sounds good. Cool. All right. Yeah. Go for it. Thank you. Yeah, so just wondering how Google created a website which is actually rendering the content through JavaScript. Basically, I've seen many sites in India, basically, like a lot of websites using React, JS, and everything. And most of the contents come through JavaScript, but I mean, if I see those pages, like I mean, they say I'm doing JavaScript, they are typically all blank pages. So I'm just wondering, and they are pretty doing well in search, like when searching, but just for the generally term. So just wondering, like, I mean, Google looks at some other signals as well, apart from the main content, or like, I mean, why they're doing so good without having those main content. We do render pages when we see that they're using JavaScript to figure out what content would be shown to users. It sometimes works a bit better, sometimes not so well. So it's still, you kind of have to know what you're doing, but we do try to figure that out and to deal with those kind of pages. There are also some techniques that some sites use to make it a little bit easier for us, such as pre-rendering the JavaScript site and serving us the static HTML version, somewhat similar to how you might serve a kind of better designed version for mobile users when mobile users come to a website. So there are the various techniques that people are doing to try to get that content into Google and to make sure that Google can understand it well as well. Sure. Thank you. Hey, John, it's Brian. How are you? Hey, Brian. Long time no see. I know, long time no see. Question for you. Whenever you were talking a minute ago about the desktop search based on the mobile crawl, does that mean that content that's not visible on desktop can impact the rankings if you're just looking at the mobile version? Content that's not visible on the mobile version would affect desktop. Right. That way, yeah. So if you have an MDOT site or a dynamic serving site and your mobile site just has half of the content of your desktop site, then we only index the mobile version and we kind of take that reduced version and try to rank your site based on that. OK. Cool. Thank you. John. All right. John. Yes. One last question. I need to head out otherwise. OK. I'll make this simple. There's always a discussion about increase in site speed and the question becomes, let's say you can get your speed down to like three seconds, is there much benefit to increasing it? So besides the obvious, people not leaving your site. But I ask this because there are some recent stories about sites converting to some really fat cash versions and for example, and they've really benefited them a lot more than I guess we would have suspected. OK. At the moment, we try to differentiate between reasonably fast and really slow sites. So if you're already in a couple seconds range, then that's kind of reasonably fast. A lot of people see more indirect effects though. So things like the conversion rate changing, people browsing more on a website if it's really fast, those kind of things. And sometimes those non SEO effects have a much bigger effect on the bottom line than just pure ranking changes. True, true, true, true. Along the same lines, I asked this in the question, but good content versus bad content, is there really much benefit to saying good content, bad content, and then really super amazing content? I think that's kind of up to you to figure out where you want to focus your energies. I don't think our algorithms have just three buckets. We try to look at it across the whole range of different quality and also more important, I guess, the relevance to individual queries. Right. So range is really kind of what I was asking, not like three buckets. Exactly. Is it more than just a binary kind of thing? Yeah, definitely. Thank you. All right. So I need to head out. Thank you all for joining. Thanks for all of the questions. And I think we have the next Hangout lined up for Friday. So if I miss something, feel free to move things over there. And wish you all a great week. Until then. Thanks, John. Thanks, everyone. Bye. Thanks, John. Bye.