 All right. Welcome, everyone, to today's Google Webmaster Essential Office Hours Hangouts. My name is John Mueller. I'm a webmaster trends analyst here at Google in Switzerland. And part of what we do are these office hour hangouts with webmasters and publishers, anyone with any questions around web search and their websites. As always, if any of you want to get started with the first question, feel free to jump on in now. OK. I want to ask one. So all right, so we have pages on our website, which is kind of taking too much of time to load a specific part of the page. And it takes almost nine seconds and we have 10 seconds. So what we are thinking is something if we can block some of the resources from the page itself like some of the content is dynamic. But somehow, when I'm checking to block those specific scripts, so my page is rendered perfectly fine. So I mean, if we can kind of block those specific scripts for Google or for rendering purpose, so will it make any kind of difference in terms of page load time or anything and such? I kind of be careful with blocking scripts just for Google because then it might happen that we can't render the page properly, that we can't see the full layout properly. And the tricky part there is if we can't see the full layout properly, we don't know if the page is mobile friendly or not. So especially on mobile, you wouldn't have that kind of mobile boost because we don't know. Is it mobile friendly or is it not mobile friendly? We can't see the page properly. So that's kind of what I avoid doing there. Also with regards to speed, at the moment we differentiate between kind of reasonably OK, fast, and really, really slow. So something that takes multiple minutes to actually load a page, that's something we would say is probably pretty slow compared to the rest of the sites. With the change to mobile speed, we'll look at that a little bit more differentiated. But we use a number of factors with regards to speed there, including some factors that we see from Chrome, where we see what users are actually seeing. So just by blocking specific scripts to try to make the page look faster for Googlebot, probably won't necessarily change anything with regards to how we see the speed for your site. Instead, I try to see if there are ways to speed up the site in general for both users and for Google. Sure. The reasoning I'm asking is that, I mean, why are you checking in the Chrome that too? Basically, I'm just trying to block all those specific scripts. It's basically taking two seconds to load that one specific part of the script. And if I disable that one, then my page is rendering. It's fine, even in mobile as well as on desktop. So it's a kind of script also. You can say it's a back edit is working. When the whole the page is loaded, maybe at times it's like the other purpose, like the functionality purpose, so that is something the script is there. Yeah. Maybe there are ways to embed the script so that it loads after the page is actually rendered. Like something kind of like a delayed loading of that, especially if you just need it for functionality, then maybe that's fine. Like you wait for the page to load, and then you load that script. It's a bit hard to say without looking at the pages themselves. But those are some things that I would look at. Maybe there are ways to implement that. Sure. All right. Thank you, John. Sure. All right. Anything else from anyone else before we get started with all of the submitted questions? No? OK. Hi, John. All right. I have a question. The mobile index, the mobile speed now, is it a ranking signal for both desktop search and mobile search? It's only for mobile. Only for mobile. OK. And there's another question. When we search, we'll use the Google Page and Speed inside to check the web page loading speed. Most of the time it shows us that it has no data or data available for this page speed. It only shows the score. Now, for some of our websites, when we see the score, the score is very good. But when we use other tools to check the speed, we found that the web page loading speed is very bad. For other websites, it's vice versa. The page score is very bad, but loading speed is showing very good. So what Google actually takes into consideration? They take consideration of page speed score or loading speed time? We use a number of different metrics because, like you said, sometimes a page looks really good in one score and looks really bad in another score. So we try to take different measurements that we have and to combine them in a way that we think makes sense and gives us an overall view of the speed of the page itself. So if you're seeing that in one tool it looks good and the other one it looks bad, I would see why the other tool says that it's bad. Maybe there's something simple that you can change to really make sure that across the board, you're actually really good. OK, so my next question is because after when we get the message on Google, it must start to say that Google is going to take our mobile page loading speed as a ranking factor, we see that some of our research ranking for this is probably because of this update or something else. The mobile speed change, I think, is rolling out only in July. So it shouldn't be something that you would see already. OK, thank you. Sure. All right, yeah. Real quick, I heard someone just actually pass from this question a few minutes ago. And that was the gray versus the green URLs in the search results. Do you know what I'm referring to? No. OK, on some searches, a few people have noticed this. You'll see underneath the main title and link at the URL for the website. On a single results page, it'll have a mixture of green and some on gray and others. I didn't know if that was something that you guys were testing or if that was. Hopefully. I mean, that sounds like the type of thing that we would test, kind of the colors and the layouts, the size of individual text elements. That's something that we would test all the time to try to see what makes the search results a little bit easier to understand and easier for people to kind of process and find the right stuff. So I mean, I don't know this specific test what they're looking for or what they're doing there, but that sounds like the typical UI test that we do. It's like shift things around slightly, change the colors a little bit, and see what works, what doesn't work so well. OK, thank you. All right, let's see what we have submitted here. My website currently uses noindex and product pages with automated translations. Once the product is manually translated, our content team, by the content team, we then remove the noindex straight away. We are seeing Google taking months to reindex a page with manual translation, even though we have a dedicated XML sitemap to accelerate indexing. How can we accelerate the indexing of these pages? Do you think we're using noindex properly here? So this is definitely a good way to use noindex. If you have these products on your site in one language and you have translations in other languages and those translations aren't good yet, then using a noindex like this is perfectly fine. Depending on how you do your automated translations, it might also be the case that you could say, well, you'll just index the page without that block of automated translated text. Maybe we'll just put the title of the product itself and let the page get indexed like that. And then once you have the translation cleaned up, then you just add that to the page as well. That might be an option. But in general, kind of the mechanism of having a noindex on a page, fixing the page, and then removing the noindex from that page, that's perfectly fine. Sometimes that does take a while to get reprocessed. We have to recrawl those pages. And depending on the size of the site and what else we're crawling from that site, that can sometimes take a bit of time. But in general, that's kind of the normal crawling and indexing process. So that's not really something where I'd say anything is particularly going wrong. What I might look at in a case like this where you're seeing a really long time for content to get picked up again is to see if we're crawling a lot of unnecessary URLs. So if your shop is really small and you don't have a lot of products in there, then probably this is less of a problem. But if you have tens of thousands of products and you have millions of pages that can get crawled with all of these URL variations, then obviously if you have a cleaner URL structure, we'll just have to crawl maybe, I don't know, a couple of hundred thousand pages. And then we can get to these updated product pages a lot faster. So that's kind of what I would look at there. Does a 4.10 work any faster than a no index? To get an index again or? To get it removed from the index. To get it removed, it's about the same. So if you want to remove something, then a 4.10 or no index is about the same. 4.10 and no index are generally a little bit faster than a 404. But in practice, if you're looking at a period of a couple of weeks and you want to get this content removed, they're all about the same. Do I still need to add both versions to Search Console? So the dub-dub-dub and the non-dub-dub-dub version? Yes. At the moment, we recommend doing that just so that you have all of the data. We're currently looking into ways to make that process a little bit easier. So we'll probably ask around for input on Twitter or somewhere else to see what your ideas are there, where basically you just add your root of your website. And then we automatically include the dub-dub-dub, non-dub-dub-dub, HTTP, HTTPS versions in the same listing so that you have all of the data in one place. Maybe it would even make sense to include the subdomains there. I don't know. We'd probably like to get your feedback on that. So probably we'll ask around for more tips from your side in that regard. But at the moment, if you want to make sure you have all of the data, I definitely recommend adding all of those variations, even though it clutters things up a little bit. I know that sometimes it looks a bit messy. When trying to rank in the local map pack, how long should I expect this to happen for a new website? What are the most important factors there? I don't know how Google Maps does the ranking and their listings there. So I can't really help with that. The only thing I know is the Google My Business listing, and that's probably what you're already doing. If a website is already optimized for mobile, is it still advantageous to implement AMP? I think that can still make sense. So there are a number of features that require AMP to actually work well, where we can cash the content and deliver that directly in the search results. And these are things that we can't do with normal mobile-friendly websites, because for security reasons within a browser, we can't serve your content from Google.com Cash. So that's one of the reasons that might still make sense to you to use AMP. I would definitely try it out and think about where your website would be shown in search, where it might make sense for your content to actually be served with AMP. I know a lot of websites are already moving over to AMP-only setups, where basically they only serve an AMP version instead of a mobile version or a responsive version. So that might be another option to look at. But in general, just because something works well on mobile doesn't mean that AMP doesn't make sense. Ja? Yes? I have one question regarding that. I'm not specifically to AMP, but it's something about PWA. So Google recently said, like now Google support PWA for desktop as well. So is there something that we need to follow any specific guidelines, or we can just follow just as like for PWA to do what we are doing for the mobile site? So it will be OK? Or do we have to follow certain different implementations? Like, I mean, let's say it has to be kind of a combined set of clients and rendering as well as services and branding. So do we have to follow something else? Or like? No. So if you use a responsive design and use that for your PWA, that should just work. So there's nothing like a desktop PWA that you would have to set up. As long as the design works on desktop, which is usually the case where you have a responsive website, then that should just work. Should we read anything into the fact that we're seeing much better rankings on mobile than on desktop? I'm talking about a five-position difference, including on individual queries. I don't know. That sounds like a good thing in that where we're showing things a little bit higher on mobile for your site, I don't necessarily know where all of that would come from. It might be some mix of maybe local results, personalization, or I don't know. It's really hard to say. But in general, I wouldn't see that as a sign of you doing anything wrong. It sounds more like you're doing things pretty good for mobile. Has your team considered issuing general messages in Search Console like your site's content needs improvement or something about links if the links were problematic? We looked into something like, I don't know, a quality meter or something like that for sites for a while as well in Search Console. But in general, it's kind of tricky because the meter we would give for quality is essentially how relevant we think your site is in Search, which is essentially how we're already showing your site in Search. So kind of the current ranking and clicks and impression information we have in Search, it's essentially kind of trying to reflect how relevant we think that our site is. And it's sometimes really hard to bring kind of an artificial metric into Search Console to say your site is low quality or high quality or it needs improvement or it doesn't need improvement if we don't really use that same information in Search one-to-one. So it would feel a bit artificial to me in that we tell you why your site needs improvement in this regard, but actually in Search we look at something completely different. So that's kind of tricky, I think. I think of the case that I think of is where I've seen sites that they'll have 100,000 pages of blank pages that are indexed in addition to their site, which is like 20 pages, and they may not know that there's all that empty content on their site that's causing problems, I guess. I understand what you're saying, I guess, but I think about these certain cases, I guess. What do you call them? Stub pages, lots of stub pages. Yeah, I don't know. I think it's something that there is probably room to find some of these things that we can show to sites where we can say, well, this is perhaps a reason your site is not performing so well because you have way too many pages compared to the amount of content that you actually have or something like that. But it's hard to find that in a way that really kind of reflects what we think would have an impact on Search. And I think that's always tricky. I mean, that's the same problem that you have with different SEO tools where basically you get some information from this SEO tool, and you almost have to be an expert to understand what exactly this tool is looking at and what that actually means to kind of interpret, is this even relevant for me or not? Or is this just like busy work that I have to do? Does it actually change the search results for my site or not? But if you guys have any kind of ideas or recommendations in that regard, I'd love to hear about it. I mean, the Search Console team is working on the new Search Console. And if there are things that they could add to make it easier for you to create better websites, I think they'd be totally up for doing that. What's the best way to conduct site surveys if pop-ups are no longer allowed on mobile or exit surveys, acceptable? Essentially, what's important for us is that when a user comes to a site from the search results, that they see the content that they were kind of promised in the search results. That's kind of why we implemented this policy with regards to the pop-ups and interstitials, especially on mobile, where that's much more of a problem. What you do afterwards when the user has made it to your site is more up to you. So if someone clicks on a link within your site and you want to show a pop-up in between because it looks like, oh, they're engaging my site, then that's totally up to you. That's something that you could do. I don't know how exit surveys would work on mobile. Or scroll-based or time-based? Both scroll and time-based are also things that we would consider kind of blocking the initial view of the page. So that's something where I'd be kind of cautious, or at least set a really high threshold so that you're really sure that users have had a chance to actually see the content on the page. Two questions about structured data on mobile first. Indexing, how to add carousel markup for e-commerce category pages? I tried to add it on the Google guideline, but it only shows one URL. In the screenshot, the carousel has many URLs. How to make it? I don't know. I'd probably need to take a look at the page directly to see what is actually happening there. What I would recommend doing if you're still struggling with this is also posting in the Webmaster Health Forum. There are some people there that have a really good eye for structured data issues who can kind of guide you with regards to what you might need to change there in your markup. The current mobile site is an m.site. Should all content of the desktop site be the same as the mobile site? Yes, we recommend making sure that the content is the same, or at least the equivalent on the m.site as on the desktop site, especially with the shift to mobile-first indexing, we'll only index the m.site in that case. So if there is content on your desktop site that's not on the m.site, then we would completely miss out on that. And that includes things like structured data. So if you have the structured data only on the desktop site but not on the m.dot, then we would completely lose that when we switch over for mobile-first indexing. At the moment, our classifiers do try to recognize those situations and try to kind of save those sites for later. And maybe we can find a way to message those sites as well. But in general, it should really be the same on mobile as on desktop. If we change to a responsive design, what should we do in order to prevent a ranking drop? Should we 301 from all m.urals to desktop URLs or 301 to relative URLs? We did a blog post on shifting from m.dot to responsive design, I think, last year, late last year sometime. So I double-checked that. Redirecting is definitely a good idea because people might be going to your m.pages anyway, and it will be good to get them to the right version. We have two brands and a selection of websites, both subdomain-based and dedicated domains in the UK and similar format across Germany, which also essentially the same product. The difference is one is marketed for photographers, videographers, the other more for designers. The websites themselves also have the same structure and templates. How would Google deal with this? And should we be worried about content or structured issues with Google possibly demoting them? So generally speaking, if you have two websites and you have similar products, that's less of an issue. But once you have more than, say, a handful of websites that all have the same products, then it can happen that our algorithms or our web spam team looks at this and says, this is more like a collection of doorway pages, where essentially you have one product and you just have all of these different websites that list the same product in different variations. So that's kind of one thing I would watch out for there. The best way to avoid that from being a problem is to pick for each of these products, pick a canonical website and say, well, this product applies primarily to maybe photographers. So I'll set the canonical to the photographer website that we have. And this other product applies primarily to videographers. So I'll set the canonical to that one. In that case, you can still list the same product in both of these shops or in all of the shops if you have multiple shops. So users can go there and they can still buy it there. But in Search, we'll just index one of these, the canonical version. The advantage for you is also that we can combine all of the signals that we have into that canonical version, which usually makes that canonical version a little bit stronger in Search so that it's easier to find as well. So that's kind of what I would recommend doing there. Again, with two shops, I wouldn't be so worried if you're talking about more than a handful of shops, then I would definitely find a way to pick a canonical for these individual products. John? Yes. I have one question regarding the comment. We have a section on our website and in certain part of the content, we want to move it from one section to another. But the thing is, we don't have a provision or functionality to redirect those specific sections. So can it be done through canonicals? Or it will be fine? Or if you put the canonicals, Google can still crawl the previous pages and somehow try to run old URL instead of the new ones. We use a number of different factors to pick which URL to show. That includes redirects but also canonicals. So I think if everything else is aligned so that the internal links point to the new URLs and you have a rel canonical set to the new URLs, then probably we will shift over to the new URLs as well. You can also sometimes additionally add something like a JavaScript redirect to those pages. Sometimes that's easier to do. And when we see a JavaScript redirect, we also treat that as a normal redirect. So that would also tell us, hey, this new URL is the one that you actually need to index. Cool. That also can be done on sub-devates as well, instead of just by code. Thank you. Yeah. All right. And there's a question on Twitter. Someone is asking, is this an SEO trick? I don't know. It's hard to say what specifically you're pointing at there. I think it is definitely to this timestamp. Like, I mean, it says four days ago. But when you check on the page, it says like somewhere around 20 days ago or something like that's actual date. Oh, OK. Sometimes we pick up dates wrong. So that's something where I know the dates team is looking into cases like this. I can pass this on. But in general, we try to figure out what the date is that's relevant to the page and we try to show that. And it's more a matter of just showing it in a snippet than any SEO trick. So Hajjan, in that tweet, what I'm showing that, in the Google search results, it's showing that the page was updated four days ago. But that article is actually two years old. If you see the timestamp in that article. Yeah. I suspect we're seeing some other date somewhere else in the article. So maybe like we're accidentally picking up a comment date or something like that. I wouldn't assume that this is some kind of SEO trick that they're trying to do because it wouldn't really change anything from our side with regards to ranking anyway. It's just really just a snippet that we show there. But for end user, it's not a very good experience that we expect. If Google is saying that this article was updated four days ago, then we expect the latest articles. Sure. Yeah. That's totally I totally understand that. So that's something where if you run across things like this, you're welcome to submit feedback on the bottom of the search results. There's a feedback link to let us know about these kind of things. But it's sometimes tricky with dates because sometimes an article is updated and sometimes it's updated a lot later than it was originally published. And it's like, which date shall we show? The original date or the updated date? That's sometimes hard to say. But sometimes we also get the dates wrong. So that seems to be the case here. All right. We've had a website that given many attempts, we can't get the home page indexed. All other pages are indexed properly. It's currently excluded because submitted URL is not selected as canonical. We tried a whole bunch of things, including a different domain. Nothing worked. I looked at this briefly before the Hangout. And it does seem like something weird is happening there and that we're picking a different domain. Instead of the one that you have, from the domain name, it sounds like it might be related to your domain. But the domain name that we pick currently doesn't have any content. So it's kind of a weird situation that we would run into this. I need to double check with the team to see why we're picking that one. But if that other domain is also yours, then I would recommend just setting up a redirect rather than removing the content there. So kind of clearly guiding us towards a version that you want to actually have indexed. And especially if you're saying that you switched to a new domain and it didn't work, so we switched back, then maybe that's from that change that you did there. And if you switch domains like that, I'd really recommend setting up redirects so that we can find the actual version that you do want to have indexed. We have a multi-location brick and mortar store. We have a single website for various reasons that two locations have separate inventory. And we need separate e-commerce storefronts. We're trying to decide whether to link those stores directly from the home page, or if we should have a kind of a home page link to slash stores and then link to them from there. We seem to need to balance two competing interests. On the one hand, we want to get users to the relevant store with as few clicks as possible. And we don't want to bury products any deeper in the market. Hierarchy, then we have to. On the other hand, hierarchy may be clear to users on Googlebot with home page slash stores. So what should we do? What should we do? In general, both of these setups would work. So I don't see any big advantage in having the URLs like in separate subdirectories even further down. From our point of view, we don't count the slashes in the URLs. So if you put it into slash stores and then slash location, and that's how you want to kind of keep your website on your server, that's perfectly fine. What does matter for us a little bit is how easy it is to actually find the content there. So especially if your home page is generally the strongest page on your website, and from the home page, it takes multiple clicks to actually get to one of these stores, then that makes it a lot harder for us to understand that these stores are actually pretty important. On the other hand, if it's one click from the home page to one of these stores, then that tells us that these stores are probably pretty relevant and that probably we should be giving them a little bit of weight in the search results as well. So it's more a matter of how many links you have to click through to actually get to that content rather than what the URL structure itself looks like. So a flat architecture may be more beneficial than an acceptably long kind of pyramidal? Well, from just the URL itself, it doesn't matter either way, but with regards to making it easier to understand the context of these pages and how important they are, I think a flat architecture definitely tells us a little bit more. But it's still something where you have to be careful not to overdo it. So if you link to all of your pages from the home page, then they're all on there. So it's not something where you'd have much value from that. So you still need some structure, some context around those pages. But if you have a handful of pages that are really important for you, then that will be perfect to link from the home page. Or any of the important pages that you might have on your site. Yeah. What also makes a big difference for us is especially if the home page is really important for your website, is that newer content is also linked pretty high within the structure of your website, so maybe even on your home page. So what a lot of sites have is this sidebar where it's like new articles or new products or products that are on sale or something like that. Anything that you want to kind of push a little bit in the search results, that definitely helps us there as well. So if there's something new or something that changed on your website, and you think it's important, then make sure it's linked from the home page somehow. John, instead of linking from just for the home page, can we link it from other category pages as well? Like those are getting a lot of graphic from Search. So that would be OK. Or you just want us to link only link from the home page. I think that's perfectly fine to link to it from different locations. I mean, that's kind of the organic way that you set up a website. So I would totally do that. If it's something that's important for your website, then make sure that it's really obviously important within your website, so that regardless where Googlebot goes, they can see, well, this is really critical content. This is something I need to focus on right away. All right, cool. Thank you. All right, I still have Webmaster Tools for HTTP. Even though my site migrated about a year ago to HTTPS, occasionally I'll get messages via Webmaster Tools to HTTP that I won't receive on HTTPS. I even see things like spikes and indexed AMP pages on HTTP occasionally. Is there an issue with having both? Should I get rid of HTTP? Usually, we should be able to figure out that everything has moved to HTTPS. But like you're seeing here, sometimes we still see some signals that point at HTTP and will show some information there. So that's one of the reasons why I'd recommend making sure that you have both of these versions verified. And not having it verified in Search Console doesn't change anything with regards to how we associate information with your website. So I would definitely keep both of those verified in Search Console. I noticed when we got the message for the manual action that it only went to HTTP, but showed up in HTTPS. It went to the messages of HTTP, but showed up in both. And we resubmitted on HTTPS. And I'm wondering maybe I should also resubmit HTTP, because obviously, as we've discussed this thing, it seems like maybe there wasn't a reason for it at this point. And I obviously wanted to go away, because it's clearly having an effect. So I want to make sure I cover all bases. But also, I do see these oddities, like spike in AMP. So that's why I asked. For reconsideration requests, it doesn't matter. So internally, when we take a manual action from the website, it applies to both HTTP and HTTPS. What might be happening in a case like that is that when we send the message out, we only send it out to one of the sites in the account so that we don't overload your email with unnecessary messages that are essentially the same. So I know that there's some logic within Search Console to say, well, the webmaster just received this message anyway, so we shouldn't send the same thing again to the same account. We should still show that information on HTTP and HTTPS. So I think that's working OK. It's just the message that you receive. It might go to one or the other. I don't know if there's any priority set up with regards to this or not. With regards to the spike in index AMP pages, that seems a little bit weirder, but I could also see that that just occasionally happening, that we say, oh, we have these pages. We think they're associated with HTTP, so we'll show them there. And then we reprocess them, and we say, oh, actually, they're HTTPS, so we shift them back over there. My site is part of the ticket fly hack problem. OK, I don't know what the ticket fly hack problem is. I have a live music venue that lists the calendar. Right now, all ticket fly sites are down, so that customers see my calendar. I use GoDaddy's forwarding tools to send it to my Facebook event page. Right now, if you type in the site, it'll forward to the Facebook page. If you search for it on Google, it lists HTTPS. However, that listing takes you nowhere. I verified my site in Search Console. How can I get rid of the S in HTTPS when you search? That's always kind of tricky if you take down a site instead of kind of redirecting it somewhere else. So what I might look at there is to see if, through your hoster, you can also set up a redirect for the HTTPS version. If you can set up the redirect for the HTTP version, then usually doing that for HTTPS is possible as well. So I double-check with them to see if that is an option. And when that happens, then we can obviously pick that up. Without the redirect, we might assume that these are separate sites. So we see the HTTPS version that goes to, I don't know, maybe an empty page. And we see the HTTP version that redirects to your Facebook page. So we can definitely index your Facebook page. But we don't quite know what to do with the HTTPS version because it seems like maybe it's just temporarily broken. And by the time a user clicks on it, it'll be OK. We're not really sure. So I would definitely check with your hoster to see if you can set up a redirect for that as well. Usually, that's less of an issue. In the worst case, assuming that the site would otherwise be down for a longer period of time, what I would do in a case like this is move the DNS to a different hoster where you can set up a redirect. Because having that redirect to the page that actually has content for your website, I think that makes a big difference. Otherwise, what would probably happen is we would drop this page completely from our search results at some point. And then it's a matter of do we understand that this Facebook page has replaced your existing website, or is this Facebook page is something else that we happen to show in the search results somewhere further down? So if you can set up a redirect in any way, I would definitely try to aim to do that. To which level can we believe Search Console data, namely clicks and impressions? Mostly, it doesn't match our data in Google Analytics or Adobe Analytics. The data is actually pretty good. I think one tricky aspect there is that these different analytics tools, they measure in slightly different ways. So that's something where you might see differences. In general, though, especially when it comes to the impressions and the average ranking information that we have in Search Console, that's something that's actually pretty good and generally based on what we've seen users actually do. So in particular, the average position is based on the search results that we've actually seen. It's not something that's computed theoretically. It's really based on what users have seen in the search results. So sometimes you'll see differences in maybe a rank tracking tool that you're running on the side and Search Console. And usually, that's more a matter of the rank tracking tool seeing one version of the search results, whereas Search Console sees what users actually saw in the search results. I think there'll always be some differences across these different tracking tools with regards to clicks and impressions. To some extent, that's unavoidable. I know we do try to make it as accurate as possible. There are some things that we filter out in Search Console, particularly when we see that queries are only done maybe once or twice, then that's something we might filter out for privacy reasons. That's all listed in the Help Center. Usually, you can tell that this is happening when you look at the total count per query or for your site in general, and you look at the individual queries that we listed on the bottom, where maybe if you add up those individual queries, it comes to, I don't know, 500 impressions. And the number on top says 700 impressions, and that difference is kind of what we filtered out for privacy reasons. So that might also be something that's worth kind of taking into account and thinking about when you're comparing these metrics. I want to move some content of a French website to a new one. I'd also like to make a new website multilingual, manually translating the content into multiple languages. What's your advice to do this as clean as possible? So I think what you're doing is splitting an existing website and kind of taking part of that content and putting that on a different website. Yes. Yeah? Yeah. So I think the main thing that you need to be aware of there is that when you split a website or when you merge a website, it's a lot harder for us to process that compared to a normal site. So that's something where I would just kind of go with the expectation that it's going to take a bit of time for everything to settle down. And it's not absolutely clear what the final state will be. So if you split out some content from one website and you put it on a new website, does that mean that the new website will be getting just as many clicks and impressions as the old website got? Probably not. Probably there'll be a difference. If you're also adding new content to the new website, then you'll see differences anyway. But I think just from an expectations point of view, it's worth to keep in mind that this is not a trivial change for a larger website, at least. So you need to be a bit patient. With regards to what you need to do there is primarily to make sure that you have redirect setup for the individual pages from the old one to the new one, that with regards to internal linking on the old website and on the new website, you kind of have that covered as well. So for example, if someone linked to one of those pages internally within the website, then make sure that internal link points to the new website instead so that we really have all of the signals that tell us this content piece here moved to this website, and we should index it as this website. So I think that's kind of the primary thing from a technical point of view to do. To make the new website multilingual, I think that's something that's a great thing to do, in any case, and I would just make sure that you have hreflang set up properly for those pages. Usually, that's less complicated if you don't have a lot of different language versions. But in general, that's something that's kind of standard nowadays, like a lot of people have those kind of things set up. So I think this is a good move. You need to be patient. How much time do you think I would have to wait? How much time? I don't know how. Weeks, month? So I would assume that if you're doing a domain move from just one domain to another, you probably have that pretty settled down within maybe a week or two. If you're splitting a website, if you're merging a website, I would consider that probably more on the order of months. OK, thank you. All right. You want to shut down? Yes. OK, so I have some great problem which has happened to our website. So one page of our website is actually not Google is not able to kind of capture part of a content. And we have tried on checking in the fetches Google. So there it is showing fine. And I've also checked in the rendered HTML through the page speed tool. They have checked there. It is also there. But when I'm trying to copy fragment of the text and they're doing with a cycle in my URL and that particular text, so Google is not returning that particular my website URL itself. Or even though if I'm just not putting my website on just search strings or maybe the text strings also, so it is not appearing on the first page itself. So what might be the issues? I don't know. It's hard to say without looking at the pages themselves. So I think what you probably want to differentiate is what you kind of hinted at there is there are technical reasons for this maybe. So in particular, is the content actually in the static HTML? Or is it being filled in with JavaScript? Is it being pulled in with an iframe or some other method to at least figure out, is it that Google is not seeing this content, or is it just that Google doesn't want to show it in the search results? So in particular, sometimes we do have maybe some kind of spam filtering setup, or if you're searching for something that our systems think is obvious spam that's usually a sign that a site is hacked or something like that, then maybe we just won't show that in the search results. But if it's a matter of us not actually seeing the content at all, then it doesn't matter so much what you're searching, we just don't see that content. So kind of differentiating between, is there a technical reason why Google is not showing it? Or is there maybe some other quality reason why Google is not showing it? And if it's a more matter of a quality reason type thing, then that's something where I would just work on the website itself, and maybe think about, is this query really representative for what I want these pages to rank for? Yeah, so that's fine. But the question is like, I mean, it's OK if a website is not ranking, which is another thing. But maybe Google can actually kind of try to capture all this text and everything. Basically, the one part of the website page is getting crawled and everything. But the subsection of the page is not getting crawled, which is the kind of, I can say, I have just kind of consolidated it. A lot of topics are like, basically, into the one page. It's basically it's a one page. And somehow, Google only capturing certain section, not the rest of the section. And it is, yes, it is implemented in HTML, like the only P tag is there. So I don't know, like, I mean, what's wrong in that? Like, I mean. If it's in HTML and if it's a reasonably, I don't know, reasonably sized web page, then that's definitely more matter of a quality issue rather than anything technical. So I believe the limit for the HTML pages for indexing is something around 10 megabyte. And with 10 megabytes, it's like we can pretty much pick up any HTML web page. It really has to be really crazy long to actually kind of be not findable. So I'm going to just, like, I mean, just based on this particular URL in the chat, like, I mean, if you can just check through this particular URL of my page, website, basically. And initially, like, I mean, yeah, so the page. So looking at the query that you posted there, I would just use a shorter text to search. So when I cut it off about it half and use that with quotes, then that seems to be picked up. But it's something where I don't see that as being like so representative of us not actually being able to find the page. That seems, I don't know. And it's like, I don't worry so much about that. I think that we're able to index this page normally. And it's just that if you're searching for such a long piece of text, then probably our systems are just like, oh, I don't know what we should do here. And they give up. OK. Yeah, I find it's better just to do a search on the title of the article, not including domain name. That tends to work better. And like John said, if it's really long, just use the first five to 10 words that tends to work better as well. So this one is like the very first paragraph of my page content. So that is the kind of problem we're facing, being your very first paragraph and just not get picked by Google. So maybe after that thing, maybe Google is not able to see my rest of the content. And that is covering a lot of different topics regarding different kind of topics often. Yeah. I mean, looking at the Cache page, we're definitely able to see the full content. Yeah. Yeah. Yeah. I don't think it's a matter of not seeing the full content. It's probably just our search results saying, oh, it's like you're searching for such a complicated query. It's hard for us to pull that together. We give up. So like you mentioned, like a name from the site and then maybe a title or first paragraph or first sentence, something short, but that's kind of what I would aim for there. Cool. Any more questions from any of you? Sure. Hi, John. Go ahead. Sorry. Go ahead. Thank you. Yeah. What's normally if I have set up your parameters, I'm going to set to no index to where certain parameters to that URL. And what's happening is our user will need to share those URL with those parameters. So I was wondering if Google will consider it is still treating those backbanging as a signal. If I set up URL in front of it, it's not to index. So in the URL parameter, a handling tool, you mean? Yeah. It's a handling tool. I don't know for sure how we would treat that. So in general, what would happen is when we see a link from one page to another page, we try to assign that link between the canonical versions of those pages. So that means kind of like, do we have a canonical page for that page that they're linking to or not? If we don't have a canonical page, then we kind of drop that link because we don't know from where to where it actually goes. But if we see that there's a canonical version that we can use, then we'll treat that as a link to that canonical version of the page. So what you can sometimes do is do an info query for the URL that you see people linking to. And you can see, does Google show any other URL in place of that URL? And if we do show a different URL there, then probably we're seating that URL as a canonical for the URL that you specified. So we would still be kind of be able to count that link between those two pages there. OK, thank you. OK, one of the previous questions someone asked was about a URL not appearing, the home page not appearing, and maybe an inner page appearing instead. And over the years, I'd seen kind of things that were clearly bugs. But in other cases, I wondered, are there cases where a page is over-optimized and whether through content or links, and that just causes it to disappear? Or is that just a kind of an open theory question? If you're searching explicitly for the home page, then we should be able to show that even if we think from a quality point of view, it's not a great page. So if you're searching for something like the brand name and some information from the home page, we should still be able to show that. On the other hand, if you're just searching for general information that could lead to that home page, then that could be a situation where we say, well, the home page is so keyword-stuffed. We don't know if these keywords are actually relevant on the page because it's just everywhere. But this lower-level product page perhaps seems more reasonable, so we'll show that one instead. That's something that I've seen happening quite a bit. But that we wouldn't show the home page when someone is clearly looking for the home page, that seems more like a bug. That seems like something that shouldn't really be happening. True, true. When they're searching for the brand specifically. But in the other case where it's just a generic kind of search, could that also be affected by, again, stuffing too many links, I guess? Some kind of thing? That could have. I mean, links is tricky because our algorithms probably try to look at that on a site level. And for the most part, when we see problematic links, we try to just drop them and ignore them. So they shouldn't have that kind of a negative effect on a page, on a page level. So maybe the dropping in and out is nothing. It's just, again, if it's five pages deep, that doesn't even matter, really. I guess kind of. Sometimes that happens. It's like they're just fluctuations. And sometimes they fluctuate so much that you get some impressions, and sometimes you don't get any impressions because maybe you're switched over a page or two below in the search results. And it's just fluctuating a little bit because our algorithms are kind of unsure how we should be treating this. Well, I don't know if anyone else has questions. I don't want to use up all the time, but I do have a few more questions. All right, go for it. OK, there was a particular project we worked on a couple of years ago where we built that section of a site in a space where all the content of every other site was just kind of generic articles. I want to call them eHow articles, but they weren't really useful to people. So we thought, hey, we could do this a lot better. And we built them out with a lot of supporting data and charts, very useful for people. And we noticed that over three years, that section of the site, which was maybe a couple hundred pages, it just continued to grow, even though we did not detect anyone sharing it or linking to it or anything like that. And we wondered if that was kind of normal behavior. By grow, you mean just get more visibility in search, more? Yeah, yeah, just traffic from search. And it was just a constant kind of trend line upward. And again, the competitors, they probably had a lot more established sites and whatnot. But I mean, was our content being rewarded? It sounds like it. Among these competitors. I mean, it sounds like you were doing the right thing there. I don't know which site this is or what was the specific situation there, but it does sound like our algorithms were picking up that this is actually pretty good content, and we should show it more in the search results, which sounds like it's doing the right thing. So we just built a calculator and it cost $5,000 because we thought it would be really useful to people, but it's not doing so well. And so we kind of wonder sometimes, OK, well, if you spend $1,000 on an article, then is it a lost cause kind of thing? So it's just kind of tough to gauge. Yeah, I think that's the kind of thing where you'd want to check with the audience instead of just looking at it from an SEO point of view and figure out maybe with user studies or something to see, is this really something that people are looking for or what are they looking for in addition to this, or what's kind of the next step or the previous step that leads them to the calculator or not. So I don't know, for example, if you have some kind of calculator for insurance stuff, then maybe kind of the additional information around everything insurance-related would be pretty useful, where people are maybe searching for do I need house insurance? And they kind of have all of this detailed information about house insurance. And then as a second step, they have that calculator. It's like, this is how you can calculate what it costs or what you need, those kind of things. So instead of just having a calculator on its own, having kind of the whole story that helps people to kind of cover the full path that they would need, that would probably make sense. But that's something you really need to check in with your audience instead of just looking at it from an SEO point of view, just saying, well, I could just put out 10 different articles that are kind of related to this, and that'll fix our SEO problem, you really want to make sure that you're covering something that people are actually needing, that where there's actually a hole. I know people are looking for the calculator. Kind of like on that previous example, where I mentioned about the content. It actually was an insurance type of thing. It was about automotive insurance trades. And the big companies didn't do a good job. But on the calculator, we know a lot of people are looking for these types of calculators. And the ones that are out there are very dated. They don't work very well on mobile and lots of other. Incomplete, I guess. Yeah. I think that's always a bit tricky when you have a tool that basically lets you enter numbers and you get numbers back. Because search engines, when they look at this page, they see a bunch of text fields. And the value is what happens after people put stuff in there and they can interpret what comes out of that. And that's sometimes hard for search engines to figure out without the completing information around that. The calculator is an example of the content. Like we'll do big articles and a huge amount of data mining to do interactive storytelling, that kind of thing. And it's really interesting. But it's also very time consuming and expensive to do these things. Yeah, I can imagine. Yeah. My second question is we had a site that used a domain that was keyword based. We, at some point, 301 did to a domain that was more of a brand term. And it seemed to have difficulty after that. And we say, well, maybe it was difficulty with 301. Maybe it was because the domain name changed. Could that have an effect, I guess? It's just the lack of, I mean, going from a keyword domain to a non-keyword domain? Usually that's a pretty small factor. So I would assume, for the most part, you wouldn't see a big change in search visibility, just from shifting from one domain to another. But it's hard to say what else happened during that time. So what we'll often see is that people make some change on their website and they see a drop in rankings and they think, oh, it's because of the change when actually we were making other changes in search, as well, and the site probably would have seen a change in search anyway. So that's really hard to kind of gauge there. The one kind of case where I have seen kind of site moves being trickier is when the new domain name has some kind of problematic history associated with it, where it just takes a little bit more time for us to really understand that the content currently on this domain is actually quite different than what it was in the past and it should be kind of evaluated on its new basis rather than what it was in the past. But usually those are situations where it's a very dramatic change where you can really see like you did this domain move and once the move was processed, like the visibility of this site disappeared completely. That might be because maybe that domain was hosting, I don't know, adult content in the past or really problematic spam and our algorithms are just like, oh, this domain is really problematic, we need to watch out for it and it takes a really long time for that to kind of settle down and get reprocessed. But if you're talking about something just like subtly dropping after a domain moved that seems like it wouldn't be related to the domain name. All right, so we're a bit over time. I think that's perfectly fine, but I need to head off to the next meeting. I'd like to thank you all for joining in. Thanks for all of the questions. Hopefully you found this useful and I'll be setting up the next Hangouts probably later today. So if anything else is on your mind, feel free to drop those questions in there and we can get that covered then. In the meantime, I wish you all a great time and if anything comes up along the way, feel free to ping us on Twitter or of course reach out to us in the Webmaster Health forums. Thank you everyone and have a great weekend. Thank you. Thank you, John. Thank you, John. Bye. Bye.