 All right, welcome, everyone, to today's Webmaster Central Office Hours Hangout. My name is John Mueller. I am a webmaster trends analyst here at Google in Switzerland. And part of what we do are these Office Hour Hangouts, where webmasters and publishers can come together and ask us questions around search and give us feedback appropriately. We have a bunch of questions that were submitted today, and a handful of people who are here are live to ask questions. Next week, we'll be at Google I.O. So if any of you are going to Google I.O., so feel free to drop by the, what is it, the web sandbox area or one of the Office Hours that we're doing and come and say hi in person. Maybe I can get it set up so that I also do a Hangout while we're in Mountain View to kind of cover the Pacific Time Zone as well so that they don't have to get up in the middle of the night. We'll see. It's lots of stuff happening next week. All right. So to get started, do any of you want to get started with a question? Hi, John. I have two questions. The first one is we have got this question from one of our clients. He's asking for the blog post, how many words is the minimum, how many words do you need to add? What usually we do, if we write any blog post, we try to make it minimum 500 words because what we have found, if it's 500 words, then we can give minimum necessary information. But we never write more than 1,000 words because we think this is not user-friendly. User will get bored reading big content. So from Google perspective, which one is the better? What do you suggest? As far as I know, we don't count the words at all. So we don't really care how many words your blog post has. I think from a practical point of view, it makes sense to think about this and say, well, I want to make sure that I have at least something useful that I can tell people. So more from your side to have maybe some minimum, say at least this many words or at least this amount of information. Otherwise, maybe I should just write a tweet or maybe I should wait until I have more content to put together. But it's not that we say there is a minimum or a maximum or an optimal number of words. That's really more about how can you bring that information across? And do you need to have a minimum number of words or is maybe a blog post that says we did not do this or we did this? Maybe that's already enough as an announcement to be useful on your blog. But it's really more from a useful point of view. So here's my second question. So how do you measure this is a quality content? Because if it is two or three lines content, that means it is not the quality content, of course. So how do you measure when you say this is the quality content, which factor do you consider? I think that's really hard. So we look at a lot of different factors to try to figure that out. Definitely not just number of words or number of lines. So something can have a lot of words in it, but it doesn't say anything useful. So it's definitely not something where we would just count the words and say, this is good or bad. But what I usually recommend doing is having someone who is not associated with your website to review your content and see if that matches what you'd like to bring across. And if that comes across as something that's useful for them. OK, the last question, this is a different question. One of our clients has e-commerce website. So what happens a few days ago, they reactivate some of their products. So they remove the product from the website, then add those products again. But when they add the product on the website, they did not add the quantity. That is what the website shows out of stock. Now, Google calls the website, Google gets the information. And when someone search the product, it shows on the stock, but it shows out of stock. But they have already changed the information, but it is still there on Google's stock. So what we can do, they change the price regularly, maybe after one week or two weeks, maybe two or three weeks later. Sometimes it goes out of stock. Sometimes it is in stock. So how can we do this properly so that Google shows the proper information on their search for them? The best way is to use a sitemap file. And in the sitemap file, use the last modification date to tell us that this page has changed. And you can send a ping for the sitemap file to Google and to the other search engines to tell them that this is actually something that has changed that should get updated. So that's generally the best approach. What about the frequency? I saw that there are some people use sitemap and they add the frequency. So is this something new? No, we don't use that. OK, the last one. Yeah, it's really the date. Because if you can specify a date and you tell us how often it updates, then we should just look at the date anyway. So we focus on the date. Also, priority, we don't use that from sitemaps. Thank you, John. All right, any other questions before we get started? Yeah, I have one question about I have a situation. For example, I wrote three articles. And they have the same high volume keyword. But if you speak about long tail keyword, they are different. For example, SEO 2019, it's for Russian language. But I think it doesn't matter. And second one, SEO definition. And third one, SEO full guide. And what I saw that first of this page rank well with term SEO, just SEO, I mean, without any other long tail keyword. And after this, during some time, one month, this page lost their ranking. And other page take this rank. I was curious, perhaps Google tries to understand which page is better. And after this, I had the same situation. A few of my URL had this ranking. For example, if we speak about YouTube, I don't have the same situation. For example, some of my videos, Russian video, they rank in one keyword. It doesn't matter. Even I can rank a few of my videos in top five. But for Google, Google tries to choose the best for this term. And after it means other pages can't be in top 10. I think these things can change over time. So that's something where one page might be very visible in the search results at one point. And then over time, maybe over a couple of days or weeks, another page is becoming more visible. And usually, that's more a sign that these pages are not kind of really strongly cemented for those rankings. But rather that they're kind of similar in rankings. And it just takes a little bit more for one page to go about. So I don't think there is much from a practical point of view that you can do there other than say, well, just because I'm ranking number one now doesn't mean I'll always be like that. It can make sense to continue working on that so that you're not just number one now, but you're number one next week over time as other pages start to get more visible as well. Oops, I think you're muted. Oh, sorry, yeah. I think, I mean, I don't want to compete with myself. My own content with other my work. Yeah, and I know that Google don't like when you have the same pages, but they are not duplicate content. They have different meanings. Yeah, but it means I can't rank two pages in one turn. Yeah, in Google. No, that can happen, too. Like if these are two very good pages for that term, that we might show both of those. It's not always the case. We really need to make sure that we're not showing the same content to the user twice or three times in the search results, but it can be different variations where maybe you're searching for a bicycle and you have information about bicycle trips and bicycle parts, and we don't know what the user wants. So we might show both of these. Yeah, it's really a S of this one. No, it can happen. Yeah. OK, thank you. Cool, OK, let me take a look at the questions that were submitted. The first one I have here is about verifying on Google posts. I don't have any information on how that works. It sounds like you're not able to claim the Knowledge Panel for that. My suspicion is maybe someone else has already claimed that Knowledge Panel, so it would be worth double checking within your organization if someone else has verified that. If you can't figure that out, maybe do a post in the Webmaster Help Forum and send me the link, and I can pass that on to the team to take a look. Then the second part is about search URLs. So should it be website.com slash search slash search term, or is it better to have slash search and then question mark q equals search term, for example. Is there an optimal variation for the search terms? From our point of view, both of those work. What I would recommend doing, though, is using the URL parameter, because that's something that's a lot easier for us to recognize as something that might vary, and especially we can recognize it as something that maybe a user is submitting. So that's kind of the direction I would take there. So using a parameter instead of a path makes it a lot easier. In particular, if there are other parts that are added additionally to this search term, then that makes even more sense. So for example, pagination or filtering, if you add those as parameters, we can learn that these other parameters are for pagination. Whereas if they're a part of the URL in the path, then it's really hard for us to learn what these individual path parts are. And it makes it a lot harder for us to optimize our crawling for the website. So I'd recommend, especially search terms, putting those in a separate query parameter. So with a question mark, q equals whatever, and letting it kind of work that way. That said, if you currently have the site set up in the other way with path, then I wouldn't just blindly change that around, but rather think about essentially what you're doing is a type of restructuring of the website. So you need to set up redirects, and you'd need to make sure that any traffic that's going through the old URLs goes to the new ones, all of these things. You might want to consider there. So don't just jump and change everything around, but rather do that in a thoughtful way. Two questions about Google My Business. So I don't really have much insight into Google My Business. So I don't know how I'll be able to help here. Two websites and similar niches. I choose the main category and kind of a different one in the other business. I really don't know how Google My Business would handle that. So I'd recommend posting in the Google My Business self-forum for that. And the same, I think, with regards to the local events and other features from Google My Business. So I'd kind of watch out for those as well. The one thing with regards to events in particular is that you can use the events markup on your web pages as well, and that is something that we can pick up with regards to web search. And we can show that in the Knowledge Panel on the side, kind of events that we picked up from your website. And that's something that is visible differently in different locations. So it might be that that's very visible in the US for some businesses, but it doesn't mean that we would show it just as visibly in other locations. So we try to figure out what the right balance there is with regards to where we show which type of search results features. Then a question from the Portuguese web massive forum asking about some improvements shown in Search Console, which are roughly translated as logos. And the poster has never seen that. What could that be? I don't know. So it's hard for me to say what that might be. I know the Search Console team continues to work on the product and continues to experiment with new features. So maybe this is something that they're trying out. It's hard for me to say off-hand. Is there any additional considerations when setting up SEO on a single page app, specifically a Salesforce community's public site? We've set up the community according to Salesforce documentation, sitemap file, but we can't get it to be indexed properly. I don't know. I don't know specifically about the Salesforce site. So in general, when it comes to JavaScript-based websites, which most of the single-page web apps are set up as, there are some things that you'd need to watch out for. In particular, Google needs to be able to render the pages so that we can pull in the content and use that for indexing. Depending on the type of JavaScript that you have on the pages, that might be a bit tricky, because we currently have a fairly old version of Chrome that we use for rendering. What I'd recommend doing there is, on the one hand, double-checking our documentation. So we have developer documentation on rendering in a lot of detail. So that would be my first step. The other would be to use tools like the mobile-friendly test to test if we can render these pages properly. And with those, generally, you're able to kind of figure out how you can deal with this. I think the tricky part here might be if you're tied to a specific platform and you can't necessarily change the JavaScript that that platform uses, then that makes everything a little bit harder. Because if that JavaScript is done in a way that isn't supported by Google for indexing, then we can't really do anything with that. You can't really set any setting in Search Console and say, Googlebot should do this in a unique way for this website. But I would double-check to figure out where the issue is actually coming from. And if it's really coming from the framework that you're using, maybe get in touch with the framework providers. Maybe have them get in touch with us through the forums or through the Hangouts and other channels. I have a website that's ranking for some keywords in different countries, although I'm not targeting any specific country. When I checked in Search Console, the average position shows a rank of 11 or 21 for that keyword in that specific country. When I checked with the VPN from that country, it's not showing my website there. So what's wrong here? In general, there are two things that I could see might be happening here. On the one hand, testing through VPN doesn't always give you the right answers. So that might be one thing. The other thing is, in general, the data shown in Search Console is not a theoretical ranking position, but rather something that we saw when we showed search results to users. So what could be happening here is that there might have been something temporary where your pages were ranking and that position for users in that country. That could be something where we say, well, we'll try this out. And if that works well, then that might be a good place to go. It could be something where maybe the temporary effect was more a matter of personalization or maybe a matter of other sites that were shown in that area, not being that well during that time, those kind of things. So usually, you can tell if this was a temporary effect or if this is a default behavior by looking at the number of impressions that we show in Search Console. And if the number of impressions matches what you would expect to see for impressions for those keywords during that time, then probably that's more a matter of everything is working normally. If the number of impressions shown in Search Console is significantly lower than the number of impressions that you would expect, then probably that's more a temporary issue, where we showed it briefly with that ranking and maybe a day or so later it's not ranking that well. What's the effect of redirecting 404 is on a site to a specific 404 page instead of just showing a custom 404 page? Both of those approaches work. So redirecting a URL to the 404 page is fine. Directly showing a 404 page is also fine. That's more a matter of what you can do to set things up on your website and less a matter of specific pros and cons with that approach. Generally, we recommend making sure that a URL returns 404 immediately. Sometimes you can't do that. You have to redirect to an error page. That's fine. When moving content from a subdomain to a main domain, we saw a significant increase in traffic for that same content. Can you explain what might be happening here? So is there some advantage of having everything on the main domain or not? This is, I don't know, one of the most controversial questions that always come with regards to SEO. Should I use a subdomain or should I use the main domain? And in talking with our engineers and our quality team from their point of view, it's always like we treat these the same. The effect that you're probably seeing here is that we do look at websites overall a little bit to understand the overall quality of a website. And that can happen that we say, well, a subdomain might be something that's more like a separate website, but it might be something that's more like a part of the main website. And that kind of difference there could be something that might be playing a role there. So maybe, for example, if the blog is not seen as something that is really as wonderful as the rest of your website, then maybe that blog is something that's ranking a little bit lower. On the other hand, if we see it similarly with regards to the quality overall or with regards to being one website, then it will generally rank in the same position. So we see people all the time complaining that I moved it to my main domain and it's ranking better, or I moved it to a subdomain and it's ranking the same way. And both of these are essentially legitimate approaches. I wouldn't say that you need to go one direction or the other. Sometimes there are technical reasons why you want to go to a subdomain. Sometimes it makes things easier for maintenance and kind of long-term tracking to have everything in one domain. So from my point of view, there is no magical answer to this question. There are lots of opinions, and there might be a lot of practical reasons that kind of drive you in one direction or the other. And I would focus more on the practical effects. Let me just see here. All right. Question in the chat. In our website's homepage, we have code differences between mobile and desktop versions. For example, we call images in desktop with a div class. And in the mobile version, they're an image. Because of this, we can't use the image alt tags on desktop unlike on mobile. Could that harm our SEO? Maybe. So that's something where from our point of view, we would probably not see those images if they're not embedded as clear images within the web pages. So if you're using CSS to embed the images, then probably we wouldn't be able to pick those up. That said, we're shifting pretty much most of our crawling towards the mobile versions of the websites using the mobile-first indexing. So perhaps your site is already shifted over to mobile-first indexing. And if so, we would be using the mobile version as the basis for all indexing anyway. So if your mobile version is good and you have all of the content there, the images are linked properly, and we've shifted to mobile-first indexing for your site, then you should be all set. That said, other search engines might not be using the mobile version. And they might be looking at the desktop version. And if they can't pull out those images, then obviously they won't be able to show those images properly in their search results either. And similarly, also social media sites like to embed images from a site when you share that URL. And if they can't pull out those images because they're not clear image tags, then they also would have trouble with that. So for Google, it's not a problem, right? This kind of code differences may harm our SEO or not. I wonder that. If we shifted to mobile-first indexing and the mobile version has images, then you're OK. OK, thank you, John. Sorry. All right, let's see. I recently started seeing double-featured snippets in search. Is this a test? Is there any data behind it? What's happening here? I don't know. So I would assume that this is more a matter of us trying different things out to see how users respond to that, to see where we can show the value of these websites a little bit more visibly. These things change all the time. So these are search features where teams are always working on them to try to improve them, to try to show them in ways that work well for users, where we try to balance providing information in search and also encouraging them, of course, to go and visit the website directly. So that's something where I would expect there's never going to be a stable state where we say, the search results are finalized and nothing will change here with regards to the size and the shape and number of lines of text, all of these things. I think these are always going to be things that we will try to improve because what works now will probably not be the same thing as what works in the future. So we constantly have to work on this. And it's something that I'd also recommend that other sites continue working on that as well. So don't just assume that because your website is doing well now that you'd never need to change it. Follow up regarding a question from earlier or one of the previous ones regarding a site showing up in Europe but not in the US. I'm curious if content is judged on a page level per the keyword or the site as a whole. Only a subsection of the site is buying guides and they're all under a specific URL structure. Would Google penalize everything under that URL holistically? Do a few bad apples? Drag down the average. So usually the word penalty is associated with manual actions. And if there were manual action, like if someone manually looked at your website and said this is not a good website, then you would have a notification in Search Console. So I suspect that's not the case. Otherwise, you probably wouldn't be asking like this. In general, when it comes to quality of a website, we try to be as fine grained as possible to figure out which specific pages or parts of the website are seen as being really good and which parts are maybe not so good. And depending on the website, sometimes that's possible. Sometimes that's not possible. We just have to look at everything overall. So it might be that we found a part of your website where we say, we're not so sure about the quality of this part of the website because there's some really good stuff here but there's also some really shady or iffy stuff here as well. And we don't know how we should treat things overall. That might be the case. But again, it's really hard to say without actually looking at the website. So I really kind of recommend that people go to maybe the Webmaster Help Forum and get advice from peers like how would they would see this? Do they see problems with the content? And be as direct as possible if you want really actionable feedback and kind of point out the content that you think is good but also point out the content that you think might be bad, might be improvable. And really kind of try to tease out some objective feedback from peers about your website overall. And really kind of try to tease out some of the good feedback from peers about your website. OK. All right. A few websites have started scraping my content and have been publishing them. We tried to contact their hosts for a DMCA takedown without luck. Those having my content scraped and republished hurt my site. Should I disavow these URLs? So from our point of view, other sites copying your content wouldn't be something that would negatively affect your website. So that's a very common situation that sites copy content. If you do find that, then I would look into the legal approaches that you can take. Maybe you can also contact the site directly to have that resolve. In general, that's something where there are ways to approach that. So I try to look into that. That said, if you're not seeing those copies showing up in search for the queries that you care about, then it might not be the highest priority to focus on. Hi. Hi. Or please allow me to ask, does sites that tend to talk about movies and TV download sites get demoted? No, those are fine. That's from our point of view, it's not the topic of a website that makes it good or bad, but really what you write there. So if you're creating really good content about movies and TV sites and even download sites, then that's fine. Because sometime back, I think I had issues with my blog rankings and I posted the issue on the Google Webmaster forum and I was sure that my sites had content that relates to downloading movies and whatnot. And that could be the reason why there's a ranking decline or something like that. I think the main thing you'd want to watch out for is that you're not providing content that essentially belongs to other people, so that falls into the DMCA category. So for example, if you were providing movie downloads directly and you weren't authorized to do that and the movie studios come to you and they keep submitting DMCA complaints saying, you're not allowed to publish this because this is our content, then that's something that could obviously play a role there. Because then those pages would not show up in search and overall that could be affecting your site. But just because you're writing about this content and if you're writing about it in a way that is useful for users, that's generally fine. Yeah, I do not know why, but I just noticed that whenever I write content that has to do with movie sites, it's a tech site actually. But sometimes I tend to incorporate how to download for a particular site and whatnot. And I don't know, the site runs very well. And I'm sorry if you say about 14 days traffic declines and all that stuff. So it's kind of like putting me in this position where I don't understand what I'm doing wrong. Yeah, that doesn't sound like an issue specific to that kind of content. That seems like something where our algorithms are maybe just on the edge and not completely sure how we should show this website in search. And sometimes we show it fairly visibly. And then maybe some signals add up and we say, well, we're not sure. Maybe we should show a little bit less visibly. So my general recommendation there is to continue working to make sure that all of the signals really align and say, this is a fantastic website. So with the tech website, one of the things that I've seen over time is that you tend to collect a lot of old information that's not so useful. So maybe it's worth going through that and seeing how can I show that all of the content that I have indexable is still relevant and useful for users. OK. Do you recommend we delete this old post or we rewrite them or something? That's up to you. So generally, the quality teams at Google say it's better for webmasters to improve the lower quality or older content. So maybe it makes sense to rewrite them. On the other hand, if you have a lot of content like that, it's kind of impractical to rewrite everything. So maybe it makes more sense to just put a no index on this and to say, I will focus on better content in the future and try to avoid the situation that I collect lower quality content. All right, thank you very much. Sure. All right. Let's see. Here's my site. Not all of my images are indexed in Google. What could be the problem? I took a quick look at the site that was linked there and I see a lot of images that are indexed in Google, in Google images. So I think overall you're doing things right. The thing to keep in mind when it comes to images or web search in general is we don't index all content of all websites. And that's completely normal. So if you're seeing that in general images are being indexed, then we're probably trying to get as many of those images indexed as we can. But that doesn't mean that we'll index all images on a website. So my feeling is that everything is kind of working OK here and we're picking up enough content that we think is relevant to show in the search results. In the new Search Console, where can I submit a link for indexing? You can do that with the URL inspection tool. So right on top in the new Search Console, you have a field where you can enter a URL. And you can enter a URL from your website and click the, I don't know what it's called, the button to test that URL. And from there, you can do a submit to indexing. So first you need to test that URL and then you can submit it for indexing. Can you say something about the crawling issues Google is facing recently? I'm not aware of any specific crawling issues. So I know we had some indexing issues a while back, but all of those should be resolved. I'm not aware of anything specific regarding crawling at the moment, but it makes me worry that maybe I'm missing something, so I'll look around. My question is on URL structures. Would a URL of services, my service, have the same ranking as slash my service? Or do higher level pages have a ranking boost? There's no special ranking boost for being higher level in the URL structure. You can structure URLs however you think is useful for your site. So we do look at things with regards to where they're linked within the website. For example, usually the home page is one of the pages that is most commonly linked from external sources. That means the home page is probably the page that has the most signals, the most page rank associated with it. And if you have pages that are linked from the home page, then those are really easy for us to find. If they're linked only several steps away from the home page, then that's a lot harder for us to find. And appropriately, the signals that we have from the home page, they kind of get diluted over time as they get further away from the home page. So if your services page is something that's only linked under this general set of products that you're offering, and then a category, and maybe a subcategory, and then there's a link to the service page, then that means it's pretty far away. So it's a lot harder for us to pass signals to that page. And that's independent of the URL structure. On the other hand, if that's linked directly from the home page, then we can pass signals right away there. So again, that's also independent of the URL structure. So I would worry less about how you structure the URLs and look more into how you're connecting these pages within your website. How easy are they to find? How easy is it for Google to pass signals to that specific page? And with regards to the URL structure, one thing where I would look into this and kind of keep this in mind is when it comes to monitoring the performance of your pages. So in particular, if you have a lot of pages that are of similar kind, for example, product pages, then maybe it makes sense to have a similar URL structure for those pages so that you can filter those out, both in Search Console and in whatever analytics tool that you're using, that you can look at specifically your product pages and see, overall, how do these pages perform compared to maybe your category pages? So that's where a clean URL structure kind of makes sense. But just with regards to SEO and how the signals pass, it's more a matter of how they're linked internally than how you're specifying the URLs themselves. A recent change by Gary in the crawl budget blog post with regards to kind of the embedded resources and XHR calls also being taken into account for crawl budget. So this isn't really a recent change. This is essentially how we have to balance the crawling of content. And when we render pages, we take into account all of the embedded resources too. And that's kind of, I would say, expected because we have all of these requests that go to this server. And we need to make sure that we're not overloading the server, which is mostly why we're doing this with regards to crawl budget. So it's not that we're trying to artificially limit things, but we really just want to make sure that we're not causing trouble on your server. So with that in mind, the question goes on, should I move these to subdomains instead of keeping everything on the main domain or not? From our point of view, when it comes to crawl budget, we try to understand what the server is overall. And if we see that all of these subdomains, for example, are essentially a part of the same server, then we'll treat that group as one server. And all of the requests that go to that one server. So that's something where if you just set up separate subdomains on the same server and you serve the same content, you're not really changing anything because the same server has to process all of these requests. So from our point of view, I don't think that would change anything for crawl budget in particular. On the other hand, if you move some of these resources out to CDN, for example, if you host your images on a CDN, if you have videos on a CDN, if you have other static content on a CDN, that can make a difference because we can see that these are clearly separate servers and we can balance the crawling across those two sources completely independently. Also for users, that makes sense because usually these pages can load a lot faster if the static content can be pulled in from a setup that is optimized for static content. That said, a lot of people focus on crawl budget that don't really need to worry about it. So in particular, smaller websites, we can probably crawl everything on your website over in a couple of days. So that's not something that you really need to worry about. Crawl budget is really more of an issue when it's a really, really large website when we're talking about hundreds of thousands of pages or millions of pages. And all of these millions of pages have 100 embedded resources, then instead of like 1 million pages, we have 100 million requests that we have to do to the server. Then that does make a big difference if we're limited to maybe, I don't know, 10,000 requests a day and you have 100 million requests that we would need to render all of these pages, then that does cause kind of a bottleneck. On the other hand, if you have 1,000 pages and each of those 1,000 pages has and the 100 embedded resources there, then we're talking about 100,000 pages. And for the most part, that's a matter of two or three days crawling of a website for us to actually get everything if we really needed to get everything right away. So from that point of view, I'd say this is a really cool theoretical topic, crawl budget. But instead of focusing on this for smaller websites, I'd focus on other things instead. There's lots of other stuff that you can do with regards to speed to improve things overall, with regards to quality and the website in general. So especially for smaller websites, focus on other topics. If you're a really, really big website, if you're working on a big e-commerce site, then obviously managing the crawl budget is something that makes sense. And double checking your server logs to make sure what is Google actually crawling and how frequently is it updating these files and how do we have to maybe perhaps combine different resources so that not as many requests are needed to render these pages, all of those things are things that you can optimize. Which speed metrics are most important for crawling and indexing? So we talked about speed just briefly before. I think you mentioned crawling and indexing here. So that's one aspect that is slightly different than the general speed topic. So in particular, what I would look at here is the difference between speed when it comes to crawling and indexing versus speed what a user would see in the search. Well, not in the search results, when they visit your page. Those are completely different metrics. And they can be worked on in different ways. There is some overlap there, of course. But in particular, when it comes to crawling and indexing, we need to be able to request the HTML pages as quickly as possible. And from those HTML pages, we can crawl the links on those HTML pages. And the speed of the server, how quickly it responds to those HTML page requests that flows into the crawl budget discussion that we talked about before. So that's, again, something that kind of makes sense, especially if you have larger websites. Then it's important that you can serve the content fairly quickly. That said, we do also render pages, pretty much most of them, when after we've crawled them. So we also need to render them in a way that a user would render them. So reducing the number of embedded resources, making sure that these pages render quickly, that you don't have any render blocking JavaScript, those kind of fancy things as well, those are all important for indexing as well. But from a user point of view, you might have some things that are slightly different there. So in particular, things around caching, things about how a page is actually rendered over time, like which parts of the page are blocking. Those things are aspects that you take a look at there. From Google's point of view, we use both kind of lab measurements, so theoretical measurements of how quickly a page could load, as well as kind of practical data that we've collected through things like the Chrome User Experience Report, where we get feedback from users saying this page is fast and this page is slow. And you see both of those using the PageSpeed Insights tool. So I would double check that to kind of see. But again, it's worth understanding the difference between the user side of speed and the pure crawling side of speed. And depending on where you suspect the problems would be, digging into those aspects individually would make sense. Does a change in domain still work? For the past two months now, we've changed our domain to a more branded name. Until now, we've not recovered a ranking. We've done 301 redirects, told Google the URL has changed. Everything still doesn't work. We're still on the bottom of the page, and we're normally ranked in the second position. So from my point of view, all of these domain moves, they should continue to work. I have seen someone else ask about this. So it's something I looked into a little bit as well. But I don't have any clear information saying that something is stuck or not working. I would imagine that site moves happen all the time. And if there were a general issue with site moves for the last two months, there would be a lot of people complaining about this. So my suspicion is that maybe there are some aspects with regards to the way that you've set up the site move that are not working as well as they could. So that could be things like maybe some redirects that are left out. It could also be something like you're moving to a domain that has a lot of problematic history associated with it, where it takes a while for our algorithms to figure out what actually this is a completely new website on this old domain. And we need to first re-evaluate the website overall, rather than just say, this is the content on this old domain that has this problematic history attached to it. So both of those things might be things that you'd want to watch out for. I would recommend maybe posting in the Webmaster Help Forum or something like this, detailing the old new domain so that people can take a look there to see is there maybe something technical that you missed, maybe going through all of these things. Site move is not always trivial to do. So missing something technical, that can happen. Or is there perhaps something associated with the old domain if people look at archive.org, for example, or looking in their various SEO tools that people have? Maybe there are some things associated with the old domain that you'd need to take care of either, or where you just need to understand, well, maybe this takes a bit of time for things to settle down for Googlebot to understand that you're not associated with perhaps the old problematic content that used to be hosted there. All right. I think we pretty much made it through here. Let's see. Then there's another question about links and the disavow file. So the way the disavow tool works is that we take those into account immediately when we recrawl those URLs that are specified in the disavow file. So as we reprocess those URLs, we take the disavow file into account. And if those are listed there, then we drop those links from our graph. It works in multiple ways. We don't remove them from the links report in Search Console. They're still shown there. So that can be a little bit confusing. But essentially, we take those into account immediately. If you have a manual action and you submit a reconsideration request after submitting the disavow file, then that would be taken into account immediately. Then the web spam team would see you clean this up. That's OK. OK, thank you, John. Sure. All right. Any other questions from you all? I have a question. Awesome. Go for it. So with the sunsetting of Google+, it's kind of hard to reach you these days. And as part of what we do is we basically monitor different analysts and different people all day long. And in the past 24 hours, I would say we saw some very questionable behavior that I want to alert you to. Mostly it was outlets rewriting years old content and adding very clickbaity headlines to it as if it were new information. And unfortunately, some of this really kind of surfaced quite a lot in the past 24 hours of which I have examples I want to really pass on. OK, sure. You can probably just send it by Twitter. Or let me just drop my email address here. It should be in the chat. All right, great. And just send it my way. And I can't promise that we take care of that and just throw them out. But it is something that the team does care about with regards to quality. So they'll take a look at that and see if from our quality algorithms we need to refine things a little bit. This kind of feedback, especially when it's timely and something new just happens, is really useful. Yeah, and by the way, it has actually sort of nothing to do specifically with what we do. It's just like in the daily process of seeing this, it's like, ooh, that's not so great. Yeah. No, OK. Thank you. Sure. Yeah. Can I ask you a question? Sure. I'm interested about the morning I read Google guideline, general guideline about anything else. I stuck with this point, 2.3, your money, your life. And for example, I got that it depends only on your content. And for example, if you propose the same service. And but I didn't understand what you need to do. For example, in other resources, a phone that you need to have a page about us with some photos, with some proof information about your expertise to make this transfer and stuff like this. And about, for example, if you write some blog post, you can make a page of offer, you know? And what can you tell more about this? I mean, to explain more time about your money, your life, how I can provide for my website. Yeah, I don't think we have any specific details on what you could be doing there. So that makes it a little bit tricky, because the guidelines that you're referring to are for the quality writers. So it's not that we would take that one-to-one into account. Then we would say, you need to have this page and that text and this thing on a page. We don't really have that. However, there are a handful of people who really looked into this topic and who have found a lot of really interesting examples where they see this site was affected by one of the recent updates. And they changed things around by doing this, this, and this. And I think two names that come to mind are Marie Haines. She has looked into this quite a bit. And Lily Ray has looked into this quite a bit. They've also done presentations on this. So you can probably find presentations from them with a lot of examples in them to take a look at to see what are other sites doing. And then you can think about maybe this is something that you should be doing as well. And it's not so much a kind of SEO tactic that you need to do it like this. It's more a matter of understanding what users would look for when they try to judge the authority of a site and then actually providing that information. And sometimes when you see what other people are doing, you're like, oh, obviously. I should have noticed this right away. But when you're stuck looking at your website, you're often like, I have a very narrow mindset. So seeing how other people are doing things, I think can be really, really insightful there. And I take a look at their blog posts and presentations. Can you type this name of presentation? I don't have the direct links. Oh, just the names. I can just post them in here. And I'm sure there are other people that have done really good stuff in this area as well. But those two have stood out recently. All right. Hi, John. Hi. I got a question. I posted it on the chat. I launched a blog about two months ago, a blog that teaches the basics of blog and digital marketing. And I also posted an original content with 1,500 words posts that is easy to read. It's been optimized for SEO, et cetera. But it does not rank for any keyword. Plus you can't even find it when you type for the title even. So it doesn't rank in any page, even not even on page 20. And we haven't been flagged or penalized or anything. And I'm just wondering what the issue is and how it's sold at this point. That's really hard to say off the hand. It sounds like it's indexed. So if you look for the URL, it's there or? It is, it is, it is. But yeah, so it's been indexed. But I mean, it won't rank at all on any of the pages. I've also added the URL. But it just seems odd because we've got another post that does rank. It does rank really well, but it does rank. And this one just doesn't at all. I don't know. It's really hard to say without looking at the page itself. What I've sometimes seen is when you focus on a site too much, you end up essentially keyword stuffing the content. And at some point, our algorithms will say, oh, these keywords are so repetitive here that we don't really think it's worth showing this page at all. So if you've really over-over-over done things with regards to optimization, then maybe you're running into that. And if you have other blog posts on the same site that are working in Search, then maybe compare those two and see, am I going too far here, or am I still kind of similar to the other blog posts that I have? OK, but it's not a site related problem. They were just just pointing on it, writing more books. Yeah. OK, all right, thank you very much. All right, we're getting close to the last question from you. May I ask another question? If I use Chrome version 41, may I see the websites with eyes of Googlebot? I wonder. Similar, but not exactly. Not exactly the same, but similar. Yeah. Yeah, I would generally avoid using an OVL Chrome just because of all of the security issues. So that's something, if you have that on the side and you just use it to double-check individual pages, that's fine, but I would not use that to browse. That's why you do all these updates. What would you recommend me to use for this purpose? I would use mostly the mobile-friendly test to see how Google actually looks at the pages. With the mobile-friendly test, you don't need to have the pages verified, so that's really easy. The Inspect URL tool is another one that is really useful for that. For that, you do need to have the website verified. OK, thank you, John. Sure. All right, let's take a break here. It's been great having you all here. Thanks all for joining in. And for joining in at, I don't know, surprising hours of the day. Thanks for dropping by. I wish you all a great weekend. And if you have a chance to tune in next week for Google I.O., feel free to join those live streams. Or if you're there in person, make sure to drop by and say hi. And otherwise, I'll set up the next batch of Hangouts. If anything is still on your mind, feel free to drop those questions there. Thanks, everyone. Thank you, John. Thank you so much. Have a good weekend.