 All right. Welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I'm a webmaster trends analyst here at Google in Switzerland. And part of what I do is these webmaster hangouts, together with webmasters and publishers, like the ones here in the Hangout today, and the ones that submitted a bunch of questions already. This is a special set of Hangouts focusing on individual topics. Today's topic is common questions that we've seen, misconceptions, those kind of things. I just put together all of these topics a couple of days ago, so people might have missed that there's actually a Hangout today. But that's fine. I guess this is more for those of you who noticed and who want to jump in and ask some questions. All right. So I have a bunch of questions that I collected, that I just wanted to run through very briefly before we get started. If you two have any questions along the way, feel free to jump on in. Otherwise, we'll move over to the submitted questions afterwards or to your questions if there's anything that you'd like to get inside on as well. All right. Where should I start? So one of the common themes I hear around about these Hangouts, about these office hours is you can't believe whenever Google says something, you can't really believe them because they're just looking out for themselves. And from my point of view, this is definitely a myth. There is no reason for us to bring information out there on how to make a website technically well. That doesn't make sense. So this is really information that we put together with the engineering teams, especially when it comes to technical websites, making websites so that they can be crawlable and indexable. There's a lot of information out there. There are lots of misconceptions out there. And we want to help clear that up. So these are things that you can definitely trust us that we look these up with the engineers to double-check what we're talking about. May I ask something here? Sure. Isn't it possible or something you ever did? OK, it's much better now. OK, something you want to implement, but changing algorithms more. We want people to do things in a certain way and you want to fix that algorithmically at some point. But technically, it's not done yet. But you still say it, do it like that, knowing either that it will be done at some point soon. So it will anyway benefit webmasters that do it right from the beginning. Or maybe you just do it hoping that many people will hear doing it and eventually your rankings will be better because of these people, all of our things is not technically yet fair. Have you done things like that? Or everything you say, it's already in the algorithm. So things can change over time. I think that's good to call out as well. Sometimes things that we discuss with regards to technical setups, they can change over time. So a really obvious one at the moment is, for example, the switch to mobile first indexing, where we're going to start indexing the mobile page instead of the desktop page. And that might result in some technical changes that you need to do on your pages. And these are things that are long discussions that happen long before we actually can announce it, obviously, because we don't want to just randomly announce things that we aren't really sure about yet. So those are things where maybe we will be telling you, you should do this, and this, and this, where we know in the future maybe those steps aren't necessary or aren't as critical as they are at the moment. But we were telling across things that you should do this, this, and this, where we know. We bring across things that are good practices that should be done anyway that definitely don't cause any issues even going forward. I see. What about things that are one where we are site? Google Webmaster Tools, recommendations, good practices, are those updated? Because you, for example, reference us a lot of time to that 21 or 23. I don't even remember that things. Blog post, which is from what, 40 years ago or something like that? I mean, do you keep the information updated or people should basically just following the different sources to see what's still true now, what was changed? Yeah, so we try to differentiate between things that are kind of the current state, which includes things like our Google Plus posts, our videos that we put out, the blog posts that we have. All of these reflect the current state. And then we have the Help Center, which is kind of the steady state that keeps being updated. So things on the blog posts, we won't necessarily update the blog posts whenever there are things that are changing. But we will make sure that the current information is reflected in the Help Center so that any time you go to the Help Center, you get the correct information. Obviously, sometimes we don't keep up or there is something small that we might have missed. But these are things you can tell us about, and we will fix those, or we will find out about them anyway and fix them in the Help Center. So you have close to 99% up to date? I would say more than 99%. I trust a lot that we keep up there. And I know that the writers in the Help Center, they prepare all of these things so that when we announce them, it's ready to go live in the Help Center. Sometimes translations are tricky, and there is a little bit of a delay. But for the most part, we can keep that up to date. With blog posts, we can't always update the blog posts. I think that's really hard to do. What we do sometimes do is put a label on top of the blog post and say, hey, this is a recommendation that we had in the past, but it's no longer valid. So it's kind of like a disclaimer on top, saying, hey, this is the old blog post, but keep in mind, this is not the case anymore. Like I think on the Ajax Crawling blog post that we initially did, we've added a label on top now. I'm not completely sure about the Ajax Crawling one, but we do this from time to time. Anyway, it's awesome to see, to know that the Help Center is great and up to date. That's great, because we at least have a reference point where we can go and find the latest information. Exactly. Thank you. That's definitely your goal. OK, so on to more myths and misconceptions. Duplicate content will sink your website or cause your rankings to drop or something else crazy. For the most part, we can handle duplicate content really well, especially if you're talking about things like www, non-www, HTTP, HTTPS. All of these things are things that we've seen for years and years. And our systems can deal with those really well. And if anyone comes to you and says, hey, your website has duplicate content, and Google will remove it from search, because you have slash index.html or just slash at the end of your URLs, and they both work, then that's definitely not the case. For really, really large websites, duplicate content makes it harder for us to crawl, but it's not going to make your site disappear in search. Kind of related to that, I have a bunch of these around crawling. Ignoring your crawl budget will cause your site to sink. So at Google, we don't really have this notion of crawl budget the way that people are talking about it externally. We hope to have a blog post about that sometime soon, probably later this year, to explain a bit more about how we handle crawling. But for the most part, if you have a reasonably sized site, then you don't have to worry about this. You don't have to hide internal links. You don't have to mess with no index on individual pages because this page could be using a crawl budget or mess with canonicals or those kind of things for any reasonably sized site. Once you have sites that are several thousand pages large or they're dynamic and they generate an infinite number of pages, then obviously you want to make sure that your server can handle the load. This kind of goes into the next point. More crawling is better. It's another one of those common myths that we see. People go to Search Console and they submit to indexing or they submit sitemap files and resubmit them every day to try to force Google to crawl their pages faster or more often. And that's not going to change anything when it comes to search. So that's not something that you need to manually tweak. More crawling just means more crawling. And the content we do have indexed, we can show in Search. But just because we crawl it more often doesn't mean that we will rank it higher. May I have one more follow up? Sure. It was to the question, to the thing you said before about the link budget, but not really about the crawling budget. More about internal links with no follow. It's something which at some point years ago was done quite frequently, link sculpting, where you want to redirect the link juice to certain pages on your site. I never did it, and I never considered that it carries a big important. I always thought that internal links are just ways to go from one page to another. Maybe say one page is more important, but not really restrict pages because you want to have only three links from the main page to internal pages. Does it ever make any sense? Does it make any sense today to do something like let's use no follow or some links to maybe have our page more powerful or it's just a myth? The link sculpting part I don't think has worked in the way that people have assumed for a really, really long time. So that's something that changed at some point in our algorithms. And it's definitely not something that I would encourage people to worry about. For the most part, the effect I've seen is that people try to do link sculpting, page rank sculpting by, for example, no following the links to their terms of service or to their Contact Us page or things like that. And from our point of view, this makes the website even harder to understand because we're really used to this model of, well, these Contact Us pages, the Terms of Us pages, our Terms of Service pages are pages that we understand are linked from a large part of the website. And we can deal with that really well. But if you're kind of hiding that with no follow or hiding that through even crazier methods, then that makes it harder for us to understand that actually this is a Terms of Service page and we can probably ignore it for the most part. The one place where sometimes it does make sense to look into things like no follow is if you have things like infinite calendars where you could always click next year, next year, next year, and it'll say, oh, here's the year 5,200. And here's this weekly calendar of no events that are planned. That's something where having a no follow on those links can kind of help us to avoid clicking through to infinite number of pages. So that's something where you might look into that. But with regards to page rank sculpting for individual pages, that's really not something I'd recommend but putting any time into. And regarding internal linkings, is it true that Google trusts lesser in internal linkings than external links? I don't think you can say that. So that's something that we look at links in a variety of ways, but it's not that I would say by default internal links are worse than external links. OK, thanks. Another, let me run through some more of these and we can open up for more questions from you all afterwards as well. So another one that I often hear is around crawl errors. Or this is still crawling. I see a bunch of crawl errors in my Search Console account. I need to fix these, otherwise my site will be seen as being bad or low quality. And that's not the case. We see crawl errors as a technical response code that you return when we try to crawl a page that doesn't exist. And it's perfectly fine to tell us this page doesn't exist. It's not something that we would count against the website. It doesn't mean the website is lower quality, even if there are millions of crawl errors on a page, on a website. So that's something that looking at those crawl errors and double checking that there aren't any there that you care about, that's definitely useful. But I wouldn't focus on those crawl errors and say, oh, this is something I need to fix. Going off into more technical kind of things, JavaScript can't be crawled, single page apps are bad for SEO. Both of these are no longer the case. They haven't been the case for quite some time now, where it's a couple of years at least. We can render JavaScript. We can process most JavaScript the same way that a browser can do that. Similarly, we can access single page apps in the way most browsers can do that. So these are things that we can crawl and index fairly normally how a browser would do. Specifically around JavaScript and single page apps, there are obviously things you need to keep in mind. So that's something you'd probably want to look into the details of what it is that you should and shouldn't do. But for the most part, just because it's JavaScript or just because it's a single page app doesn't mean it's bad. John? Yes. Can I ask you a question? Sure. What about table? Let's say I have a website and I want to, instead of writing everything, I can put it in a table and I can present it in a nice way. So the users will understand it better. But what will happen to the SEO part if I do that? The content will be perfectly fine. That's perfectly fine. Tables are fine. CSS is fine. If you use spans or divs, that's perfectly fine. With tables, one thing to watch out for is that we can still understand the context of the text that you put in there. So if you have a table and you have each line in a separate cell and a table, then it's hard for us to understand that this is actually one paragraph of text. But if you have one table cell and you put your paragraph of text in there and you use tables as a kind of a layout helper, that's perfectly fine. Tables don't always work that well on mobile. So if you have a lot of mobile users, you probably want to watch out for that. But it's not that we would say tables are bad for SEO. Can I ask a question? It's 7 AM. I know you want to keep with your presentation, but I have to take my kids to school soon. It's 7 AM in California. Oh my gosh. You're crazy. OK, go for it. So how is it that sometimes Google shows search results pages for websites? When you look at the SERPs and you click on one of the blue links, it takes you to a search results page for that website. If the Googlebot is not actually typing in search terms into that website's search box, how does it give you a URL that's a search result? We probably find them normally through crawling. So sometimes what happens is within a website, search results will be linked, kind of like tag pages might be. And if we find a link to one of those pages, we might crawl that page. And if we can index it, then we might show that in the search results. So it's not that we're going to these pages and guessing the search queries and trying to kind of randomly create URLs to see what happens. But we found these links somewhere and they seem to work. They give us content, so we index that content. So somebody somewhere created a link to that search results page and Google indexed it? I mean, it might not be that they generated that link manually. And like I said, oh, for this crazy combination of URL parameters, I'll put this link into a forum and Google will try to crawl it. It might be that they have a different kind of scraper or crawler and they stumble across those. A lot of cases, these are even linked internally within a website. OK, I want to just say at some point that if you don't find the normal navigation menu, sometimes do these kind of searches by yourself. I mean, not you, but. That's specific to kind of discovering new content, not necessarily to find content to index. But if we can't crawl the website normally and we can recognize there's this kind of a search form there, then we might try to guess to see if we can find links to the actual content. Not that we would take the search form and index it like that, but we would kind of take the links that we find in the search results and follow those. All right. I have a question. Actually, we have a website, which is like in India, we have nine languages. And all over the world, we have some 60 plus languages. And the user point of view, they will put a review in English. And how can I show that in the different languages in the other part of other countries? That is what my question is. OK. So what you could do is provide the translated version of the content. And we can pick it up like, well, not of the content, of the kind of the boilerplate of the page. And that's something we can pick up as a separate version of the page. You shouldn't be using automated translations, especially not for user-generated content. But what I would recommend is in one of the next topical hangouts, I think maybe end of next week, we'll have a session just on internationalization. And I would bring this topic up then or put that into the comments for that hangout so that we can look at that in a bit more detail. That's fine. Actually, the thing is whenever we got many reviews from all the different parts, then Google considers as taken when it translates to English, Google considered the English language or a different language. I would look at those in the internationalization hangout because there's some unique things to kind of watch out for there. OK, that's fine. Thank you. All right, so just a few more. I don't think I have that many more. So patience. Doing site moves with fancy methods like just setting a rel canonical is better than using redirects. This is another myth that we sometimes see where people will create these big blog posts and say, we did this fancy site move using rel canonical or using, I don't know, robotic redirects or something crazy. And from our point of view, this is actually more of a problem than that it would help your website because with clean redirects, we can easily recognize that the whole website is moving from one domain to another, that it's a one-to-one type move situation and we can crawl a little bit faster to recognize all of those redirects and we can move all of the signals that we have attached to your site a lot easier than if you're using something fancy that we can't recognize as a clean site move. So from that point of view, always use your 301 redirects when you're doing a site move. And when it comes to redirects, we sometimes hear that using 302 redirects will actually cause your site a lot of harm. And that's definitely not the case. So we differentiate between 302 and 301 redirect in the sense that 302 is, by definition, a temporary redirect, where the original URL is the one that we try to index. And the 301 is a permanent redirect that means the destination URL is the one that we try to index. So nothing gets lost. It's just a matter of, is the original URL index or is the destination URL indexed? So that's kind of the main difference there. If you join temporary redirect and you keep it for the long run, like kind of treat it as a permanent redirect, then we'll also start treating that as a permanent redirect and try to index the destination URLs instead. Yes. Just to follow up on this, is it better from Google perspective that I should have the same content while site moving and then work on content of that website? Or it is also suggested by Google that I can just publish better content on my destination website? If you're doing a site move, then we expect that content to be available on the new page, under new URLs. So with a site move, when you do a redirect, you can't keep two copies of the content. You basically just have the final destination content. And obviously, working to improve your content is always good. But this is something I wouldn't necessarily combine with a site move. Just try to keep it simple and try to focus on one step at a time. Thanks. All right. This kind of goes back to the original question that Bogdan asked in the beginning. It's like, what about these 24 questions or 23 questions, blog posts from Amazon Gal? Is there this one technical trick that I need to do to make sure that my website is seen as a high quality website? And that's definitely a myth. So it's not the case that you need to have 325 words on a page. And we will see that as high quality content. It's not the case that you need to have 1.2 factor for keyword density in order for us to recognize that this page is high quality for this specific keyword. There's no technical trick to making a high quality website. But rather, a high quality website has a lot of different factors that come combined and they work together. When users look at your website, they're not going to count the words to understand is this good content or not. They're going to kind of take in the whole picture of your web page. So there's no simple trick to kind of making Google think your page is our high quality. Similar to the question about tables. Can I interrupt you? So what kind of information do you gather when you determine whether this is a high quality or this is low quality? That's really hard to say. So we try to extract a lot of signals about the website to understand what is the value of this website. How is this high quality content? What is the context of the pages on here, the relevance of these pages? And we take a lot of things together. So it's not, we don't have an algorithm that says, look for this meta tag. And if this meta tag is there, then that's like a plus point. Look for this sign that they're not using tables. That's a plus point. There's no simple technical trick for that. And you can sometimes see that even in cases where the whole web page is actually blocked by robots text, where we can't look at the content at all. And yet, that page might be ranking number one for specific search queries. So that's something where we don't even know what is technically on this web page. But we know that this is something that appears to be really important and really relevant for some queries. So you must be considering the bounce rate and all that, no? We try not to use user signals like that. So those are things that, especially if they're analytics, we don't have access to anyway. We use that to improve our algorithms, of course, to see if we make this change or if we make this change, where do we see that users are happier, like any other website would make A or B tests. But we wouldn't use that on a per-page basis. So sorry, I'm actually using a Pinterest at all. Pinterest, I'm using that. But the thing is, Pinterest users, they are coming to see pictures. So I have noticed that they click into the website, and my content is really good. But the thing is, they are coming to see some pictures. I don't have enough pictures. They leave very quickly. So would it be a problem for my insurance? Well, if you don't have the content that people expect from your website, be it pictures or text or whatever, then that's something where you're probably not a great match for those users. So they're probably not going to come back on their own. They're probably not going to recommend your website to other people. It's not like that. My website is actually a travel agency website. But I'm always linking to these things into a blog page, not the commercial site. But the thing is, I cannot put a lot of pictures like a website, which truly focuses on pictures only. But I'm using Pinterest. So as soon as they see all the pictures, they go back to where they were before. So that kind of situation. I mean, that's ultimately something that we probably wouldn't notice from a technical point of view, but we would notice indirectly. If users are unhappy with what you're providing on your site, or if they're very happy with what you're providing, then that's something we try to pick up indirectly through things like links to your pages. Just four more. So don't worry. We'll have lots of time for more questions as well. Valid HTML or HTML5 is required for Google to rank your pages. And kind of like I mentioned before, with regards to tables, that's not the case. Valid HTML, in particular, is something that we sometimes see people testing and saying, oh, that's not something that we would focus on when it comes to search. So Valid HTML itself isn't something I'd really worry about. However, if you're using structured data, then having valid HTML, that can make it a lot easier for us to pick up the structured data properly. Especially if you're testing things out with structured data, then Valid HTML makes it a little bit easier. It's not necessarily a ranking factor, but it makes it easier to implement things properly. So moving on to structured data, schema.org, use of structured data is a ranking factor. Because if you have structured data, then obviously your site is high quality. That's definitely not the case either. This is something we see every now and then as well. Similarly, AMP is not a ranking factor. From our point of view, these provide good user experience, so AMP in particular. Structured data itself helps to improve the snippet that we show in the search results. It can result in things being shown kind of as a rich card even in the search results, where there's a lot of additional information, which makes it easier for users to recognize, is this the right page that I want to click on or not? But it doesn't change rankings. Outbound links on a site. Are they important or not? From our point of view, outbound links are not a requirement for ranking. If you put outbound links on a site, and that's something that users like, then you're providing extra value for users, and that's a good thing to do. That's not something I'd kind of discourage. But at the same time, just by putting a high quality outbound link on a page, doesn't mean that your page is automatically high quality. This kind of goes into the next one I have with regards to affiliate sites. Google hates affiliate sites. It's something we often hear. And that's also not the case. So just because a site is an affiliate doesn't mean that we won't show it in the search results. Some affiliate sites provide a lot of extra value. And that can be really useful for users. So that's, by itself, not something bad. But of course, if you're an affiliate site and you're just taking a feed that you're getting and reformatting it and publishing it one to one without providing any additional value, then that's something the web spam team might look at and say, this is pretty thin content. I don't know if we really need to index this. And one final one with regards to optimizing speed. Optimizing speed by milliseconds affects your ranking or your crawling. So if you use one of the speed testing tools and you see I've optimized it 100 milliseconds faster, therefore I'll be ranking one slot higher, that's also not the case. We primarily differentiate for SEO from a ranking point of view between sites that are really slow and sites that are kind of reasonably fast. And if they're reasonably fast, we'll kind of treat them as reasonable sites. And for the most part, this isn't a matter of like 100 milliseconds more or less, but like a really significant difference. However, users definitely notice speed optimizations that you make on your website. And that can have a really significant impact on the way that users interact with your website and then indirectly, of course, on things that we could pick up as well. So this is something where I would do speed optimizations for users primarily, not for any testing tools, but really look at what users are doing on your website and see how you can improve their flows, how you can guide them to look at more content, how you can make it easier for them to maybe buy things, make things faster on their mobile phones. For example, if you see everyone's using mobile phones. Can you give a number here from Google's standpoint? A number. It's another two seconds. Another one second. No, it's OK. I don't have a number to share. I just have a personal one. I just have personal numbers where from a personal point of view, when I see a site taking, I don't know, five or 10 seconds to actually start rendering any of the content, then I start getting impatient. And I wonder, maybe I should go somewhere else and actually see if this content is available somewhere else. But when it comes to search, we do try to differentiate between sites that are really, really slow. So that's not a matter of like 10 seconds or 11 seconds, but it's like those sites that you go and click on and it just takes forever to actually get any of the content. And that's really good. So from Google's point of view, under two seconds is fine. If you're under two seconds, I think that's fantastic. Two seconds, three seconds is a pretty high bar. It's really hard to reach sometimes. But if you're able to get that, I think that's really good. If you can go even faster, then I'm pretty sure that you will see significant changes in the way that users interact with your website. So there have been tests by Amazon, by lots of other bigger websites where they artificially slow things down by maybe 100 milliseconds. And they see significant changes in the way that users engage with the content and in the way that they kind of like buy things and convert. So that's something where the ranking itself doesn't necessarily help your website overall because your business is not to provide a ranking, but your business is to actually sell something, to convert users into customers. Hello, John. Just one quick question. I'm wondering, is there any baritization logic behind baritizing one domain over the other? Like for example, if the domain is based, if the business is based in Britain and the domain extension is to go to the UK, would that prioritize that website for the search results that coming from the UK? And the vice versa, for example, in USA, would that be the case if the domain is to come or to the US? Thank you. That's essentially geo-targeting. And we do use geo-targeting to help with users who are looking for local content. So if you can recognize that your website is about local content or is targeting users in a specific country, we can promote that slightly in the search results. So that's something where a local top-level domain helps, using a generic top-level domain and specifying the country in Search Console, that helps just as well. That's similar in the US as well. So if you have a .com website and you set the country in Search Console for US, then we can take that into account. Obviously that doesn't mean that this website will always be ranking number one in that country, but especially if we can tell that people are looking for something local, we'll try to show it a bit higher. So if I am coming from UK and I search something, then any local business would show up first because those are the prioritized domains because they are based in UK. Because I am coming from UK, they must show up first, yeah? First is a really hard place to kind of guarantee. When we look at the rankings, we kind of think about what is relevant and what is less relevant, and we will prioritize, kind of promote the local results if someone is searching for something local, but that doesn't mean that it will always be number one. So it might be that something is really, really relevant in the UK even though it's not from the UK. All right, that's understandable. And one more thing, and I'm not going to take much of your time, just one quick thing, about the backlinks. I hear a lot about backlinks. Some people buy it, some people buy for, let's say, magazines, local magazines to vouch with them online and that kind of stuff. And to be honest, that's unfair because if they have the money, they have the wealth, they will buy everybody and they will have a lot of backlinks. So what's the point, where's the logic behind that? Is there, like, because it won't be just for rich people to have a backlinks and the others won't have a backlinks. So is there any logic behind that? Is there any kind of algorithm to check if this a legit backlink or not? If that's the case, can you explain it to me a little bit? Yes, we do take links into account when it comes to ranking, but we watch out for things that aren't legitimate links. So if these are ads, for example, if they're marked with the realm of follow, then we take those out of our equation. If we can recognize that they're ads or that they're otherwise paid for, then we take them out of our equation. So these are things where we have a lot of practice and experience with over the years and we do try to treat them appropriately. Thank you very much. All right, let me run through the questions that were in the chat and double check that we don't have any other big submissions and then we can open things up to more questions from you all. How to delete a part of a page in a perfect way? Should I use 301 redirects or maybe 404 or 410 if that content doesn't exist anymore? So if it's something that you remove from your website, then I would just use a 404 and just let us know that we tried to crawl this. It no longer exists. We will drop that out automatically over time. You don't need to do anything special there. If you need to remove something urgently, you can use the urgent URL removal tools. They help us to drop that content out very quickly. One thing to keep in mind there is it takes out all variations of those URLs. So HTTP, HTTPS, dub dub dub, non dub dub dub, all of those are removed as well. And another one here is a true that since January 2017, Google will penalize sites with pop-ups. What about pop-unders? Yes, we do have an algorithm coming out. I don't know if it's in January or if it's further on in 2017. Whereas specifically on mobile, we will try to recognize interstitials. And we will take that into account when it comes to determining which of these sites are relevant or more relevant or less relevant. So that could be seen as a kind of penalty. With regards to pop-unders, I haven't really seen pop-unders on mobile. At least haven't really noticed that. So that's probably less of a thing there. Let's see. Yes. When we're trying the speed insights, the cities are getting a score of 60 out of 100. What is a good speed we should have at least for the mobile? For page speed insights, I would use that less as a kind of a baseline to cut through but rather as a way to improve things. So if you think your site is slow and you see a score, I don't know, 50 or 60 there, then that's something you can work on and tackle all of those small problems that are flagged there. So it's not so much that I would say you need to have at least 70 for mobile and 80 for desktop or whatever. It's really more to encourage you to tackle these problems on a step-by-step basis. Can I ask you a question? Go ahead. Sure, fix is the problem. Sorry? Go ahead. OK. For the previous pop-up thing, most of the websites use these pop-ups to convert people, convert guest to customers. So if you give a penalty to the websites which are using pop-ups, they cannot get the best out of the visitors. I disagree very strongly with that. Because if you're showing a pop-up to someone for the first time when they come and visit your website, then they don't even know if your content or whatever you're providing is useful for them because they can't look at it. So this is something where if your only value is by sending people away by showing them a pop-up like this, then that's something I would rethink that strategy. Your business should succeed in showing users the value that your products, your services, provide and letting that talk for itself, rather than pushing it in their face and saying, hey, you need to buy this. It's kind of like you go into a physical store and then someone stands right in front of you and says, hey, don't you want to sign up before you can actually do anything in that store? And that's not really that useful. That's not really that user-friendly. And that's the kind of behavior that really pushes me away from a lot of websites where I just say, oh, I don't need these pop-ups. I don't really need to see this website because they clearly don't respect my time, so I don't know why I should respect their time. What if I do it after the first click? So I go with a principle of first click free. So people enter from Google, read the page, and when he clicks on a second link on my page, so it goes on a second page, taking a cookie or something. I show him, I show at that point, sign up, pop-up, for example. Is that OK or it's still bad? I think that makes a lot more sense. I think you probably want to kind of try things out and see where the right kind of threshold is. Yeah, but would I be penalized by Google for that? No. From our point of view, primarily what is important is the transition from the search results page to your content. So if they can click on a search result and they go to your page, they can read all of the content that you have on that page, then what you do afterwards is between you and the user. But they should be able to get through all of the content that you kind of, or that we're recommending in the search results. Right, thank you. I have a question for you. Last one, and I promise I'll shut up afterwards, but it's kind of important for me to then only catch you again. I have a big problem with a friend, customer, don't know how to tell him, which has a blog. It's a politician. It has a personal blog kind of thing. And he insists on having, on the main page, only the last blog post. So each blog post, the main page is just one post. He clicks next. He goes on the previously post. He clicks next, so it goes on the previously post. That practically is identical content from the main page to the real post page. But that page changed daily because he posts new things daily. How bad that is. I kind of recommended him against that because I can see it as a duplicate content. I don't know what Google will index for the main page, for the actual page. I try to tell him to add things on the page itself, like user generated content, more comments, support the idea of comments of some extra content or so. But in most cases, it's just the same post on the main page, or the page is going on the page itself. How could I handle this? I think it might not be optimal, but it's not going to be critical. So if that's the way they want to do this, then that's the way they want to do this. It's not going to cause the website to disappear. In the worst case, what will happen is we will pick one of these URLs and make it canonical, but it will rank in exactly the same. By the best URL, tomorrow will be another URL because today's page 2 tomorrow will be page 3. For the most part, I wouldn't worry about that. I think that would work fine. But maybe from a usability point of view, there are things to improve. But from search, I don't see that as a critical problem. Would it be bad if I was a rel canonical, which I would change daily? So main page says rel canonical to a real article page today. And tomorrow, when the main page is another article, I say rel canonical to vet article. I think that would be more confusing. Because if someone is searching for the name of that person or that business name, then suddenly, individual articles are the most relevant one when actually the home page is the most relevant one. So I would just let Google figure that out by itself. OK, thank you. John, one quick question. According to the canonical, actually, there was one client that I was doing the local SEO. So it's a classified website, basically. So what I was facing with was there was some duplication, duplication in essence that the website is being built up in such a hierarchy. For example, there is a property for sale section. So inside the property for sale, apartments for sale, so this is kind of a hierarchy. So the URL is like the domain.com slash if you'll just go straight to the child category, it will open the page. And if you will just go with the hierarchy, for example, the domain.com slash first parent, then the second parent, then the child, it will go to the same page. So what I did was I just added the canonical tag to the end page, like the most, the page itself, for example, apartments for sale. All right, so for example, if you even go to the domain.com slash apartments for sale or domain.com slash property for sale slash apartments for sale, so the canonical tag should be ending with the URL, which I need to be ranked in the Google. So it's OK for that? That's perfectly fine. That's a very common practice that you have different ways of reaching the same content with different URLs potentially. And using a rel canonical to clean that up is perfect. Another small question regarding the 404s. This guy was not adding any 404 pages. So instead of 404, it was landing to the main home page. So when I added 404, it says, OK, if I will just go in a hierarchy, for example, first it's 202 redirect, then after that it's 404. So it's perfect or I need to just go directly to the 404? That a redirect to a 404 page is fine as well. That's also very common. That's something we have to deal with, so that should work. Got it. Thank you so much. All right, let me just see. There's a follow up with regards to the pop-up question. So basically Google wants that the user can see the main content first, then after that serve them a pop-up for an interstitial that can help in the conversion, mostly in that regard. Yes, so we definitely want to make sure that the user, when they click on the search result, they see the content that we promised them. So if you click on a search result, the snippet says, there is this information going to be available for you, and you think this is the information I'm looking for, they should be able to find that immediately when they go to those pages. They shouldn't need to click past any interstitials. And especially on mobile, this is something that's really frustrating. You have to find a small x, so you have to find the link that says, no, I am stupid. I don't want to sign up for your newsletter, which makes the user feel bad. So these are all things that essentially come back to us as a search engine where users come to us and say, well, Google, you promised me that I would find this content on this page, but instead, I'm just being shown this big advertisement that's being disrespectful to me and being respectful for my time. So that's something where people come back to us and say, Google, you did a bad job, and that's something we're trying to fix here. So whatever you can do to make sure that when users click the search results, they're able to see the full content of that which was promised in the search results, that's kind of what we're aiming for. John, just a quick one. Sure. I'm about to leave, actually, but just a quick one. I need from your golden advice, all right? What's the best practice of ranking Google? I know it says there is a lot of factors involved at the very same time. I know that's the secret, but tell me, what's the best way of ranking? And is it all right to keep some of the 404 pages? Because we have recently changed the URL structure from old to new URL structure, and we are getting over 700 404 now, need to be so. So some of them are ranked and indexed, we can do, but the rest, I don't find any good alternative which won't be beneficial for the user experience, so I would leave it just 404. With that, I'm the ranking, and as I said, just golden advice. Don't give me the secret, all right? Just tell me how it works. OK, so I have a couple minutes time, and I need to tell you the secret to ranking number one in Google. This will be tricky. So I guess that there are kind of taking a step back. There are two things that I would really focus on. On the one hand, making sure you have a technically sane website that technically works the way that it should. It works everywhere on all devices, is fast, responsive to user interaction. On the other hand, you want to make sure that the products and services you provide are things that users really, really care about, that they will go back to your website. So in the worst case, if Google didn't exist at all, if your website didn't show up at all in search, people would still flock to your website because you're providing something really valuable. So this is kind of like first make sure that technically things are perfect, and then make sure that your website, your business, would be perfectly fine if search didn't exist at all. So really easy to implement, I guess, or probably not, but it's something where you don't need to focus on individual things once you've really made sure that technically things are OK. You should really, really focus on making sure that users absolutely love what you're doing. They recommend your business. They recommend the work that you provide, and they keep coming back. And these are the type of signals that we pick up on indirectly where users recommend your site. And we pick up the links, and we can show that in search. So that's kind of what I would aim for there. Thank you very much. And just one little thing. Should I, in terms of backlinks, as you know, everybody else buying magazines, press, do follow. They have the money. But as a marketing specialist, buy as well. Or should I just leave it to come naturally? Well, if you're asking me, you should definitely do it naturally. If you buy links, that's something that would be against our Webmaster guidelines. And the Web Spam team might even take action on that. And cleaning that up is sometimes really tricky. So I would definitely not recommend trying to cheat your way through the search results. Sounds great. Thank you very much. And have a good one. Thank you very much. Take care. Bye-bye. All right. One last question that hopefully doesn't ask for the secret to life. Can I answer? Sure. Go for it. I also mentioned it in the comment section. I have a new blog. I'm going to create content for this new blog. But I don't know for which keywords I should focus on first. Because it takes time to create good content. So which keywords should I focus on first? Which keywords? Is that the main keywords which I'm targeting, which are very, very competitive, or the keywords that already Google is ranking me for? That's something I don't have an absolute answer for. I think you need to know your audience. You need to know your topics really well. And based on your experience of that topic of your audience, you can work out where there's maybe missing content. Or where the content that's available online doesn't really match what people are actually looking for. So that's kind of what I would aim for there. I wouldn't artificially just take five keywords and write content for that. But really use your knowledge of the area of your users and work out where are things that I could provide additional value that doesn't exist elsewhere. Or where are things where I could provide value that's significantly better than anything else that's out there on this topic. Got it. Thank you. I ask one last question. All right. One last one. Then I have to rest on this. Yeah. When you submit the URL to Google, actually I was facing this kind of trouble yesterday. So it's kind of some kind of limit upon me because I changed that title and description. Then after six hours, I had to change it again. So I submitted it afterwards again. So it's kind of a limit. I can submit it in 24 hours once or something like that. Sure. You can submit it as often as you want. That's less of a problem. I don't think you'll get a lot of value out of it if you just keep changing things. Because then we can't kind of understand the context of those pages for the long term. So that's something where if you need to submit things because they change quickly and if they're individual URLs, you can use the submit to index tool in Search Console for that. If you have a lot of content that you're changing, if you're redesigning your site, if you're doing any kind of redesigning bigger chunks of content, then you can use a sitemap file to let us know about that. I was just curious about, for example, how much time will Google take when I submit the URL to read all the content and put it in the context? So I can just see the impact of the title and description I just changed. We don't have any guarantees. So sometimes we can pick that up within a couple of minutes. Sometimes it's a matter of a couple of days. It really depends a lot on the website on various other factors. So that's not something where we'd always say within 10 seconds we'll be able to reflect that in Search. So when I see the impact in the ranking, I'll be like, sure, OK, this is the impact for that title and description, yeah? Yeah, yeah. OK, thank you so much. All right, I need to take a break here. I have a next meeting lined up. We have a bunch more of these topical hangouts lined up for December, I think like two or three a week. So if there are more questions on your mind, check out the list of hangouts that we have available. Pick the one that matches your topic best, and we'll focus on those questions when we get there. Thanks a lot for joining. Thanks a lot for watching, and hope to see you all again in one of the future hangouts. Thank you, John. Bye-bye. Bye, everyone. Bye.