 All right, welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a Webmaster Trends Analyst here at Google in Switzerland. And part of what I do is talk with webmasters and publishers like the ones here in the Hangout and the ones that submitted a bunch of questions already. As always, if any of you are fairly new and want to ask the first questions, feel free to jump on in. If possible, I will start because that's the question I doubt it will be answered. It is in retreat, but I think you will just keep it. So I will try to force it here. It's a case I encountered recently with one of our sites. Here's the problem. We had a brand, let's say, old brand.com, the domain name. Due to some copyright issues, we kind of decided for a period till everything legal clears to use a different domain name, both in promotion and as main domain. What we did was to do a 301 old domain, so main brand to newbrand.com. And we're promoting basically this newbrand.com. The problem is that at some point, I estimate somewhere between six months and one year, we will probably solve all these legal problems. And we'll want to get back to the old domain name, old brand, which we used far now since years. My obvious solution would be that at that moment, we would just reverse the 301 redirect. Is that cool or is there another better way to implement it? That's fine. I mean, sometimes you have these situations where you have to do it anyway. There's no way around it. It's usually it's better to avoid a lot of moves back and forth, but if this is what you want to do, if this is the long-term goals of your business and it involves going to one domain temporarily for a while and then moving back, then that's the way life is. I wouldn't worry about it. But will negatively affect us? We should be able to pick that up. Or we should better stay with the domain once we move. It's a big lose. It's I think we should be able to handle that well. I suspect you'll just have these temporary fluctuations when you move, but that should settle down fairly quickly. Instead of 301, we should better just promote this one and use rel canonical. No, I would use 301. Do a clean site move to the temporary domain, and then when you have the final domain, do a clean move to that one. OK, thank you very much. Sure. John, I have a similar question. Because now I have different questions for you, now that we've fixed the other issue. Because now we have those two sites, which are both now recovering one faster than the other. And so we're trying to decide what to do both for Christmas and then forever into the future. So does it really matter which one we 301 to the other for now? And just as Bogdan has said, if we then decide that actually the long-term plan in the new year is to use the other one, is that going to cause problems that we 301 for three months? But then we actually decide, no, we've changed our mind. We're going to use the second domain and forever more keep it on that one. I think that should be fine. So you will definitely have this fluctuation when you do the move. But if you do the move back to one domain and then do the move to the other domain a couple months later. And now that the 301s and signals are kind of immediate, does it matter if you move? Because one domain is 10 years old and strong. The other one is only a couple of years old. Does it matter if you move the strong to the weak or the weak to the strong? The sum is still exactly the same. Because the strong signals add to the weak or the weak signals add to the strong. So what you end up with at the end is exactly the same result. Is that right? Probably not exactly the same. But for the most part, it should be pretty similar. It should be nice and easy for us to work on. We need to wait around 10 to 7 minutes to give it a bit more time to settle down and then think about what you really want to do in the long run. We phrase, we do that. So to kind of see where the current situation has settled down in search and combine that with your long-term goals and do they overlap? Are there different directions? Are there different directions that it's the harder to make that decision? Yeah, we're seeing fluctuations. And we're seeing Google, because they're both live and they're both identical content now, we're seeing Google try to decide which one it wants them for different searches. It picks a different domain. And our problem is, and we'd love to give it just more time to see which one Google eventually determines. But we're a gift company. So 50% of our sales are in the next six weeks. So we would be much better off picking one. So it combines the signals for Christmas and then move afterwards, because then we have a full year to worry about other decisions. Whereas now we have to pick one that is going to be, because we could be diluting both rankings at the moment. What do you think of the challenge? What they need to be able to go ahead and have those next six weeks? I don't know, OK. So I think picking one is definitely a safe strategy. I would tend to picking the one that currently gets the most traffic just to make it easier for things like that. They've both now just got almost identical traffic. One had a lot more. One had a lot less. Now they're basically going in opposite directions. OK. But if there's any risk of loss of rank at all, then presumably we should throw in the weaker ones of the stronger, because any percentage loss. If you can send me an email afterwards, I can double check. And see if there's anything where I would say, oh, this is really weird and something to watch out for. I think in a general situation like this, with normal websites where there's nothing crazy that has been happening. Website. Then I don't see any reason why you shouldn't be able to just pick whichever one. All right. I'll drop you a, I'll send you something on Google+. Take a quick look to see if there's anything. Would you, I don't want to push you, though. But would you mind letting me know quickly, because then because we have only got a few days to Christmas, perfect. I'll leave you all alone now. All right. OK. Yes. I only have news questions which John doesn't like to answer. I don't like asking questions. Anything you have on November 10th? Do you want to confirm November 10th? November 10th. Thursday. I have no idea. I think you asked, Gary is like, did we do any algorithm changes? And most likely we did algorithm changes every day. I think I asked. He just offered off the answer saying, yes, we did a change. We did changes all the time. Yeah, we do changes all the time. And it's also sometimes tricky because we make these changes. We discuss these changes for a long time. And then at some point they roll out, and it takes a certain time before they're actually visible. So it's not like we would meet in the morning of November 10th and it's like, oh, we should push the button today. And then everyone will see it today. There is always this time until the results are actually visible. So it's really hard for us to just say, well, this was just one specific change, and that's what you're seeing there. Also a lot of these changes, they affect sites in different ways and different kinds of sites. So it might be that most sites didn't see any change at all, and some sites see some really strong changes. So that's really hard to kind of generalize. Yeah, it just seems like the chatter is heating up a bit. And people are saying, peg it a little back. And then there's a boot penguin or something else. So you guys may be anything or aware of anything specific to that date. I don't think we have anything specific to announce for that date, yeah. All right, thank you. Sorry. All right, let me have one more. All right, go for it. On another side, I noticed that a lot of our pictures are copied by other sites. Most exactly, they're not really copied. They link directly to them on our site. But in this case, you are considering using blocking hot linking. Because as you see it, they just force our server with extra requests. They take our bandwidth, not that this matters too much. But still, it adds up. And I don't know if there are any benefits. But before doing it, I thought on asking you about this. Because maybe we get some benefits because since they use them, our pictures from our site on their sites, maybe somehow counts like a link. Maybe somehow it increase the popularity of our image when it is searched. So I was wondering, is there any benefit for us when people do this? Or it's just basically stealing our server resources when getting nothing in exchange? We do use that for image search. So that's something that does play a role there. When it comes to hot linking protection, our main problem is when people go to image search and they click on a search result, and they can't see those images. So in any case, if you're doing some kind of hot linking protection, make sure that visitors who are coming in through Google aren't affected by that. Yeah, obviously. If you're lacking other visitors, that's kind of up to you. That's really good. No, but I mean, this will mean that probably a couple sites will just not use our image anymore. Mafor ones will download it and upload it on their sites. And I'm somehow afraid that OK, we save server resources, but we may lose some search engine factors. Because maybe these images they use still count somehow. So they do count or not? So we use this for image search. So if image search is important for you, then maybe that's something to look into. But what I would do is just do an A-B test and say I will take 10 popular images on my site and treat them like this, 10 popular images on my site, treat them in a different way, and see what happens. Do people really use them less frequently? Do they copy them? Do they just use something else? Does it really change the load on your server in a way that is measurable? Because if it doesn't change anything on your server overall, then it's like you're creating extra maintenance hassle on top of everything else. That's probably not worthwhile. OK, thank you. All right, let me run through some of the questions that were submitted here. Does Google Panda determine page quality based on user signals or mostly on flat technical metrics? So I double checked what we announced for the Panda algorithm in general. And we didn't really talk about the specific signals that we use specifically for Panda. So that's something I don't really have a good answer for you there. We try to use various signals to figure out the quality of the pages to understand how these actually work. And a lot of that comes in together into some of the quality algorithms that we have, like the Panda algorithm. If a website has a partial penalty which impacts unnatural links, should we try to disavow those links or can Google handle us alone? There is any issue when this information is in Search Console, when does this message stop showing up? So essentially what happens with the partial penalty or the partial manual action with regards to manual links is that our algorithms are kind of excluding these already from being used. So on the one hand, we're trying to take care of it for you so you don't have to worry about it too much. On the other hand, if this message in Search Console bothers you and you really want to clean everything up, then that's something you could clean up. But for the most part, we've already kind of taken those into account and taken those out of our link graph. What would cause multiple domains in a specific industry to experience identical drop in rise and ranking for a particular keyword? This industry is of the adult nature and the keyword is sex toys. The sites include a bunch of sites. At the beginning of the year, these domains consistently ranked page one in Google, but later may dropped off to page one and experienced various ranking changes. What's up with that? So in general, things can change in our algorithms. So it's not so much a matter that our algorithms would be targeting specific websites or specific industries, per se, but rather that perhaps the algorithmic changes that we made are visible on these types of sites at the moment, particularly. So one thing, I guess, from our point of view is that it's not that we don't show any content for these kind of queries because there is just other content that's being shown. And the other sites that were being shown might be saying, oh, this was a great change. And Google should keep this change. And these sites here might say, well, I want my old rankings back. So there's always some change that happens across these kind of sites. But in general, we don't go into specific algorithm changes when they affect sites like this. So if you see things where you'd say it looks like there's an error on Google's part or it looks like the search results quality overall is really bad for these queries now that these sites have changed in rankings, then that's something you could send to us. And we talk about these kind of things with the quality team regularly to see, are we doing the right thing? Are we still showing good and useful search results? Or did something change in our algorithms that the search results we're showing now are really not that good? So as an example, one thing that helps us a lot when we do these kind of discussions with the quality team is if there's a very clear difference between the quality of the search results that we show. So if we look at the search results and the top 10 are still all kind of the same, then from our point of view and from the quality team's point of view, there's no reason to change the search results again or change the algorithms to flip the results back because it's already more or less the same. On the other hand, if the top search results are really bad and the ones that are missing or that dropped in ranking are really, really good and really the most important search results for these kind of queries, then that's something where the quality teams would say, OK, it looks like we're doing something wrong. It looks like we need to fix this. And usually they take that to really try to resolve those kind of issues. So the cleaner, you can send me queries and examples where it's really something that's wrong in the search results, where the search results are not useful for users, then that's something we can discuss with the quality team. Whereas if it's just more of the same kind of up and down, then that's less useful for us to kind of discuss. John, if I can do a quick follow up on that. What do you do in cases where, for example, all of the results are relevant to the users? Maybe they're e-commerce magazines or some shops or something like that. But they do a lot of link spams. So that industry, you know that most of the websites are doing a lot of link spam. Do you just value links or any other factor less for those types of queries or it doesn't really work like that? It depends. So we do try to handle this kind of situation on one hand algorithmically where we can. On the other hand, sometimes we have to take manual measures to kind of take action on that. And sometimes that's really awkward, where we look at all of these legitimate sites and we look at them and it's like the top 20 or 30 are all doing crazy spamming. On the one hand, we need to take action on them. On the other hand, it's like there's nothing more reasonable to show there. So that's always a really hard trade off. Luckily for the most part, if we do take manual action on some sites, there are enough legitimate and good sites that kind of bubble up and take their place. Sometimes what also happens is that we're able to either algorithmically or manually take out, for example, links that are causing problem or spammy techniques that are causing a problem and just focus on the rest of the site. So to kind of say, well, we see you're doing all of this keyword stuffing, but our algorithms are able to filter that out. And you might still be ranking high, but it's definitely not because of this keyword stuffing. It's based on all of the other good signals that we have. OK, so it's not that you might consider way less factors versus other queries necessarily just because you see a lot of problems with that factor. People are abusing it. Yeah, not specific to individual queries. I don't think that would make sense. But in general, we use a lot of factors for crawling, indexing, and ranking. And we don't weigh them all the same, all the time, for all queries and all kinds of sites. So that's something that's usually kind of useful for webmasters because you don't have to do exactly the same as your competition to rank in the search results. You might do this really well, and others might do this really well. And we take all of this together and say, well, overall, we should rank them like this. Just to jump in very quickly, so are you saying then that if there is a site that happens to do something very well, that even if it really spams or breaches a webmaster guidelines, that you might kind of turn a blind eye to that? Because you see lots of complaints all the time of people talking about sites like Forbes and so forth that are mysteriously at the top of the surfed because they have interstitials and they have ads all over their content, and some of the content isn't that great. But then there are other things that they're really good at. So is that kind of how it works? Google might ignore certain bad things about a site if there are certain really great things about that site. It's really hard to generalize. So I don't know, for example, the Forbes example, that that's something where there might be some other things happening there. But it's definitely not the case that we would say, oh, this is a good site. Therefore, they can do whatever they want, and we won't penalize them. That's definitely not the case. So even our own sites, when we do something bad, we will take manual action on them. And sometimes that does result in them ranking lower. Sometimes that doesn't visibly change the ranking for some keywords and it just affects other keywords. But we don't exclude sites that are, shall we say, I don't know, important or big or well-known from any manual actions or from any algorithmic actions just because they're big or well-known. I didn't mean big or well-known. It was more that if there are some sites actually really great at some things, yet also breach the Webmaster guidelines or do some really cheeky stuff, might you kind of turn a blind eye to that simply because they're great at certain other things. It wasn't a big brand type of thing there. So I guess to some extent, we do try to balance things out and figure out what is relevant on these sites. And there are, I'd say, a bunch of spammy techniques where if we're kind of certain that we can filter these out, then that's something that won't necessarily affect things on the site. For example, keyword stuffing. That's something where we have algorithms that work really, really well with regards to keyword stuffing in the sense that we're able to look at the rest of the site instead and to say, well, this part here is keyword stuff will just ignore it. And they won't have any positive effect from this. We'll focus on the other signals on the site. But other signals where we can't really kind of tune them out completely, that's not something where we'd say, well, we'll just ignore this on this site and focus on the good parts of this site. OK, thanks. All right. We discussed here last week that Google has a problem indexing lazy loaded images in German. In case that Google is not able to see an image below the fold, what do you think about our server side solution where we essentially cloak to Googlebot? So I guess from that, you already kind of hear that I don't think cloaking is a good thing in the sense that just because Google isn't picking up the lazy loaded images and other search engines probably as well doesn't mean that you should be cloaking to Googlebot and treating it differently. So you should really be serving the same content to Googlebot as you would be serving to other users. With regards to lazy loaded images, especially if they're far below the fold, that's something where I kind of look at the situation and think about, is this really a problem that you're battling with? Or is this something where you just think Google could be doing something differently compared to what it currently is doing? So for example, if you have an e-commerce site and you have a category page and has thousands of articles on it and you lazy load the product images, which probably makes sense, then you would load the first page completely. And the rest of the pages, you would just load when the browser scrolls down to that position. But from Google's point of view, we probably want to see the product images on the product page, so not necessarily on the category page. So we might not see those product images that are further down on the page, but those aren't probably not the most important images on that page anyway. So instead of seeing them there, we would see them on the individual product detail page instead, where there's usually also a higher quality image that we can appropriately use for image search. So that's kind of taking a step back. One of the things I would look at there is, is this really a problem that you need to solve? Cloaking is something I would really discourage with regards to solving any problem like this. On the one hand, because it's against the Webmaster guidelines, obviously. On the other hand, it adds a lot of maintenance overhead that can break in subtle ways that you would not even notice, because you're showing Googlebot something very differently from what users would see. And therefore, if it breaks for Googlebot, you might think your site is all working as it should be while actually it's not working at all for Googlebot. So that's kind of one thing to watch out for there. There was a similar question, I think, further down or in one of the other threads about Googlebot just using the data source attribute and using that to load lazy loaded images. I think that might be something worth discussing with the team, but on the other hand, that's very different from what other search engines would do. So I don't know if that's really a good solution there. One thing we have looked at, for example, with AMP pages, is you could use, for example, a NoScript section and put the image link in there. Googlebot will ignore the NoScript section. Other search engines might pick that up and actually show that. Do you have in your algorithms a signal for page when that page is the last one or the only result the user clicked on when searching for specific topics? Would such a signal be used in any way to enhance or demote rankings of a specific page or site? As far as I know, we don't have any signal like that in our algorithms. It's really, I don't know, hard to actually understand that as well. So I don't know. I think there's a study somewhere with regards to how people search in sessions and how they sometimes take breaks in their sessions. And sometimes they come back to a problem a day later or a couple of days later. So actually, the last link that they clicked on the previous day is not really the last link that they clicked on for the whole session. So it's really a tricky thing to actually understand well. And from a per page or per site point of view, I don't think that would really make sense. We do use some of the user signals when it comes to understanding which algorithms are performing better or not to kind of test algorithms. We do a ton of A-B testing in the search results where we do subtle, small changes in search to see how users respond to that. Do they find the content they're looking for? Do they not find that? That's something we'd use quite a bit. But doing that on a per site or per page basis, I think, would be really problematic or tricky. What is worse, from Google's point of view, when a site doesn't have any loyal users or when a site doesn't have any great links, I know both are bad, but which scenario is worse nowadays? I don't really know. It kind of depends on your site. But I kind of worry that if you're looking at it from this point of view, you're kind of looking at your site in a very narrow way. If you had a physical business, what would be worse that people don't walk past your store front or people don't see your advertising? I think you're focusing on the wrong metrics there. So that's something where I would think about what is more important for you. What do you prefer that happens with users and your website? What is your ultimate goal? And your ultimate goal shouldn't be which metric does Google prefer because then I will gain that metric. My site got some great links and started to rank better and attract more visitors from search. But then my site dropped and traffic is lower than before. I think that might be because those new users didn't like my site enough to recommend it to others or to come back and the new search traffic didn't stick. Am I thinking in the right direction? Maybe. I think from a search point of view, from our point of view, these kind of things can change over time. I wouldn't necessarily say that there's really a conclusive connection between those two aspects. But this is something that you can definitely check with your users directly. Figure out what they're doing on your site, watch your analytics to see how far they go and within your site. Do they interact with your content? Are there things that you can test on your site to see how people actually react to that? And that's not something that we would pick up directly for search, but that's something that indirectly might have an effect. Where, like you said, people didn't recommend my site enough, that's something where indirectly that might be something we could pick up. Can a normal turn? Put it on. Yes. Go ahead. OK. OK. So we have this unique problem where we have a huge website and a huge company, and then we have subsidiary websites for all regions. And then we have the same content because it's the same on-brand. So what we end up doing is we end up cannibalizing all sorts of results in everywhere in the world. Like if I'm in Canada right now and I'm searching, I'll find results from New Zealand. It's alternatively from different parts of the world coming up first and second. But the content is almost the same. So going forward, what is a good strategy to really be doing? Technically, we're doing almost everything we can. But it's just being status quo for some time. I would focus on things like hreflang to better give us information on which version of your content is equivalent across different languages and countries. So that's something where I think there's often a lot of potential to make it easier for search to figure out how these things are connected and which version should be shown in which situation. All right, let me grab some more questions that were submitted, and then we'll open up for more questions from you all directly. Can a normal term like, for example, how to lose weight fast become a navigational query for a particular resource based on how the user reacts to the results over time? So theoretically, anything can become a navigational query, but this seems like a very obscure thing to become a navigational query in this specific example. So that's something where I doubt our algorithms would quickly notice anything there because there are just so many people that are trying to find general information on queries like that. And this is not so much a matter of making something a ranking factor, but more a matter of understanding what users are trying to find and what they find is relevant for their specific queries. If I have 1,000 words in-depth article written by an expert in natural conversational tone, does it mean that this article is automatically high quality in the eyes of Google or is it high quality only when users say so? So no, just because you have 1,000 words written by an expert in SEO-friendly conversational tone doesn't automatically make it high quality content. What I would recommend doing there is if you're writing content for your site, if you're creating a website, I would go back to Amit Sengal's post from a couple of years ago about 23 questions you can ask yourself with regards to high quality content and go through those with someone who's not related to your site. You really have someone neutral look at your site, kind of go through some of the flows, complete some of the tasks within your site, and compare that to other people's sites and really try to get critical advice on what you could improve. And chances are it's not a matter of 999 words versus 1,001 words written by this expert or written in natural conversational tone. It's probably not that. So chances are there's a variety of things that come together to kind of make it so that we understand when a site is actually high quality content, when a site might not be high quality content. Would you say the amount of users that can go through Google to my page, my site, overall via generic queries is adjusted based on the site quality signals and links, like 5,000 visitors daily based on links and 5,000 visitors daily based on other site quality signals? Or is it more mixed that you take together links and site quality signals and say, OK, 10,000 visitors daily for the site overall? So this kind of touches upon a misconception I see every now and then in that we don't have a quota for websites. We don't say 5,000 visitors this day, 7,000 visitors the next day. That's not how search works. That's not how users work. Sometimes people search more. Sometimes people search less. Sometimes your site is more relevant. Sometimes it's less relevant. There's no quota. There's nothing where we would say, oh, we will send 500 visitors today. And when we've reached 500, we'll stop sending traffic. That's not how search works. So in a case like this, that's not how we would do things. Jeremy, I have a follow up on the previous one. You just noted me when I was asking. Oh, sorry. No, no, no problem. It was about quality content. Well, here's a problem. Most people, regular people, people who share content, consider quality content something right that I don't know, a five grade level student. What about academic content? Which is, obviously, high quality, because Buzzasker in the previous question asked is written by an expert, but it is at an academic level, which most people won't understand. So that content won't get much share, but it actually is high quality. How Google would see handle this? We would generally try to figure out what the right audience is for that and to show it to them. So that's another thing that sometimes I see mentioned in that you say it was written by an expert, maybe it was written by an expert, and it's fantastic content, but it's at a such a high level that for the general audience, it's not really well seen. So just because it's high quality content and really useful and good information for some people doesn't mean that it's relevant for everyone. So that's something to kind of keep in mind. And sometimes we see this for really good websites as well, where they have a lot of useful medical information, for example, on a website, but it's written in a way that the normal user can't really process, and they can't find it in the search results because they don't know which medical terms actually search for. So you might search for headache, but actually it mentions, I don't know, the medical term if there is one for that. So that's something where just because we don't show it for generic query doesn't mean we think it's lower quality content. Oh, so it will be seen as high quality eventually from a trust point. It will be really cool, but it may not be popular and not show in a lot of search queries. Yeah, yeah. Again, it's something where just because it's written by an expert and just because it has a lot of text, then it's written on this website doesn't necessarily mean that it's automatically high quality content. But these things, of course, all add up, and they give us a bigger picture of what the site is about, which audience makes sense for the site, what kind of queries it should be relevant for. And that's something that is really hard to kind of quantify into specific aspects. So it's recommended that when possible to eventually lower the quality a bit so it dares to a larger mass of people. Not necessarily. It's more popular. Not necessarily, but maybe, maybe. So if, for example, you have, let's say, medical content again, and it's written for doctors, and you notice that normal people are trying to find information and there's a lot of really bad information for them out there, then maybe it makes sense to have a copy of this written for normal people as well. And that would have more chances to rank in Google. It would rank for different types of queries. It would be for people searching for, why do I have a headache? And the doctors might search for, I don't know, this specific symptom. And it's different content for different people. And sometimes it makes sense to kind of cover both of those grounds. Sometimes you say, well, my content is so technical, I don't want the average person looking at it because they will just be confused, then maybe just keep your technical content. Would it make sense to have two versions of an article, one for more qualified audience and one for regular people, or this could be seen somehow as duplicate content? It wouldn't be duplicate content. I think that's definitely an option if you have content, then you would see that. I mean, you see that in the news, for example, often, in that some scientists will publish an article. That article will be online. It's like the most intense technical document of this type online. But other people will write about it in a way that's easier to understand. And they will have maybe the more broader audience go to, I don't know, this news site or newspaper, blog, or whatever, writing about this. And the more technical people will know which terms to search for and go to the technical document directly. And it's OK if both appear on the same site on the same day. Sure. I mean, this is different content for different audiences. I don't see a problem with that. OK, awesome. John, I have two quick questions and maybe one not as long as the other five. OK. All right, two snippets showing the old dates based on YouTube. Is that being fixed really soon? That being fixed really soon, I don't know. We did pass that on to the team. I think Gary tweeted about that as well. I don't have an update at the moment. 2013, 2011. Next is Search Console. Can you give us a hint about what this means by infrastructure updates? I know you guys are doing major infrastructure updates to the Search Console. Does that potentially mean we're getting 12 months of data? Well, we hope that sooner or later you do get your 12 months of data. I don't have any insight into what specifically we can announce there. So these are essentially general updates that we're doing at the moment. It's not that we're taking everything and rewriting it completely. And it'll be completely different next week. So sometimes you just have to catch up with the rest of infrastructure. So just to be clear, when you finish this infrastructure update, will we see new features or more data? Or it's just going to be behind the scenes stuff that we don't see? Pretty much behind the scenes stuff. It's like, I don't know, you have everything running on Windows XP, and then you move to Windows 10. And there's a lot of work involved with getting everything shuffled over. And in the end, it'll be running on a new system. So not that Search Console is running on Windows XP. Just clear. We're actually doing a massive infrastructure update for one of my software tools. And with that, as soon as we do the infrastructure update, within minutes later, we're actually launching a bunch of features because of that infrastructure update to handle it. So from a software perspective, it makes sense to do these infrastructure updates One is, obviously, if you're slowing down. Two, if you need more space. Or three, if you want to add new features, I need it. So OK. I mean, sometimes you just have to upgrade things to the newest stuff. And those are changes that take a lot of time, but don't necessarily result in anything user-visible in there. OK, my final question is mobile-first related. Obviously, there's a lot of stuff going on around discussions around mobile-first people are talking about. If they only have AMP content, I kind of want to see from your perspective, what are you and Gary and the people at Google saying, why are SEOs and webmasters asking these questions? These are the wrong questions. They shouldn't be asking these questions. What are those topics come up that SEOs and webmasters shouldn't be focusing on around this topic? Or? I don't see anything specifically where I'd say they're asking the wrong questions. I think it's good to kind of try to take a step back and understand what this means for my site, for my clients, for new sites that I might build up. I think asking questions is great. But I think, at least from my perspective, when I take a step back, I'm looking at it in a way that it's basically the fundamental change here is that you still have one index. Nothing's really changing with that outside of the way you are populating or grabbing the data for this index. You're grabbing it now from a mobile perspective, as opposed to a desktop perspective. And if a mobile device hits a desktop-only site, you'll still be calling the desktop-only site. But it's kind of like a mobile render version of it. So I think that's the big picture of this. And when people start panicking around, my desktop site's not going to work, or I don't have a mobile site, or whatever, it's really nothing different outside of a mobile browser calling the web. And that's how you're grabbing the data. Is that the big picture? That's pretty much it, yeah. So at least from our point of view, we're trying to keep this at something where, obviously, we want to reflect the mobile web, but we don't want to tell everyone to change all of their sites and do everything differently. And suddenly, everything they learned before was wrong. We're trying to keep the old setups the same as much as possible. I suspect there might be things here and there that do need to change. But for the most part, we're trying to essentially make this change on our side and reflect what users would see for the most part when they use their mobile devices to access these websites. OK, thank you. All right. Let me try to grab some more questions, and then we'll have more time for your questions as well. Some pages are ranking based on site quality, and some pages rank based on links. When I have pages on my site that don't have any external links pointing to them and they're receiving traffic, do they drain this site quality juice as more as opposed to if they had links from the upper little sites? I don't think we have anything like site quality juice. So I'm not really sure what this question kind of heads out to. But for the most part, we do try to take into account over 200 signals for crawling, indexing, and ranking, and we try to combine them to understand what is most relevant for the search results for individual queries. And sometimes it includes things around links. Sometimes it includes things around the rest of the web and the rest of the website. So that's not something where I'd say you're kind of like draining this and using that. We try to really take all of these signals together and look at the bigger picture. Question regarding Google's quality updates about the, by SEO, so-called phantom updates that did not officially get announced by Google. And we're looking at search quality. Let's see. Where does this go? Are quality algorithms updates by Google refreshed in any specific time frame? No. So these are essentially just normal algorithm updates that we do all the time. We make lots of changes in our algorithms. And sometimes they're bigger. Sometimes they're smaller. Sometimes they affect fewer sites. Sometimes they affect more sites or more queries. It's really across the board. And these are things that are not set in specific time frames. It's not that we would launch an algorithm and say, OK, this is a new algorithm. And all new algorithms take six months to update and to show new data after six months. Some algorithms are faster. Some algorithms take more time. Can sites recover their rankings if they fixed all usability issues before the next refresh of that specific algorithm part? Yes. In general, our algorithms do try to run regularly. That's why we use algorithms rather than trying to do things manually. So if we see that things change on a website or with regards to a website, then we will take that into account with the next time the algorithm is run or the next time the algorithm looks at that data. Thanks if I could just jump in. And so just one more small question if you could tell us. So regarding this 7 July update, the Phantom 4, it's called. I'm from that point. I'm watching each week like 60 pages. They had lost a lot of visibility from the SEO part of you, point of view. So my question is, to today, no page. We covered the rankings. And it's like, well, only like 10 of the pages really fixed issues what I can say from now. Like they have two big ads. They have not really good content, something like that. So but even those pages that make a lot did not recover right now. And we saw it on the Phantom updates that were before. Like that it took a few months or took till the next refresh. So that looked similar. So because it's not officially announced, we can't say anything more. But could you just say something like how if this could be some manual thing that is running out like Panda was before Penguin was in 2012, 2013? I really don't know which update you mean. So that makes it a bit hard to say what specifically is happening there. I think one thing to keep in mind is that we make algorithms all the time. We make updates to our algorithms all the time. So it's not the case that there is any fixed recycle time or any fixed update time for any of these updates. So just because it's named, I don't know, Phantom 1, Phantom 2, Phantom 3, Phantom 4 externally doesn't necessarily mean that this refers to anything similar internally. So it might be that these are totally unrelated algorithm updates that happened. And some of them are able to reflect things as we recrawl and re-index things. And others need a bit more time to kind of understand the bigger picture. How often do you reverse the algorithm in terms of do you guys do a lot of things, you know? Restore. Revert. Revert. Revert. Revert. Revert. Restore. I don't know. So it's really rare that I see things where we roll something out and the engineers are like, oh my god, this is crazy. And we need to turn it off again. That's really rare because we do try to test these things very extensively ahead of time, just to make sure that we don't run into that situation. What does happen regularly, though, is that we say, well, this algorithm is no longer needed. Therefore, we'll turn it off. We'll delete the code. We don't want to kind of burden our code base with things that are no longer effective, that either don't have a visible effect anymore because other algorithms are handling it better now, or that are perhaps even so far as saying, well, this is kind of working in the wrong direction compared to the newer algorithms that we have, and we prefer to focus on the newer ones. So that's something that I think any software company needs to do. They need to regularly maintain their code. And by maintaining, it doesn't mean that they always tweak things and make things better and add more and more things. They also have to regularly take out the old code to clean things out, so that when things do go wrong at some point, someone can look at the code and say, oh, I understand what is happening here, and it's not just a big pile of spaghetti that nobody has any idea of what is actually happening. Yeah, but it's a bunch. It's a bunch. It's not like one person does that. I mean, it's a whole team that decides to go how to make that out of the question. Yeah, that's definitely the case. John, just quickly, I wanted to ask you quickly about AMP. So if the site is really steady, always steady, at milliseconds, there's no need to go out, right? If it's a local business, the server is a virtual private server, and their site is always in milliseconds, and on a regular basis, it's being checked. So there's no really need to go out, right? Well, AMP isn't only about speed. So speed is definitely a big factor there, but being able to embed the content in the search results page immediately, being able to embed the content in other places, using the content delivery network, all of that can make sense, also for smaller local business. Obviously, it's something where you need to balance how much work is it to actually do with how much value do they actually get out of it. And it might be that for some sites, at the moment, you say, well, it's too much hassle to actually implement this in a way that works well. And for others, you might say, well, it's just an option in WordPress, and I can select that, and everything will work well automatically. OK. All right, thanks. John, can I ask something really quick? Sure. It's actually a penguin question, so I hope you don't kick me off with this. Let's assume that we actually have one client that had been penalized, and they decided they want to just get it over with and move to a new domain. So they basically have two domains now, and the new domain has been doing really well. But it's a hassle to kind of keep products and update everything on two domains. And now that a penguin isn't really demoting sites, it's just ignoring the bad links. Would it now be a good idea to redirect the old domain to the new one just so they only deal with one site instead of two, or should we still go through a full disavow process for the old site? It's hard to say in general. So I think for most sites, you'd want to clean up these issues anyway, because there's always the possibility of a manual action as well. There might be other algorithms that are looking at that too. So for most sites, I would try to clean that up anyway. If you're in a situation where you already have two separate websites and you don't know which one to pick, then obviously combining them probably also makes sense. But I would try to take a step back and look at the overall picture rather than just blindly say, oh, you need to clean all of this up, and then you can finally move. Maybe it makes sense to just say, well, all of this old stuff is so bad and so terrible that we really need to make a clean cut anyway, regardless of just penguin. Right, it's just that they still have users coming to the old site. So they really do want to make a redirect, not just delete the whole domain. Yeah, OK, I guess we'll do a disavow anyway. One quick other thing, we talked about the crow stats and how one of our websites had really increased in the time downloading a page. Would you say that over a certain value, it can actually severely impact rankings? Or this is just for users, because obviously there's an impact to the users. I was just curious whether there's a certain threshold over which it might actually impact how it performs. That's purely a technical thing. So what will happen is we'll crawl less when we see that the server is slowing down. And if crawling less means we can't get to your new and updated content, then we can't show that in search. But if we can still show all of your content anyway in search, then, yeah, that's probably less of a problem. So John, can you tell us what's behind you? What's behind you? Yeah, what is that? Your algorithms, I don't know. It's the finish line. We're done. Yeah, finish line. A lot of work has been done. Yes, last second. Yeah, no, it's like the decoration here. I don't know. That's amazing, yeah. Pretty fancy. All right, with that, most important last question, I'd like to thank you all for joining. I'll set up the next Hangout again. The Friday one this time won't take place, because I'm at a conference in Austria. But next round should be happening in two weeks. Sorry, I don't think it's something we have. John, I've tried to send you a message on Google+, but I think I shared a post with you instead. But I'm sure you're working out. All right, great. And John, did you get my email? I sent you an email also. Probably, yeah, all right. Yeah, get lots of emails. Need to figure out something like an AI to do my email for me. All right, so with that, I'd like to thank you all for jumping in, all of the questions, and hope to see you all again in one of the future Hangouts. Bye, everyone for hosting this. John, have fun in Austria. We'll meet you in Poland.