 OK. All right. Welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I'm a Webmaster Trends Analyst here at Google in Switzerland, actually in Mount Miu today. So yeah, that's confusing. And I have a special guest, Jennifer Slagg, from SEM Post, who's in town for the recent SMX conference. And I thought it would be great to have someone to join us. So thanks for coming. Oh, thank you very much for having me. Very cool. All right. So some of you have questions already. We can start with those. And a bunch of questions also got submitted. So I'll go through some of those as well. Do you want to get started? Yes, sure. So we have a question about features snippet or position zero in organic search results. So the question is, we are trying to get these features snippet for a few keywords. And so we implemented a few changes. And the result was that we obtained the features snippet or zero position under google.uk or google.com. But we are not getting the same result under google.it. So we are a business in Italy. We operate in Italy. So our main objective is to get the features snippet under google.it. So we don't exactly understand why this is happening. We implemented the changes like, I would say, a couple of months ago. And so immediately after 24, 48 hours, we got the features snippet in google.uk and .com, but not in google.it. So we wonder why there is this difference, what could be the reason. We can also provide a few examples. We think that our features snippet is in terms of quality and in the way it answers the question of the user, it's much better than what our competitors are providing with. But we don't understand why it doesn't work in Italy and it works on other Google websites. So Google search page. I don't really have a useful answer for that because this is essentially algorithmic. And it depends on a lot of different factors that where we kind of automatically try to figure out which kind of snippet should we show in which place and which search results. So that's something that can vary from location to location in the different Google versions and the different country versions also across languages. So there is no simple one fix that you can do to make that appear. But it sounds like you're doing the right things. If you're seeing the features snippet for other countries, then I would continue focusing on that. And that's something where I imagine over time the algorithms will pick that up. But there's no guarantee. It's kind of like a different kind of snippet. And that's something that sometimes we put together in different ways. So is it possible that the Google.com or the English version is more advanced in a way and the other version of the algorithm will catch up later? Or is that a possibility in a way? Or this is our case, our hope in a way. Sometimes that's the case. So sometimes there are features that we launch in maybe English or in US first. And then over time, we launch them somewhere else. So sometimes that's kind of the case. But for the most part, and I think that's true with featured snippets, if you see those in Italy in general, then it's that we have launched a feature in Italy. It's available in Italy. But our algorithms look at it in a slightly different way, depending on the whole ecosystem where we think it makes sense to show them. So I suspect over time, this will evolve a little bit. I wouldn't call it more advanced or less advanced. It's just different. Like if you have one website and you have German content and English content, then maybe the German content ranks, but the English content doesn't. And it's not that one is better than the other. It's just they're different search results. OK. Thank you. Do you have any questions? Yes, I have another question always related to featured snippet. And my question is, we tried to understand why many competitors of us always the same have gained many results in Bayer Shilce instead, many other competitors not. So I don't know. We think that probably could be because of the trust of the domain of this competitor that could be one of the reason or not. So I would like to know. It's really hard to say. I mean, why does a site rank? We use a lot of different signals to try to figure out when we should show a site. And that's usually not something where you can just say, oh, it's because of this one thing that the site is there. So that's, I don't really have a simple answer for that. OK. Sorry. We know. You asked these questions. I tell you I have no answers. I feel bad. But these are things where I suspect you're on the right track. You kind of know what you're doing. It's working out for the most part. And these things settle down over time as well. So that's it. Let's hope for the best. And I don't have any other questions. OK. Fantastic. All right. Jennifer, a question for you. You were recently at the SMX conferences. How was that? It was really good. There was, obviously, Gary from Google was there a couple sessions with me and talking about different Google things. Obviously, mobile first was like the hot topic of the day as well as some of the recent algo changes. There's some interesting mobile first stuff that came out that you guys are still working on the algorithm to figure out the links issue and all that. So it's still coming, still in the works. But it seems like it's going to be another least few months before we kind of see more from that. Probably the biggest takeaway I took from the mobile first stuff is the fact that mobile page speed will somehow be incorporated into it at some point, which is a pretty big deal because so many people are focusing on desktop page speed, which is also good. But the thought was, oh, well, if Google doesn't care how slow our mobile site is, then why should we care? But thinking from a user experience perspective, if you're a mobile site and I visit your site and it's super slow, I'm just going to go to your competitor instead. So I was pretty happy to see that mobile page speed was going to somehow get rolled into the mobile first index. Yeah, I think that's something that definitely makes sense in that regard because at the moment we're basically looking at the site and saying, well, does it work at all on mobile? And kind of taking the next step and saying, well, it works really well on mobile is actually kind of like a logical progression, I think. Yeah, a lot of people have less patience for mobile. Like if you're on desktop and you're clicking a site and it's taking a while, if you really care about it, you just flip tabs and do something else and then you eventually come back to it if you remember. Whereas on mobile, if I'm sitting there and looking at my phone, I'm like, why isn't this launching? Or why isn't loading all the way? I'll just click back and get the competitor. Just go somewhere else. Yeah. So I guess that kind of comes back to a fundamental thing from Google's side where we kind of say, well, if we serve search results that don't work for you, then it's not, from our point of view, the problem of the site that you clicked on, but it's our problem. Exactly, because we don't think all this stupid site, we think, why is Google giving me this release of the site? Or will I want them to give me super fast? Yeah. Yeah, yeah. All right. Let me see what all came in. Bunch of questions already? OK, here's a question from Pete. To enforce the relationship between two internal pages, should you have the internal link going in both directions as opposed to one direction? Does both ways count more? Interesting question. I haven't actually seen it like that. But I'd say it really kind of depends on the site architecture. So it probably makes sense to have it in both directions if these pages are kind of leading to the same things. But I wouldn't see it as something kind of like a requirement that all internal links need to be bi-directional. Because sometimes it makes sense to go one direction where you're following the flow. And sometimes it makes sense to offer different ways of kind of handling something. So do both ways count more? I don't think across a site that would make a big difference. All right, another question from Pete. If your site is showing some errors in schema.org, can that be harming any of your rankings in that Google discounts the info gathered from it or none of it is used by algorithms for search rankings? No. So we do try to understand the content a little bit better with schema.org markup. But it's not the case that if you have errors in your markup that you would be ranking lower than if you didn't have any content, any of the markup on there. So that's something where if we can't recognize the markup and lots of sites have broken markup, then we just don't use that part of the markup. And for the most part, structured data is something that we use to better understand a site and to show rich snippets. But it's not primarily used for rankings. All right. Hey. Hello, how are you? Pretty good. How are you? Yeah, good, good. I have a question regarding a specific website. And it's HTTPS migration that we did in February. And everything went smoothly. There's no problems. Done it loads of times before or went well. And then in this last week, we noticed in Google Search Console that the search impressions had gone down for the HTTPS site and the impressions for the HTTP site have gone back up. Now, the HTTPS is still live. But the HTTP shows in search. When you click through, it redirects as you would expect through to the HTTP. And I'll put a, for example, I've just tapped in the box. That's, if you type in the brand, you will see in Google Search results and you hover over and it's HTTP. But when you click through, it redirects for HTTPS. And the only change we've made in the last week was to move the alternate language tags further up the HTML because they weren't getting picked up and now they are. But other than that, we haven't made any change. So it's quite odd that, I mean, traffic's not gone down. Traffic's remained the same. Rankings have remained the same. But obviously it's a little, it was a little alarming to see the impressions drop. And I put some screen grabs in the comments before as well as in the Google Plus link. So it's just an odd one. I was just wondering if it was an anomaly or if it was an issue perhaps that would make it do that if this is a common thing, an uncommon thing. Just a bit more insight really to feedback to the client. I don't know. I'd have to double check on that with the team. I know there was some problem with picking the right canonical with HTTP and HTTPS in the last couple of weeks. But I haven't seen it the other way around that we kind of prefer the HTTP over HTTPS. But it's an interesting case. I'll definitely take that to the folks here. One thing to kind of keep in mind is when we pick a canonical, we use a number of different factors, which includes internal links, external links, site maps, redirects, rel canonicals, all of that. So if there is anything that's kind of conflicting in there, then sometimes we'll pick another version. It sounds like you have experience doing these migrations and you've probably checked off all those boxes. So maybe it's just a matter of our algorithms doing something weird that they shouldn't be doing. So I'll take that to the folks here to kind of see what we can do there. OK, thank you very much. Cheers. In the meantime, I'll keep going over with a fine two of comb, but I can't spot anything. So yeah, it's just an odd one. Thank you. Yeah. I don't know. From just looking at the way it's shown in search, I'd suspect it's something more on our side than something on your side that you're doing. And it might just be that we say, oh, well, it used to be HTTP for a long time. Therefore, we'll just show it HTTP, even though you've made these changes in the meantime. OK. Sometimes algorithms can be weird, but the feedback is useful. OK, great. Thank you. Thanks. I have a question, John. OK. It's about voice search. The marking up content or schema tag, will it help voice search? Marking up content with schema for voice search, I don't think we'd use that at the moment. As far as I know, we never announce anything. I would see some types of markup I could imagine might be useful for that. So I believe there is a question and answer markup that you can use, which probably makes sense to implement if you have these question and answer sites. But as far as I know, I don't think we've announced anything specific where we'd say, this is the type of markup you need to do for voice search in particular. I know that it was an important topic in this SMX West voice search, and it's increasing. One of the important things, I mean, this is a question to you and Jennifer, too, using different kind of browsers is already annoying. We know that. I mean, we're web developers or web designers. And we're going to use it's a new era. People are going to use Amazon Echo, Google Home, and maybe Facebook and other companies. They're going to do something similar. Now, I think someone, I don't know who mentioned, someone said that Amazon Echo is using being search. They're not using Google search engine to show results. Well, let's say I'm talking to a friend. He's using Amazon Echo, and I'm using Google Home. And I'm telling him that, OK, use voice search a different result for a specific search term. So the question is, isn't it possible for you guys to sit together and find, do something so that we can, the browser compatibility issue, it's really annoying. So we can prevent this type of situation. I don't see that happening in the near future. I don't know. So I totally understand the thought that testing across browsers is already hard, and then testing in a way that you can't copy and paste the results is like, how do you even monitor your rankings on voice searches? I have no idea. But this is such a new area that I think there's a lot of opportunity for people who are thinking about these type of issues early on. And who come up with tricks and solutions for some of these problems. So that's something where if you're keen on this topic, then I would keep focusing on it. And some tools are out there that work across these platforms, like API.ai, as a way of creating agents that work across the Google and the Amazon platforms as far as I know Facebook as well. That might be an option. But I think. Specific API. Sorry? You said it's API, which specific API? Can you post it? API.ai, it's API.ai, OK. Yeah, it's a service we bought recently, too. But it works to kind of create these intelligent agents that you can use in multiple places. But these are topics that I think will be very top of mind for a lot of businesses and websites out there over the probably coming years. So if people are really interested in this topic and want to kind of dig in and try to figure out what recommendations could be made to small businesses, to large businesses with regards to voice search, then I think that's like a fantastic opportunity. And from my point of view, it's still so new, for example, that none of these devices are sold in Switzerland where I live. So the first thing I did when I came here is I went to Best Buy and buy all this stuff. So I could try it out, too. And it's really, I don't know, it'll be an interesting time. I think it's kind of like how the search results were in the beginning in that people first needed to kind of understand how does a search relate to my business and how do I tie things together, which things work, which things don't work. And there will be lots of things that work in one device that don't work in another device. And finding a way to kind of target the right devices or target both devices in a reasonable way, I think will be a challenge. But we'll see how it kind of settles down. I also think it's fascinating how I did a presentation at SMX East on voice search. And it's kind of interesting. I did a lot of looking into how voice queries are different compared to how you type in something. And on voice, it's so much more questions. You're asking questions as opposed to typing in your three word phrase. But now when you're doing it in voice, you're actually asking the question that includes a three word phrase. So it's a lot of different kind of intent in the way that people are searching for stuff on voice as well, which I find really fascinating. And I think it's going to eventually make it so that people are going to be conscious about creating content to match what voice searches are looking for as opposed to what we're looking for in a more keyword based situation. I think the hard part will really be to kind of find a balance that works across all of these different platforms. Because lots of people will still be using normal web search and browsers and phones. And voice at the moment is like such a small part. So it's something where browser compatibility is already very, you know, in Firefox, Chrome, Microsoft page, and all different browsers. So I'm just thinking that. And you know, technology ships really fast. Now let's say Amazon Echo and Google Home, they're maybe $10, $5 like that, something that the companies make a decision that, OK, we're going to make these devices really cheap for users. And then, and people like to, I mean, if there's a chance that, OK, we don't have to use keyboard, people use like their voice to search. So that's why I was thinking, OK, maybe this is a question to us. Yeah, I think that's a fascinating topic. And I think there is also a lot of opportunity for people who kind of provide help for businesses to better understand this, but also people who create tools or people who create platforms and say, OK, I will create a service that lets any, I don't know, pizzeria kind of connect with that and make it possible for anyone to order pizza locally through my voice service to kind of enable this voice platform with normal small businesses that don't really understand the web that far. But that's something where I suspect definitely over the next year, the next couple of years, you'll see a lot of changes, a lot of really innovative things coming out, and a lot of opportunity for people who say, well, I'm just not going to only focus on keywords and links. I'm going to think past like what we have now and think about what people will be doing in the future, how like the next generation of users is going to actually interact with the internet, which might be by voice, which might be by keyboard still, because maybe the keyword is actually better than voice. I don't know. It really will be an interesting time, I think. I think also the generation thing too, like when you look at younger people are way more likely to do voice, whereas we are probably more of the, well, not specifically, but older are little bits more resistant, I guess. My mom, I talk into my phone for a search, and she's like, that's creepy. It's just so odd to that, whereas my daughter, they voice search all the time, and they'll be in a crowd of people, compare what their results are, because they understand there's personalization and stuff, and what you get as well. It's kind of interesting to see how the adoption rate is going across age groups. Yeah. I mean, yeah, we'll see. Let's see. Maybe older people, they'll feel that, OK. No, I don't have to use another device. My voice is enough. I don't know. I mean, maybe it ends up that you think about who your audience is, and if you're targeting an audience that's more keyboard-oriented, and maybe you'll do a different type of website. If you're targeting a new audience, then maybe you'll do something that works more with voice search. But I'm pretty sure there are ways to cover both areas. OK, sure. Thank you. All right, let me just refresh here to kind of make sure that we're not missing anything. So if people are watching, you're welcome to jump on in as well. I posted the link in the Google Plus post, and maybe it's still here. There it is. Yeah, it's still here. So there's room. Looks like there's a bunch of people watching live, so feel free to jump on in. But let's take the next question that we have that was submitted here, where news publisher, we'd like to give an optimized experience to users coming from certain referrals, like Facebook. The idea is to show different news templates, title layout, ad slots to users from different referrals. However, the content of the page will always be the same. So what's the best approach here? Using rel canonical, as mentioned, with regards to duplication, from Google's point of view, what we essentially want to see is that a user clicking in from the Google search results would see the same content as Googlebot would see. So what you do for anyone else, which could be Facebook or Twitter or whatever else, is essentially up to you. From a user's point of view, I suggest not making it too different, so if it's just a different layout, then that's kind of up to you. With regards to the technical implementation, if you're using the same URLs and you kind of have to deal with that on your own, if you're using different URLs, then rel canonical is a good way to combine that with your preferred version. With the rel canonical, you need to keep in mind that it's a signal for us. It's not a clear directive where we'd say we will always pick the one that you specify as a rel canonical. We might sometimes pick the one that you're saying is not the rel canonical. So that's something to kind of keep in mind there. I suspect from a user's point of view, this could be really confusing. All right. Is all the content on an internal page, which becomes 301 totally discarded from the algorithm's point of view and therefore free to use on another page? Yes. I mean, it's kind of a weird question because you're saying I moved my content to a new URL, but at the same time, you're saying I want to reuse the same content somewhere else, which is kind of a weird situation. But from a technical point of view, if you redirect one URL to another, that content is no longer used in search. So you can do whatever you want with it. But even if it were still used, it's not a case that we would penalize a website for duplicate content. You might have the same content in different URLs and we'd try to pick the right one. I have a question. All right. Go for it. Hello. Good afternoon. We had a little problems with Google trying to change our titles. And we don't understand what is happening because Google just changed the titles without any kind of warning or justification. So we have a title. We have a well-definite H1 tag. But Google changed the title on the results. Why does it happen? It's basically our algorithm trying to figure out what the best title is. So that sometimes happens when we see a title that uses a lot of duplication, for example, or that is fairly long, then sometimes our algorithms will try to find a better title. And the thing to keep in mind is that this is something that we do on a per-query basis. So it might be that if you search for this, then we show that title because we think that best recommends your website to users searching for this. And if someone searches for something different, then we can show a different title or a different description. It's a kind of A-B test to try to find the best title to improve the CTR. Yeah, it's not so much an A-B test. It's more we're trying to best show how this page fits in with regards to what this person was searching for at the moment. So it's not a case that we're saying your title is bad and you're being penalized for having a bad title. It's more a case that our algorithms say, OK, you've provided this information, but we think it's clear to the user to kind of phrase it like this. So that's where sometimes it's a sign that you can improve your titles in general. So for example, if all of your titles just stay home, then obviously that's not very useful to users. So it makes sense for us to change that. But sometimes, even with titles that are kind of normal, we will rewrite them. And this is especially true when it comes to mobile where there's less room, where we have to make things shorter. Something I will mention that I notice a lot in practice, if you're doing a site colon query for your site, for some reason Google will totally rewrite those titles way more than if you're actually doing a more legitimate search to bring up that page in the regular search results. The title will be way different, as well as if you're doing a search for keyword keyword and then your site name. Most of the time, Google will include your site name in that title tag, whereas if you would drop the site name, it wouldn't show that anymore. So that could also skew what you see as well when you're doing testing to see what Google is doing for your titles as well. Thank you. Thank you very much. Can I ask just one more question? I know it looks like a stupid question, but I have clients, and my clients question me about something. Google currently crawl about 2.8 million pages per day on our website, so this is a lot of content that Google see and crawl and index. We can see this page on the results, and we know that the Google already know the new content we change on the page. But these changes take too much time to be effective on the position and changing and something like that. So I know it's a hard question, but how many times Google takes to really effective understand my content and my website instructor to change my positions? It depends. So that's not a very useful answer, I guess. But one thing to keep in mind is that we don't crawl all pages all the time. So things like home pages or the higher level category pages will try to re-crawl every day, and we'll see those changes fairly quickly. But things like lower level pages, within the website, we crawl maybe every couple of weeks or every couple of months, even. So that's something that might take a bit longer to actually be picked up. So that's something where you might see us crawling millions of pages every day, but that doesn't mean we will crawl the whole website every day. That might be we're re-crawling the home page a lot of times because there are changes there, but the really low level pages, they might get re-crawled every month or so or even later. So that's already there just from a technical point of view for us to even see the change. It can go from one day to several months. And then afterwards, reprocessing that is usually pretty fast in that maybe, I don't know, a day or so for us to actually understand that this page is different and use that for indexing. But the whole ranking side does sometimes have kind of this delayed effect in that you put something out there or you change something on an existing page and it just takes a while for Google to say, oh, for this new query, actually, this page is pretty good. We will start showing it a little bit more. And that's something that sometimes does take a bit of time. But usually the fast changing things are still possible. It's not that we wouldn't be able to pick up new things very quickly. So if some topic is suddenly in the news and you do a blog post or news article about it and we'll try to pick that up within a couple of minutes, a couple of hours to actually show that in the search results, it's more a matter of if you change like a product page. If you change, I don't know, a page about blue shoes and you change it to pink shoes, then that's going to take a bit of time for us to understand, oh, the context of this existing thing changed and actually now it's a little bit more in this direction so we should show that like that. Thank you. Thank you very much. Welcome. All right, let's see what we have left here. Tons of questions still submitted so I suspect we won't get through all of them but you're welcome to kind of copy them into a new Hangouts as well. Two identical websites in every way but one recovered from a manual link action. Does the Google algorithm work slower for a site that was once penalized? No. So if a manual action is resolved, then it's resolved then it's kind of taken out of the whole setup so that's something where once it's resolved, it's resolved but if you're saying you have two identical websites and you've done kind of link building in weird ways so that you did get a manual action, then that sounds like something where maybe those sites are so similar or kind of so kind of borderline on their own then that's something that you'd want to look at anyway. So from my personal point of view, I try to recommend people kind of combine sites rather than take them apart. So if you're saying these are essentially identical websites, then you probably get more value out of that by making one stronger site rather than having multiple websites that are essentially the same. Hey, John, could I jump in with a quick question? All right, go for it, Josh. How are you guys doing today? Good, I hope. Yes. Fantastic. My question is how well is the Mobile First Index testing going, how far along are you guys with that? How far along? So I guess the underlying question is like when is it rolling out? We don't have a time for that yet. So that's something where we're testing various things. Our primary goal is to make it something that site owners don't have to worry about. So that's kind of the direction in which we're testing. Like what can we do to make it so that actually for the normal website, this is not something where they actually need to make any change. So that's kind of the direction which we're heading there. And the tests that we're doing are on the one hand to see how this could affect the average website or the existing websites that we have out there. And if there are issues that we run across in those tests, like what can we do on our side to kind of solve those issues for the site owner? So that's kind of the direction where we're going with these tests. It's not so much that we're testing and we're just gonna switch it on from one day to the next. But we really want to make sure that this is kind of something reasonable and that it works for both sides because if we turn something on and half of the websites fall out of the search results, then that's our problem. That's not your problem. Then of course those sites don't get any traffic from Google but people are gonna be upset with us because they search for something and they're like, oh, you're not showing me what I was able to find last week. It's like you're doing a bad job, Google. So that's kind of what we're trying to do with these tests and figure out which aspects are kind of weird that we can't understand properly on the existing web and which aspects are things that maybe site owners should change or should change in the long run and how can we make it so that if we think site owners need to change something, how can we make it so that it's easy for them to understand what they need to change, easy for them to kind of test where these issues are and to get the right information that they need to make that change within a reasonable time. So not one way to the next, but more like over a period of months or even year or longer. Are we still thinking that the desktop site is gonna be ranked off the mobile signals? Is that still what we're thinking? Yes, that's still essentially the goal. I mean, that's kind of the end direction. Well, end direction is hard to say, but kind of the big trend that's happening anyway in that lots of people are using only mobile devices. So the lights are going off again. So that's the kind of direction where we think like in a couple of years we'll, oh God. Okay. Larry Page is trying to tell you something. You're telling me too much information. It's like, someone's listening in is like, stop talking, John. So I imagine that Googlebot crawler will change. You're gonna switch the UA to mobile agent. I imagine that caffeine indexer won't change. You're just gonna index the same content and spool out dupes or whatever you find there. These are all questions, not statements. So Googlebot doesn't actually change in the sense that we already crawl with the mobile crawler and the desktop crawler. It's just we mostly crawl with desktop at the moment and we're mostly crawling with mobile afterwards. So that's kind of the shift that would be happening. You talk to the one crawl. Yeah, yeah, so mostly. So we'll probably still be checking the desktop site to make sure that we have kind of the understanding of the mapping of desktop to mobile pages so that we can show the desktop link when you're searching on desktop because that wouldn't make sense to show the m.site for desktop users. And I guess Hummingbird isn't gonna change much. You're still gonna parse the same entities. Yeah, I can't say they won't change because all of these algorithms, they change all the time. You guys are always making a progress. But I guess we're wondering, are any of the 200 ranking factors gonna change really? Please tell them all to us. No, don't tell me them all. I just wanted your gut sense of yes, suddenly this factor in mobile is gonna be something to watch out for. I'm pretty sure that some of these will be adjusted over time. And that kind of depends on what we see from the mobile web, how that works out. But I don't expect it to be thrown upside down or something crazy like that. It's more like what do we need to do to kind of keep things in a reasonable shape that the search results are kind of what the web is showing. Of course, yeah. As you know, we're just trying to make sure we're preparing our clients. And the midwives of this change for you to make sure that it's seamless as possible. So thanks very much, Chuck. Yeah, thanks. So this is something where we are trying to keep that in mind. And that's something that we're pushing very hard for from our side as well in that when the engineers come to us and say, oh, well, sites just need to make this meta tag change. And like they make meta tag changes all the time. We tell them, well, you know, simple meta tag change is easy for you to tweak. But for some businesses, it's a matter of like for the five month until we can change a meta tag. And that's- Where do I find that in Dreamweaver? Yeah. But these are things that are really hard for some normal businesses to like go out and make significant changes across a website. And that's something where we push back really hard on the engineers and on the product managers and say, we can't just go out and tell people that they need to make this change for next week. It's not a matter of like one week to the next. So some of you might be able to tweak your blogs very quickly and like enable that new plugin and everything will work well. But for bigger sites and especially for smaller sites that basically set up their website once and they keep it for two to three years, those are not trivial things to do. And that's something where we need to keep that in mind. And we're very trying to keep your side in mind as well and that it might be good for your business if everyone comes to you and says, oh, please make my site mobile first indexing ready. But for the ecosystem as a whole, that's still a big problem. Of course, yeah. We're all just trying to make it as seamless as possible. Yeah, that's great. All right, let me grab some more questions that were submitted and does take some picture internal links, do they get treated in the same way or are there different ways that they pass weight? So essentially that the internal links are the same when it comes to us crawling and passing information within his website. But if it's a link only with a picture, then we don't have a lot of context for that specifically. So things like an alt tag, alt attribute on an image, that helps us to kind of understand, in a sense, the anchor text for that individual link and kind of more context on how that's kind of relevant for the other page. So that's something where if you just have a picture and you compare it to just a text link, then you will see a bit of a difference in how we treat that. Does Google use keep mapping to help with ranking a page? Not that I know. I think this is really great, great way to review your pages and how users are interacting with your pages, but I don't think we use that at all with regards to ranking. Can product reviews be used as a part of Panda scoring or reviews used as a ranking signal? So reviews are content on a page and content on a page can be good or bad. So that's something which we definitely use for ranking. We definitely use that for indexing, provided we can of course see those reviews. And with regards to Panda specifically, I don't think that's like a factor of Panda that we explicitly look for reviews, but it's content of a page. So we can use that to understand is this a high quality page or a low quality page? That does help us. I've been receiving dozens of spammy links lately from XYZ domains at hacked websites. I know you say not to spend a lot of time reviewing disavowing links, but it's like, what should we do? So from my point of view, the easy solution here is if you're worried about this and you see those links, then just disavow them. You fix that problem. You don't have to worry about what Google does with that. If you can fix it yourself, you don't have to worry about it. If you're in a situation where you never looked at the backlinks and there happened to be a bunch of these XYZ domains or other hacked sites, then probably we're already taking care of that for you. But if you're worried about this, you can always fix it yourself. Oh man, some people seem to have trouble with the link. Oh man, I didn't hide the link. It's like in the normal comments. I can't actually click on it to check if it works, but Josh made it in. So other people made it in. So it's- You have to be very clever. Very clever, okay. All right. No, I can explain it if you want more people to know. All right, tell me what I did wrong. Oh, you didn't do anything wrong. It's the way Google+, if you're watching the full screen of all of the Hangouts, you're not gonna get the extended comments that you posted it in. You have to isolate this one Hangout, the March 24th one, and view all comments and scroll down and find in the middle of the 70 comments where you put it. Oh, okay. That means you have to be very dedicated and like John Mueller a lot and wanna come in here and talk to you. That's good. Well, I'm glad you managed. That's great. Okay, so maybe I need to put that in the description next time, but I mean, it's good to have a small group too. All right, here's a question. My both home pages have disappeared completely from Google search already two weeks for no reason. No manual actions, no security issues, nothing, pages index, site maps sent, but nothing appears in search. I don't know why. So there can be lots of reasons why pages disappear from search. What I would recommend doing there is starting a thread in the health forum because the people in the health forum are really good at kind of guiding you towards the type of issues that might be applying here. It could be everything from maybe a meta tag on your page, maybe a setting in your CMS. Maybe you accidentally did a URL removal on your side. There are lots of different options here, but it sounds like people in the health forums will probably be able to kind of guide you in that direction fairly quickly. Is getting an external link to a page set to no index follow? Does that count as a weaker signal than like to a Facebook page? No, it doesn't count as a weaker signal. It's getting an external link to a page set to no index. It goes somewhere. It doesn't count as a weaker signal, but if you're linking to a no index page, and that's not something we can show in a search result. So it's not really that useful. Like, we do follow the internal links on a page like that and kind of forward those signals. But obviously, you're making it a little bit harder than it probably needs to be. We know usage of the content plays a substantial role in SEO nowadays. How would you estimate the importance of on-site product reviews? So that's essentially, like I mentioned before, it's content on a page. And product reviews, comments, whatever you have on your pages, if we find them when we call an index, we will try to take that into account. Hey, John, I have a question for you, John. All right, go for it. Yeah, so we had a site that recently received a manual action for spamming structured data and had some questions about resolution and mostly was centered around if there was any risk with Google of submitting multiple reconsideration requests as we slowly peel back the layers of what may have caused the manual action, we're concerned about what the risk may be of submitting multiple reconsideration requests. So we processed the reconsideration requests one after another, which means if you submit one reconsideration request and you don't wait until that's processed and submit another one, then we'll ignore the second one. So there's not any, just to clarify, there's not a penalty risk of submitting multiple requests and that causing Google to cause further action. It wouldn't cause further action, but what the web spam team sometimes does is if it looks like you're going back and forth a lot. So I don't think that applies in your specific situation, but some sites get a manual action, they fix it because they know exactly what they did wrong, do the reconsideration, and then a week later they get a better manual action for the same thing. Then that kind of back and forth scenario, if you do that a couple of times, the web spam team will say, OK, next time when your reconsideration request comes in, we're just going to wait a little bit over a couple of weeks to process that because we know you're just going to switch it back. So kind of give you a bit of a time out there. But if you're doing this and trying to solve an issue, then that's not really the same scenario. So that's something where from my point of view, that's fine. What I also do there is maybe check with the help forums. So there's some people in the help forums who are really, really smart with structured data and they really know those policies by heart much better than I do. So if you go there and you post a sample URL, they can tell you pretty much like, this is OK, this is not OK. And that helps you to know things down quite a bit faster. Sure. And I guess one very fast follow-up question, I guess, and you tell me if you can answer this, but does a structured data manual action have the ability to delist you from search results versus just a fact, structured data display? Only removes the rich snippets. So the kind of the rich search results. It doesn't change rankings. It doesn't remove you from search. It's really only specific to that individual item that you're doing there. Thank you very much. I appreciate that answer. All right. Andrew had a question as well. We can't hear you. Oh, no. It's a question how to turn the microphone on. Oh, man. OK, you have a question in the comments. OK. Let me see. OK. Our site rebranded about two years ago, starting November, the indexation of the old URLs jumped up, and the domain is now ranking for a number of non-brand queries. Like, what can we do to fix this? I don't know. So that's something. I briefly looked at that search result and saw that as well. So I don't know what specifically would be the thing to do here. With regards to canonicalization, we do take into account different factors. So redirects are obviously one thing. Rel canonical, internal links, site maps, they help us as well. But I suspect if you did this two years ago, then we should know about that. This is something I probably need to take up with the team here to kind of explain what is actually happening there. Because it seems like something more on our side rather than on your end. So probably not something you're doing wrong. Probably something our algorithms are just either confused about or doing wrong on our side. OK. Let's see. Since the Fred update, oh my gosh, we've seen a gradual decline in our search ranking. We've followed all of the Google guidelines. So we can't understand why we've fallen so far. We were on top of page two, bottom of page one. And now the lights are turning off again. Because of Fred. Fred is turning off the lights. Any help that you can give us would be great. OK. I know there's lights. Man, it's worse than Fred. We're not making enough. OK. Any help. So essentially, if you're following the Google guidelines, if you're doing things technically right, then that sounds to me like there might be just quality issues with regards to your site, things that you can improve overall when it comes to the quality of the site. Which means there's no simple answer. It's like no meta tag that you can make your website higher quality. It's just like in general, you probably need to take a step back, get random people who are interested in that topic to review your site, to compare to other sites, to go through, I don't know, a survey to see what you could be doing better to improve the quality of your site overall. And ideally, don't just tweak things to kind of subtly improve the quality to see if you can get a little bit higher, but really take a stab at it and try to figure out what you can do to make a significant dent in the quality of the site overall. We never use sitemaps, and all of our pages were always indexed well. Yes, that's good too. In the context of an HTTPS move, is there any reason to add sitemaps now, or can we rely on Google picking up the 301s and doing the good job it did for years? We do recommend using sitemaps for things like this because we can understand that pages changed a little bit faster. But if your website is fairly small and we've always been able to keep up, then probably we can just deal with that on our own as well. So lots of sites move to HTTPS or do domain moves without sitemaps, and that should just work fine too. I think if you're asking this question and you have a small website, then probably generating a sitemap file is pretty easy to do and not really that crazy of a thing to add on top, to kind of add a little bit of exit security with these kind of moves. But it's probably not the ultimate critical thing that will make or break your site. All right, one more here. Gary mentioned that there's no domain authority. But what does John say, more or less? I would say Gary knows this extremely well. He works with the ranking teams regularly. So if Gary says something, you should just leave him and follow what he says. So that's something where I don't really have anything more to add to what Gary has been saying. And let me just take one more here. Search console is showing soft 404s for our WWW version, but the canonical is on HTTPS. We redirected everything. Our site is slowly being de-indexed for what's happening here. That sounds like more of a technical issue where maybe you're redirecting in a wrong way. So for example, what might be happening is that you're redirecting everything to the home page. And that's something that we would pick up as a soft 404. We would start dropping those URLs. So if you're doing a redirect, make sure you're doing that one-to-one between the individual URLs and not redirecting everything to the home page of the other version. Hey, John. Hi. Managed to join in for last minute if I can have only one short answer question. All right. All right. Links to images. So somebody hotlinking images from my site. Do they count as a link for the whole domain or not? I don't know. Good question. So they definitely count for the image. I don't think we would apply that to the rest of the website because the image doesn't really link out to the rest of the website. OK. May I have one more? OK. Work too late. One thing, we have our servers quite overwhelmed. And our system guy proposed a solution which I find kind of strange. He proposed to move a dub, dub, dub version on the cloud and have it cached and keep one and dub, dub, dub version on our server only. We've rel canonical to the dub, dub, dub version. But basically, they will exist separately. It will be accessed different URLs. And the non-dub, dub, dub would not be cached and will show all user generated content in real time. While the dub, dub, dub, which will be one link, one promoted and the one in rel canonical. So the one we will promote will be cached and will display only things with a few hours delay. So one is basically faster. Yeah, dub, dub, dub. It's faster, but it doesn't have everything. While non-dub, dub, dub is having everything. But would that be a problem? Yeah. I think it's probably not so much of a problem. But it could be confusing for users. It could make it harder for you to diagnose issues. But what I would do is just test it and maybe take a relevant part of your website where you say, I don't know, these URLs receive the same traffic as this set of URLs. And just set up that setup for those two pairs and see how that works out. But I may notice for penalty from Google or something like that. I don't think there would be any reason for Google to analyze a website or demote a website because of this setup. But I could imagine that users might notice a difference. And it might be that Google indexes one version a little bit better than the other one. But it's not the case that we would manually search for someone who is doing this advanced hosting. Nearly Google should index only with the dub, dub, dub version because that's the one I promote with rel canonical. And that's one. Yeah. I would just try it out. I would test it because then you don't have to believe me. Then you can say, well, my number show this. OK, but it's not like I will get ourselves banned on lots of links. No, you won't get banned or removed from Google, of course. OK, thank you, thank you. Have a great evening. Thanks. All right, let's take a break here. There's still tons of questions left. But if there's something that you really urgently need to get answered, I post it in the Webmaster Help Forum. There are lots of really awesome people there who know their stuff. If you want to ask it in the next Hangout, I think the next ones are set up already. So you can just copy it into those Hangouts as well. And we can try to get through those as well. All right, thanks, everyone, for joining. Thank you for joining me as well. Thank you for having me. And hope to see you all in one of the future Hangouts. Thanks, guys. Have a great day. Thanks, John. Thanks, John. Thanks, John. Bye.