 OK, welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. Today we have a special version of the Office Hours Hangouts, particularly for Latin American websites, Latin American companies, to kind of go through some of the topics that have been top of mind for them. Andy has helped to organize this a little bit. Do you want to take a stab at it, Andy? Sure. So for those of you who don't know me, I'm Andy Young. I am the VP of Product and Growth at Cossack Ventures and a mentor in the Google Launchpad program, where we bring, or Google brings, 15 or 16 Latin American companies up to San Francisco to mentor them and to help them. So between that program and then the work here at Cossack, we try to really help Latin America grow, build better products, become better webmasters. I've been working at Cossack for a long time. I've been working at Corsack for a long time. A lot of you have worked with myself or on or in the Launchpad program, and we're hoping that Tom, who is the rock star of the Google Webmaster Trends team and taught me most of what I know, can help teach you guys a lot more. That sounds great. OK, so it looks like there are a bunch of questions that were submitted already, but maybe we can start off with live questions from any of you. Anything that's been on your mind with regards to web search, websites, Google site of things, that maybe we can help answer. I'll jump in first. This is something that I know a lot of our companies talk about and wonder about. Latin America has Brazil, and it has the rest of Latin America that speaks Spanish. Very often, the words are very similar, but just a little bit off. So the big giant down here is Mercado Libre, and they have Libre and Libre off. And a lot of companies deal with different brands. Any advice on how to have search presence when you have to deal with two different brand names in regions that are right next to each other? So in general, the method that we have for this kind of problem is the hreflang link. So this is something that you would use to link two different pages together and say, this is the version for this country. This is the version for another country. And then depending on where the user is searching from, we'll try to swap out the URLs in the search results. So it doesn't affect ranking. It's not that it would boost those websites in those countries, but it'll try to make sure that we're showing the right version in those countries. And the hreflang links can be a bit tricky to set up initially because you have to link all of the different versions together. But one thing you can do that probably has a fairly large impact already is just do it for your home page. So just take maybe your most popular page, your home page, or take a handful of the most popular pages and just manually link those together with the hreflang markup where you specify, this is this language for this country and this is that language for that country. And we recognize that these links are bi-directional, so you're confirming this markup on your pages. And then we'll use that to show the right version in the right search results. So we don't even need to understand like the subtle spelling changes there. We essentially just recognize this user is in this country and searching in this language. Therefore, we'll show this version. All right, so let me start running through some of the submitted questions. And at any time, you're all welcome to kind of stop me, pause me, ask for more questions, ask related questions. Maybe some of these questions will get you into the creative question asking mood as well. All right, so the one I have here on top is what about Chrome data? Can you see and use Chrome data in Web Search rankings? Can you see what a user does on my page when he comes to visit Google? Do they visit this internal link or leave a comment or click on my Facebook share button, et cetera? So I don't know the specifics about Chrome data, but in general, this is something that's very confidential within Google. We don't use the Chrome data across a variety of products as far as I know. And as far as I know, this is not something that we would use in Search at all for rankings. One place where we do sometimes use kind of this user data information is just when we look at the search results, where people are clicking, and only to evaluate which of our algorithms are working well and which of our algorithms are not working well, and specifically when we do experiments. So when one of the engineers comes up with an idea and says, I think this would make a fantastic change in our search rankings algorithm, then we'll do internal tests first and then we'll maybe do a really small test group to see how the test users react to that. Do they like this change? Do they end up clicking on the first results because we're bringing the better results on top? Or does this change bring lower quality results up that they're not really clicking on that they don't really appreciate? So at that level, it kind of makes sense for us to look into user behavior data, but on an individual page basis, and especially when you're looking at a sub-page basis, are they clicking on the Facebook button or not, I don't think that would be something that we'd be able to use within Search. I think that's just very noisy data and probably not that useful, even if that could be refined. Let's see, there is a question from Bruno in the chat. We're wondering what's the impact on using WPA applications to optimize our mobile experience since Google has some trouble with JavaScript? So I assume you mean PWAs, Progressive Web Apps? Is that possible? If so, with Progressive Web Apps, so Progressive Web Apps is essentially a special kind of website that has a certain amount of functionality that works really well on mobile browsers. So for example, it can work offline with a service worker that kind of runs in the background and updates the content and makes sure that your pages are available when connectivity is bad. And these are things that, from a point of view, are really good things to have because they make your website essentially more like a native application within a mobile phone. So that's something that works really well. With regards to Search, one of the problems with Progressive Web Apps, or I wouldn't really call it a problem, but one of the aspects that's very common there is that they run on JavaScript. So it's essentially a big JavaScript framework that's running for your website. And traditionally, search engines have had trouble with JavaScript. However, Google in the last years has worked really hard on making sure that we can actually render pages the way a normal browser can do that. So for the most part, we can execute the JavaScript on a page, and we can render a page like a normal browser would, and we'll index the page like that. However, there are some aspects that don't work perfectly well when it comes to Progressive Web Apps or JavaScript applications in general. And for that, there's a blog post that someone from the Chrome team, I believe, Tom Greenaway, I'm not sure about the name, did maybe in December with regards to making sure that Progressive Web Apps can be indexable. And there are a bunch of tips that you can follow to make sure that that actually works. So you can definitely create a Progressive Web App that works really well on a mobile phone. That's also perfectly well indexable within Google's kind of web search environment. So that's something that's certainly possible. Sometimes it takes a bit more work to kind of get that to work together. So depending on how far you want to go, sometimes that's easier, sometimes that's a bit harder. All right, can a site page design have any reflection if you're suffering from Panda in the sense of internal click-through rate, bounce rates, or is it mostly content-based? So Panda is a quality algorithm that we rolled out a bunch of years ago that tries to bubble up higher quality content in the search results. So that's something that looks at the site overall. It looks at everything across the site and tries to figure out, is this really a high-quality site that we would want to recommend to users? Or is this something that's more lower quality site that's maybe not so worthwhile to recommend to people? And for that, there is a blog post that we did way in the beginning of the Panda algorithm, which was, I think, 23 signs to look for with regards to high-quality content. So that's something where I would definitely take a look at that blog post and try to follow along what we would recommend with regards to high-quality content. And that can include things like a design. So that's something that users will notice when they come to your site. So from our point of view, a website is not just the text on the page, but actually the whole construction that's available there. So all of that kind of tries to flow in with regards to our algorithms. OK, a question about RankBrain. RankBrain uses vector proximity to understand the meaning of words. They usually use two dimensions, x and y coordinates. Does RankBrain process language based on more dimensions like 304, et cetera, that integrate not only semantic meaning but intention and feeling, et cetera? So the short answer is, I don't know the specifics around RankBrain. It's more of a system that tries to understand the queries themselves, unless about the content on the page itself. So trying to take a query that someone brings to Google, where we see a lot of queries for the first time every day, and try to figure out what is the user actually looking for here, and what can we do to make this content easier accessible for the user? So with regards to that, I wouldn't focus so much on the coordinates and how many dimensions RankBrain processes within its kind of machine learning algorithms, but just kind of realize that we see tons of queries for the first time every day, and we have to try to figure out what is the user trying to achieve here, and automatically bring the answers to those queries. So that's not something where we could route that query to a team of engineers that will quickly figure out, oh, they're looking for this page about this Superbowl ad that showed up last week. That's something that our systems have to do automatically. They have to do that in a fraction of a second. All right, let's see. With regards to AMP, is it possible to think that once mobile latency becomes very low, as technology advances and speeds increase dramatically, AMP will no longer be needed? Let's say latency in 2021 is three times faster than today, and everything is fantastic. Well, we still need AMP, essentially. I don't know. That's an interesting question, of course. But from a practical point of view, my take on this is at the moment, it's very much a big problem. So latency is a big problem. The bandwidth on mobile is a very big problem. Also, the resources on a mobile device are very problematic in the sense that mobile CPUs are very limited in their processing power. Mobile devices tend not to have that much RAM. So all of these things add up, and they mean that you can't take a big complex desktop page and just run it on mobile, and it'll be really fast and snappy. It's just like all of these different things add up, and they make it really hard to make really fast mobile pages. And from our point of view, what we've just noticed is that when pages are slow, people tend to kind of limit their time in the interactions with those pages. And that's something that's not particularly new. In particular, e-commerce sites have done studies on this time and time again, where they will kind of artificially slow down a page by even just a couple of hundred milliseconds, and they will see a noticeable difference in the click-through rate, in the conversion rate of those pages. So that's something that we've just seen that speed is really an important factor. And if we can do something to make mobile pages a lot faster for a lot of people, then that's something that, at least for the moment, makes a lot of sense to actually do. With regards to 2021, that's still quite some time away. And judging from the way that the internet has evolved in the last couple of years, obviously, things will be a lot better, a lot faster by then. But I really doubt that this problem will be solved across the world. And keep in mind that there are a lot of people that are just now coming online on the internet in a variety of countries. And that's a really big market, where if they have bad connectivity at the moment, and your site says, well, most people have really fast internet at the moment, then they're going to go somewhere else. They're not going to stick around on your site and wait to see if your site finally loads and loads something that's interesting to them. They'll just go to a competitor that has something a lot faster. And from Google's point of view, I think the interesting aspect is always that we see any of these problems as our problem. So if we see that we're sending users to pages where they're frustrated because they're slow, then that's our problem. Because users come to us with a request. They ask us for the best page for this specific topic or this specific query. And if we send them somewhere where we know that they're going to have a bad experience, then that's essentially our problem. We're doing them at this service. So the more we can do to make sure that the internet as a whole speeds up so that anywhere where they go, they have a good experience, the more we really want to drive in that direction. Topic? Yes. A lot of Brazilians in the call right now, where we have Google Web Lite. That's one of the places where it exists. Should we consider that like a slap on the wrist when we get that, that you think our site is particularly slow in the log? Should we take that as a negative or as you guys just do it to everyone in that region? I don't know the specifics with regards to the triggering of Web Lite there. So I know in some regions we do that more because we really notice that connectivity is a problem. And in other regions, I believe we might do that based on the user's capabilities where we recognize, oh, this is a user on a bad connection or has a low end device, where it probably makes sense to do that. So I'm not completely sure where Brazil falls in that area. But I think there is a meta tag that you can put on your pages and say, hey, I don't want the Web Lite treatment. Awesome. I think Launchpad just joined. All right. Fantastic. Great to see some more people. They're just Googlers right now. Wow. OK. Let me just post the link also in the public thread so that if anyone else wants to join in and there happens to be a slot free, you're welcome to jump in. It looks like the slots keep coming and going fairly quickly. All right. Here's a really common question that we see from Christian. Does Google use the same version of its algorithms in Latin America? We see daily how the search results where maybe Penguin and Panda don't seem to be as applied as much or maybe the competitive queries don't see as many changes as you might see in other countries. And I guess from our point of view, from Google's side, we try to make sure that our algorithms work as globally as possible. So there are lots of countries out there. There are a lot of different languages out there. Content is in languages that we might not even understand. And we want to make sure that the work that we do is as globally applicable as possible. So not something where we'd say, well, this is an algorithm that is specific to this country or this language or anything like that. So that's something that we try to do as globally as possible. Sometimes it does make sense for us to say, well, this is something we're going to launch in the US first because we know that market we're really well or we have some partners there who are able to implement the special markup that's needed where we can guide them through the complicated process in the beginning and hold their hand a little bit until we're sure that the markup actually works the way that we want it to work. And then we'll start rolling that out a little bit more. For example, with AMP, that's something where we started with a smaller set of countries. We've been expanding that over time. And I believe that should now be pretty much global or pretty soon and probably will be globally available. So that's something where sometimes it makes sense to start small and kind of grow a little bit bigger. But for the most part, especially when it comes to ranking algorithms, when it comes to the core search algorithms, we do try to make sure that they're as global as possible. Because we don't really have a chance to kind of give one-to-one treatment to all of the different countries in search results. All right, here's a question from Laura in the chat. We were wondering about differences between the US and Latin America regarding semantics. Is it correct to assume that improvements in English semantics are also available in Spanish and Portuguese? Do you see any differences? So I assume this is more around the kind of querying where you enter a search result and you kind of see the results come back. And kind of similar to before, we try to make sure that things are working as globally as possible. However, we do sometimes have to take individual language feedback into account and see, well, this is something that maybe a team put together in California and it works really, really well in English because we understand English really well. But maybe in some other languages, it doesn't work that well. So specifically in Europe, we see that sometimes for smaller countries, smaller languages, where something that should be working well globally just doesn't work so well in that individual language. And we don't understand those queries as well as we could. And that's feedback that's really useful for us. So if you see that happening where you enter a query in your language, in your country, and you see search results that are really bad, or where you can tell, oh, Google is missing up them in the meaning of these two words, then that's important feedback for us to have. You can either send us that by email. For example, you could send that to me via Google+. Or you can use the feedback link on the bottom of the search results to send us kind of general feedback about those search results. Or if you're thinking this is really embarrassing and Google should really do something about this, you could even go into the web search help forum and post about that there to get feedback from other people as well. So all of those methods are, I think, valid ways to give us feedback. And from Google's general point of view, we try to make sure that this works as well as possible globally. However, since I'm based in Europe and we have all of these different European countries, I do see that sometimes we have weird quirks that show up in some languages, but not so much in others. All right, a question from Bruno in the chat. What's better, using AMP or investing in PWA or making a responsive site? Talking about positions in the search results and organic traffic, all of these are possibilities. There's even a way of kind of combining the PWA with the AMP side, where you can create kind of the structure that works as a PWA but also has the AMP content. So that's something that you could also look at. But from my point of view or from my side, it's really hard to say which technology is better because these technologies change over time, the capabilities change over time. And the best advice I would give there is to make sure that you're able to be as agile as possible with regards to your website, that you can kind of isolate the technology part from the content that you're providing or from the services that you're providing so that when a new technology comes up, you think, oh, this is fantastic. This worked really well for my site, that you're able to kind of switch to that fairly quickly or when new types of structured data markup becomes available, that you're able to implement that fairly quickly. So it's not so much a matter from my point of view of saying this technology is the best and you all need to do it, but rather kind of remain agile and remain kind of thinking about the technology as it comes out and think about how it could apply to your business. And don't just kind of pick up technologies just because they're available and say, oh, we will invest all of our IT resources into implementing our website in this technology and then totally forget about your actual business needs. All right, here we have one with regards to rich cards. We haven't been using rich cards. We understand the improvements in search appearance, but does it also impact ranking? No. So with regards to rich snippets and rich cards, these are things that we show slightly differently in the search results, but they don't affect the ranking. So it's not a ranking change. It's not a matter of us looking at these pages and saying, well, they use a specific markup. Therefore, they must be high quality. Therefore, we should rank them higher, but rather we show them differently in the search results. And that's something where you can kind of highlight a little bit more what your pages are actually about, and that can have value of itself. Also, some of these different types of markups that you can use are shown in different places in the search results. So it might not be that we would change the ranking of your individual listing, but maybe you're also shown in the carousel on top, which kind of, in a way, is something that you could see as a change in the visibility of your content in search. All right, any questions from the Hangout in the meantime? I would have one. All right, go for it. It's a bit stranger one, but maybe you can point it where it says, if you can't answer me directly. I was looking in Google search console for a site. I own for some keywords, and I was pretty much looking at the click-through ratio. I was trying to increase this, since a while, by tweaking with a title, adding structure that ought to make it look prettier, and things like that. But I kind of have a problem. I don't really know what I should aim for. For example, could you tell me for a site that it's number one in Google? What would be a good click-through ratio? Something to aim for? I don't actually have any numbers that I can share. Maybe somewhere where. I think it really also depends on the type of search results that you're aiming for, where it can be very drastically different. For example, if someone is searching for your name, and if they don't click on your site, then you're doing something really, really wrong. But on the other hand, if someone is searching for a generic term and your site ranks there, then obviously there's a lot more competition within that search results page. So that's something where you would see very different click through rates, depending on the type of query that people are doing and the type of content from your site that's actually showing up in those search results. So that's something where you can't just look at the aggregate click-through rate and say, oh, this is good or this is bad. But where you have to think about, what is the user actually doing here and why is my click-through rate like this? And is that OK? And it might be that that's perfectly fine. If you're showing for more generic queries and you're getting a lower click-through rate, then that could be perfectly fine. That's not something that you would need to compare to. For example, if someone is searching for your brand name specifically and they always click on your site. So that's kind of the thing I would try to differentiate. I think it's a bit frustrating because I am on number one. And I think I'm the most relevant one. But of course, that's maybe subjective. And I only get something like 65 click-through ratio. That's pretty fine. That should be that it's pretty low for being number one. I don't know. It's really hard to say. For example, I don't even know for our official blogs what the click-through rate would be there. I'd have to look that up and see what is actually shown there. I see. And the question I asked in the chat, what happened? How the first people that got in the hangout managed to because you only posted the link later. It's magic. No, because this is a kind of a specially themed hangout specifically for Latin American companies. We decided to invite some people that we knew that were kind of matching to the audience into the hangout a little bit earlier. But the other ones, it will not be like that, right, with the next ones, next hangouts. Yeah. So usually I try to keep them as generic as possible. But this is a kind of a special one that wanted to make sure that I had a chance. I was just thinking that they missed something. No, no. OK, thanks. I have a few more questions, but I let power ask to one. No one else, I jump in again. All right, great. We'll come back to you. All right, a question from Bruno. How do competitors rank better than us in some keywords if they don't have optimized page content or a lot of inbound links coming from other sites? So I guess that's a really common question that we hear a lot, is like, why is my competitor ranking above me? That's not fair. My page is definitely better. And from our point of view, that's really hard to say. On the one hand, I don't want to judge the quality of these pages and that at all. But the important thing to keep in mind is that Google uses a lot of signals when it comes to crawling, indexing, and ranking. We say we use over 200 signals for that. And all of these things kind of add up. And it's not so much that you have to copy exactly what other people are doing in the search results, but because we use all of these different signals, one site might be focusing more on this, and another site might be focusing more on something slightly different. And all of that is perfectly fine. So that's not something where you need to copy one to one what other people are doing, or where you can easily compare one to one, like this one is ranking high, and why is my site ranking below that, and just compare two or three factors out of those 200 and say, well, these factors all would say my site should be ranking higher. There are lots of things that can come together there. So my recommendation there is generally not to focus so much on competitors as hard as it is, but to think about what you can do on your site to make sure that your site is clearly the best of its kind overall. So instead of trying to copy what other people are doing and chasing them, think about what you can do to really make sure that if Google shows search results for a specific query, if they don't show your site as number one, then when Google engineers see that, they would say, oh, this is a bug in our algorithms. We really need to fix that. So that's something that from my point of view is something that's really important for me. When people come to me and say, well, for this specific query, I'm ranking below all of these other sites, and I'm just as good as all of these other sites, then that's not something I can take to the engineers and say, hey, we should be swapping the order here because these people are clearly just as good as all of the other ones, and they deserve a chance to be shown as well. But it's a lot easier for me to go to the engineers and say, hey, for this specific query, we're not showing this site. And that's clearly the absolute best of its kind. That's not something that's even open for discussion. That's just what people are searching for and what they're trying to find. Then the engineers will take that and say, oh, well, this must be a bug in our algorithm somewhere, and they'll try to figure out which of our ranking signals are messing up and picking the wrong version or the wrong site rather to show in the search results at the top position. So that's something where instead of trying to be just as good as other people, I'd really try to figure out a way to be significantly better than that. So you go on and ask, so what can I do beside the obvious stuff like keyword research, optimizing content, rich snippets, et cetera? It's difficult to do something different since we're all e-commerce sites. Yeah, sometimes it is difficult. And I think that also gives an opportunity to you because it's equally difficult for your competition as well. They can't magically pull a rabbit out of a hat either. So that's something where sometimes with creativity, you can achieve some things that other people aren't able to do so far. And sometimes it's also worth taking that position and saying, well, we're ranking number two for this query. And improving that is something that we don't really know what we can do. But maybe we can take this chance and say, well, number two is also pretty good. And instead of focusing on search so much, we'll focus on different kinds of traffic sources as well. So instead of kind of like limiting yourself to just search, kind of make sure that you're diversifying your traffic sources and also getting traffic through social media, that there are different ways that people can reach your site and kind of buy things on your site as well. And then you don't have to worry so much about ranking number one, ranking number two, and ranking number three. But rather, you have kind of this stable foundation of people coming to your site over and over again on their own as well. Can I ask you a question related to this, John? Sure. We're on this topic. Go for it. One of the big things about ranking here is these are a lot of homegrown companies. So that question was, you know, in e-commerce down in America, how can they stand their ground against the SEO giants like Amazon and TripAdvisor when they come down? Which, you know, they're not necessarily big right now. But if they decide to face Brazil, to face Argentina, how do you maintain your edge? That's sometimes hard. Yeah. I don't really have a magic answer there. I think that's always kind of the tricky situation. On the one hand, it's good because you're kind of like a step ahead of these big companies. And generally speaking, big companies, they don't move that quickly. They can't kind of pivot as quickly as smaller companies can. So you have that kind of automatic advantage kind of built in. But on the other hand, if they do decide to enter your market, then that might make it a lot harder for you. So I would kind of take this time as something where you can build up your value and make it kind of like I mentioned before, that you have diverse traffic sources so that you're not reliant on just search, for example. Because if a big company comes in and they almost, I don't know, dominate the search results for the queries that you're focusing on, then that would be kind of problematic. On the other hand, if you have a diverse set of traffic sources, then even if search kind of goes down a little bit, you're at least able to keep up. You're not in panic mode automatically. And then you can still figure out, do I want to kind of go head to head with this big company? Or do I want to focus on a niche that's not so interesting for this big company? And try to find the path that works for you, because you kind of know what you want to achieve within your business market there. And sometimes it makes sense to say, well, I will focus on a specific niche out of this big business market, and that's still big enough for me. Or sometimes it makes sense to say, OK, well, this big competitor is coming into the market, and I can take them on. I can easily beat them in my market, because I know my market a lot better than this global competitor that's kind of slowly coming into the area. So that's something where there are different paths that you can take. It's not always easy. There is no simple trick to saying, well, I was number one last year. Therefore, I'll always be number one by putting this meta tag on my pages. The whole web environment changes fairly quickly. All right, here's a question from Michael about Google Local Listings. Is there a connection between Google Local Listings and Google Organic Listings? I asked because I see some spammy sites in Google Places ranked number one, but I don't see the company in the organic results. So I don't know specifically how Google Local ranks their pages. That might be a situation where we show a site in the local results, but we don't show it so much in the normal web search results. Sometimes that can be a normal situation, where a site just doesn't rank so well for local queries, but we know it's local. We know it's something that people want to find in search results, so we'll show it in the local results and not show it so much in the normal organic results. So in general, there is no connection between the local listings and the organic listings that are kind of independent. And because of that, I don't really have much insight with regards to how the local rankings actually get put together. With regards to Penguin, the recent changes observed in the search results, which according to some SEOs like Barry Schwartz have to do with private blog networks in Penguin, the affected private blog networks belong to the English market, or are there also private blog networks affected in Latin America or Spain? I don't know specifically what this is referring to. Barry reports on a lot of interesting things. But in general, kind of like I mentioned with the other aspects, we do try to make sure that our efforts are as global as possible, where if we can spend maybe 10% more time to make something work worldwide and have a much higher impact, then that's something that we'll be likely to do. So that's something where I would imagine that things like private blog networks, which are basically just like link networks, that we'd be able to tackle that on a global scale. Obviously, sometimes these things involve changes that are done algorithmically, where we can tweak our algorithms to understand better this is something that is more along the lines of abuse within Web Spam that we need to tackle differently. And sometimes there are things that we can tackle manually where we say, well, we're not able to catch this completely from an algorithmic point of view just yet, but we can do something manual to limit the damage that these things are causing for the moment. And then in a case like that, we'll have the manual Web Spam team take action and figure out what they need to do to clean that up. John, I'm going to say the less politically correct way that some sites in Latin America got a slapdown in the last one in the traffic that we follow. OK. OK. I really don't know what those changes were there. I've been out the last couple of days, so I haven't been able to keep up with all of that. But it's good to hear that at least some of the work that the team has been doing is visible as well. How do you treat embedded YouTube videos on a page? Do you treat it like an external link to a specific resource? Does it hurt if I embed external resources on a page from a ranking point of view? So maybe the last question first. No, it doesn't hurt if you put external resources on a page. Depending on how we can render those pages, we might be able to take some of the content from those external resources and include that in a page. For example, if you use JavaScript to pull in content from an API and you display that on your pages, then that's something we could theoretically take into account and say, well, this is on those pages. When we render them, therefore we can index it like that. With regards to YouTube videos, I guess that's a little bit different because videos are kind of not textual content. We can use those as video landing pages, though. If we recognize that you're embedding YouTube videos on a page, we can understand that this page might be a suitable video landing page when someone is searching for something video related. So it might be that we take the thumbnail of the YouTube video and show your page in the normal search results when someone is searching for something where we think a video might be a good result for them. Does AMP have any effect on the mobile first indexing update? So maybe just taking a step back first, the mobile first indexing update is a change that we want to make in our search results, where we've noticed that more than half of the people are using smartphones to access our search results. So we want to make sure that our search results actually reflect what the average user would see when they do use our search results. So with that, we want to switch our indexing to switch from using the desktop version for the index to the mobile version for the index, so that actually our index reflects what the majority of our users would see when they actually visit a page. So that's kind of the mobile first indexing change in a nutshell. That's something where we haven't worked out all of the details yet. We announced that on our blog a while back, just so that people don't freak out completely when they see these kind of crawls and indexing happening, but also so that people can think about what this change could affect on their website. So for example, if your mobile page doesn't have any of the videos or images embedded that your desktop page has, then we might lose those videos and images when it comes to search, when we switch to the mobile version. So that's something that you could kind of prepare on your end as well. And similar with structured data, if you're not marking up things on your mobile pages, then that's something where we might not be able to pick up that structured data when we switch to the mobile version. So if you can see if that happens to your pages or if that's the case on your pages and fix that before we actually make these changes, then that would be really useful for your website. Obviously, we will let you all know a little bit more when we know more about the mobile first indexing changes so that you can prepare a little bit more structured. But that's something you could also do in the beginning. With regards to AMP, at the moment, AMP is generally used as an alternate version to the mobile page. So it's not something that we would always show in the mobile search results. So at least at the moment, we wouldn't use the AMP version as the mobile first page for indexing with the exception of a situation where the AMP page is actually your mobile page, where you don't have any other mobile page at all, where you might not even have a desktop page at all. So you can use AMP as a framework for making the website completely. And in that case, the AMP version of your page is your normal version of your page. And the desktop version is the normal version of the page. The mobile version is the normal version of the page, and that's the version will index as well. So for example, the AMP project.org website is completely in AMP. So in a case like that, of course, we would have to use the AMP version as the page that we would index. On the other hand, a lot of other sites use AMP essentially as an alternate URL. And in a case like that, we would just use the normal mobile version for indexing and not the alternate version on the AMP page. Let's see. Another question here. Samrush has recently released a backlink audit. Any idea how trustworthy, efficient their toxic index is? What are the risks when we disavow backlinks in Search Console? I don't know. I haven't actually seen their backlink audit tool or data there. So it's really hard to say. With regards to links, obviously, there are a bunch of third-party sources that are able to crawl the web as well. The web is generally open and crawlable. So that's something where other sites might have a reasonable understanding of how the links are on a site or across the web overall as well. And they can work out things like this is probably problematic and this is probably OK. So with that in mind, it's possible that they have something really fantastic that's helpful for sites when they look at issues around links. And from my point of view, this is particularly useful if you're not sure what previous SEOs have been doing with your website. If you've been working on your website normally and you haven't been building crazy links, you haven't been doing crazy things with regards to web spam and linking schemes, then probably you don't need these tools. On the other hand, if you're picking up a project or a website from someone else and you have no idea what they've been doing or previous SEO, where you know that they like to play around with weird things, then maybe it makes sense to review what they've been doing with links. And maybe these third-party tools help make your work a little bit more efficient by letting you focus on the actual issues rather than looking at millions of links individually. Let's see, another question here we're looking forward to improving our use of canonical, but we're worried that it could hurt some of our keywords. For example, our search results for presence and gifts can be similar. Should they be canonically related? Where should we draw the line? So that's a question we see every now and then as well. The thing to keep in mind with the rel canonical is, on the one hand, this is a signal. It's not a directive. It won't automatically always combine things. This is mostly done as a way to protect sites from shooting themselves in the foot. So for example, we've seen lots of sites set the rel canonical to their home page, which would mean if we followed that directly and found that when we crawled the site, that we would only be indexing their home page and dropping all of the other pages. And that's probably not what the webmaster was trying to do. So our systems try to catch that and react accordingly. So similarly, in a case like this, we'll look at these two pages and if we say, well, these pages aren't really the same, then maybe we would see this rel canonical and say, well, probably a mistake from the webmaster side. We can probably ignore that because these pages are essentially different. One talks about presence. The other one talks about gifts. Some of the individual blocks maybe are the same, but actually, they're different kinds of pages. So that's one thing that could happen, that we essentially just say, well, these are two different. We can't combine them into one page. The other thing that can also happen is that we say, well, maybe the webmaster does know what they're talking about and we'll just follow this advice. What could happen in a case like that is that we'll lose the other copy. So we won't have that version in our index anymore where you might set the rel canonical from the gifts page to the presence page, but what will happen there is we'll lose the gifts page. So if someone is specifically looking for gifts content and otherwise your gifts page would be really useful for that, then we would lose that. We wouldn't have that to show in the search results. So that's something where you kind of need to think about what it is that you're trying to do there and which pages you have that you do want to have shown in search. And you can kind of work with that from that point of view. So my general recommendation is to use the rel canonical for situations where it's really the same page. Maybe the order of the items has changed slightly because you have kind of an order and filtering aspect involved there. That's something that might be OK, but if you have different content on these pages and you want them to rank individually, then I would not use the rel canonical across those pages. Let's see, another comment from Bruno. AMP won't be used for indexing probably, but won't it be a ranking factor? Since it improves user experience and load time, won't we rank well using AMP? At least at the moment, AMP is not a ranking factor. I think that's a good situation to be in because it kind of leaves it up to you whether or not you want to implement AMP or not. One thing I try to encourage teams internally at Google to avoid is to kind of push their own agenda and say, well, we would like to have more adoption with our specific feature. Therefore, we'd like to make it a ranking factor. And from my point of view, a ranking factor should really be something that improves the relevance of the search results. So not just something that kind of drives more adoption. And if it does improve the relevance, if we can determine that AMP content is more relevant to users, then maybe that's something that can change over time. But at least for the moment, that's not the case, that we would say, well, AMP is really not a ranking factor. So whether or not you implement AMP is more a question of what do you want to do for your users? And sometimes content implemented by AMP can be blazingly fast. And if people can reach your site and read your content blazingly fast, then we found that they're more likely to stick around on your website and actually do other things on your website as well, which is probably a much bigger win for you than any kind of subtle ranking tweak in the search results. Canonical noindex, nofollow. Are they still important nowadays? Or does Google just know automatically what to crawl, what to index? These are definitely still topics that are important. Canonical, like I mentioned, is a signal for us. It helps us to combine pages. Noindex is something that's more like a directive, in the sense that you're really saying, well, this page, after you've crawled at Google, I really don't want you to show it in the search results. It's a clear directive. So that's definitely important as well. Nofollow can be important as well. Where you say, well, I can't vouch for these links. These are comments that people left in reviews, for example, or blog post comments. I have no idea what they're linking to. I can't vouch for these. Or maybe there are advertisements on a page. Then putting a nofollow there is a good sign, good thing for us. So these are all, I'd say, still important topics. It's not the case that Google has so much magic machine learning technology in the back that they would be able to look at any page and say, oh, I understand completely what this page is about and what the webmaster is trying to do here. I don't need to understand any of the HTML. That's not the case. I don't see that coming anytime soon. So a lot of these technical elements, a lot of the technical SEO aspects, I think they'll continue to stick around. And these are good things to learn. A lot of CMSs do a lot of this automatically for you now. But it's still something good to understand, because search engines still follow all of these small technical details step by step. All right, let's see. Is there a way to avoid the Google AMP cache for AMP pages? I don't think so. So if you're serving AMP pages and we're showing them in the search results as AMP pages, then we would generally serve them through the Google AMP cache, because that's the fastest way to serve this AMP content when it comes to search. So there was recently a blog post by the AMP team about the AMP cache in particular and how it does help improve the search, or not the search quality, but the speed within the search results, where we're able to preload and pre-render content above the fold so that when people click on the AMP result, it's already there. It's not something that needs to go to a server and be fetched and all of that. Let's see. You go on and ask, just to explain, we want our returning visitors access our site logged. So if Google makes a proxy to our pages, this will affect our login system. Yes, it is something that you kind of have to plan in when you're creating AMP pages. So if you have a login system, there are things you can do, such as the AMP iframe that you might be able to embed to kind of guide the user to their logged in information. But in general, since the AMP pages are a CDN, where we're hosting the static pages in the CDN, you don't have the full flexibility of a fully dynamic page. So that's something where you kind of have to think about where it makes sense to implement AMP and where you can kind of go with a mixed approach and why you'd say, well, this is too dynamic for us to cache on a CDN in a static way. We probably either have to wait for more functionality on the AMP side where I know things are coming or maybe just say, well, at the moment, this is not something that would be suitable for AMP. And when it comes to AMP, you can implement this on a per page basis. You can say I'll do these set of pages with AMP and this set of pages without AMP. And that works perfectly fine for us too. All right. Andy has a question from someone who can't get in. When completely redoing a site and its content but keeping the brand, are there best practices to not lose all of the history? So this is a tricky one. From my point of view, there is no simple way to kind of keep everything that you've had so far. With regards to history, obviously things you can do to kind of maintain the kind of structure and the context within a site are important in the sense that if you're changing URLs from a website, then obviously redirect from the old ones to new ones so that we can pass any of the signals that we've collected over the years to the new URLs. So that's obviously one of the early things that you should be doing. And we have a lot of that covered in the Help Center article. The more complicated part is if you're completely revamping your website, then obviously you'll have different content. You might have different URLs. You might have different contexts within those pages. And that's something we have to relearn. That's not something where we can say, well, the old website was ranking really well. Therefore, the new website will be ranking really well plus all of the new stuff on top of that. But instead, we'll be looking at the new version saying, well, this is the new website. And this is how we're going to rank your content. So there's no automatic kind of trick that you can do to just transfer everything old to something completely new and different. And say the new and different version will be ranking just as good and even better than the old version. So that's kind of the tricky aspect there. On the other hand, that also means there's opportunity for you. So if you're doing a redesign and you're adding things like SEO elements, clear structure within the pages, clear internal linking structure, then all of that can mean that the new version of your site is actually, from a search point of view, much better shape than the old version was and will be able to kind of focus on the new version as well. But like I mentioned, it can go the other way as well, where if you do a redesign and suddenly everything is broken internally, the internal linking structure is completely mess, the URL structure is really weird, then that means that the new version of the site is something that we have a lot of trouble with that will make it harder for us to rank. Bunch of questions still coming in. Let's see how far we can go before I fall over. With regards to thin content, supposing I can make a page that is very similar in structure and content to a lot of pages on my website but still delivers what the user was looking for, thinking of a specific long-term query, would that hurt my rankings? That sounds like something that could be a good idea. So if this is something that really can stand on its own, then that's something I'd theoretically try out and kind of see what would work there. Usually the aspect of thin content that's more problematic is when it goes towards doorway pages. So when you're taking this big bucket of keywords, just swapping out those keywords and automatically generating all of these variations of pages, and they all lead to the same kind of conversion funnel, then from our point of view, that's just a set of doorway pages, and that's something our algorithms and our manual web spam team would look at and say, well, this is really not a good use of Google's resources. These are not high quality pages. These are not things that we would want to show to people. In your situation there, it does seem like you're doing something that's unique for users for those specific queries. So that's maybe something that's more OK. So I would try that out and see how users actually respond to that. And if you're seeing that users aren't really responding to that that well, then iterate on that and kind of see where you can go from there. Gabriel, are you still on? Because they're working with accounting law. So every law affects things a little differently. Oh, great. 90% similar content, but relevant to an individual law, just to give you a little more detail. But in that case, in that case, those are individual laws. And that's something where I could see it might make sense to make individual pages for that. Another idea might be to kind of concentrate things on fewer pages and say, well, this page is about this set of laws, and here's some content about that. There are always trade-offs when you go too far, too broad, and say this page is for this one specific topic. Because that means those individual pages don't have a lot of value of their own. Whereas if you concentrate things together, then you could have a really strong page on this bigger topic. So kind of finding the right balance between a really strong concentrated page and kind of these bunch of individual pages that are harder to kind of evaluate on their own, that's sometimes tricky. But that's something that you have to do. You know your content best. You know your user's best. You know what works for them. All right, question from Pedro. Let's see, I've been asked if you can use more than one page on the website for X default. The documentation isn't clear, but my recommendation is there should only be one X default page on one site. Is this correct? No, so depending on how you define site. So essentially, X default is within one set of hreflang the default version. So the set of hreflang links is between the different language and country versions of one specific page. And for that set, you can specify one X default. So you can say, well, this is my Spanish page, my French page, my German page, and my X default is the German page or an English page or whatever you want. And you can do this on a per page basis. So that's something where across the site you can have multiple pages that are X default if you've marked up multiple sets of pages for hreflang. On the other hand, if you've only marked up the home page with hreflang, then obviously you'd only have that page as an X default. And another thing to keep in mind when it comes to hreflang, if you only have one language version, then you don't need to use hreflang at all because we don't have anything to swap out. So can I clarify, because this is like a bit, I've been asked this a couple of times. I understand that's like for a set of language, a group of pages, you can have one X default, but imagine that I have like, well, let's simplify this, like four pages in Portuguese and four pages in German. And I'm having, I want to use also my English pages as X default because, so can I create like four X default pages that correspond one to each of the languages? Sure. No, okay. Because this would like start, you end up having a whole website made out of X default if you can go through this. Yes, but you would have that on a per page basis. So for example, for your blue shoes page, you would have blue shoes in English, blue shoes in Spanish and maybe in French. And you would say one of these is the X default for blue shoes. So that's something for that set of pages you would do individually. It's less the case that you would say this whole website is X default because you can't like mark up the whole site with one hreflang link. It really needs to be between individual pages within a website. Okay, thanks, because the documentation hints that it kind of hints that it's just one, but it's not clear totally, so that's fine. Okay, I'll double check that in the morning, good idea. All right, a question about sitemap. How efficient is a sitemap to help Google index all of our pages? What are some other best practices? So a sitemap is pretty much the fastest way to let Google know about new and updated pages within your website. So with the sitemap file, if you specify a last modification date, then we'll know that this page has changed recently and we'll try to recrawl that as quickly as we can. The important thing within a sitemap file is really the URL and the last modification date. The change frequency and the priority we pretty much ignore because we found that they don't really provide that much useful information, but the other two are really important. With regards to the URL, obviously you want to make sure that things are as consistent as possible within your website. So the URL you list in your sitemap file should be the same URL as the one that you link within your website and it should be the same one that you use as the rel canonical if you specify rel canonical on those pages. So as consistent as possible and as correct as possible with regards to the last modification date and that really helps us with regards to crawling and indexing those pages. All right, we're pretty much at the end of our time, but if there are any questions from any of you here live here in the Hangout that I need to address, let me know. No more questions? That's good too. Can we do the next one in Spanish or Portuguese? Oh man, not for me, but I can see if I can find someone. That sounds interesting. Sure. All right, so with that, thanks a lot for coming. Thanks for all of the questions. And I have, I think two more Hangouts set up in the normal flow, a German one coming up tomorrow and an English one coming up on Friday. Probably time zone wise, not ideal for you all, but I'll set up more in the weeks to come as well. So thanks again for organizing this, Andy, and thanks again for everyone who was able to join in. It's been great having you here. Thank you, John. Bye, everyone.