 All right, welcome, everyone, to today's Webmaster Central Office Hours Hangouts. My name is John Mueller. I'm a Webmaster Trends Analyst at Google in Switzerland. And part of what we do are these office hour hangouts where people can join in and ask their questions around their websites and web search. Bunch of stuff was submitted already. We'll try to go through as much of that as possible. But if any of you want to get started with a question, you're welcome to jump on in now. Hi, John. Hi. I have two questions, actually. So the first one is about the chart. So our client, they run some sort of survey to get information from the user. And they write block content based on that survey. Now, the question is, how we can add this, how can we add chart to the blog post? Should we add as an image? Or should we add some sort of code to show the chart? Which one you've built up? I think it depends a bit on what you want to achieve with the chart. Usually, these kind of things I would just add as an image and make sure that you have an understandable alt attribute for the image as well. So if there's any critical information in that chart that you need to get across and put it in the alt attribute so that we can pick it up as text so that people who can't see the image can also get that information. But in general, I would just let them get in those images. I think usually, these kind of charts are not going to do something fantastic in image search because it's hard to imagine that someone is looking for that particular chart using Google images. But essentially, an image is probably the best approach there. I don't think you would get a lot of value out of making that chart into HTML and putting the numbers and the labels into text because that's something that you can just as easily put in the body of the blog post in the alt attribute as well. And the next question is about infographics and image. Now, infographics is still a bit different from the image. Now, how Google can understand this one is just a simple image and this one is infographics. How can Google understand this thing? I don't think we need to understand the difference. Or why should we need to understand the difference? Because for us, an image is an image. We can pull out some information from an image. If there is text in the image, we can try to understand that. But essentially, if it's an image, if you're putting a little information into an image, then that's something that's primarily an image and not primarily text. So if you're doing something like an infographic, like a really long thing, I'd recommend making sure that you have the text or the bulk of the text in there in some kind of a blog post or in kind of other textual format as well. OK. Thank you, John. Sure. Oh, hello, John. Hi. My question is regarding content scrapping, which is also a quality guideline from Google. So the thing is, we have a website where we cover mobile games that come on Google Play Store, and we write about those games. We use the right original content about the game, like what is story line gameplay and all the characters about the game. But we have a competitor who just violently copies everything from Google Play Store and puts the content they rank better than us. So what can we do to improve our rankings in this kind of situation? Like we are doing everything original, and they are just copying from Google Play Store. No. I think sometimes that's frustrating, but essentially the important part to keep in mind is that we use a lot of factors when it comes to ranking. So what I've sometimes seen is that one site will do a lot of things really well, and then some individual things really badly, and they'll still perform fairly well. Because overall, we look at that and say, well, there's lots of good signals here too. So as a competitor in a situation like that, I would recommend making sure that you're focusing on all of those signals as well, by trying to find ways to improve your site overall, not just specifically with regards to the content. For example, I don't know your site, so I'm just making up a situation. What could happen is you have really good textual content, but you embedded on a site, look really out of the hole, that is real. So that's where I take a step back, and instead of just focusing on the text, try to think about your whole website overall. We do have considered all these points, and I think we have a good visibility overall of the website. We have a good number of backlinks compared to them. Our website looks good, and it loads fast, and everything is good. It's just the content part that I believe we are lacking behind, because they just copy it and they just rank better. Still not able to figure out the exact reason why they are ahead of us. I don't know if it's the right guess. Usually, in these kind of situations, not that there's one thing that they're doing that is the reason they're ranking better, and you can just copy that and do that as well. It's really a combination of different things. So that's something where I'd really be cautious about just saying, well, on this page we have better content than this other page, but really try to take a step back and look at your website overall. Maybe there are other parts of your website that are more problematic, and try to find ways to improve all of that. And that's not something that happens like you fixed the HTML code today, and then tomorrow everything is perfect. That takes a long time sometimes to really improve overall. Sure, sure. Thank you. Sure. All right, let me run through some of the submitted questions, because I think last time we barely got through any of them. Let's see, the first one, I don't know what I can say here. Would you confirm if Google is making any changes into the search results? Because I'm seeing ranking loss and big fluctuations and existing rankings for my Australia-based website. So we make changes all the time. So from that point of view, I can pretty much confirm that we have made changes in the search results. So I don't think that's really useful in that sense, because if you're making changes all the time, why am I seeing changes now? But rather, that's something where I would generally recommend obviously recognizing these kind of situations is a good first step. But also, letting it settle down, seeing what happens in the end, checking in with other webmasters, other running sites to see what kind of changes maybe they're seeing as well, and then think about, on the one hand, what you can do to improve your sites overall, so that they're a little bit more stable and not reliant on this one particular factor that you happen to be working on. And on the other hand, maybe looking at the search results where you've been seeing changes and thinking about how all of that could fit together. So in that regard, I don't really have this one answer that's like, well, you're seeing changes in search. Therefore, you should pull the handbrake on your website, and everything will stand still. That doesn't happen. These changes in search are things that we do to try to improve the search results. So sometimes we do get it wrong, but a lot of times I think we head in the right direction. And it's worthwhile to find ways that you can improve your website and keep up with how the web is improving overall. Google has stated that deep linking to an app has a positive ranking effect to the associated website. It's being linked from. Is this accurate today? So I don't know if there is really a positive ranking effect on the website that you would get just from linking to an app. That feels like a stretch. I'm not sure where exactly that's coming from. And the question is, is this still accurate today? Provided I'm doing all the right things to make my site indexable, providing authoritative content that users can trust. What sort of lift should I expect from it? Yeah, like I said, I don't think this is something that we do today. Because essentially, you can link to all kinds of intent. And just because you're linking to that content doesn't mean that your website is better. This is, for example, if you take it to web pages, something that people have been doing since the beginning, where they will publish something low quality or spammy, and they'll link to Wikipedia and CNN and Google and say, well, look at all of these authoritative websites I'm linking to. You should trust my content. And just because you're linking to something good doesn't mean that your content is good. Obviously, sometimes links do provide extra value to your content, but there is no kind of magical effect from just linking to some things that are also good. Would you advise utilizing a reverse proxy server for moving from a subdomain to a subfolder structure? Any risk or considerations that shouldn't be overlooked? In general, if you're changing the structure of your website, I would recommend just setting up redirects and doing it the normal way. Setting up some kind of reverse proxy, in my opinion, for something basic like this seems like a lot of extra work with a lot of moving parts that can break. So I recommend setting up the redirects. Whenever you're changing the site structure, you will see some fluctuations in search for a while, but that's something where it'll settle down. And we have a lot of practice with redirects. So usually, that should work out well. I think, especially in the beginning, there are some situations where maybe setting up a reverse proxy makes sense, where if we really can't crawl your content properly, or if you have a really kind of complicated infrastructure that you need to hide from users and hide from search engines, then maybe setting up kind of a reverse proxy to map your complicated URLs to cleaner URLs, maybe that could make sense. But for the most part, site moves like this, site structure changes, that's something where I wouldn't bother with all of this extra machinery to hide those changes. I've seen several WordPress sites get hacked, resulting in a warning being displayed in the search results if a reverse proxy was used. And this warning was present, would the warning extend to the rest of the domain, or would it only pertain to the pages that are hosted on the WordPress CMS? I don't know how broad that kind of setup would be. So I think what you're kind of saying is you would have your WordPress site in a subdirectory or something like that on a main domain, and would this warning affect the rest of the domain? I don't know. It depends a bit on how easily we can recognize which part of the site is running WordPress. I mean, the ideal situation is not so much to kind of mitigate the risk that when you get hacked, that Google only shows that for part of your website, but rather that you try not to get hacked, or that if you do get hacked, that you fix this as quickly as possible. So I think investing a lot of time and energy into setting up a special configuration where that hacked warning will only be visible to a small part of your site seems like time and energy that you could be spending to make sure that your website doesn't get hacked at all. In Search Console, in search performance, when I filter a query, that reports give me certain stats. Then when I add a country filter plus the same query, I get total difference stats. I get a higher number of clicks than before. How come I get fewer clicks if I don't filter a specific country? I don't know. It'd be useful to look at some examples. If you could send me some examples, I'd be happy to take a look at that with the team. Sometimes what happens there is, depending on the way that the filtering happens in the back end with regards to how much data we have for individual parts of a site and how kind of the sampling of that data happens overall. In practice, what you can usually rely on is that the higher number that we show in Search Console is the more correct one. So maybe that helps. Maybe that makes things more confusing. If you have some specific examples, I'd be happy to take a look. One of the typical SEO suggestions is to avoid Google to index poor content pages. Typically, user profile pages are like that. Maybe just username and a couple of information. The question is, is it better to put no index on profile pages to avoid Google seeing those less valuable pages? You can do that if you want to. In general, I wouldn't say that all profile pages are bad, but it's something that we do sometimes see that either spammers use profile pages as a way of kind of placing their links or placing some specific content that they're trying to rank for. Other times, the profile pages are just with basic information and not very useful. If it's just a matter of basic information and not very useful, I generally don't think that's something you need to put a no index on. Because what happens in practice is we will look at your website overall and we'll try to determine which parts of your website are important for Search. And those are the parts that we will focus on. So even if you have parts of your website that are less useful, like maybe some of these profile pages, we will still try to focus on the things that we think are useful and relevant to show in Search. So it's not that those less complete profile pages would be pulling your website down. The one time where it would probably make sense to do something here is if it's a really gigantic website. And you have tons and tons of these kind of emptier pages on your website. And that makes it really hard for us to crawl and index the website. So in particular, if you have several million pages and of those, maybe 80% are these more empty profile pages, we would spend a lot of time focusing on these empty profile pages before we recognize that actually they're not so important for your website. We could be spending our time somewhere else. So from that point of view, I wouldn't blank at least say you always need to put no index on your profile pages. I would look for things like spam. If people are abusing those profile pages, clean that up. If the absolute number of those profile pages is kind of in a reasonable range and they're not showing visibly in Search, then I wouldn't really worry about that. Do you think a 301 redirect chain of multiple redirects rules pileups can hurt a website? For example, httpexample.com to www.example.com to httpsexample.com. Is that something critical to look at? Or is it just us who are noticing those paths but search engines and users are experiencing the final URL? Yeah, for the most part, we focus on the final URL. So that's something where, depending on how you have your website set up, you commonly have this kind of chain redirect set up where you redirect incrementally closer to the URL that you actually want. And that's perfectly fine. The one thing that can sometimes happen is if we find the ancient URL at some point, that's the only thing we know about your website. Then what Googlebot will do is follow five redirect steps and then in the next day, follow the next fine. But once we've recognized where your final URLs are, then we'll focus on those final URLs. It's not that we'll follow that redirect train every time we try to crawl. So that's something that I've seen tools sometimes flag and say, OK, if you look at this obscure URL, you'll find seven redirects in a chain and your website will disappear in Search if you have seven. Instead of five, and that's definitely not the case. So what I would do in a general situation like this is try to double check that when users are using a website, they're using the final URLs, that you're not constantly redirecting between URLs. And if you have multiple URLs redirect set up like this and this is just legacy changes that have been happening over time, I wouldn't worry about it if it's something that people see all the time. If you look at your server logs and you see that these redirect chains are being used regularly, then I would try to figure out how you can link to the final URL rather than to the initially redirecting URL. But the redirect chain itself, that's not something that I would really worry about. Our company is in process of merging and rebranding two domains. Let's see, site one and site two both have unique content. They cover similar topics. Site one is our flagship site with more value. The new domain we required for the merging rebrand just happens to be the .com version of site two. So we want to use that. Since site one is more important, we want to first redirect site one to site two and then redirect site two.org to site two.com as a later date. Would this be problematic for the search? Would Google penalize the site since this could be a confusing signal to use? So we definitely wouldn't penalize sites like that. That's something where definitely from a web standpoint of view, this is a non-issue. This is a kind of a technical issue that we need to figure out, but it's not something we would consider to be of use. So maybe kind of that worry off first. That doesn't mean that you won't see any fluctuations with changes like this. So that's a kind of a different question. In general, moving from one domain to another is something that we have a lot of practice with. If you're just moving one to one, all of the URLs from one domain to another, I would expect that to just work out. Maybe you'll see some fluctuations for a couple of days, maybe a week or so, but usually that should settle down fairly quickly. On the other hand, if you're merging things, so in particular, if you move like site one to site two, and site two already had some content, then you're merging things. In cases like that, it does take significantly longer to kind of settle down. And in this case, I don't know if the final site already has content or not, if you're merging things into that. But what I would generally do is try to separate out the individual steps so that you can easier recognize where things have gone wrong or where something might have happened that you need to double check. So a common scenario is you do kind of this moving from one domain to another, and at the same time, you change your CMS, and you change your URL structure. You change the layout of your pages. And if you do all of that at the same time, and something goes wrong in search, then figuring out why that happened in search is going to be really hard. Whereas if you do it incrementally, and you recognize, oh, when I changed my URL structure, things went wrong. Therefore, as a next step, I need to figure out what in my URL structure is different compared to before. And maybe I can improve things. Maybe I need to roll the URL structure back, whatever. But by being able to separate out those individual parts, it makes it a lot easier to figure out what you need to change. So that would generally be my recommendation there. I wouldn't worry so much about the unique content part or similar topics or all of those things. But from our point of view, it's really more a matter of, are you merging things or are you just moving things? A question about pop-under ads. My site started ranking, and I have this pop-under ads on. It only opens up once in 24 hours. Can this affect my ranking on Google? I don't want my rankings to drop. I don't think anyone wants their rankings to drop. I don't know what specifically you're showing with the pop-under ads. In practice, we do take a variety of signals into account when it comes to understanding the quality of a page. And we look at things like if the top part of the page is visible or not. And pop-under ads feels like, I don't know how exactly you're implementing it. But if you're opening a window behind an existing window, that feels like a really old technique that I don't know if browsers even support that that well anymore. So that's something where, from my point of view, I would look more into the usability side of that and figure out, why are you doing this? And what is your goal? Does it actually work for you? And then as a second step, figure out if this is really useful for you users and useful for your site, how might that affect search? We have a question about mobile usability. It stated that we have clickable elements too close and content-wide on screen on some pages. We tried a lot of ways, like improving font size, adding padding and margin for clickable elements to improve these issues, but none of them work. The actual problem is Google Search Console doesn't show us where exactly the problem occurred on which clickable elements. It's also interesting that when we use a test live page on Google Search Console, we don't get any errors with the same pages. OK, so what I would suggest doing there is if the bulk of your pages don't have this message and if the live test doesn't have this message, then I would just ignore that warning. What might be happening is that individual times when we try to render those pages to see how the text on those pages works, that maybe we're not able to process the CSS for those individual pages, and then we flag that like this in Search Console. So that's something where if the live test says it's OK, then I would trust the live test. I have a question about nofollow versus no in atriff on internal linking. OK, I don't want Google to relate our menu links on all types of category pages. What would you suggest here? Remove the atriff from the menu links. Add nofollow to the menu links. Disallow the menu container. So generally speaking, you don't need to hide things that are, I mean, I don't know your specific site setup. But in general, you wouldn't need to hide links on a site like that because if it's something that you don't want to have indexed, then I would put a no index on the destination page. It's generally not something that you need to prevent from being crawled. If you do want to avoid forwarding signals internally, then using a nofollow there is fine. Using a disallow on a menu container, I don't think that would necessarily work in most cases. So that would mean that you'd have to have your menu as a separate HTML file that is loaded in an iframe, on a page kind of thing. And that's a really complicated site structure setup. That's something that I haven't seen in a really long time. But if you really just don't want to forward any signals to those links, I would use nofollow. But for the most part, you probably don't need to do anything special here. Question about page speed insights. Why does a Google AdSense script cause poor page speed insights performance? It's loaded async, but Google marks the ads by Google as a problem with these errors. It's weird because Google suggests don't use Google AdSense. It's bad. It slows down your pages. So I don't know about that specific snippet of text, but in general, our speed testing tools are agnostic to the kind of markup and scripts that you add to your pages. So it's not going to be the case that our speed testing tools will say, oh, this ad script is perfectly fine because it's made by us, even if it's really slow. So if our speed testing tools say that a particular element is slowing down your page and that particular element happens to come from Google at some point, then essentially, you have a page that is a bit slower. Users don't care where that ad is coming from or where the slowness is coming from. They just care that a page is slow. So that's something kind of off the side with regards to why the Google AdScript is making your pages slow. I don't. That's something where I would check in with the AdSense folks to see what other options you have where you might be able to implement those ads in a way that don't slow down your site or maybe there are ways that you can implement the ads on your site in a way that doesn't directly affect the page speed. For example, if you have those ads further down the page, then maybe the largest contentful paint, the first meaningful paint, all of those metrics are actually OK because the slower part is loaded after the first page view. So those are kind of a few elements there. The other thing to keep in mind is that, in particular, the page speed insight score is not like a magic number that you have to improve and get to 100%. It's a combination of different speed factors, and it's meant to help you to figure out where your weak points are with your website. So that's something where if you add a script to your page and it goes down from 98 to 95, then probably that's still OK, but it's worthwhile to figure out, especially if you see really big changes with regards to those speed scripts. In the future, we plan on using the core web vitals rather than just the page speed insight score. The core web vitals are also, I believe, shown in page speed insights now, but separately. So I would take a look at that to see how maybe this particular script affects the core web vitals and also check in with the AdSense folks to see if there are ways that you can implement things that still work really fast for your site. I noticed my site ranking suddenly dropped around May 2020, and I noticed all my ranking posts was not ranking anymore. I don't know what's wrong with my ranking. I don't know. That's a very vague question and a very broad timeline, so it's really hard to say. What I would recommend doing here is maybe posting in the Webmaster Help Forum, and including your URL, including the queries, where you've been seeing changes, and the pages on your website that you think should be ranking a little bit better for those queries and trying to talk with other people who've run into similar problems to figure out what you could be doing to improve things there. What I have sometimes seen is that sites will look at metrics like the total impressions or the total clicks through the site and not consider the individual queries and the conversion on those queries. If those queries were really queries where your site is particularly relevant for. So for example, I have seen situations where a cycle write an article about how to log into Gmail. And if that article happens to rank for a query like Gmail, then you would get a lot of traffic there. And when our algorithms figure out, well, actually, not really the best thing to show for this query, then you will see a big drop in traffic. And if you just look at the absolute numbers there, then it'll look like something really big is happening to your website. But it's actually just that one query. And if you step back, then you might notice, well, maybe my site isn't really the best one to rank for that particular query. I made several tests to find out that my high CLS and LCP are coming from AdSense ads. Oh my gosh, more of these. So other than removing those ads, can you give an idea how to solve the problem or to fix a guideline from Google AdSense would also be nice. So again, we're not going to change our guidelines just because products from Google are also sometimes slow. But rather, it's like we encourage the web to find ways to improve their speed overall. If you're seeing issues around specific Google products, then I would go in those Google product forums and really work with folks there to help improve that. And I'm happy to help out if there's anything I can do there. But in general, we don't know all the teams at Google. And we can't give them confidential information on how to magically make things faster. They have to do normal things, as any other provider would need to do. So that's something where I primarily try to work with the product teams that you're trying to integrate there. Is crowd and Google search results with multiple domains from the same company selling the same product against the webmaster guidelines? Our business is being pushed down. Page 1 results for thousands of keywords. And we want to know how to deal with it. It's something that I think our algorithms do sometimes struggle with. A lot of times, I think, we get it right. And every now and then, something will pop up where suddenly there are lots of different TODs from this one site that are showing up in the search results. And that's something we do try to work on. So it's not so much like, is it against the webmaster guidelines or not, but rather our algorithms sometimes struggle to get the right balance there. Showing multiple results from the same website is perfectly fine. We try to generally try to limit that to maybe one or two results per website. But for individual queries, it might make sense to show a lot more. And there is no magic number that we would say, this is the total number of results we would show from one website or one business in the search results. So that's something also worth keeping in mind there. The other thing is the one area where it does touch on the webmaster guidelines is if you're doing something like creating doorway sites or creating doorway pages. If you're creating significant duplication with all of the same content and just tweaking things by putting it on different domain names, then that is something where the web spam team might take action if they find that. Using the web spam report form is a good way to bring that up on our side. But also keep in mind that the web spam report form is something that we use to improve our algorithms. It's not something where the web team will take that input and one-to-one do a manual action on it. Two questions, does Google index image metadata? If yes, does it influence ranking? We do pick up some image metadata. And I believe it's primarily used for things like understanding the image license and the copyright information. It's not really a ranking factor, but it's something that we do sometimes show in Google Images. When people are seeing this preview of an image, we can sometimes show the metadata from the image there. So not a ranking factor, but always a good thing to have, especially if you have information about the licensing or copyright information that you want to provide. Does Google index metadata in MP3 and other audio formats? Does that influence ranking? I don't think so, but I don't know for sure. I think the tricky part there is, I don't think we actually index MP3s at all. So we would probably be able to pick those up as videos sometimes, depending on how you're embedding those on the page. But otherwise, I don't think we would individually index MP3s separately. So that's more a matter of providing information on the landing pages where you're linking to those MP3s. And obviously, information on the landing is text, and we can use that for ranking. How many internal links should we use on a page? Let's say there's one in the menu and one in text. What is the best approach? Will it lose link value if the same internal link is used multiple times? Use as many internal links as you want. So that's kind of the guideline. It's not that there is any magic number that you need to focus on or with regards to kind of using the same link multiple times. That's all generally fine. The one thing I would watch out for is more that your site has kind of a clear structure in that when we crawl your website, that we can understand which pages are related to each other, which pages are kind of equally important, and which pages are kind of subpages of existing pages so that we can understand the context of your pages a little bit better. So one thing that SEOs sometimes like to do, especially when they first get a hold of a crawling tool, is to create a really flat structure, where all pages link to all pages. You just have to know one URL, and then you know all URLs of the website. And if you crawl the site, it's just one long line where all pages are equally important. And from our point of view, that's not really that useful because we don't understand how the relationship is between these pages. From kind of a pure crawler point of view, it looks like, oh, all pages are on level one, or however you count that. But that's not necessarily good. It's usually better to have kind of a balanced structure where you have some category pages, maybe some subcategory pages for the individual categories, the detail pages further down below, so that we can understand what the relationship is between those pages. And with regards to internal links, that essentially maps back to that. So if you have links from your category page to your subcategories of that category, and from there to the individual products, then that's a nice structure to have. It's not that you need to put all of your site's links into all of your pages. OK, let's see. We still got some time. There's one longer question about a specific site. I'll just skip that one for now because it seems like something more complicated to look at. I notice these spammy sites are linking to my pages. I sent them an email to stop, but they didn't. I disavowed them. I don't want them to affect my ranking because I didn't put them there. Some pages are linking to my pages around 20 times with some weird author. Is disavowed the only thing I can do here? This is perfect for these kind of situations. If you really want to make sure that Google's algorithms do not take this into account at all, disavow is perfectly fine. In practice, you probably don't need to disavow. If these are really spammy pages, if these are just kind of like links on random pages of the web, then that's something we probably ignore already. It's not something that you would need to disavow. But if you're worried about this, worried enough to actually send them an email and it's like you really don't want them to be taken into account, then the disavow tool will do that for you. Structured data questions on a new site, FAQ page, structured data. Is it OK to mark up main content landing page containing FAQs with FAQ schema rather than create a separate FAQ page strictly for FAQs linked to each other? From a user point of view, we think it would be better on the site to have all the FAQs listed on the main content landing page. The landing page will have an accordion style index on top with hashes to jump out to the FAQ content for ease of access. Speaking, you can do that. I don't know if it really makes sense for your users, but that's kind of a question between you and your users. It's less a matter of can you do it or not. The important part with FAQ content and structured data rich results in general is that we expect this content to be visible on the page. So if you're hiding the whole block of FAQs, that would be problematic. If you're showing the questions on the page and if you click on the questions and it expands and shows the answer, that would be fine. Video structured data, same landing page as above. We have multiple videos listed in carousels. Better to have individual pages for these videos and add supporting content on the page. If so, how to mark up the videos on the landing page and point Google via markup to authoritative best content page containing the best amount of useful metadata and schema data. So the best amount of metadata and information on the page is not something that you can really quantify. You have to kind of use trial and error, test that with your users, and figure out how much information makes sense to have on one page versus splitting that across multiple pages. When it comes to videos, the thing to keep in mind is that when we recognize there's a video on a page, we will show one video thumbnail for that page. So if you have multiple videos and these are essentially video landing pages for those individual videos, then I would tend to put those videos on separate pages just so that you have one clear landing page for those individual videos. So that's kind of from my point of view the way that I would look at it there. If you have all of the videos on the same page, then it's really hard for us to tell which of these videos is the important one that we should show as a thumbnail, for example. Some simple links are attached under each results description in the search results. Using a scrolling carousel, I can't find any guide pages on that. What's the difference with the exposure type and site links? How does Google select those links? Can you explain more about the purpose and the algorithm of additional attached links on the search results? Not so sure exactly which kind of links you mean there, because search results pages have gotten a bit complicated with lots of different things there. When it comes to things like site links that we would show under a page, that's something where we try to understand where it makes sense to show additional information to the user. And oftentimes, that's based on things like us recognizing that the users search for one thing. But actually, they're not looking for the most important page on those keywords, but rather some general information around those pages. So maybe they'll search for any name, but instead of clicking on your company homepage, they'll click on Blue Running Shoes, because that's a site link, because actually they were looking for running shoes from your company, and they just remembered your company name as a query to look for. So that's something, kind of a reason we would show things like site links. And for site links, in particular, it helps to also have a clear site structure, kind of, as I mentioned before, rather than linking everything with everything else. If you have a clear structure on your site, well, we can tell that these are kind of subpages from this one primary page. Then it's a lot easier for us to automatically generate site links for those pages, because we know that they kind of belong together. It will be interesting to get some insights in recent major changes of the PAKE algorithm. Do you have any interesting worth mentioning? Is seed site something that is implemented? So I have no information about any major changes of the PageRank algorithm. I think we've kind of worked with that for a really long time. So from that point of view, I don't really know what specifically you're looking at there. The thing to keep in mind also with PageRank is we do use PageRank in search, but we use lots and lots of other factors. So sometimes it's something that people tend to focus on a lot just because we publish information about it once. And links are this kind of thing that you can implement and kind of manipulate a little bit on the web. And you think, well, I can influence this PageRank algorithm. And probably you can influence the PageRank algorithm, but there's so many other things involved with search that focusing on one specific tiny element of all of the kind of ranking or general indexing algorithms is not going to really help yourself. OK, we still have a few questions left. And John, sorry. Good morning. It would be very good if I could ask a question because I have just a few minutes left. It's about GPR and all the privacy stuff because of that we are implementing two click embeds on our website for social media and YouTube. It's like when you come as a user to our website, you see a placeholder and there's a text like here would be recommended content from YouTube. And then you have to click on a switch and then you give your permission to see this kind of third party and that stuff. We are wondering if this affects our visibility and rankings on Google because Googlebot can't see this YouTube video or social media embed because Google can't click buttons. Yeah, so I think that kind of embed has been around for a pretty long time. And generally speaking, I think that's a fine approach to take. The one place where I could imagine this playing a little bit of a role is in recognizing which videos are embedded on a page when it comes to video search and kind of the video thumbnail. But what you can do in cases like that is use the video markup rather than just the video embed. So our systems do recognize the common kinds of video embeds and try to figure out which one is a thumbnail automatically. So if we recognize a YouTube embed or Vimeo embed or whatever all of these common formats are, we can figure that out automatically. But you can also use the structured data to tell us about this. So in a case like this, instead of relying on the embed code, I would use a structured data and tell us, hey, I have the video here and this is a thumbnail image. This is the file or the other information that you have for the video. And then we can still recognize the video is on this page. We can use this page as a video landing page and show it appropriately in the search results. Our approach would be to just use the embed URL schema so that Google sees the URL of the embed. Is it enough or should we use the video object schema for this, for YouTube embeds? I would try to use the video object. Yeah, I'm not sure how we would use the embed. But the object, you're really giving us that information. For the other kinds of embeds from social media content, usually I'd find that less problematic unless there's something in that embed that you need to have for indexing. Like if you have a series of Twitter embedding on a page, and if we can't see those tweets at all, then the text in those tweets, we can't really associate with the page. But the fact that this text comes from tweets, and here's a link to the tweets, that doesn't matter to us. So it's not that we need to see, for the other kind of social embeds, it's less of an issue. I think even for Instagram, when it comes to images, I believe their images are embedded in a way that we can't actually index the images through the embed anyway. So for the non-video types of social media embeds, I don't think that would be a problem. OK, thank you. Sure. OK, maybe we'll just switch over to more questions from you all. I also have a little bit more time, so we can stay a little bit longer if any of you want. Feel free to jump in. Can I ask a question about intrusive interstitials? OK. So first of all, thanks for doing this Hangouts. I'm a big fan. Thanks. There's some clarification that I need around the interstitials regarding serving it to organic visitors, bots, and about full page. So let me ask. Let's say there's been tests done, and we see that the conversion rate for an e-commerce website is the same for buying products. So from a user perspective, a full page or a lower third is the same from a conversion perspective for buying products. But signing up for the emails, it's hurting the sign up rate, the registration rate. So from what I understand, Google devalues a full page interstitial. Is that correct? On mobile. At least on mobile. Yeah, yeah. So is it OK to serve? Let's say I can figure out how to improve the registration rate. Is it OK to serve a lower third to organic visitors, but visitors coming in direct or other marketing channels to serve the full page? Is that OK? That's generally OK. The thing to watch out for there is that Google bots, when we crawl and index your page, we don't send a refer. So that's something where if users coming in from search see an improved view, that's fantastic. You just need to make sure that that improved view is also the one that Googlebot sees when it crawls and indexes the pages. Otherwise, we wouldn't be able to take that into account. Correct. So it's OK to essentially serve the lower third to Googlebot and organic visitors. So let's say now somebody's coming in from California, same like Googlebot, it's the full page, and we've got the privacy policy. And it's a full page. Is this OK? Will I get devalued for that? We try to recognize legal interstitials and ignore those. So things like privacy policy, the data protection interstitials, those kind of things, we try to recognize and skip over. The important part is that you're showing this on top of the HTML page, not that you're showing it instead of the HTML page. So things like redirecting to a separate interstitial URL or showing the interstitial instead of the actual content, that would mean that we wouldn't be able to actually crawl and index the content. But if it's a div on top of the HTML, that's perfectly fine. OK. And last question regarding devaluing. So I understand obviously organic is what I live by, and that's my job. So it's very important. But let's just say from a registration standpoint, that full page business-wise for the email is more valuable. How can I measure if I'm getting devalued? Is there a way? Can you suggest something? So for example, my main competitors, they do not have a full page, except for one. Until now, I don't know exactly how to measure or prioritize this within the company. I don't think you can. I don't think that's possible. So it's one of those things that we use in the ranking algorithms, where we will kind of try to take that into account and essentially rank the website a little bit lower if we recognize that it's doing things like the content above the fold on mobile is not there or filled with ads, filled with an interstitial, that kind of thing. But it's not that there's a flag in Search Console, or you see a warning, or anything like that. Last question, and I don't know if you can tell me the answer for this, but let's say the ranking drops three or four positions consistently. So I get to that first, second position, and then it drops three, four positions. Is that somewhat of an indicator that this could be hurting me or related to the devalue? I would intuitively say no, just because my understanding is that especially something like the intrusive interstitial change, that's something that is more probably a softer factor, and it's something that wouldn't be applying across the whole website. In particular, if people are still looking for your brand name, then I wouldn't expect to see any ranking change for that. Whereas if people are looking at you. I see it on specific pages and related to eyeglasses to help. So if people are searching for eyeglasses and you're one of the hundreds of competitors out there, then that's something where I could see kind of a change being visible in search. But it shouldn't be across the whole site. It should be really kind of to those more generic queries where there's more competition than slightly harder for your site. OK, thanks so much. I really appreciate it. All right, I think we're at time. So I'll stop the recording here. But if any of you want to hang around and chat and ask more questions, you're welcome to do that as well. Thank you all for dropping in. Thanks for joining and asking all of the tough questions, submitting some really good questions. And hopefully, I'll see you all again next time. All right, have a great weekend, and let's stop the recording.