 All right, welcome, everyone, to today's Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a Webmaster Trends Analyst here at Google in Switzerland. And part of what we do are these Office Hour Hangouts with webmasters and publishers all around web search related questions. As always, there are a bunch of questions that were submitted, but if any of you want to go ahead with the first question, feel free to jump on in. So hey, I have a question about the new PageSpeed tool. So basically, we started to check it even on daily. And we found that on the morning, we get an issue, so errors for, for example, lazy loading images. But in the evening, this issue disappears. So we're trying to understand if it's something that may be from a PageSpeed team, they're changing, they're still testing it, or it's actually an issue that we need to solve on our digital assets. It's hard to say what exactly might be happening there. In general, what we try to do with these tests is we try to keep them as stable as possible. So I don't think the PageSpeed team would be sitting there saying, oh, in the morning, I like it like this. In the evening, I might get different. I assume it's more a matter of your page kind of being on the edge there, and sometimes it's enough to trigger these alerts, and sometimes it's not enough. So whether or not you should fix that depends a little bit on the issue itself and on what you can find out about that. Maybe it's something really easy that you can change. Maybe it's a matter of having to redesign something significantly, which is not that easy to do, which maybe you'll save for some later time. Yeah, cool. The new PageSpeed tool, I think, is pretty neat because it includes the Lighthouse tests, and it runs them for you. So it kind of saves you the question of whether or not this test would be the same on other devices as well. So I think that's a pretty neat idea to kind of combine everything and make one test that covers pretty much everything there. And you can still run the Lighthouse tests locally in your Chrome as well if you want to double check to see where exactly this is happening or what if you want to kind of tweak things on the page and see how that works out with regards to the issues that are flagged. All right. Any more questions from you all before we get started? I can shoot in with just one more question. All right, go for it. Elene from Norway here. So if you have a lot of subject pages because the old tag structure said that if you have one word, a lot of times it will be automatically generated as a tag, and then a new page will be automatically generated for that subject. If you have a lot of those pages, would it ruin the whole site? Because we do have a lot of subject pages, so my suggestion will be to remove them, but just to double check here. So I assume these are kind of like tag pages where you have a list of articles that match those keywords, something like that. Sometimes they can be OK. Sometimes they're a little bit low quality. So what I would think about is see these as normal pages on your website and think about whether or not you want to have them shown in search. And if you don't want to have them shown in search, then maybe no index them. If there's a way for you to differentiate between good tag pages and kind of not so good tag pages, then that might be useful. But I generally see these like any other page that you have on your website. And think about, do you want it indexed or do you not want it indexed? OK, thank you. All right, let's see what we have submitted today. In a new search console, I can see if I filter a query one by one, I'd be able to see this query in all its metrics. I can also see the pages that Google has shown for the query. My question is, what is the consideration made by Google in showing these multiple pages as landing pages and how to best optimize for Google to show only the ones that we want? So I think first off, one thing that's important to know in Search Console is there's a difference between how you look at those pages in Search Console with regards to the performance report. So in particular, if you look by query, then we include all of the pages that were shown for those individual queries. Whereas if you look by page, then we only show the information for that specific page. So the reason for that difference is for individual queries, it's possible that multiple of your pages are shown at the same time. And for that, we only track the top most one of your pages there. So that's kind of, if you compare those numbers and you say by query and by page, you'll see that there's a difference there. And that comes from kind of that difference. With regards to telling Google which page you want to have shown for which query, that's not something that you can easily do. There is no kind of mechanism on our side for you to say, for this query, I want this page to be shown and not this other one. But rather, it's our algorithms that try to figure out what is the most relevant page for this particular query. So what you can do if you're seeing a page that you don't want to have shown from your website is to think about what you can do to make that page more relevant. And that could be with regards to the content on the page. So you could maybe perhaps clarify that this is really the page for this particular topic and maybe remove some of the information from the other page and make sure it's really well visible on the primary page that you want to have shown. Another thing you can do is to make it more relevant in general across your website through the internal linking on the website. So to really clarify that when we crawl and index your website overall, we see this as a page that is really important. We see it perhaps linked from the home page. We see it perhaps linked from other pages within the website. And all of these things tell us that this is something that you think people generally want to go to. So you'll make it easier for them to go there. And we'll pick that up, and we'll try to show it a little bit more visibly in the search results as well. Let's see. The question goes on, I think, is there a way to input a group of queries to be shown in Search Analytics? Not that easily. So what you can do is enter a part of a query and we'll show everything that matches that query. So for example, if you have one keyword and you have different variations of that, you can just enter that primary keyword in Search Analytics, and it'll show those different variations as well. Another thing that you could do is just export the table to Google Spreadsheets or to Excel if you're using Excel, and then use the filtering there to refine things the way that you want to have them visible. All right. I watched a November 13 video about the 200 ranking factors. Do you know where a list of the newly updated ranking factors is? I don't have a list of all of the ranking factors. I think it's also something that in general, it's not really worthwhile to focus on that too much because if you're trying to focus on these individual ranking factors and you easily lose track of the bigger picture, namely where Google wants to go with regards to Search, where it's a matter of making really high quality websites that are relevant for the user. And it's not so much a matter of, oh, you have to have exactly this meta tag in this place, and this keyword mentioned three times and not four times, because ultimately, that doesn't make a page more relevant. So I would focus more on the long-term goal because if you focus on the long-term goal, then you don't have to worry about these small fluctuations and changes in the individual ranking factors. I noticed that a page on my site has a new backlink on it, and it now bounces around on the top of the second page to the bottom of the second page to the third page and back and forth. I assume this is in the search results. I heard this is normal. If it's normal, how long does it take until Google knows that it's a good link or if I need to disavow it? So in general, if people are linking to your pages, that's kind of a good sign. If this is a link that you're not involved with, that you didn't place there yourself, then I really wouldn't worry about this. This is something that will settle down over time. You'll see more of these links over time. It's not a matter of you needing to review every single link that goes to your website, thinking about, is this one I want to keep or is this one I don't want to keep? In practice, most websites don't have to use a disavow file at all. So all of these links that come to your website, some of them will be clearly awesome, where they bring you a lot of traffic. People write about your website, the work that you do, and that encourages people to go there. We might pick that up and say, oh, this is really a good sign that this website is great. And other links that you'll get are maybe perhaps completely random, perhaps even dropped by some spammer. And for the large part, we can ignore those. So it's not a matter of you needing to go in and manually picking and choosing which links that you want to keep or which ones you want to take away. In Search Console, 99% of our pages are excluded, with discovered, but currently not indexed. It's been like this for a number of years, even though we have links from some high-profile newspapers and websites. What could be causing this? And what could I do to get these pages indexed? Could it be a problem that Google isn't able to crawl the high number of pages that we have? So in general, this sounds a bit like something where we're seeing a lot of pages and our systems are just not that interested in indexing all of these, where they think maybe it's not worthwhile to actually go through and crawl and index all of these. So especially if you're seeing discovered, but currently not indexed, that means we know about that page. That could be through a sitemap file. It could be through internal linking. But our systems have decided it's not worth the effort, at least at the moment, for us to crawl and index this particular page. And especially when you're looking at a website with a large number of pages, that might be a matter of something as simple as internal linking, not being that fantastic. It could also be a matter of the content on your website maybe not being seen as absolutely critical for our search results. So if you're auto-generating content, if you're taking content from a database and just putting it all online, then that might be something where we look at that and say, well, there's a lot of content here, but the pages are very similar or they're very similar to other things that we already have indexed. It's probably not worthwhile to jump in and pick all of these pages up and put them into the search results. So what I generally recommend doing there is, first of all, if you're really seeing 99% of those pages not being indexed, I would, first of all, perhaps look at some of the technical things as well. So in particular, that you're not accidentally generating URLs with different URL patterns, where it's not a matter of us not indexing your content pages, but just getting lost in this jungle of URLs that all look very similar, but they're subtly different. So things like the parameters that you have in your URL, upper and lower case, all of these things can lead to essentially duplicate content. And if we've discovered a lot of these duplicate URLs, we might think, well, we don't actually need to crawl all of these duplicates because we have some variation of this page already in there. So the technical thing is the first one I would look at there. And then the next step I would do here is make sure that from the internal linking everything is actually OK, that we could crawl through all of these pages on your website and make it through the end. You can roughly test this by using a crawler tool, something like ScreamingCrawl or DeepCrawl. There are a bunch of these tools out there now. And for the most part, I think they do a really great job. And they will tell you essentially if they're able to crawl through to your website and show you the URLs that were found during that crawling. And if that crawling works, then I would strongly focus on the quality of these pages. So if you're talking about 20 million pages and 99% of them are not being indexed, then we're only indexing a really small part of your website. That means perhaps it makes sense to say, well, what if I reduce the number of pages by, I don't know, half? Or maybe even reduce the number of pages I have to 10% of the current count. By doing that, you can make those pages that you do keep a lot stronger. You can generally make the quality of the content there a little bit better by having more comprehensive content on these pages. And for our systems, it's a bit easier to look at these pages and say, well, these pages that we have now, which might be one million of your website, for example, actually look pretty good, we should go off and crawl and index a lot more. So those are kind of the three directions I would take there. First, make sure that you're not accidentally generating too many URLs. Make sure that the internal linking is working well and try to reduce the number of pages and kind of combine the content to make it much stronger. Hi, John. Hi, John. Sorry. You can go fast. No worries. So I have a related question. OK. Webmaster outreach to me and give me a site. He has some kind of a says that compare programs. Well, he has something like a million cooled or discovered but not index pages. So my question is, well, what's the best way to consider or to decide if these pages won't to be indexed or they just pages that and don't obey the guidelines and shouldn't be on Google? I don't think there is a simple technical test that you can do. That's something where you have to take your experience and look at those pages yourself and say, is this something that you think Google should be showing in the search results or not? And especially if you look at the mass of the pages, you can look at them and say, well, are all of these pages individually things that should be shown in the search results? Or maybe is there something that we can do to make a stronger page with fewer URLs? OK, thank you. I think it's always tricky because it's tempting to combine everything into very, very few URLs. And then you have gigantic pages which aren't very useful to users either. So finding that middle ground is sometimes tricky. And I think that's something that can also change over time, where if we see that kind of the pages that you have are really strong and good, maybe it's something that you can build out and say, well, I will take this page and split it out into two pages instead of one page. And I can make sure that these two individual pages are actually very good and useful. So that's a bigger number. He has something like 16 million pages. He's have some kind of algorithm that create and display a pretty compression of two types of programs. But still, I'm not really sure if it's both to get indexed. So especially if it's automatically generated, then that's something I would be very critical when you look at those pages. Because it's very easy for someone to take a database of a million products and set up this comparison or compare this product with that product. But actually, when a user looks at that page, it's not that useful. So that's something where just because you can create so many pages from a database doesn't mean that it's a good use of all of the resources. Hi, John. Hi. We have faced a strange issue a few days ago. Not a few days ago, last two months. Two months ago, we rebuilt a website and launched the website. So when we launched the website, we noticed the site has a certificate, but there is no redirection from HTTP to HTTPS version. So we did this redirection. We set it up properly. We checked everything was working well. But after one month later, we noticed that we lost all the ranking for the keywords. Like for most of the keywords, there is no ranking for the website. And then today, when I checked the ranking, I found that for some keywords, we get back our ranking. So this is the first time we've faced this issue. We've faced this issue before. The good side is, we are getting back our ranking. But the downside is, for the two months, we did not get any organic traffic. So what is the issue or why it happens? I don't know. So I think that's the main answer. But one of the things that is always a bit tricky with these kind of things is sometimes there is a technical issue with regards to moving a website. If you're making this kind of a change, a move to HTTPS is essentially moving a website. So there could be something technical that maybe wasn't working that well with that specific move. It could also be that there was just some general search quality ranking changes that took place at about the same time. So that's something that definitely did happen. We made a number of updates to our core algorithms in the last couple of months. And these are things that we do all the time. But sometimes, see a more visible effect than others. So if you make a site migration right about at the same time that we make a change in our algorithms and you see a change in rankings at the end, then it's really hard to tell, is it because of this core algorithm change that Google made, or is it because of the site move that I made? We see another thing. When we made this change, we created a Google Webmaster tool property for the HTTPS version, and we submitted the site map again from the version. So after one month later, when we noticed the ranking was dropping, we checked the how many URL indexed by Google. So we found that Google de-indexed all the HTTPS URL but de-indexed half of the HTTPS URL. So what can we do? Why it is not crawled at the same time? Is this something limitation of the Google bot? That shouldn't be a limitation from Google side. So site moves, especially moves from HTTP to HTTPS are things that we have a lot of practice with. And if it's implemented properly on a website, then usually we can pick that up within a couple of days. So that shouldn't be something where URLs disappear completely and then they appear sometime later in the HTTPS version. So my guess there is maybe something went wrong with the migration, with regards to the redirect, with regards to maybe the rel canonical, or maybe there is a no index involved somewhere, something around that line. It looks like when you mentioned that the rankings are back now that things in the end settled down properly. But it might be that with the redirects or with robots text or with the site files, something was not quite the way that it could have been. Thank you. Thanks, Joe. Sure. All right. Any question about the trust project? The trust project and international consortium of news organizations says Google Facebook being used the indicators of the trust project and their associated tags in various ways, such as in their algorithms or displaying the trust indicators themselves. So is belonging to the trust project a ranking factor for news? If not, is it possible that it will be in the future? I don't know. In general, I suspect these kind of associations and memberships are not something that we would use as a direct ranking factor. So that's something where I suspect you have more of an indirect effect here in that if other people trust your content more because they realize that you're actually taking part in these organizations and these organizations are maybe organizations that don't just accept anybody who randomly wants to join the organization, then that's something where you've probably seen an indirect effect there. I don't know if we would use something like this as a direct ranking factor. I think that could be kind of tricky. And with regards to whether or not it could be a ranking factor in the future, I think that kind of goes in the same direction in the sense that these kind of organizations and memberships with these organizations I think are generally a good thing to help move the ecosystem forward, but it's not necessarily something that we would say means that the content is somehow more relevant or it's better content when we would show it in the search zone. So from that point of view, I don't think that this is something that we would use as a direct ranking factor. One in three of my fetch and render requests in Search Console returns a temporarily unreachable error. I can't see this reflected in my logs. Are you aware of any issues with the fetch and render tool? I'm not aware of any general issues that would be causing one in three requests to fail, but it is something that we sometimes see from users in the forums and elsewhere when we talk with them. So there are a few things that kind of come together here, especially when you're rendering a page with the fetch and render tool. We try to fetch all of the embedded resources as quickly as possible for this tool, for the rendering. And in doing that, we have fairly strict deadlines with regards to how quickly we need to add that content back. And that's something where if some of this content can't come back in time, then that content might be seen as temporarily unreachable. And this is something that's a bit unique to the testing tool and not something that would generally be reflected in Search, because in Search we don't have to bring the result back immediately. We don't have to look at the absolutely current version of these embedded URLs on these pages. We can cache those a lot more aggressively. We can take our time and say, well, we need to render this page. And before we do that, we need to get all of this embedded content. And that's something that we can kind of schedule and work around with there. So in Search, we're a little bit more flexible. We have a bit more flexible deadlines. We have a lot more caching that's happening for the embedded resources. So that's something where just because you see temporarily unreachable errors in the fetch and render tool doesn't necessarily mean that that would be a problem when it comes to Search as well, especially if you're seeing that most of your requests are actually rendering well. And that's kind of a sign that technically there is nothing really broken there. It's really just a matter of us not being able to get all of this content as quickly as possible. There are two things that you could do to kind of help there. On the one hand, you could read up about the crawl budget. We have a blog post about that from, I think, about a year back from Gary, where we talk about how we limit how many requests that we do to a server and what you can do to kind of improve that. So make your server faster is one approach there. The other thing that you can do is to think about ways to reduce the number of embedded URLs that are required to render your page. So this is a general thing that we recommend, in particular with regards to speed. So for example, if you have, I don't know, tens or I don't know, hundreds of URLs that are needed to actually be downloaded in order to render a page, then that's something that can make it slower for your page in general. And that's something that oftentimes you can find ways to either reduce those requirements or to combine those URLs into kind of a shared URL. So for example, if you have a lot of different CSS files, maybe there is a simple way that you can compile all of those CSS files into a single file. And by doing that, you save a lot of requests similarly with JavaScript. Similarly, if you have different tracking systems set up on your page, maybe it makes sense to find a way to reduce the number of unique tracking setups that you have and just focus on maybe one or two setups that really reflect what you need to see. So those are kind of the directions I would go there. Let's see. Yeah. Yeah, I have just a follow-up question on your last question or last answer itself. So I wanted to know recent time. Recently, in our website, our page download time has been increased a lot. And because of that, we have also seen the crawl pages per device has been dropped significantly. So just my question over here is, is there a correlation when Google thinks that the server is slow or it's taking too much time to download a page? So it can also affect the crawling aspect also in the ranking kind of, let's say the Google is not able to fetch our updated content more frequently. So it can be a possibility. So it can cache my pages maybe in a longer period. And then it might affect the ranking itself. Yeah, those are a lot of different aspects. So I think, first of all, it's important to know that crawling does not mean ranking. So if we can't crawl a page now, but we can crawl it tomorrow, then that wouldn't mean for us that this page is lower quality, that we shouldn't be ranking it as well. So just because we're not crawling a page as often as maybe before or as often as we could otherwise doesn't mean that we would rank that page lower. The one aspect there that does come into play with regards to the crawl rate is if you have new content that you keep putting on your website and we're not able to keep up with crawling that, then obviously that new content isn't something that we'd be able to show in search. But if these are existing pages that we're just not crawling that frequently, then that doesn't really matter. So you don't need to artificially push the re-crawling all the time of the content that you have. With regards to the time downloaded, I believe that's in the crawl stats in Search Console. That is something that does play into the general crawl budget theme in that if it takes long for us to download individual URLs from your website, then we will generally crawl fewer of those URLs every day just because we don't want to overload your server. So that's something if you go and look at those crawl stats in Search Console and you see that they're quite high, then that's something I would try to figure out why that is happening. If that's maybe a script on your pages that's just taking longer to run, maybe a database that's fairly slow in the meantime, all of these things can kind of come together there. And sometimes there are simple approaches that you can do to fixing that, like implementing caching or implementing a CDN or something to make it easier for Googlebot and users to get a cached version of those pages. So that's something I would certainly look at there. But again, if your website is not changing that frequently and we're able to keep this content that you have on your website in our index, then it's not the case that you need to make crawling faster. Because we will rank your pages anyway. It's not that we will rank them lower if we don't recrawl them that frequently. Thanks, John. Hi, John. I have one more question. One of our clients have e-commerce website. It is a gift website, actually. So during Christmas, they launch Christmas landing page, and they add all the Christmas gifts on this page. But after Christmas, they do it every year. They take off the page. They remove the page totally from the website. And when Christmas comes, they publish the page again. So does it affect the ranking? Because every time they are taking it off, Google found photo for error, and they indicated on Google master tool. And then when they publish the page, it takes time to index the website. Is it because they are taking off every time? I think that's always a tricky situation if you have something that's so seasonal that you want to remove the content afterwards. Because when you remove the content and we remove it from our index, then the next time we find the content, again, we have to think about, is this content really here, or is it just going to disappear again? So that's something where I don't know what the best approach there would be. I think if you can keep your Christmas content up for a reasonable amount of time, that it's not just a matter of days that we have available to actually index this content, then probably that will work. In general, you also need to make sure that it's clearly relevant within your website as well. In particular, on your home page, elsewhere on the website, kind of link to your Christmas content. So it's not just that we can find those URLs, but actually so that we see that it's actually something really important that you think is relevant there. And that's something that you can do for seasonal content in general. So one thing you could also do is use one single URL and reuse that depending on the individual seasons that you want to target. So if you have, I don't know, let's say Thanksgiving content, and then you have Christmas content, and then you have Easter content, or whatever you have, then it might make sense to have one page that's just kind of like for seasonal activities, essentially, where you swap out the content depending on what season you're trying to target. That would make it a little bit easier for us to recognize this is actually something really important because that one URL would collect links over a longer period of time. It'd be something that people could refer to and link to in the long run. And that's something that would help us to better understand that this is actually pretty relevant. Even if the theme of that page changes over the course of the year, as long as you're not changing the theme of that page day by day, then I think in general that would work out. Thank you, John. Then we have another question here. We currently serve a country selector overlay on our websites. The Googlebot render preview isn't showing anything below the fold. It's not possible to scroll. Is Googlebot actually able to see my content? It depends a lot on how you implement this kind of a country selector. Our general recommendation for this situation is that you use a banner instead of an interstitial. Because by using a banner, we can always pick up the rest of the content. And by using a banner, users, when they go to those pages, they'll still be able to see all of your content. They can still use that selector to go to maybe their local version. But they'll still be able to see the rest of the content if they want to look at the currently index version or whatever version they happen to click on. So that's kind of the direction I would take there. You can, in some cases, use an overlay to kind of do that a little bit more forcefully. But it's always a bit tricky with regards to kind of interacting with the user and making sure that it actually works well for search. So I generally recommend working around this extra difficulty and instead just using a banner instead of an interstitial. And Charlie? Hi. Yeah, hi. So I have a kind of similar question over here. I mean, it's not with an interstitial, but rather to have a banner. So we are kind of a personal finance website. And we do run certain promotions over certain products. So we tend to show certain banners on our website, pages like a few, you can say, sections of the website, which might be not relevant for the theme of the page. So is it somehow going to affect my that page quality? Like I mean, for instance, like if the page is about home loan, but I am showing them a promotion of the credit card. So will it kind of make any negative impact on the page? In general, these kind of things wouldn't be something that we would see as being problematic. But you do want to look at those pages overall and think about what is the primary theme of this page and how can I make it clear to users and to search engines that this is really what this page is about. So in general, it's not a problem if you look at, say, one article page in the site bar, it lists related articles or it lists new articles that you also have on your website. That's something that's a very common kind of setup on a website. But you kind of want to limit all of those related or unrelated things that you're linking there to maybe a minimum so that it's really easy for everyone to go to those page and say, well, this is the primary content. This is what we should rank this page for. This is what users should look at when they go to this page rather than all of these other kind of distracting elements as well. So that's generally the recommendation I would have there. It's not that from an SEO point of view you shouldn't have any kind of, let's say, distracting content on a page. I think that's kind of normal in any bigger website. But you should try to limit it so that it's really easy for us to pick up the right content. Sure, sure. Thanks. All right, kind of sudden rise of 404 errors. So thousands of new unwanted pages generated by CMS that we removed caused the rankings to drop. These pages were never meant to be there. If these pages were never meant to be there and if they never had any content on them, then that's perfectly fine. We see 404 errors all the time. It's not a matter of us saying that a website is lower quality just because it has 404 errors. If anything, having 404 errors is a sign that technically you're doing things right and that when a page doesn't exist anymore, you give us an error code. And that's the right way to handle it. So that's generally not something I would worry about. I would, however, in a case like this, if you're moving from one CMS to another or if you're making significant changes on your CMS, try to look at those pages and think about where are these pages coming from? Is there perhaps something with the internal linking that's broken? Is there perhaps something with the CMS setup that used to provide content on these pages and now doesn't? And kind of trying to dig into those errors, especially if they're fairly new, especially if you've made significant changes in the CMS, just to double check that there isn't something kind of where you're accidentally leading people to a 404 page that you can fix on your site as well, where maybe instead of a 404 page, it should have been a redirect, all of these kind of things. So that's something where, from an SEO point of view, 404 errors are perfectly fine. You can have a ton of them. Some sites have millions of 404 errors. That's perfectly fine. If you see a strong rise in these, especially after you've made changes with your CMS, then I will just double check to make sure that these aren't caused by anything that you can influence on your site. In EAT, is there any percentage that divides expertise, authority, and trust? For example, expertise needs to be 50%, not 30%, 30% and trust 20%. No. I think, in general, so EAT comes from our quality radar guidelines. And in general, I think it's very useful to look at those quality radar guidelines because they give some idea of where we would like to head with regards to search. But it's important to realize that the quality radar guidelines are just some guidelines that we give our quality radars when we try to evaluate algorithms. It's not the case that we take the quality radar guidelines and, one-to-one, turn them into code that does all of the ranking. So quality radar guidelines is not, one-to-one, our ranking algorithm. So instead of focusing on these things in minute detail, it probably makes more sense to focus on the bigger picture and think about, so Google thinks about authority and expertise and trust when it comes to websites. Therefore, maybe I should work on those things because I know I'm not showing any authority on my website because maybe I don't have any names on my website or maybe it's just all content that I happened to have published long ago and I never really got around to actually showing users where this content is actually coming from. So these are things I would take as general feedback for a website to help improve it. But it's not something that you would want to implement, one-to-one, and say, well, I need to have exactly this factor exactly like that, and then I will rank number one. That's not going to be the case. Is the item type service supported by Google for reviews and ratings? The review guidelines mention services, but the supported item types doesn't include service. So I would see the supported item types in the developer side as the authority, if you will, on the types of markup or what should I say, kinds of objects that you can use this markup for. So we work very hard to make sure that the documentation on the structured data side, especially in the developer documentation, is as up-to-date as possible with regards to the specific types and combinations that we support. I realize we support a lot of different types of structured data, and sometimes it's tricky to figure out which types apply to your particular case. But in general, we try to be as explicit as possible in those guidelines with regards to what we support. That doesn't mean you shouldn't use anything that we don't support. It's always a bit of a chicken and egg problem in that if we think about the next type of structured data that we do want to support, we see nobody is actually using it like that online, then perhaps we'll try to find something else. So if you feel like you really need to use this particular type of structured data for your specific use case, then by all means, use that even if we don't actually use that directly as a visible element in the search results. But if you do want to focus on the aspects that we visibly use in search, then that's probably the requirements that we have in the documentation. If the documentation is not clear with regards to your specific use case, then posting in the forum is great. Letting me know directly is also an option. Then I can take a look at that with our documentation team and with our structured data teams to see, do we need to clarify this more? Is this something that we find people are getting confused about, that we could help them be less confused about, then we'll try to do that. We're definitely not trying to make structured data more confusing than it could be. It's always hard. One of my sites has tons of 404 errors due to an error in an SEO plugin. And now whenever a user visits my site through Google, it gives a 404 to them. OK, so I think this is one of those cases where looking at the 404 errors and the sources of the 404 errors is useful. So in particular, if we've been indexing pages and now they return 404, that's probably a bad thing. What I would do in this particular case is try to find a pattern for those pages that are currently indexed and leading users to 404 pages and setting up redirects so that you can redirect users to your actual content. In Search Console, in checking the status of a resource for a site, it displays a message about a security problem, harmful content. At the same time, in the section URL of page with problems, information is missing the last time it was detected. Checking a domain with transparency reports, safe browsing page also shows nothing. After sending or across for a check, the measures taken manually due to a security problem are removed. But after a few days, they're displayed again in an account. I don't know. Feels like this is something we probably need to take a look at individually. So if you have a thread in the forum, that would be something that we could take a look at there. Let's see. Oh, looking at the screenshots that you link to the issue that's shown there is uncommon downloads. In general, this is something that goes away over time. And if you submit a review request, we'll take a look at that as well. But it might also be something that you can help influence on your site. So in particular, if you have downloads on your website that change all the time. For example, if you make your downloads so that they're unique every time a user tries to download them, then that could be a reason for us to flag this as an uncommon download, because that download will be essentially a one-time download that we can't easily check before a user downloads them. So an uncommon download is not something that we would say is always problematic. It's just, like the name says, something that we haven't seen a lot of that we haven't been able to double-check. So you can help us to double-check these by requesting a review. Or if you know that you're generating a lot of different download files for people that are all unique, then maybe think about ways that you can have fewer downloads that are easier for our systems to double-check before users start to download them. I believe there's also a Help Center article with regards to what can be done there. John, I can ask a related question also? Sure. Sure. OK, so we have some kind of add-on for push notification. This add-on has some kind of resources that he downloaded from external domains. Well, we don't have any alert for malicious content or something, but when we search for these domains, we see that many public articles mark them as part. So we're wondering if we need to stop using this add-on or maybe require a special or only for us domains to host the push notification. How do you see that there are flight spams? We just search the domain in Google and we see some review sites or some kind of security websites that mention that these domains have malicious content. OK, so something from Google. So one thing I would try to differentiate is if it's just seen as spammy or if it's actually malicious content. Because if it's just seen as spammy, then generally, that wouldn't be that much of a problem for us. I mean, it's perhaps a sign to rethink the relationship with that company. It's like, are they really doing the right things for our users with these push notifications or is our setup with these push notifications through that service not perhaps something that users don't really appreciate? That's kind of something to look at on the side. If it is malicious, then that's something where you probably want to be a little bit even more critical. Because most likely, you're including their JavaScript within your pages. So if their JavaScript does anything problematic or if their website, for example, regularly gets hacked and the JavaScript is modified by third party people as well, then essentially, you're putting this content on your website as well. So if that content were, for example, to serve malware in some extreme case, then your website would be serving malware. And that's something our systems, when they look at your website, they just see, oh, this website is serving malware to users. We will flag your website as being malware. Because when users go there, they get malware. So that's kind of the difference approaches I would take there. They kind of try to figure out, is it really malicious? Is it really problematic, the code that they're providing? Or is it more a matter of it being seen as spammy and maybe it's kind of more of a quality thing for you to work on? OK, thank you. Hey, John, quick question. I've been trying to see how I can message you with the specific examples for the websites that are not as relevant on Google+, but I can't see it. Can I email you instead with the examples? Let me just drop my email address here. Thank you. And also, we just removed 30,000 pages, but there are old pages from 2000. Do you know how long time it will take for Google to see that they're not there anymore? Probably a long time. So if these are old pages and they haven't been changed in a long time, then I suspect we probably only recrawl them every couple of months. So probably, if you're looking at a lot of pages, I would look at maybe perhaps a timeline of three to six months, something like that. If it's something that you need to have removed quickly because they're very visible in search and lots of people are clicking on them and you don't want them to go there, then you can work with the URL removal tool. There's a limit to the number of individual URLs that you can submit there, but maybe there are things like subdirectories that you can submit that take out all of these old URLs easier. OK. Yeah, because I'm afraid that old doctors' answers regarding like headache for 70 URLs from that are ruined in the ranking of a new article on headache because Google has seen, oh, there's a lot of low quality pages here. So by having that there, new articles won't have the chance to rank. Is that? That's sometimes possible. That can happen. So we do try to look at a website overall to see how the quality of the website is overall. And if you have a lot of low quality content, then that could be something that makes it harder for us to recognize the high quality content. If the content is just older and it's kind of OK, content is just older, then that's not really a sign that it's low quality content. So that's not something I'd really worry about. But if you're saying that you have old content and new content, it's essentially on the same topic. Maybe it makes more sense to redirect the old pages to new ones so that you can kind of combine things rather than just delete them. OK. Perfect, thank you. OK. All right. Let's see. There's one question here about, I think, mobile-first index. We have separate URLs for desktop and mobile, so example.com and m.example.com. The m.example.com, in terms of hreflang, refers to the desktop version of the pages. Should we change this to the m.versions? Yes. Ideally, you would have the m.versions of the page linked by hreflang to the other m.versions. So it should always be the same type of page linking to the other page of the same type. So mobile to mobile, desktop to desktop for hreflang. That's especially the case with mobile-first indexing when we primarily index the mobile version of the page. Then we need to have that hreflang link between the mobile versions. To some extent, we can still figure that out if you have the canonical set to the desktop. And the desktop also has the hreflang. But ideally, it would really be mobile to mobile for hreflang. And I believe we documented this a while back. So I try to look at that in the developer documentation. So John, just a brief addition to our question. OK. Yeah. By the developer section, you mean the best practices, just? I believe so. So there it is written down that the mobile hreflang shows to mobile then. But yeah. But you saw that we are worried about the maybe internal linking on our mobile side. And anyway, we should switch. I would do that. Sometimes it's very easy to make that change. Sometimes it's a bit trickier. But especially with mobile-first indexing, it's very important for us to really see the connection between the mobile pages. OK. Thanks very much. Let's see. Not that many questions left. One really long question, which is really tricky to get through live in the Hangout. That's something probably would be easier by through the health forums or in a shorter question, if you can make that. Let's see. Some of the other ones that we have here, new fashion brands. Webmaster becomes aware of adversarial SEO techniques. She fears these will damage the brand's sales and image. Namely, someone tries to lower the brand's search rankings and customer reviews by posting spammy contents on external discussion boards. Third-party websites now have fake news articles about this brand's product quality. How could Webmaster fight this, technically? Tools like Webmasters are good for alerts, but is there anything that's missing? How does Google protect Webmasters from attacks happening outside of their websites? So for the most part, we're aware that this happens, and it's not something I would really spend too much time on. If you're seeing that there are problematic links to your website, you would use a disavow tool. If you're seeing that things are being done with regards to your brand, with regards to your website in ways that you think might be maybe even illegal, then that's something you can take action on on that basis. That's not something that Google would be able to essentially help you with. We can't make that kind of a judgment call. But if you're seeing this is happening on some third-party websites, then you can get in touch with them and see what you can do to help clean this up. So that's something where I would generally try to get this cleaned up, because it makes it a bit easier for you to focus on your core business. And you tend not to lose as much sleep over what or not this actually has an effect. With regards to links, again, that's fairly easy to do with just the disavow tool. For the most part, I don't think this is something where you'll see any big issues. How important is pages per session, average session duration on ranking a website? In Google Analytics, our benchmarking page values seem to be a bit lower than others in our industry. Should this be something we try to improve and focus our efforts on? So we don't use analytics when it comes to search. So just because you're seeing something in Google Analytics doesn't mean that we would use that as a ranking factor. I think maybe that helps as a first step. However, especially when it comes to speed, we do use speed as a ranking factor, at least as a small one. And speed is something that has a fairly big influence on how users interact with your pages. So oftentimes, there's a much stronger effect when it comes to speed with regards to user behavior, how many pages users look at, those kind of things. So that's something where if you're seeing that your website is slower or worse off with regards to speed compared to maybe your competitors, then that's something I would certainly focus on, even if it's just for the users themselves. Because if users find a fast website, in general, they tend to stick around longer. They tend to click around more within the website. They tend to convert a little bit better. That's something that especially some of the larger e-commerce websites have done numerous experiments on. Some of these have been written up. And you can really see even small changes, like a couple hundred milliseconds where you think nobody can even notice that that small of the speed difference. They do have measurable impact on how many users convert, how they convert. So definitely I would look at speed if that's something that you can do better. All right. So I think we have pretty much everything covered here. What else is on your mind? What else can help with? I have one related to Ullena question. Is it good practice to send XML sitemap for deleted for 10 files? A sitemap for deleted URLs. So you can do this, but I would do this only temporarily. So if you've just removed a bunch of your pages and you want Google to recrawl them a little bit faster, the sitemap file can help us to do that. But in the long run, you should remove that sitemap file so that we don't run into the situation that we think, oh, you're telling us these URLs need to be indexed, but they've never been there for such a long time, like you're confusing us. So temporarily is fine for the long run. I would not do that. OK. There is also one question in the chat if it's relevant for you. OK, let's see. Brand search trends and social signals that affect the ranking. So for the most part, I don't think we use social signals at all. I don't know what you mean with regards to brand search trends. I think in general, if people are searching for your brand, that's usually a good sign because on the one hand, that means they explicitly want to go to your website. On the other hand, there's usually not that much competition if someone is explicitly looking for you in search. So that's one of the reasons it can make sense to really think about how you want to be seen in search in general, like what brand you want to use online, what domain name you want to choose, having something that really makes it clear that when someone is searching for this, they actually want to go to your website. They don't want to read about this term that they search for. They really just want to go to your website. And that's something that from search point of view, if they really want to go to your website, then we'll show your website. There's no real competition for that kind of a query. So that's something that I think usually makes sense to spend a bit of time to think about what you can do to make sure that users are able to remember your website, your business, and then explicitly go back to your business should they want to go back. John, I have two questions here. So one, so when somebody, let's say, users searching for your brand terms along with the products, let's say, Amazon or Amazon, maybe some other things, well, so does Google relate this thing? Like, I mean, this particular brand is relevant for this specific product. And maybe in the algorithm, it's something they can detect it like, OK, so people are searching for these products. And they are quite frequently using the other brand name. So maybe Google can kind of push that ranking for that specific brand up in their search result page. I don't know if that would generally make sense. I think if people are searching for a kind of a product category and your brand, then obviously we'll try to bring that combination out in the search results, but that doesn't necessarily mean that if people are just searching for that product category, that your brand is automatically the most relevant one for that. So that's something where if they're searching with your brand name, then obviously we'll try to show you there. But it doesn't necessarily mean that your general rankings would be higher just because some people are searching for your brand name as well. Sure. My other question is regarding to having content on a website with you can say similar topics. I can say because of the nature of the query or the page itself for the topic, it's like we have divided into two. The one where we are showing the kind of listings, where it makes sense to come in and to allow users to navigate to our different pages from that specific topic. And other page we have specifically talking about the usefulness of that product. So what happens is our main page is not ranking really well. But on the other hand, where we are providing all this content based page, so it is providing pretty well. So I'm just wondering if we are able to merge these two pages all together, will it make a better kind of in terms of search visibility wise? Because I'm having also a doubt to worry, if I kind of merge this, this page would become huge. And maybe my user might not want to kind of scroll through all these things. So just wondering how we should go about that one. I would test it. I don't think there is a clear yes or no answer to should we combine these pages or should we keep them separate. I think that's something that I would generally try to test and see how users react to one page. Also, perhaps think about what you can do to make one page that's not just this page plus the other page, but maybe something more organically combined that's maybe the shorter or maybe that works better for the users. But especially if you're looking at a bigger website that has a lot of these kind of pages, then try to take a representative sample of those pages and find a way that you can do some simple A-B testing to double check to see if your hypothesis is correct, that maybe fewer pages would be better in this case or maybe these two pages separate actually work a lot better than when they're combined. And that's something where you need to also think about what your overall goal is from this page. So don't just focus on things like page views or if people are scrolling to the bottom. But if you're trying to convert users to buy something, to sign up for something, then track the conversions and see if users are actually completing that action rather than just the number of page views or the number of rankings that you have for that particular page. Yeah, true. So basically one page is true, one for the potential for the conversions. And that's our issue, that this page is not wringled. Basically, it's don't have that much of in-depth content or the kind of usefulness on how we are making better something for the users. On the other hand, other content is have a very, you can say, text rich or the very usually good content where the user can actually interact and can see how this particular product is actually helpful for me. And ultimately, this product actually leads users to my main page. But it's like when users are looking for certain queries to my content pages, it's better than my product page. I think that's something that's probably really best to just test. Yeah. All right. Thank you. All right. OK, any other last questions before we head off into the weekend? What else is on your mind? Number of questions. But thank you for your help today. All right. Great. Thank you. Well, I have one. OK. Advanced one. So we have a site that ranked for some kind of topic. He rebranded. Now he potentially want to rank on other topic. The website have many subdomains. So one of the subdomains is going to create an AdWords campaign with the topic, the old topic. So we know that they maybe have some kind of indirect effect on organic. But we're wondering if it can get us a negative effect on the organic positions of the old topic that we don't want to get mentioned again. So there shouldn't be any organic effect from running an ads campaign for something like this. So that's something where maybe you would see an indirect effect in that people go to your content. They really like it. They recommend it to other people. And that's something that we could pick up on. But there wouldn't be any effect from just running ads on a website. And that's kind of like you mentioned, like this worry that maybe it has a positive effect. Maybe it has a negative effect. That's something that we hear every now and then that people say, oh, running ads means that your website shows up lower in search so that Google can run more ads. And others say running ads means your page shows up higher in search because Google is getting money for the website. Both of those are not the case. It's not that there's any connection with the ad side and the search side when it comes to ranking. So that's something where you can really focus on your website as it is from an organic point of view. And try to find the right approach there. I'm kind of curious when you mentioned that you already have a lot of different subdomains for this website. Sometimes it makes sense to combine these various subdomains into one kind of more primary website. Well, language related. OK, oh, language related. OK, that's, yeah, I think that's completely natural. Yeah, that's no problem. Still, we have some kind of, well, we don't have someone to ask me. It's some kind of, for example, the site was in Foxfield. Now he's switching to more work, audit-offic-like investing. And he wondering if AdWords campaign for Forex can have effect on organic positions for Forex, for example. No, that shouldn't be the case. OK, yeah. I see one question in chat. Sometimes video ranks in the search results. What could be considered there? Can a short video outrank a 5,000 word blog post? Sure, it can. It doesn't have to, but it can. So I think, first of all, there's no specific word count limits that we're looking at when it comes to pages. So sometimes we'll see that come up where people are saying, oh, but my page has 700 words and this other page has 200 words. Therefore, my page should rank higher. It's not the case that, at least as far as I know, that any of our algorithms count the number of words on a page and try to make ranking decisions just based on the number of words. Sometimes pages have very few words on them and are very useful. Sometimes pages are very long and are also very useful. So all of those options are possible. So I wouldn't blindly focus on the word count. I think word count can be useful if you don't know the website overall and you're trying to figure out which of these articles are really high quality and in-depth and which of these articles are kind of thin that maybe we should focus on more. Then across your website, maybe the word count can play a role in roughly giving you some information on those articles that you have there. But in general, for ranking, it's not that we would take word count as something like a one-to-one ranking factor. And when we look at things like videos versus textual content, that's something that's always very tricky to kind of balance that ranking. Similar with regards to news, with the news carousel, with regards to images, all of these things can come together to some extent. And we have to look at very different factors on a page where we have a lot of textual content versus a page that maybe has a lot of images on it or a big video on it. And ranking those two is sometimes pretty tricky. So we have to think about what we believe or whatever that we believe would be the most relevant result for users in those cases. And the mix of the type of search results, they can and generally do change over time. So for example, what I recently noticed with regards to the big fires that they have in California, if you search for some of those fires now, you get a lot of video results. Because obviously, videos are a great way to show the effects of these fires there. I imagine if you look at those queries in a couple of months, you'll probably see more textual results and fewer videos because it's more a matter of kind of getting background information on those topics rather than looking at live videos or almost live videos. So these things do change over time. And there's no simple kind of way to compare a video result versus a textual result and say, well, this is something that we'll always rank above this content here. That mix of different types of content, that's very dynamic and that can change over time. And you will generally see changes in ranking with regards to video versus text content over time. I think that also makes it a little bit interesting with regards to search because it does show that there's no simple ranking factor that just works for all types of content, and these things change over time. So it kind of keeps it a little bit exciting and also kind of helps you to realize that you don't have to cover all of the bases when it comes to ranking. You can focus kind of on what makes your content unique and valuable on its own. Let's see, the message in the chat goes on with regards to Cyber Monday deals and MIDI videos appearing before sites. I think these kind of super time-limited effects are sometimes a bit tricky, and they're tricky for our algorithms sometimes to figure out as well. So that's something where I don't have a general answer for this and say, well, for Cyber Monday, we would never show videos because why would we show videos? Sometimes it makes sense to show videos. Sometimes it doesn't make so much sense. And sometimes our algorithms take a bit of time to figure this out, and by the time they figure it out, maybe that date has already passed. So it's kind of a bit of a tricky situation there. All right, let's take a break here. It's been great having all of you here. Lots of great questions that were submitted. And I hope you all found this useful, and I wish you all a great weekend. Thank you. Bye-bye. Bye. Bye. Thank you. Thanks, John. Bye. Bye.