 All right. Welcome, everyone, to today's Webmaster Central Office Hours Hangouts. My name is John Mueller. I'm a Webmaster Trends Analyst here at Google in Switzerland. And part of what we do are these Office Hour Hangouts where folks can join in. Anyone who is working on a website and has questions around their website and web search, we can discuss these questions and find or try to find some solutions. A bunch of questions were submitted already on YouTube, which is awesome. So we can go through some of those. But as always, if any of you want to get started with a question, feel free to jump on in. I have a question, if you'd like me. Sure. I wonder if Googlebot can crawl user-triggerable events on a page or not. What kind of pages? I am asking that Googlebot can crawl or not user-triggerable events on a page. Like SQL events or JavaScript events I am also talking about. Usually not. So how can I show this to the Googlebot? Like dynamic rendering or HTML snapshots? What kind of a technology I can use? Yeah. So anything like dynamic rendering works, anything that you can do to make it so that the content loads with the link rather than interaction, that helps as well. So let's say I am using a pop-up, which can be seen after 30 seconds sessions. So I wonder if Googlebot can see this or not. Maybe. Maybe. So we render the page, and we try to give it a bit of time to settle down. But if you're doing something where in 30 seconds, something changes on the page, I don't know if that's long enough weight that Googlebot has already rendered the page, or if Googlebot is still rendering the page. OK, I have another question. So let's say I am refreshing the web page after 30 seconds, and I am redirecting it to another URL. So Googlebot can see this or not, because it is a kind of sneaky redirect, as you know. Yeah. So maybe we can see that. It's similar. It's against these red lines, right? Yeah, if you're waiting a certain period of time, then it's possible that we can see that. It's possible that we don't see that. OK, thank you, John. Yeah. Hi, John. Hi. We have got a question from one of our clients. They want this clarification. So the question is, if they show their Google review on their website, will that affect their ranking? Will that help to improve their ranking? What are they showing on the site? Google business listing review. So the review they are getting on their Google business listing, if they show those review on their website, how does it impact their ranking? I don't think it would impact the ranking at all. So it's not that we would rank it higher, definitely because of the review content. If they use the markup for the review for their business on their business homepage, we would not show those review stars. And maybe there is something in the reviews that is content that they wouldn't otherwise have on the page. Like if a customer writes about their service and they don't have the service described on the page, then that might be something where it could rank. But by definition, just by having a Google review or any kind of review on a page doesn't make it rank better or worse. And one more question. So we have found recently that some of our clients have a lot of issues with their structured data. And most of the structured data actually are generating from the markup. So those markup actually generated by workplace plugins. So it is very difficult to fix those structured data. Because if we need to fix this, we have to fix the entire plugin. So will it affect the ranking for having wrong structured data on the website or invalid structured data on the website? No. It won't affect the ranking. But it might be that we don't show the rich results for that. Like if the markup is such that it doesn't match the requirements anymore, then we wouldn't show those rich results. So it wouldn't affect the ranking. But it might affect how it's visible in search. And with that, it might affect how people quick click on the results. So even if they rank the same, if users can recognize that this is really a good result with the extra structured data, then maybe they'll click more. So it's something where I think it's worthwhile to try to get these fixed. But if you can't do it, you can't do it. But then I would consider double checking with the plugin creator to see if they can improve that. Because if they fix the plugin, then that's fixed for everyone. So I think that would be useful. OK, the last question. This is about predirection. So one of our clients has a lot of pages related to one keyword or the variation of the keyword. So what we are trying to do, we are trying to remove those pages and try to put all the content in one page. And we direct those pages to that one page. Now, those web page has a rank on Google search results. So if we do this, the new page will get the rank or that will take time? That will take time. So if you combine multiple pages into one page, then it's not just adding the visible impressions from all of these pages up and saying, well, it'll rank like three times as much. It'll take a bit of time. In the beginning, you probably see a short-term effect. And then over time, you'll see that it settles down. I think, in general, it makes sense to combine multiple pages that have the same topic into one stronger page. So I think that's a good approach. Thank you, John. Sure. Morning, John. Can you hear me OK? Yes. Great. So we sell floor pane online. And one of our main category pages is floor pane. Now, we're struggling to rank in Google for that page or that category. Instead, Google's choosing for that particular term floor pane to rank one of our product pages. Now, we've tried various optimization techniques, sorting out the canonical URLs, the H1 tags, on-page content, everything. And we're still getting issues with getting that category page to rank. I just wonder if you could offer us any advice. OK. So it's basically a product page that is ranking instead of the category page that you'd like. That's right. I mean, the product page is still somewhat relevant, but obviously not as relevant as the category page, which has got the keyword in the URL and things like that. OK. And the category page is indexed. So if you. Yeah. OK. Definitely indexed. OK. So I think from a technical point of view, that's pretty much kind of OK in that if it's indexed, then we can try to rank it. What is probably, or I don't know, some of the things that you could look at here. One thing is to make sure that the category page is well linked within your website. So if you have multiple products that are all in the same category or related to that category, then link to the category page so that when we crawl the website, we can really understand this category page is actually really important. We should focus on it. So internal linking. Yeah. Internal linking is one thing. Another thing that I've sometimes seen, especially with e-commerce sites that kind of struggle with this kind of a problem, is that they go to an extreme on the category page in that they include those keywords over and over and over again. And what happens in our systems then is we look at this page and we see these keywords repeated so often on that page that we think, well, something is kind of fishy with this page with regard to these keywords. Maybe we should be more careful when we show it. So it might be that you're essentially going down, oops, Jonathan, I think you're presenting your screen. It might be that you're kind of overdoing it with the category page in that it would perhaps make sense to kind of move back a little bit and say, I will focus my category page on these keywords and make sure there's good page for that, but not go too far overboard. So that when we look at this page, we'll see, well, this is a reasonable page. There's good content here. We can show it for these terms. We don't have to worry about whether or not someone is trying to naturally overdo it with those keywords. Yeah, make sense. And quickly, do you think that, obviously, backlinks to that specific category page will help? Alongside, obviously, kind of natural backlinks to the website in general? Yeah, I mean, that's something that doesn't cause any problems. And from our point of view, in general, backlinks from other websites are something that we would see as something that would evolve naturally over time. So I don't think you'd need to go out and kind of artificially build backlinks to a category page like that. Yeah, yeah. I think what I would also do in a case like this is kind of go with the assumption that you won't be able to fix this very quickly. Not that it's impossible, but kind of assume that it's going to stick around a little bit, because sometimes our algorithms do take a bit of time to adjust and find a way to make it so that when users land on that product page that they realize there's actually category page that might be more useful to them. So something like a small banner or some other visual element on the page so that when users go to that product page, they can find their way to the category page fairly easily so that you don't have to worry about the short-term problem that maybe the wrong page is ranking. And in the meantime, you can kind of work on creating a reasonable solution for the category page itself. OK, all right, thanks for your help. Sure. John, one related question about Rich Snippets. OK. So about the final changes in Rich Snippets, the last changes in Rich Snippets. So if we have a hotel website, if we embed the view from Google My Business, is it possible to show you these rich results in search or not? If it's a review about your business, that's hosted on your business website, then we would not show that. But it doesn't matter if we use third party like Google My Business. Yeah, it doesn't matter where the review comes from. So if you use, I don't know what they're all called, then that's something where if it's a review about your business that you're hosting on your business, then we would just not show that in search. Product reviews are fine. So if you're reviewing a product and that's on one of your product pages, that's perfectly fine. OK, thank you. And one related question about the question from the lady. So your recommendation for category pages is sometimes you see some keywords, target keywords on the page, and this is not very natural. It looks not very natural. What do you mean? These keywords are in the product titles or? So what happens in practice there is our algorithms look at this page and think there's a lot of keyword stuffing happening. And then we will be kind of extra critical when it comes to that keyword and that page. So sometimes we see it that you have a category page like I don't know about t-shirts and all of the products that you link have like blue t-shirt, red t-shirt, green t-shirt, and it's like everything t-shirt, t-shirt, t-shirt, and then on the bottom you have a half a Wikipedia page. Kind of like t-shirts are made out of cloth. And t-shirts this and t-shirts that. And then that's something where when our algorithms look at this, there are like 500 mentions of the word t-shirt here. Maybe this is not the best page to show for this topic. Yeah, I missed it. Thank you so much, John. Sure. John, if you listen, I have another question. Sure. Let me run through some of the submitted questions first. And then I'll get back to everyone. Always good to kind of make sure that the people who weren't able to join in person also have a chance to get the question asked. I work for an agency that has mostly European clients. For almost all clients, we see a big spike in impressions around the 22nd and 23rd of September. And the next day they went back to normal. Is this a bug in Search Console? I am not aware of any bug related to that. So I don't think there was anything, at least from what I know. Hey, John, if I may, just related to that. There's something I sent you a couple of weeks back and it was around the 29th of September where we saw a huge sort of like increase in searches for this particular keyword. It went from an average of about 7,000 to 8,000 searches per day to around 4 million per day. So last month, right? Yeah, yeah, that was last month. And that sort of continued. I mean, the numbers are becoming smaller, but it's still in the millions. And when we've investigated it, we've sort of looked in Google Trends as well and we've seen the spike for the same keyword as well. And it's a bit odd just because some countries, for example, if you pop up Google Trends and you put the word loans in there, you'll see that even in Germany, in all these different countries that don't tend to search for the term loans, it just spikes up and it correlates with the Google Search Console data, which obviously lends me to believe that there's an issue, not with Google Search Console, but with Google Search. There's some sort of bug or something that's causing impression spikes. Yeah, I think the team is looking into that. I don't know what the solution will be there if we can kind of like recalculate the Search Console data and all of that, but I know they are looking into that. Oh, but that sounds a bit different from this question, though, where it's like in September, like on two days, they saw a spike. So the question is from me, actually. Oh, fantastic. So for one of the clients, which does e-commerce, we saw a spike from 300,000 impressions a day. To over 700,000 impressions. The next day, it went back to 300,000. And I have examples from multiple clients in different areas. Could I ask a colleague of mine that has your email to send you some examples? Sure, sure. Okay, thank you. Sounds good, yeah. I'm happy to take a look. I mean, one of the things that might be playing a role there is if this is, well, I guess probably not. But we have the new fresher data in Search Console, and that's calculated in a slightly different way to get the data as quickly as possible into Search Console. So after like during that time when it's fresh, then you might see slightly different counts there than you would see when it's kind of settled down. But it sounds like if it's from the 22nd of September and now it's the 27th, then probably that's not what you see. But happy to take a look. Okay, thank you. All right, we have the site for fetching and rendering of above-the-fold images seems to work well. With the URL inspection tool, images don't get indexed in image search though and don't get displayed in red snippets. Above-the-fold images are lazy loaded. Is indexing for lazy loaded images above-the-fold not supported? Lazy loading images, if you do them in a way that they work in the URL inspection tool, that is definitely supported for image search and with that also for rich snippets, whatever rich snippets rely on images. What might be happening here is that these images are otherwise blocked for image search. So that could be with the... There's a robots meta tag. I forgot what it's called. To block images from being indexed on a page, that might be something that is playing a role here. It might be that the images themselves are somehow blocked from being indexed in that we can still crawl them but we can't index them for image search. Perhaps that's some... So I'm not 100% sure, but it could be that you're blocking the images with robots text, for example, and when we render the page, we can still fetch those images for rendering but we just can't use them for image search. So something like that, I suspect, is happening here. If we can fetch the images when we render the page, then the lazy loading part that sounds like that would be okay. And in general, images can be lazy loaded. So that should all kind of work. If you can't get this to work, then send us the URL, maybe post in the Webmaster Help Forum or drop us a note on Twitter so that we can take a look to see where things are getting stuck. Who do you contact at Google when a malicious site generates 80,000 links to your site and tanks your rankings? You can send me a note if you want. In general, though, this is something that our algorithms are pretty good at dealing with. So that's something that we should be able to deal with without any issues on your site. Lots of spammy sites link all over the Web and they generally don't cause any problems there. So if you're seeing bigger changes in rankings, then it might also be that these are just normal changes in ranking. If you're really worried that the site is causing issues with regards to search, then you can use a disavow file. With the disavow file, you can submit a domain. So if there's one site that's generating 80,000 pages that link to your site in a way that you don't like, then you can just submit that domain in a disavow file and all of those links will be ignored. So that's something that you can kind of take into your own hands if you want. You can also send us a note. We can double check with the Web spam team. But pretty much all of the cases that people have sent my way, they were things were on our side. We were already essentially ignoring those links. Can you give an example of an excessive link exchange? So there are situations where link exchange is natural. For example, you write an article that links to a restaurant and the restaurant might link back to the article. Yes, that kind of back and forth linking is completely natural and not something that I would worry about. It's quite often the case that you have a customer and you link to your customer and your customer is really proud to kind of sell your product and they link to you as a supplier of the products. This kind of back and forth linking is completely natural and not something that I would worry about. The excessive kind of link exchanges are really when tons of sites are linking across each other. So we sometimes see that you'll have a group of sites that work together and in their footers, they'll cross link to all of the sites that are kind of involved in this kind of link exchange setup. So that's really something where if you're really obviously doing that just for the sake of search engines and that's something that we might pick up on, if you're doing this in a natural way in that sometimes people link to you and sometimes you link to them, then that's not a problem. A question about infinite scroll impagination. What's the SEO friendly method? Is the article from 2014 still valid? Yes, you can still use the setup from the article from 2014. The important part when it comes to kind of infinite scroll is that we're able to reach all of the pages that you have involved and the best way to do that is to link to all of those pages individually. So if you have, for example, something that's set up to do infinite scroll for you, then make sure that you also have kind of pagination links on that page so that you can crawl the individual pages individually as well so that we can pick all of these pages up ourselves. The one thing that's no longer necessary from that old article is the rel next and rel previous links. Those are things that we no longer use when it comes to search, but normal links in pagination, perfectly fine, great way to deal with infinite scroll pagination. Is it mandatory to just have one H1 tag on a web page or can it be used multiple times? So we get this question multiple times as well. You can use H1 tags as often as you want on a page. There's no limits, neither upper nor lower bound. H1 elements are a great way to give more structure to a page so that users and search engines can understand which parts of a page are kind of under different headings. So I would use them in the proper way on a page. And especially with HTML5, having multiple H1 elements on a page is completely normal and kind of expected. So it's not something that you need to worry about. Some SEO tools flag this as an issue and say like, oh, you don't have any H1 tag or you have two H1 tags. From our point of view, that's not a critical issue. From a usability point of view, maybe it makes sense to improve that. So it's not that I would completely ignore those suggestions, but I wouldn't see it as a critical issue. Your site is going to rank perfectly fine with no H1 tags or with five H1 tags. Are there plans to have Google My Business data and reporting rolled into Search Console? Similarly, are there plans to have the Search Console geographic drill down further than just country? So I'm not aware of either of those plans. I think it would be pretty cool to have more Google My Business data in Search Console. In the past, these have been clearly separated. So it's not something that's really overlapping, but maybe that makes sense. I don't know. To be honest, I don't know exactly what that has shown in Google My Business dashboard. With regards to further drill downs than just country, I don't know if that would be easily doable in Search Console just because of the way that the UI is set up. But that seems like a good feature request. So what I would do there, and probably also for the Google My Business data, is to make sure to submit feedback directly in Search Console. The team actively goes through that feedback on a regular basis. So if this is something that lots of people think would be really useful, then who knows? Maybe they can find a way to make it work. We fold our text with CSS so that the page is not too long. This is seen as a positive by user surveyed. In former times, it was said that this could be done for mobile pages. We also do this on our desktop page. Since we switched to mobile-first indexing, our idea is it wouldn't hurt desktop. Is that true? Or are our long explanatory texts no longer so important from Google's point of view because you only see the headline, the rest is only visible after clicking? So I think there are two aspects here. One is if users don't need these texts, then maybe you don't need them on your pages anyway. So that's kind of the more general thing. If you're folding these long texts back with CSS and people have to click on them to actually see the text, you can track on your side if users are actually using that to expand that text. And if nobody is expanding that text, then maybe this text isn't really useful for people and maybe you can save yourself a lot of kind of questions and trouble by just getting rid of the text that would otherwise be kind of folded away like that. So that's I think the one thing to kind of first figure out because if you can remove that text, then you don't have to think about like, how do I implement it? With regards to implementation, when it comes to mobile first indexing, we will index only the content on the mobile version of the page. So if you have different content, different HTML that you serve to mobile users and desktop users, we will only use the mobile content. And we will use that content to rank both the mobile and the desktop page. So if you're changing something on your mobile site, then that would be reflected in the desktop search results as well. It's not that we would say, well, the desktop site says this, so we'll rank it like this and the mobile site says something different, we'll rank it slightly different. So in that regard, if you've changed it on the mobile site and we switched to mobile first indexing, we're already using that for indexing. Why do sites that have thousands of spammy backlinks get ranked higher? So yeah, I don't know. That seems like a bad thing, right? So on the one hand, we can sometimes get things wrong. So that's always an option that maybe we're getting something wrong, maybe we're not recognizing problematic things on a site. And in cases like that, always feel free to send us a spam report. That's something that the web spam team really appreciates. On the other hand, sometimes sites have lots of spammy backlinks and they rank despite those spammy backlinks. So what often happens is on the one hand, we try to ignore all of these spammy backlinks. So maybe a site is just ranking based on other signals. There are also cases where our algorithms might look at this and say, well, there's so many spammy backlinks here that we can't just ignore them. We have to assume that there's kind of a lot of malicious intent here and we have to be even more careful. And in cases like that, we might demote a site slightly in search, so it wouldn't rank as visibly. However, it might have tons of really good signals that are also associated with that site. And that might mean that we show it a little bit lower, but it's still fairly high in the search results. So just because something has spammy backlinks doesn't mean we would not show it in search. We try to take those spammy backlinks into account essentially by trying to ignore them or trying to treat them appropriately. But that doesn't mean that the site wouldn't show up in search. And I think a really good reason for us doing it like that is that there are lots of people who are confused and don't really know what they should be doing when it comes to search. And they get advice from all kinds of people. And these are people who might know what they're talking about and there might be people who have no idea what they're talking about. And a lot of these sites that are a bit confused, they end up doing weird things more or less accidentally because they don't know what they should be doing better. So this is something that we saw quite a bit, I don't know, in the early days where people would be doing crazy keywords stuffing on their pages because they thought the more that you mentioned these keywords, the more the site will rank for those keywords. And when we looked at that, we realized that these were not malicious spammers who are trying to abuse our systems, but rather just people who don't know what to do. So that's kind of where our idea of, well, if we can just ignore the problematic things that they do, then we can focus on the good things that they do and we can rank a site based on that. So that's, in short, kind of why you might see this kind of situation where you see a competitor or another site ranking fairly well in the search results and you dig into that site to try to figure out like what kind of magic trick did they find? They have all this spammy stuff and they're still ranking high and it's like, should I do spammy stuff as well? And on the one hand, I don't think it makes sense to copy what other people are doing, but on the other hand, it's also worth thinking that maybe there are other things that the site is doing really well and it's not so much the spammy stuff that is making the site rank you well. Are the core updates based mainly on content or mixed of content and links? The core updates that we make essentially affect the core ranking algorithms that we use and for that we use lots of different signals. So it's not just the content, not just the links. Branded terms, for example, Harley dash Davidson versus kind of do two words. I write content for authorized Harley Davidson motorcycle dealers and I keep losing my rankings because customers don't search with the dash or with the hyphen and for trademark rules, I need to keep the hyphen. So what can we do there essentially? So there are two things here. On the one hand, I took a look at some of the search results for with the hyphen versus without a hyphen and we do recognize that these are synonyms and we do try to treat them the same more or less. So that's something where I wouldn't see that the hyphen itself as being a reason for your pages not to be able to rank. That's one thing. The other thing is that if you want to appear in the search results for what people are searching, then maybe it makes sense to have content written in a way that matches what people are searching for. So hyphen or not is probably like on the scale of things like a trivial thing and it's not going to be affecting that but you could imagine situations where maybe you have a pharmaceutical product which has a fancy medical name and has a kind of a general colloquial name as well and users might be searching for that easier name because that's something that they hear about from their friends and if you're only using the fancy medical name on your pages, then you're going to have trouble ranking for those terms and whether or not you have trademark rules or guidelines that say that you should only be using this kind of long medical name, if you don't use the words that people are using to search for your pages, then it's going to be trickier to rank for those terms. It's not impossible. Like we can understand like in the case like with a dash or not, that's something easy to understand. Even in the case of a kind of a long medical name versus a colloquial name, that's something we can try to figure out that these are similar or synonyms but if you're not mentioning what people are actually searching for, then you're going to have a hard time. So if you're writing content for users and you know they're searching in a particular way, then try to take that into account. Duplicate content, I'm sharing all my content across different platforms like Medium, LinkedIn, Tumblr, and cryptocurrency blogging platforms. That one kind of stands out a little bit. Will I be penalized if Google crawls the same exact articles across the web? No, you will not be penalized for having the same exact content across the web, even if it's on cryptocurrency blogs. But you are competing with yourself. So if you have exactly the same content on multiple different websites, all of these pages can rank and all of these pages can collect signals in that maybe people link to these articles because they're really great articles but they're all still competing with each other and if you're writing in a very competitive niche, then that kind of competing with yourself could be making your pages less visible. So it's not that Google would be penalizing your site or your content and say, well, this guy has his content even on a cryptocurrency blog. It's more that you're competing with yourself. Like this article on LinkedIn is competing with the one on Medium, competing with the one on Tumblr. All of these things are essentially competing for the same terms. So usually what I recommend is to try to find one platform where you can be kind of keep your content and have that content be really strong. So instead of having all of the signals that are associated, kind of some going to Medium, some going to LinkedIn, some going to a blog, have all of those signals concentrated on one version of your content. That way we can recognize that this content is actually really important, that it's something that is kind of unique out there and especially in the competitive landscape, we can take that into account and say, well, like these other articles are pretty good but look at this one article that's like seen as the authority on this topic, then that makes it a lot easier for us to rank that piece of content. So my recommendation there, especially if you're kind of struggling with getting your content shown in search is try to concentrate things on one platform as much as possible. Sometimes it makes sense to reach out to other platforms and also give some information there. But the goal I think should be in the long run to make sure that your version of this content, wherever you want to kind of place your primary place of residence, essentially online, that that version is seen as kind of the authority on this topic rather than that you're spreading things around to everyone else. Then cloaking, infinite scroll, I think we talked about this a bit. So infinite scroll just you ideally use the pagination links. Then a question about the coverage report. Oh my gosh, this is a fairly long question. Probably need to look into that separately. So if you're kind of writing really long questions specific to your site, specific to something kind of that you're seeing with your pages, I'd recommend taking that to the Webmaster Help Forum where people can kind of drill into the specific problem that you're seeing and might be able to give you some advice and tips on what you can do to make that a little bit easier. Last week you replied on Twitter about updating a URL. An example, consider that a previous Webmaster has created an inappropriate URL, say very long ones, and now we're updating to shorter, smarter ones. How does Google handle that? I always see traffic drops in the URL after updating it. It takes some time to recover. In general, anytime you make bigger changes on a site, you will see some time for things to settle down again. If you're doing this on a one URL basis where you take one URL and you redirect to another one, that's usually something that we can figure out fairly quickly. If you're doing this on a broader scale, like you're cleaning up the old URL structure that someone has created that has these gigantic URLs and you're cleaning that up, then keep in mind that we essentially need to reprocess the whole website to understand the new context of all of these pages on your site. So that's something that will take a bit longer, where I'd expect, I don't know, order of a couple of weeks to a couple of months at least for things to settle down again. I am a webmaster of a new site. Even though we rank several times a day, we appear in Google Discover every now and then. I've recently noticed that Google indexes our content 30 minutes after we published it, which, for news outlet, is an eternity. Could this be an issue regarding our sitemap or some meta tag that we're missing or something about our content? It could be any of those, yes. It sounds like you're kind of on the right track there. In general, from a content point of view, that's something that is definitely important. From a crawling point of view, there are things you really need to watch out for so that we can pick up content quickly. So in particular, what I try to make sure is that you have kind of hub pages on your website. So if your news website, that your homepage, lists the recent articles or that you have different categories, different kinds of news on your site, that those news kind of category pages list all of the recent articles. So in that case, what can happen or what usually happens with news sites is we crawl those home pages and category pages a lot more frequently than everything else. And we'll try to pick up links to the new articles as quickly as possible. And if we can crawl those pages, find links to the new articles, we can try to index those new articles as quickly as possible. So that's, even without anything from a sitemap site, that's one thing I would always make sure to do, that we can go from kind of these hubs within a news site and find all of the new articles extremely quickly. With a sitemap file, that's something you can help us even more. Sitemap or RSS feed, both of those kind of work. What's important for both of these is that you have the last modification date of the pages specified there as well. So that when we look at the sitemap file and we find a new URL in the sitemap file, we can look at the date and see, well, this is something that is new. We really need to pick that up as quickly as possible. So that's kind of the main thing there. The other thing, which is probably more on a basic technical level, is especially for news websites, we have to be able to crawl quickly so that we can index quickly. So for that, we need to be able to access the server quickly. We need to have the pages come quickly. All of these speed things are elements that are kind of critical to us. And we have an article on our blog, I think from last year, that talks about this a little bit. It's the crawl budget article from Gary that goes into some of the kind of aspects that are involved with being able to crawl the website quickly or not being able to crawl it so quickly. So those are kind of the directions I would take there. I don't think there's like, for most cases, there's not just one magic bullet that you can kind of tweak and suddenly you'll be crawled and indexed in minutes. But usually, it's a combination of different things, especially if you're seeing 30 minutes, then most things will probably be right. Maybe it's a matter of tweaking some of these things to make it a little bit better so that you can get that time down to, I don't know, a minute or two. Let's see. I think we have like 10 minutes left. Maybe I'll just switch over to questions from you all. I'm sure things, there might be some things that we can kind of help clarify as well. Hi, John. Hi. How are you? Pretty good. How are you? I'm great. So here's my question. Let's say I have two brands in the same niche, very competitive niche, say brand A and brand B. Each one of them, those two brands have basically the same content. So just for the example, let's say we're talking about the travel industry. So both brands have like hotels and flights and so on. Now, each one of those brands have a different CMS and the business wants to migrate to the CMS of brand A. So to take brand B and migrate it to the CMS of brand A. Now, the difference between those two brands are in the URL structure. So in brand B, the URL structure goes much deeper. So let's say if you're talking about hotels, then let's say you'll have cities on brand B while on brand A, you'll have only states, for example. So on this aspect, the risk is already known. If I will migrate to the basically URL structure of brand A, I will lose all those deeper pages and we lose traffic. But my other question basically is, because those two brands, once they will be on the same platform, on the same CMS, will look the same. So same UI, UX, and same URL structure, this might be a risk for, I don't know if duplicate websites, but is this a risk like having exactly the same two brands that looks exactly the same, also the same products and events? Is this might be a risk for brand A, in case brand B will migrate to its platform? Probably, yeah. So there are two aspects there. One is essentially a redesign of a website. So you're taking the existing website and you're redesigning it, creating a new URL structure, creating new layouts, creating new sets of internal links by moving from one CMS to the other setup. That's something that can take quite a bit of time to settle down. So especially if you're changing the URL structure, if you're changing the layout of the site in general, then that's something where I would expect that this would take a significant amount of time to settle down. And it's something that you need to be really careful about when doing the migration. So tracking things like all of the old URLs, making sure that they all have redirects to the new URLs. If you're merging pages or splitting pages, then make sure you're tracking all of that separately. And that's something where from a crawling and indexing point of view, we need to be able to understand the new structure of the website first. And that can take quite a bit of time, especially if you have a website that has a ton of content. So when you're talking about travel sites, often there's tons of content out there. And moving that to a new URL setup, that's going to take, I don't know, I'm just throwing numbers out, but probably three to six months at least to settle down. So that's kind of the one thing to watch out for. The other thing that you already mentioned as well is that if this new setup that you have matches a different website that you're running, then especially if the content also matches, like you're selling the same products, the same flights, the same hotels, you have the same descriptions on both of these sites, then it can happen that we will look at these pages and say, well, these are essentially equivalent. The brand name is different on these, but the content is exactly the same. We will pick one as a canonical. And we'll just index that version, the canonical version of that content. And what might happen there is that suddenly everything switches over to one website or some content switches to one and some content switches to the other one. It's kind of hard to say ahead of time how that will pan out. But if it's the same content, then technically that makes sense. It's not that this would be a bad thing from our point of view. So assuming that this can happen across these two websites, one thing you might want to consider as well is just say, well, if Google will see these as the same website anyway, why don't I just migrate both of these kind of sites to one website and make one really strong website rather than have two that are essentially providing the same content? So that's one thing to kind of think about there. If these two brands are really positioned differently and from the content point of view, you really have different content. Maybe one is, I don't know, targeted for business travelers. The other one for kind of, I don't know, last minute people who are just traveling for fun. Then maybe it makes sense to keep those separate. But then I would make sure that the content is also really different. And then I don't see a problem with having two of these sites kind of doing similar things if you're really targeting completely different target areas and you're really providing content that's different when people look at these pages. But if you're providing the same content across both of these sites, I would assume that our systems will say, well, this is kind of the same. Maybe we should just fold it together. Yeah, well, basically it's not the same content exactly. Like the user values are pretty different and the target audience is a bit different, even though we're talking about the same niche or the same industry. Nevertheless, my big concern was just because of the similarity of both the URL structure and the site, the design of the website, that it might be considered as duplicate. So I'm just asking specifically more. Let's say the content is different, but the brands looks like pretty much the same. You know, different colors and different icons and logos and stuff. But the basic structure of the website is identical. I think if it's purely the structure, that's no problem. If the content is really equivalent or identical, that's where we would kind of think about, or maybe it's the same thing. So you can think about it kind of like a blog. Like a lot of different blogs are set up in the same way. They have the same URL structure, the same kind of heading and sidebar type thing. But they're very different blogs. So it makes sense to index them individually. However, if the content were also the same, then that would be a case where we would say, well, they look the same. They have the same content. Maybe we should fold some of these together. So that's kind of where I would watch out for. If it's not like the design and the URL structure is not really what I would worry about when folding things together, it's really the content itself. Got you. Thank you so much, John. Hey, John. I've got a quick question on if we can go back to structured data. Is it possible to have FAQ page markup structured data and product review ratings show up simultaneously in the search results? I don't know. You can try it out. We can test. So it's test. Yeah. So some types of structured or rich results, we can combine in the search results and some kind of clash in the sense that it just doesn't work well in the search results together. So we only show one or the other. When it comes to situations where we show one or the other, it's not possible for you to kind of give prioritization. So I prefer to have my product review shown and not the FAQ if you have to pick one. But rather, our algorithms will try to pick one and not show the other. Sorry for that. It's possible. We have it. Oh, you have it? OK. OK. Oh, my gosh. Jonathan can run off and implement it everywhere now. Cool. Joe, may I ask you a question? Sure. OK. Me and also my competitors are hiding our advertisement on our page from the Google. I am wondering if that is kind of harmful or not because they are content and we are hiding it. But they are ads, not real content. In general, you wouldn't need to hide ads on a page. So that's something where from our point of view, like having ads on a page is normal. It's not going to be a problem. What I do know is that a lot of ad networks have their ads blocked by robots text just so that they don't get kind of bad impressions. Those kind of things, that's perfectly fine. Oh, OK. OK, thank you. OK, Joe. OK, let me just pause the recording here. You're welcome to stick around a little bit longer, but kind of so that we have a reasonable length for the YouTube recording. Thank you all for joining in. I hope this was useful and I wish you all a fantastic weekend. And now to find the pause button.