 All right. Welcome, everyone, to today's Google SEO Office Hours. My name is John Mueller. I am a search advocate at Google here in Switzerland. And part of what we do are these Office Hour Hangouts, where people can join in and ask their questions around their website and web search. A lot of questions were submitted already on YouTube, so we can go through some of those. But if any of you want to get started with the first question, you're welcome to jump in. Hi, John. I've got a question about passages. OK. So obviously, the new release of the core update regarding the passages has been announced. So it's more about how Google will see the structure of a paragraph. Because obviously, more recently, we're seeing more conversational blog posting, like on Backlinko and people like that. And I'm wondering if there's kind of a minimum word count or character count within a paragraph to Google realize that this is a paragraph? I don't know. So I guess there are a few things that fall in there. I don't have the details of all of the passages things. A few comments. It's not a core update. I mean, we wouldn't consider it a core update. I think core update is kind of an arbitrary term anyway. But it's not what we would consider a core update. It's more about ranking these passages from existing pages rather than indexing them individually. So more about recognizing this is a big page and this is a part of the page that is particularly relevant to this query that is coming. So we'll focus on that part of the page. So it's not that there's a separate passage index or anything like that involved. It's really more about understanding the page and the different parts of the page and being able to recognize which of those parts are relevant for a user's query. I don't have much more details past that to share from our side. I did notice there are some folks that have been digging up patents and papers and kind of the more, I don't know, what would you call it, educational content or theoretical content around some of these topics. And they mentioned there are things like you should make sure that you have clear headings and that you have well-structured content on your pages so that we can recognize these sections, which to me is kind of obvious. Like, if you want Search Engine to recognize a part of your page, then you should structure your page properly that it's easy to recognize. But maybe that's kind of a direction to head. In general, with a lot of these changes, one thing I would caution from is trying to jump on the train of trying to optimize for these things. Because a lot of the changes that we make like these are essentially changes that we make because we notice that web pages are kind of messy and unstructured. And it's not so much that these messy and unstructured web pages suddenly have an advantage over clean and structured pages. It's more, well, we can understand these messy pages more or less the same as we can understand clean pages. So if you take a clean page and you try to make it messy so that it works well for this new kind of setup, then I don't think you would under the bar kind of have any advantage over what you had before. Or if you already have clean pages, if they're already easy to recognize by Search Engines, if they have clean titles and headings and they focus on individual topics, then that's essentially what Search Engines need to be able to understand what this page is about and when to show it to users. OK, thank you. Sure. Hi, John. Hi. I have actually three questions. So last week, I asked you about changing websites. So this week, I'm going to ask you some more questions about website changing. So this is about one of our clients. So they are moving their website from WordPress to Joomla. And we are expecting that it will affect their current event structure. So we are planning to use 301 with detection to minimize any sort of ranking loss. But even if we do the 301 with detection, is there any possibility to see losing rank initially after launching the website? Yes. I mean, it's kind of frustrating to hear, but yes, there's always a chance that something will go wrong and that you will drop in rankings. And I think it's not so much that something randomly will go wrong and break. And it's like nobody knows why it suddenly breaks. But if you change the website structure in the sense that you're changing all of the internal URLs, you're changing the internal linking, usually that also means you're changing the layouts, all of these things can play a role in how we understand the website. So things like suddenly this page has no internal linkings or only one internal link, and it used to be very important internally, then that's something that can affect how that page ranks in the search results. So it's not so much that you set up the redirects and then suddenly something randomly breaks. It's just you have a new website structure, and that new website structure is different from the old one. And it might be that the new structure is much better because you worked so hard on making sure that it works well for SEO. It has clean titles and headings. The internal linking works really well. You use the right anchor text, all of that. And the new structure might be ranking much better. It might also be that you change to a different CMS and you install the default setup from the CMS and the default setup is not optimized for your business or your site, then the site's ranking could go down. So from that point of view, it's something to keep in mind that 301 redirects is one part of a change within a website, but there's a lot more involved when you make a bigger change on a website. So the next question is about the same client. So they are actually an organization, unprofitable. So they have a lot of clubs in New South Wales, in Australia. And each club provides various type of activities, like boxing training, gym and fitness training, these types of training. Now the current website has individual page for each activity. So they have two types of site. One is corporate, another one is a club. So corporate and club both have individual page for each activity. And each of those activity page has ranking on Google. Now what they are planning to do, they are planning to consolidate all this activity in one activity page for corporate and club site. Now, we are assuming that there will be a ramification of doing this. I just need to know how much it can affect because they are planning to consolidate all this page to consolidate it in one page because some of the activities are boxing, some are badminton. So it will be very difficult to write a title tag for all this activity that will be long title tag. So we are not sure what to do. So could you? Yeah, I think that's also one of those question where it depends. It sounds like you have multiple pages now and you want to reduce the number of pages, right? Kind of just very roughly. Yes. Okay, so essentially what would happen theoretically is that if you focus your site more on fewer pages, then those fewer pages will be stronger in search and they could be more visible in search. So generally that's a good thing. On the other hand, it also means that those fewer pages will be less focused on those individual topics. So someone is searching for that variation that you currently have a page for and afterwards you have folded that into kind of a general variation. Then it might be that that search is no longer very relevant to those pages and we don't show your pages for those queries. So what I would do there is look at the queries that are leading to the current pages and think about which of these queries you can still kind of fulfill with the new setup that you have. And if you notice that overall the important queries that you care about the ones that you have high ROI on where you get good traffic and it converts well, if you see that those queries can still be fulfilled with your new pages then I think you're on the right track because then you have fewer pages, you're still fulfilling the needs of the people that you can kind of interact with and those fewer pages will be much stronger. So that's kind of like all of the good things. Whereas if you can tell that with your fewer pages you're not going to be able to fulfill these queries and these queries are very important for you. Maybe these are things for products or services where you have a high return because you earn a lot of money from them and a lot of people are searching like this and suddenly you realize your new pages are not that relevant for that anymore then that's something where I might reconsider and say, well, at least for this topic we should continue to have something separate. Okay, thank you, Zhang. Sure. All right, any other questions before we jump in? Yes. Hey, Zhang, may I ask a question? Sure. About first identification, I will post it here. So I just noticed there are two way people usually deal with it. One way is to, they use pagination pages. When you click a certain facet navigation they go to a pages that will list a certain product and they will, if there are too much product they will like use pagination to list this all of them. And another way is that when you scroll to the bottom of the page, they will use Rejects to load more product. And my instinct is that the Rejects approach is not as good as the original one because Google might not be aware of all the product that you have for that facet navigation. But how do Google actually realize that all the product that you have for that specific navigation if you use pagination, will they consolidate all these pagination pages into one so they know, oh, when you click, for example, under $10 and they are aware of every inventory that under $10. Probably not. So, I think there are a few things that come into play there. On the one hand, normally through the category pages we need to be able to find the individual product. That's kind of the baseline configuration for an e-commerce site that we can find the individual products. And usually that happens through the category pages. Often it's like through paginated category pages. Often there's also cross-linking between different products. So that's kind of the baseline that we know all of these products. And then for the individual facets and filters, usually my recommendation is not to let those be indexed at all but rather to focus on things where, I mean, usually the thing with all of these facets is that you're essentially recombining the same products that you've already shown in the category pages just on separate facets kind of pages. And if you feel that these facet pages are pages that can stand on their own, then that might be fine to let them index. Like if it's really important, like a really critical topic for your site is, I don't know, running shoes under $20 because that's your brand or whatever, then you can see that as a kind of a category page and you can keep that index. But I would generally recommend at least having the paginated pages within these facets not to be indexed because you don't gain any value from doing that. So it's not that Google will think, oh, all of these products belong into this category, we will concentrate them on the first page of the set, but rather we'll crawl all of these pages and we'll crawl a ton of pages from your website if you let us index all of these facets. And we essentially don't get much value out of like page five out of this specific facet from your e-commerce site. So usually the recommendation there is either don't index any of the facets or only let those be indexed that you think are really critical for your website. That's kind of the baseline. And with regards to pagination or that kind of automatic loading, what is it, lazy loading, where you scroll to the bottom and then it loads more, that's essentially those are different ways of doing it. And if you use pagination, we can definitely crawl through that. So for category pages where we need to crawl the pages, pagination is probably the best approach. The lazy loading can also work to some extent, but you need to implement it in a way that Google can process it. So in particular, we load the page once and then we see what all is loaded on that page. And if your site requires that you scroll to a certain position on the page and then suddenly more content appears, then Google will not see that. So we won't be able to see any of that kind of automatically loaded content if we have to do something specific to make it load. I see. So one follow-up question is that you say for fascinated pages, they won't consolidate it to the first pages of pagination and they probably won't aware of what kind of inventory they have in a certain specification. But how about category pages? You say you recommend we index the category pages. So if we do have pagination in category pages, do Google will consolidate it and know that in certain category, even they have like several pagination, they know that in certain category, they have all the inventory that we assign to them. No, we wouldn't assign the products to a specific category. We would use the paginated category pages to find the products and we know, okay, here's a product page for, I don't know, some specific shoe or whatever, but we wouldn't kind of keep that connection between this shoe and that category page. So usually what happens is if someone searches for a category of product, we'll show probably the first page from the category page set. And if someone searches more for a specific product, we'll try to show the product landing page. So it's not kind of that combination of this product is on page five of the category page. Therefore, Google does something fancy with that. It's more like either the category page is relevant for the query or the product is relevant. I see. So just one very quick question. Just that you say that we should only index like high level, high level test and navigation, but for a big website, like they have a lot of product, the high level test and navigation is very important because category pages might not be like fragmented enough to target more long tail keyword. In that case, using regex that require user input, it's okay, right? Because we don't want the pagination after the first page to be indexed anyway. Yeah, I think you probably mean JavaScript, not regex, but yeah, yeah. I mean, I understand what you mean, but yeah, that's perfectly fine. Like if you don't want those pages to be indexed and you use one of these patterns to automatically load the content and Google doesn't index it, that's kind of what you want. That's perfectly fine. I want the stuff that haven't been loaded being indexed. So like this page, yeah. I think that's perfectly fine, yeah. Okay, thank you. Sure, okay. Let me run through some of the submitted questions. I also see there's a really long one here in the chat, but let me run through some of the submitted ones first and then I'll get back to you there as well. I just found out about emoji domain names and I saw they can actually rank in Google search. For example, if you Google or emoji domains, I also saw you can search for things like coffee emoji near me and it would show you actual coffee shops. So will emojis in search and web be something viable in the future, especially if we're moving into a more mobile-centric world? So I think we already moved into a mobile-centric world, so that part at least has already happened. With regards to emojis in search, there's a really interesting talk from Paul Hart that we have on our YouTube channel from I think last December, I don't know, sometime when we could still travel from Mountain View, one of the webmaster conferences, where he also goes into emojis and search and how at some point we realized that people actually search a lot for emojis and we should give them usable search results. So my guess is not that you need to optimize for emojis and people searching for emojis, but essentially what we try to do is understand the emoji and then we just map that to a normal text search. So if you search for coffee and then we, or the coffee emoji, then usually we should be able to understand that. I don't know if it's worthwhile to actually optimize for this because my feeling is while a lot of people search for emojis, it's probably not comparable to people who actually search for textual words. But try it out. I think it's always cool to try out these newer things. I guess one of the difficulties with emojis is if it ends up being a part of a voice search result for Google Assistant or some other assistant device, then it'll be hard for the device to tell you which URL to check out. But I don't know. We're trying to learn more about how we can get our site tribalgay.com to rank in the local search results box at the top. At the moment, it seems to only show TripAdvisor and Yelp for keywords like gay bars and Brighton. When we know that we offer the best and most up-to-date content along those lines, what can we be doing more on our site apart from schema markup, which we already have to ensure that we can rank in the local search result box? I don't know. So my feeling is you don't mean the mapped search result that we show for local queries, but kind of that bar on top where you can, I think, pick different kinds of aggregator sites as well. I don't know what all is involved there, but I'll check with the team here to see if I can figure something out maybe for next time. Yeah, this is a similar question you had like two weeks ago from someone in Poland asking a similar question. I don't know if he contacted you, but I was waiting, I've been speaking to him. But it's related to the local search and probably to do with the EU kind of regulations and stuff like that around showing other providers. Okay. It would be good to know how to get into this. Yeah. I looked into it like a long time ago and back then it was more about us trying to algorithmically recognize these sites, have a lot of content along those lines and then show that. I don't know if kind of the more niche sites would have trouble kind of being shown like that, might be a bit harder, which I could imagine might map to smaller countries. I don't know, Poland isn't necessarily small, but still. And perhaps also to kind of specific topical areas like travel gay is probably less visible overall compared to something like Yelp or TripAdvisor. But I don't know, I think it would be nice to have more of those sites visible there. So I'll definitely check. Because I work for one of the largest private medical providers in Poland. So with like 3000 doctors. So it'd be, we kind of compete against the searches on this. So it would be good to understand how we can be surfaced in this search because we actually own kind of also a ranking kind of website as well for doctors, which is similar to your kind of TripAdvisors and stuff like that. So it's like, why is one being considered above another? Yeah. Can you send me some details? I'll give you an email. I've got your email. Okay, okay. Cool. Yeah. Thanks. I've seen a lot of times, I have seen a lot of times I got some image results in my search for a search term. How does Google identify that images is good for showing in the results? Is Google crawling images with their binary code format or just as a snapshot? I don't quite understand the second part of the question, but in general, what happens when we show multiple kinds of search results on a page, such as kind of that image box and the normal text results, sometimes the video results as well, that comes in kind of, I don't know, how should I frame it? It happens, so essentially what happens when someone types in a query on Google is we send that query to a lot of different indexes, a lot of different search systems within Google and we try to send it out to as many of these systems as possible on the one hand so that we get answers from them. On the other hand, that we also get quick results so we can, like if we have really the web results come back really quickly and images take a longer time because I don't know, somehow images are a bit slower today, then at least we have the web results that we can show. But essentially, we send it out to all of these different systems and the systems tell us how relevant the results are that they have. So the image search system might come back and say, oh, image results are kind of important for this query and here's a set of images that I have for that. And if we get the web results back and the web results say, oh, web results are super relevant for this query, then we might show the web results and not the image results. On the other hand, if we hear back from these different systems and they say, oh, image results are super important for this query and the web results are kind of okay as well, then we will almost certainly show kind of an image one box somewhere in the search results. It might be on top, if we think that's super critical, it might be somewhere in the middle, anything like that. And the same thing happens for video and I imagine for the different other types of kind of one boxes and things that we have as well. I don't know, like maybe top stories, I don't know how all of that flows in, but essentially all of these different systems come back to that one central place and say, here are my results and here's how relevant I think they are and based on that, we try to figure out which of these elements we should show. So that's kind of how the images flow in there. It's not that someone is manually saying, oh, we should show images for this query and not for that one. It's really these systems are trying to understand how relevant the results are that they can bring back and based on that, we try to make an automated decision. And that's also why this can change over time, where if you search for something now, then maybe we won't show images and if we recognize that everyone is actually looking for images for this query, then over time we will start showing images in the web search results as well. And I guess regarding the second part of the question, I'm guessing you're kind of asking if Google is just taking the image as an image file or if they're looking at the contents of the image, like there's a cat on this image or not. And we do a bit of both. For the most part, we do take focus on the image file and on all of the context around that image because that really helps us to understand why this image is relevant. And if we don't have a lot of information otherwise then we might try to understand what this image is about as well. It's tricky from my point of view to focus purely on what is visible in the image because it might not be why this image should be ranking in the search results. So it could be that you have a photo of a beach and you use that in a tribal blog where you're like, oh, look at this fancy beach that I visited. It could be that you have a photo of a beach and you use it on an article about environmental pollution because it's like we want to preserve our beaches. And those are very different use cases and very different ways and times that we should show this image. So that's something where purely from understanding the image's point of view, we need to understand essentially how that image is embedded within the web page. And that's something you can do with the alt text, with headings and titles on the page, kind of understanding sections of a page, better captions if you have anything like that, the file name for the image as well, all of these things play a little bit of a role in how we understand the image and its context. My website is already on HTTPS, but still missing the content security policy header whenever scanned from a tool. Does that have any effect on rankings and how does Google look into it? I have to be honest, I don't know exactly what the content security policy header is, so my guess is probably not that critical. I don't know. But in general, when it comes to HTTPS, we try to understand which of your pages should be seen as a canonical version of your content. And usually that's a matter of we have some HTTP pages, we have some HTTPS pages, and we see the HTTP pages are redirecting to the HTTPS version, and internally you link to HTTPS. So we will pick the HTTPS version as the canonical. And we will do that even if your certificate is no longer valid, even if you're missing something critical on a page, even if there's some content within the page that's not on HTTPS, where if you load it in a browser, it has kind of that warning icon on top. Even in those cases, we will probably shift to HTTPS just because all of the other signals tell us HTTPS is the right one. And when we shift our indexing to HTTPS, then essentially all of our ranking factors that apply to HTTPS kind of kick in and apply there. So if there are things with regards to security that you could be doing better, but you're just not doing perfectly at the moment, which might be something like the content security policy header, then that's something I would definitely recommend doing. That's something that sounds like you probably need to do to make sure that you have a modern and secure website. But at least from an SEO point of view, it's probably not critical for your site's ranking. So if you, I don't know, I don't know how to frame it. If you care about your users, then probably you should make sure that you have all of this done. If you just care about SEO, then probably that's fine. My question is regarding the bounce rate of an affiliate site. It's designed in a way that we want the user to consume information and then bounce them to an affiliate site as soon as possible. In a way, we only earn when someone clicks on those affiliate links, which eventually causes a poor bounce rate. What's your take on this, since it might be a factor for rankings? Essentially, that's totally up to you. You can set up your pages like that. But what happens on our side is that we don't care so much about whether or not a site is an affiliate site, but we care about whether or not your site provides information that is relevant for us to show in the search results. So if people are following the links and going to a different place to buy something from your site, theoretically, in analytics, that might be something that is flagged as a bounce. I don't know how analytics compiles that overall. But if people are going to your site and finding the information they need and then going off to buy it somewhere, then that's essentially perfect. On the other hand, if people go to your site and they're like, there's no information here, or this is just the same information as all of the other sites, and they click on your affiliate links, then that's something that's kind of, from our point of view, not that useful. Then we might as well just show the affiliate link target as a search result and not your site. So it's not so much the matter of, is an affiliate site and driving traffic somewhere else, or is it a site that's selling things directly, or is it just an informational site? It's more a matter of, does your site fulfill what the user was looking for? And for that, it's not something that I can purely judge automatically. It's something you kind of have to look at on your own and figure out, am I really providing something unique, compelling, and of high quality that would be indexable, even if there were no affiliate links on there at all? Real-time dashboard versus process report, real-time direct is three times higher than process data. I think this is about Google Analytics. I don't really have any insights there, so can't help with that. Somehow, our 20-year-old business has created two websites of almost the same primary content. Both domains are 15 years old, same categories, same product pages, except variations like UI. On one domain, we show more textual content, curated reviews. The result is we get product page traffic on one domain and category page traffic on the other domain. If we cross-canonical those domains, can that help in SEO? I think, I mean, somehow over 20 years, lots of things get collected, so that's kind of normal. In general, if you combine your site and make one strong version out of two medium-strong versions, then usually that's a good thing. Especially if you're targeting the same audience, if essentially your message and the conversions that you're trying to achieve are the same thing, then it definitely makes sense to combine things from my point of view. On the other hand, if you're targeting completely different audiences, then maybe that's something where you kind of have to make more of a judgment call. So a common use case that we sometimes see is B2B versus B2C. So if you're selling directly to users or if you're selling directly to businesses but then sell it to users, sometimes it makes sense to keep these websites separate because you're targeting something completely different. The messaging you have on these pages might be completely different. And in those cases, it might make sense to keep it separate. On the other hand, if it's just naturally over time, you somehow collected multiple websites and they're all trying to do exactly the same thing, just different departments and different people over time trying their hand at making a website, then that's definitely something that's worth combining and saying, we will take all of the good parts of all of these websites and make one really strong website out of it, and probably you will see some improvements in search if you're able to do that. John, can I ask a question? Sorry, I joined you. I posted it too. So my question is regarding rich results, the choice of images in rich results, namely recipes. I noticed that Google uses 16 by 9 images there. And on many recipes, I provide three different ratios, so 6, 9, 1, 1, and 4, 3, as Google would recommend. And I noticed that in many cases, Google is picking the 4, 3, or 1, 1 and cropping the top part of it, making sometimes the product doesn't even appear in the image. So I was wondering if there's something to know about the choice of images there. I don't know. I heard that from another recipe site, but I've stopped hearing from them about it. So either they gave up or it's working now for them. But if you can send me some examples or if you posted it in the question, maybe just add a reply to that and add some examples there, then I can pass that on to the team and we can take a look. Sure, we'll do that. Thanks, Tom. OK, let me run through some more of the questions here and then we can open things up for everyone as well. The request indexing tool in Search Console has stopped any update when it's coming back. It would be welcomed. I don't know when exactly it'll be back. I know the team is working on improving that, though. My ranks have dropped heavily with the core update for May. So since then, I'm trying to improve to come back from it in every way that I can by updating my posts and trying to improve my content daily. Does a webmaster in this situation need to wait for the next core update or can a result be felt without that? This is kind of a tricky question in the sense that depending on what all happened with the last core update, you can definitely see improvements over time in the sense that what happens in Search is we have all of these different factors that kind of play a role depending on the query and the site. And it might be that from one of the core updates, like one of those factors is smaller, but you can work to improve the other factors and make them stronger again. So overall, you can definitely kind of move things around a little bit to improve over time. But if that effect from that core update is so strong that you can't compensate it with the other factors that you can work on, then it'll be hard to kind of reach that barrier again. But it is something also from our side where we say that the core update isn't necessarily the sign that you're doing anything wrong, but rather that we just figured out or we kind of realized that maybe your site wasn't the most relevant for some of these queries. So that's something where it's not always the case that we would say you could just tweak things and slightly improve things on your site, and then suddenly it will become relevant because sometimes relevance is something that requires a lot more compared to just slightly improving your posts and maybe adding some content every now and then. So I don't know if this is something where if you're seeing a really strong change in how we were showing your site in search, if it's actually a matter of just tweaking things or if you need to kind of reconsider overall what you're doing with your website and making sure that overall you're really like super relevant for these queries. And it's something where if we were to show it to the engineering teams, for example, they would say, yes, there's a bug in our systems if we're not showing this site as a first result for those queries. So it's not a matter of how can I make my site kind of similar to the other sites that are ranking there, but really how can I make sure that my site is, by far, the best result for these queries? And that kind of a change is really hard. So usually that's a lot more than just kind of like updating a few blog posts and adding some content every now and then. And then please don't take too long to upload the next video. Yeah, OK, I'll try to get it in this afternoon. Sometimes we have to wait a little bit because other teams or other folks are also posting things to the YouTube channel and we don't want to kind of bombard people with videos on the same day. Question about discontinuation of Flash. I work on an online gaming site where we serve quite a lot of Flash games. In terms of quality, they occupy roughly 50% of all pages of the site, but in terms of organic traffic, they only account for about 10% of our total traffic. After December 2020, should we anticipate a complete loss of traffic only to this portion of the site or will there be wider site repercussions if we don't take steps to remove all of those pages that serve Flash games? So I think I saw this question in one of the previous hangouts. I don't know if it's like the same question or the same person. That's fine. In general, when it comes to Flash, the previously, I think we stopped doing this. Previously, it was that we would take the Flash files themselves and try to understand what textual content is within these Flash files and see that as a part of the indexable web content for those pages. So that particularly makes sense for Flash websites where the whole content is in Flash. When it comes to Flash games, however, it's not that there is textual content within the game that is relevant for indexing, but rather the text is all around the Flash game. And we can use that for indexing even if we don't take into account anything from the Flash elements at all. And I would expect that after December 2020, I think December 2020 is when it's removed from Chrome or something like that. But I would expect that nothing changes for your website in that regard at all with regards to indexing, because we already focus on the textual content of the page. And we kind of ignore the Flash part of the page. So at most, I would expect that people maybe search differently at some point, but not that we would drop these from search or drop them in rankings. John, can you hear me? Could it have any indirect ranking? Maybe users are searching for Flash games, but they're not getting the results, because the Chrome isn't supporting that. Could it have some, I don't know, indirect effect on searching the rankings? I could definitely see people searching less for them over time, but I don't think there would be any kind of ranking effect for that. So it's something where even 10 years in the future, if these pages still exist, if someone searches for Flash games, we will show whatever content we have for Flash games. And that would likely be pages like this, even if nobody even knows anymore what a Flash game was. So I wouldn't expect that to drop. What you might see is if people are searching for, I don't know, a specific game type, where I don't know. I forgot what the game types are called. But if they're searching for a specific type of game, where you have maybe a normal HTML5 game or an installable game app or game for game console, and you also have Flash games, then I could imagine for that generic game type query, then over time the Flash game pages will become less and less relevant, because they're less and less recommended by users and they're less useful for users. But if someone is explicitly searching for that game type plus Flash or just Flash games in general, this is the only content we have that's relevant. So we will stick to it, even if by default people might not be able to see the Flash content. Interesting. Thank you. When I redirect from one site to another, 301 or 302, and it contains the same content, images, internal links, is it considered duplicate content with Google in general? In theory, yes, we would consider this duplicate content. And we would use it for canonicalization. But it is not a bad thing. I think this is kind of more like in theory, we would consider this duplicate content, because if you enter that URL in a browser, then you would end up with the same content. So if you redirect from an old domain to a new domain, if you enter the old domain, then you would end up with the same content as if you just entered the new domain. So theoretically, for our systems, that would be duplicate content. In practice, we would see this as a sign that these pages are the same, and we need to pick one of these to choose as the canonical. And we will focus our energy on that canonical URL. So that's something that, from our point of view, is perfectly fine and normal and not something that you need to worry about in terms of ranking. So essentially, we would probably switch things over to the new domain if you have this redirect setup. We have a site in French, and if we only translate this content on Google, translate into another language, is this considered duplicate content? No, if you translate content, we would not consider it duplicate content, because the words are different. So kind of just purely from a technical point of view, search engines are sometimes pretty simple, and they look at the words on the page, and they say, oh, these are different words, so the content is likely different. They don't look at the text on the page and say, oh, it says the same thing in this abstract way of when I try to understand the words, therefore, the content is duplicate. It's really, we focus on the words, and if the words are different, then the content is different. So if it's in French and in English, they're different words, we will treat them as different pages. John, may I? Sure. My question is about lazy load. We have set up a lazy load, and when you scroll down to the place where the image is, it loads and appears on the page. Before the moment an image is on the screen, there is like a gray rectangle of a random size instead. Because we don't know images size before we load it. So imagine you have slow mobile internet, and you are loading a page, and the text appears, and you scroll down, and you see a gray rectangle in the place where an image should be, and there is the text below. And the moment the image has loaded, the text shifts down because the image has a different height. And with core web vitals in general and CLS, which measures visual stability in particular, could that be a problem for us? Thank you. Yes. Yes, it could. That's kind of the thing that the visual stability is focusing on, where if you scroll a part of a page into view, and suddenly it shifts around, then that's kind of what we're trying to catch and focus on. So my general recommendation there would be to try to find a way to embed the size of the image in kind of the lazy loading element by default. And even if you just show the gray rectangle, if the rectangle is the same size as the image afterwards, then that results in kind of more stable page load experience. OK, thanks. All right. Getting kind of close to the end of time, maybe I'll just open up for more questions from any of you all. It looks like there's a lot of stuff happening in the chat, but I haven't been able to keep up there. So if there's anything critical, you want me to help out with, let me know. Hey, John, I have a question about redirect. I remember you once say that the redirect only past link juice is they are very close match. So there's a situation that we want to consolidate a multiple article into one big article. In that case, we redirect the fragmented article to the big article where they passed link juice because the big article will be like a broader thing, but the fragmented will be more long tail. Yeah, I think that's perfectly fine. So basically, you're saying this old URL is being replaced by a new one, and then we will forward the signals that we had with the old URL to the new one. That's perfectly fine, yeah. Yeah, and specifically referring to when we are consolidated multiple short article to a long one with less link juice. Yeah, that's perfectly fine. That's a very common setup that you take multiple pages, you combine them into one. Maybe you have multiple pages in the beginning, and you pick one of those pages, and you make it bigger. Like that's essentially all normal. I see. On the same token, because I remember you once say that they have to be very close match to be passing link juice when you redirect. In that case, it work to pass link juice. But what if I redirect an informational page to a transactional page? For example, I write a blog post about goat cheese. What is about goat cheese? And then one day I decided I don't want that to be arranged. So I redirected to the transactional page of goat cheese with a pass link juice. Yeah, that's also fine. I think with regards to the content needing to be similar, it's more a matter with the rel canonical, where you actually have content on the page. So if you set the rel canonical from one page to another page, then that's something where, for us, it's a sign that you want these pages to be treated the same, but we need to be sure that these pages are really a part of the same cluster. And we kind of to recognize, oh, these pages are actually equivalent, and this is the one that you would like us to show in the search results. With the redirect, it's essentially the old page no longer exists. And if you enter the old URL, then we show you the new URL. So they're already essentially identical. So with the redirect, that's a lot less of an issue if the pages are different or if they kind of change their focus, all of those things. That's kind of normal evolution on the web, I guess. So we can even redirect a page that have completely different content if we think it makes sense. And they will still tag length use? More or less, yeah, for the most part, that should work. I see. Thank you so much. Sure. I think the one exception there would be if you go off and just buy a bunch of domain names and then redirect them all to one central site, then that's something where our systems might kick in and say, oh, this looks kind of sneaky. But if you're changing things within your website and you say, I don't want to sell basketballs anymore. I want to sell shoes or whatever, then those are changes that you kind of decide within your site. And we generally just follow that. I see. So just one follow-up question. I have a client doing a very weird thing. They try to, in Black Friday times, they redirect a certain informational page to the transactional page. And they try to consolidate authority from the informational page to the transactional page in the hope that the transactional page will bring better. And then they want to remove the redirect after the Black Friday. My thing, tell me, is not a good idea. I don't know. I mean, I think it's really hard with these Black Friday things because it's just a handful of days. And theoretically, you could make some really big changes on a website just for a couple of days and you change it all back. I don't know if under the line, it's really worthwhile to do that. And if you kind of break more things, then you actually improve there. But I don't know, especially if you're struggling with kind of getting your Black Friday content ranking when everything else is really strong, and then I could imagine people resort to kind of these weird tricks. I think from a web spam point of view, we wouldn't see that as being problematic. From a web search point of view, we wouldn't say, oh, this is a sign that the website is low quality or anything like that. I think it's purely from a technical point of view. It's easy to confuse our systems with this. It's not so much that it's like a really bad idea and you should never do it, but you can confuse things. And if the end result is that when you change things back, it takes a month for everything to be kind of restored in the normal state, then I don't know if you've really made a profit under the line in a case like that. I see. Thank you so much. I think it's also the case that just because Google follows the redirect doesn't mean that the new page will rank for everything the old page used to rank. So Google might follow the redirect, but if you redirect our informational page to a transactional page, it's likely that the informational page had a lot more content, and it's likely that it will rank for a lot more keywords that the new page probably won. Yeah. And sometimes these things also take a bit of time, where if we see a redirect and we start following it, then maybe it'll take a day or two to actually start seeing that effect in search. And I don't know if for these kind of events where it's really a matter of being there for two, three days, if that actually works. But I'm 100% certain people have been testing this for a while now, and everyone is fine tuning their strategy for all of these Black Friday things. But I think it's just easy to also break things. So it's good to be careful. OK, I think we reached the end of time. I don't have a ton of time after this, so I will have to jump out this time. Maybe next time we can hang around and chat a bit more. End of times. It's not that bad. It's only Friday. It's not end of times. Yeah, so thank you all for coming in, and thanks for all of the questions that you submitted. Thanks for joining in and asking questions live as well. And I wish you all a great weekend, and see you all next time. Thanks, John. Bye, everyone. Bye.