 All right, welcome, everyone, to today's Webmaster Central Office Hours Hangouts. My name is John Euling. I am a Webmaster Trends Analyst here at Google in Switzerland. And part of what we do are these Office Hour Hangouts, where people can jump in and ask any web search, website-related questions that we can help out with. As always, there are some questions submitted directly. But if any of you want to get started with the first question, feel free to jump on in. Can I start? Sure. Hi. So for our natural search, our site is a subdomain of another site. And when we search for our brand name, which is Accardo Zoom, which is a subdomain of Accardo.com, our natural search results include two site links at the moment, one of which is our own Accardo Zoom FAQs page. So that's fine, because that's always linking to our relevant site. But the other site link is linking to an Accardo.com page, which is a reviewer on Accardo.com who's named themselves Zoom. And as obviously, that page is ranking really highly, because it's been around since 2012. So I have no idea if we are able to do anything to manipulate this or change this, because obviously it's a really bad experience with people looking for Accardo Zoom. And that's getting taken to a completely different site, essentially. So you can't really control the site links. So I think that's kind of first off. But usually, if we're showing something as a site link, we would assume that it would be from the same site or from the same setup, essentially. And you're saying it's a completely separate site? So our site URL is zoom.accardo.com. And it's picking up some site links from just accardo.com, which I appreciate it's a subdomain, but everything I've read suggests that they are supposed to treat subdomains as entities, too. Oh, OK. So it's essentially from the main domain, not from the subdomain? Yeah, exactly. Yeah. So we're getting site links that are taking people to the main domain when the actual key term they're searching for is completely relevant to our site. So they're searching for Accardo Zoom. Sometimes that can happen. There's not really anything specific that you can do about that other than focus more on the things that you have on your site. But especially things like related business units, those kind of setups, that's something where it can sometimes happen that we show site links from that. What do you mean by related business, you know? I mean, it sounds like the main website is on the main domain and then you're part on a subdomain? Yeah. Is that right? Yeah. Yeah, I mean, that's something where sometimes we seen that that makes sense to show it kind of separately, but also linked in the site link. But it's really hard to say because I don't really know your business or how you have that set up on your site. So if you want, maybe you can just drop a link in the chat here. And I can take a look at that afterwards to see if there's anything on our side that we need to change there. OK, thank you very much. I'll drop a link there. All right, ideally with a search query or something that I can reproduce, then I can pass that on to your team here. Yeah, sure. Thanks very much. Cool. Thank you. Hey, John, I have a question for you. It's good to meet you. Sort of similar to what Alexandra is going through. I run it on the domain, girlfriend.com. And we bought the domain three years ago to have a brand, Girlfriend Collective. It's a clothing company. It's on the Shopify platform. Over the last three years, we've gotten a ton of really great earned media links. And we felt everything in our power to make sure that our brand of terms shows up in the search results. We haven't had any warnings from our webmaster tool that says that we have any penalizations in terms of people with content or bad links. And so I was just wondering if there was any other underlying issues that you would know outside of that. And the domain is girlfriend.com. And the query would be girlfriend collective. And it's been as high as, I think, the second page on the SERPs. But we get quite a few search queries for our own brand of term, which just will not show up. And my assumption was before we bought it, it was a pretty spammy dating directory that was squatted on for a very long time. So I don't know if it was penalized because of that. But we haven't seen any warnings or anything in our webmaster tools. Hard to say, because especially if there was really problematic content on there before, then sometimes that does take a while to settle down. But it says, like you said, you've had it like this for a couple of years now? Yep, yep. And so we've had it for, since 2016, transitioned over to the new site. We've cleaned it up, put it on a standard platform to Shopify. So it's just the storefront. And we've, over the past three years, in terms of our media and links and everything in that nature, I think we've been above approach. We haven't done anything sort of that would market as bad within Google's like. And so not really sure what's going on. And we'd love to know what we can do or if there was some way to resubmit our site for anything. Yeah, I don't know. So I need to double check to see what exactly is happening there. But in general, if you check to make sure that there is no manual action in place, then you're kind of in a good place. I think the main thing I would look at, so just randomly checking the settings that I see here, Meco, some more, are kind of just to make sure that you've really cleaned up all of the old stuff that was associated with it. But I can double check to see from our side if there's anything kind of sticking around there that you'd need to take care of. In general, when it comes to kind of generic terms like that, that's always a bit tricky. But it sounds like you're not trying to rank for just a girlfriend. No, no, we're trying to rank for the entire brand name, which is a girlfriend collective. So there is a secondary word in there that we just won't rank for. And so I think that was a concern for us. And it just doesn't make sense as like to me. And so I mean, I love spending Google ad dollars, but at some point, I would love to be able to. Yeah, I mean, that wouldn't be related to. Yeah, yeah, yeah, it was more of a side joke. But I would love to be able to at least show up within the first page that makes a ton of sense. So that can drop my domain and my brand of term as well in the comments for you. I think I found it, so. OK, awesome. Yeah. Yeah, we've disavowed every, I believe, all the everything before we transitioned over. And so in terms of what we've done to wash our hands of what it was before, we felt like we've done it. But I thought that we know that it takes time for it to clear out. But I think three years is quite a while. Yeah. Yeah, I need to take a look to see if there's anything sticking around there, because it does seem like the old domain was pretty problematic. So that always makes it a little bit harder to turn it around into something more reasonable. But it feels like after a couple of years, that should be possible. Yeah, for sure. Great, thanks for your help. Cool. Is there any way to follow up? Or do I just go on to the next web master tool and try to catch up? Yeah, I think ideally maybe just jump in in the next one. I don't know if I'll have something specific for you sometimes for what happens with these kinds of escalations is the team will take a look and see if there is anything on their side that they need to change. And they'll make the appropriate adjustments. And sometimes that means longer term changes. Sometimes that means things that you'll see fairly quickly. But happy to catch up next time. Great, awesome. Thank you very much, John. Sorry, just to jump back in. John, do you want my email address to follow up? No, that's OK. That's OK. I have dropped a ton of information in the chat. Yeah. Really fantastic. Cool, OK. OK, thanks. Always good to have these details. Yeah, hi. Hi, John. My name is Rolf. I'm from the Netherlands. I've got a small family-owned business called Little Rockstore. And there are 10 multi-store domains, country domains. And since the end of 2017, November 2017, our UK and US websites dropped significantly, about 40% of sales. First, we noticed that somebody on this planet website scraped our website on a different domain. And in the last one and a half years, that we noticed that it was like three or four times. So we deleted that website. We put it offline. But other than that, there's also someone who is using all our text on our website, other product pages, home pages, et cetera, category pages, and uses those texts on links on other domains. And when I do a search, for example, on a small phrase of one of our meta text that are important for us, I see, for example, 25 or more links only using those phrases. And it's every day that comes more. And technically, we don't know how to solve it. But because it's only those two websites and there's only those two websites that are English and all those links are English, so the drop in sales should have been related to this. Don't you agree? Doesn't necessarily have to be. So I think, first of all, it's important to keep in mind that especially when you have different websites for different international markets, they can rank differently across those markets. And you can see changes across those markets. So in particular, if you have one English website for the US and one Dutch website for the Netherlands, for example, then they will usually rank differently. And the idea there is that we try to bring results that are relevant to the local users. So it's certainly possible that maybe there is more competition for some of the terms that you're targeting in the US. And that means your website is ranking a little bit lower and getting a little bit less traffic in comparison to other sites. So those are kind of, I'd say, normal things. And all of the scraping and the links that you're seeing to the large part, those are things that we have a lot of practice in ignoring. But when I take a look in Google Search Console, I see, let's say, 50 or 60 links that are also negatively, so I can disavow them. But when I take a look in Search, there are, let's say, thousands of links. And I don't want them to be in the Google Search results. What can I do? And how do I really know that it's not affecting those two websites? I don't think you can really know, because we don't list all of the factors that are kind of playing a role in the ranking for individual pages. So from that point of view, it's not something where you can really have a confirmation that those are not doing anything positive or negative. In general, if you see links that you don't want to be associated with, I would just disavow them. Maybe ideally just disavow them by domain so that you don't have to kind of keep chasing those links. When they're disavowed, they will still be shown in Search Console. So it's not that they will be removed. It's not that they will be hidden in Search. They'll still be indexed. They just won't count for or against your website. Yeah, we're just a small company selling baby clothes and the drop in sales and also all those negatively effects on Google. Yeah, it may be causing a lot of headaches. So if you are able to take just a little look, that would be very nice. Yeah, I mean, I'm happy to take a look. But from what you've described, it sounds like normal changes in ranking. That can happen. So that's something, especially when you look at larger markets where there's more competition, then there are lots of people that are fighting for these ranking places as well. It's a really small niche. But I put the comments in the Google Mapmaster Hangout here. OK, cool. All right. Anyone else want to ask a question before we jump into? All right, go for it. So the question is about one of our client e-commerce website. We have faced the issue recently. So the issue is, when they visit the website, when they show the URL of any page, at the end of the URL, it shows multiple flash instead of single flash. Now the problem is, if I put multiple flash or single flash, it shows me the same content. There is no change at all. So does it cause any issue with the ranking? They have a lot of pages which have this issue. So what's probably happening there is when we crawl the web pages, we find a link to those pages. So sometimes that happens if there are relative links on the page or on the website where we keep finding a link with another slash on it. And then we keep crawling another level and another level. And if your server returns the same content, then we'll think, oh, this is OK. So on the one hand, from the crawling side, that's not so good because we have to crawl a lot more URLs than we actually need. For the most part, for most websites, it's not critical. What you can do to help to fix that is to use the rel canonical link element to tell us this is the URL that Google should be focusing on. We might still crawl those other URLs, but we would try to focus on that kind of cleaner URL instead. Also, what I would do there is use a kind of a website crawler and test on your site to see if there is anything within the website that might be causing this. So kind of those relative links that I mentioned. Sometimes it's something fairly obvious that you have some code somewhere that just adds another slash to the URL and it works. So it doesn't cause any problems directly for people that are visiting, but it just keeps creating more and more URLs. Because they have the big website, they have more than 1,000 product, and 50 categories, 50 brands, big websites, so I think. And their ranking was pretty much good in the last year, but when we saw this issue, the ranking was dropping, even though we said have Kernical URL, but we lost a lot of issues. I don't think that would cause any ranking change. So what would happen there, in the worst case, is we would crawl all of these separate URLs. And we wouldn't be able to find the new URLs as quickly in a worst case. But it wouldn't cause the existing URLs to rank less. OK. Thank you, John. Sure. All right, let me take a look at some of the submitted questions so that we get those covered as well. Let me see. So question has been coming up for quite some time. We have a blog where we produce original content with over 4.3 million visitors to date. While authoritative and older, potentially more authoritative site has been covering many of the blog posts we put out right after we post them, often ranking above us in Google for searches about our own content. Since it's not 100% verbatim, it's not a situation where canonical tags can solve this. However, they're pulling large chunks of content and images from our blog and reusing them. Any suggestions there? This happens to us all the time as well, so you're not alone. I think this is one of those situations where you have to balance the desire to spread the word, the information that you're providing on your blog versus solely limiting it to your blog. Especially if they're reusing the existing content that you've posted, you could probably go and talk to them and mention that maybe this is your copyrighted content and nobody else is allowed to reuse it to prevent them from reposting your content. However, at the same time, that also means they wouldn't be reposting your content and there would be less visibility for your content. So that's something where you have to balance those two sides. How you want to do that is kind of up to you. It's not something that we would directly be involved in. In general, just because content was originally produced somewhere doesn't mean that other people aren't able to talk about that or even quote that content and rank with that separately. So that's not something where we would say we would always rank the first place where that content showed up in the first place. Sometimes other sites that talk about the content that was created add more value and seem to be more useful for users as well. So that's kind of those two sides are sometimes a bit tricky to balance. I'm unable to find the reason for the de-indexification of my images from Google. Let's see. I don't know. So I have your URL here in the question, so I'll double check to see what exactly is happening there. Offhand, I do see images from your site being indexed. So it's not the case that nothing is being indexed, but it might be that there are aspects that make it really hard for us to index some of these images. So what I've sometimes seen is that it's tricky to access some images sometimes or to even find them. For example, if you're using a kind of lazy loading that we can't really recognize, then that might be a reason why we wouldn't be able to pick those up. But it's really tricky to say. At least from what I see here, I don't see any technical reason or any kind of web spam reason. Let's put it that way. From our side, where I'd say these images would be dropped for some manual reason. So my suspicion is it's more of a technical thing with your website than anything else. But I'll double check with the team here to see if I can find something more specific. I have a website in Nordic language. When I search using the site command and Google is set to local language, I see a translate this page link. And it attempts to translate my home page from English to the Nordic language. But there's no English content on that page. When Google is set to English, the link disappears. Makes me think Google assumes my home page is in English when maybe it's not. Is there anything I can do to fix this? So I think the most important part here is for the most part, we can recognize when there are multiple languages on a page. And when we're not certain, then we'll show this translate link in the search results. So just because you see that translate link doesn't mean that we think your pages are in another language. It's just that we think maybe there's also content on these pages in a different language that we might want to pick up on. Sometimes that comes from things like vocabulary that you use on a page. So for example, if you have a page about vacation homes in Spain, then you have a lot of Spanish place names on there, which we might think are Spanish words, which probably are Spanish words. So we might show a translate link in a case like that, because we're not sure if there's something Spanish, for example, that would need to be translated into a local language. And in general, this kind of difference is not a problem. It wouldn't change anything with regards to ranking. It's really just a matter of usability for users, where we think maybe we should show this link just in case there's something there that would need to be translated. The one case where I would watch out for this is if you're doing something really fancy on your website with multiple languages, then I've sometimes seen a setup where you serve content in different languages on the same URL, depending on the perceived user's location or the perceived user's language settings. So for example, if a user from the US accesses your site and you serve an English version of a user from maybe Norway accesses your site, you serve the Norwegian version, and you do that on the same URLs, those kind of setups are really problematic, because we only index the content on one URL. So we wouldn't recognize that you serve different language versions on the same URL. And with that, what could happen is when we crawl, we see the English version. And that's what we think this page is in, then. However, when a user accesses that page, maybe from Norway, they see a Norwegian version. And then they get confused, because Google Search is showing an English version, but actually they see a Norwegian version. So again, I don't know your site. I don't know your setup. But if you're doing something fancy like this with using the same URL for multiple language versions, then that would definitely be problematic and could cause problems like this as well. If you're using just one language on one URL and you're seeing this happening, then my guess is this is just from individual words that we're recognizing on the page, and it wouldn't be a reason for any ranking change or anything else otherwise. If you want to suppress the translate link, I believe there's a no translate meta tag that you can add to your pages. If you do that, we would also not translate it in the browser in Chrome, but we also wouldn't show the translate link in the search results. Let's say a company provides services in states that are outside of the state where the company actually exists. Would the algorithm have an issue with this company writing blogs with intent of ranking for searches related to this service in states outside of the company's actual physical location? That sounds kind of complicated, but no, I don't think that would be problematic. So it's pretty common for a website to have content that's relevant outside of just the immediate region of where the webmaster is located. So from that point of view, I don't see any problems with that. I have a multilingual website where each language sits on a separate subdomain. I use hreflang tags. But in some pages, the bulk of the content is only in one language. Only the page templates, navigation, are translated. Do I need to include all versions of those pages in my site apps? Will Google consider that duplicate content and downrank my site? So we would probably recognize that as duplicate content, but duplicate content is not a reason for us to demote a website. So that's essentially what you're seeing there or what we would do there is we would recognize that the pages are identical, or almost identical. And we might index both of those URLs, but we would only pick one of those and show those in the search results at any given time. So for example, if you have English content for the US and English content for the UK, and you have exactly the same content on there, then we would try to pick one of those and show it in the search results rather than showing both of them. So it's not a penalty. There is no manual action involved. There's no quality judgment on that. It's just purely from a practical point of view. We think it doesn't make sense to show that twice in the same search results page. Can I make, as a follow-up question, because I asked this in the chat? Sure. The problem is also that I think we have a problem with our crore budget because we have a follow-up story. So I think we have an issue with our crore budget because we have tens of millions of pages. So my question is, do we need to include even all the translated pages in our same maps or not? Because there's already the extra long text that allow Google to discover the translated versions by themselves. So I don't know because I see that we have, Google has indexed only a small percentage of the total pages. If the pages are on your site and they're linked within your site, then I would also include them in the sitemap file. That way we can recognize when they get updated, when they get added. So from that point of view, I wouldn't hide them from the sitemap file, but link them from within the website. If you really don't want them indexed or you don't want them crawled, then I would make sure that they're not linked within the website so that we don't run across them. OK, thank you very much. So yeah, I think especially in cases where the localized version is not significantly different than others, then it could make sense to fold those together to say this is the English version and it's valid for all of these English-speaking countries. It's not that I need to make separate versions for each country individually. OK, thank you. Sure. Yeah, so I think that also answers the other question there. I received this message from Google. New issue was found. Invalid value type for field availability. And then there's an offer. Not really sure which URL this is on, but there's one mentioned in there, so I can double check that. Well, one of the things I would watch out for here is to make sure that you're complying with the policies with regards to the structured data usage there. So it looks like you're marking up an offer that is more like an event. So that's something where maybe it makes sense to double check that you actually have this marked up as an event, not as a product that you're offering. So it looks like some kind of exposition where people can go and it's free. And just from the markup that you have there, it looks like you're marking it up as a product and a product offer, which probably would be more like an event rather than product. Let's see. Talking a couple of months back about 302 redirects, but this discussion was mostly focused on page rank. I'm more interested in how Google determines the relevance of these pages. For example, if a blog post is ranking very well and then 302 redirected to the home page, does Google continue to rank that page based on the last version that was able to crawl the blog post, or is it the content of the target URL? Essentially, if the content significantly changes, which would be the case here, then we would use the content of the new page. So it doesn't matter if you're changing the content on the same URL or if you're redirecting to a different URL. When we access that URL, we see completely different content. So that's the content that we would use to determine the relevance for that URL in the search results. So in an extreme case, say you have, I don't know, blue running shoes and red running shoes, you redirect the blue page to the red page, then it's not the case that that red page would suddenly rank for blue running shoes. Because on that red page, it's all about red running shoes. So the relevance for that page would be all about red running shoes, not blue ones. So it's possible that you'd see some temporary effect there, especially when you're redirecting from one existing page to another. It doesn't matter what type of redirect you're using there. But essentially, what we use in the midterm, long term, for individual URLs is the content of the URL when we access it. So if that content changes through a redirect or through significant change on the page itself, then that's what we will use. It appears that for one or more reasons, a website suffers a major algorithmic penalty. It doesn't even come up on page one when querying for its own trademark name. There are no manual actions in Search Console. Many pages of the site are still being indexed, just not ranking. It's a large-scale Q&A site. Where would one begin with regards to something like this? I think this is different from the question we had before. But essentially, what I'd watch out for here is kind of the assumption that just because something is a trademark name or a domain name, that it would automatically rank for that. So for example, if your brand name is Best Widgets and that's what your domain name is as well, that doesn't necessarily mean that we would always rank this for Best Widgets. Because maybe there are other pages that are just as relevant for that kind of generic term that is also used on your website that would be just as relevant there. So that's kind of one thing I would watch out for, that you don't assume that just because you call your company this or you have this domain name, that it would automatically rank for that. Especially if the words in there are kind of generic and there are lots of other sites that might be just as relevant for those generic terms. So that's one thing I'd kind of watch out for there. It's really hard to say, though, with regards to kind of a generic or general situation like this, where a website changes like this, especially without looking at the website itself. So that's kind of tricky. But we do have different algorithms that update over time. We have core algorithms that update over time. And sometimes they can change in ways that they say, well, we think for this query, maybe other things are more relevant in the meantime than we thought before. So that can certainly change. But again, just because something is a domain name doesn't necessarily mean that it will rank first for that phrase. Our test site got indexed by Google. What's the best option to remove this test site from the Google index? Should we redirect to an arrow page or set up a redirect to the live site? Is there some other option? So I used to have a Google Plus post about this, but I guess Google Plus is gone. So there's not much that I can specifically point to. But essentially, when it comes to test pages, ideally, you set them up in a way that Googlebot can't access them. So things like setting up an allow list of IP addresses that can access it, or using server-side authentication to enable your testers to access that test site, that's kind of the ideal approach. Because that way, you don't have to worry about what actually is on the test site. Googlebot and none of the other search engines would be able to crawl and index that content. The general other approaches that we sometimes see, like blocking with robots.txt or using no-index meta tags, they also work. But they have the downside that you have to keep in mind to change your robots.txt file back when you make this test site live or to remove the no-index meta tag when you make the test site live, which is something that we see fairly often, that people do a redesign and suddenly the robots.txt file blocks everything. And then their normal website is blocked as well. So that's why I recommend doing things more on the server-side level, where it's really completely obvious when you accidentally kind of push those settings live for your live site as well. With regards to removing a test site like this, there are multiple things you could do. So you could start blocking Googlebot with either with IP address or with server-side authentication. And then those pages over time would drop out. You could start returning error codes like 404 or 410. That would work. You could set up redirects to your primary site if you wanted to do that. That would work. What I recommend doing to make the test site drop out as quickly as possible is to use the site removal tool in Search Console. So make sure you have the test domain verified in Search Console. You can use the site removal tool there and kind of suppress the whole site from the search results for a period of time. I think it's 90 days. And within those 90 days, we'll try to re-crawl things and kind of make sure that everything has been forwarded to your main site if you've set up redirects, for example. The downside to using the site removal tool is if your test site was ranking instead of your primary site, then, of course, your test site would no longer be ranking. So that might be something to watch out for. Is your primary website actually being shown in the search results, or is it just your test site being shown in the search results? And if only your test site is shown, then you probably want to do something more like redirecting so that you can continue to show the test site, but send users and search engines to the primary site. If the test site is not shown in the search results for normal queries, then suppressing it and kind of cleaning it up immediately is probably easier. Search Console shows some of my pages not being mobile friendly, giving various reasons. But when I check, it looks OK. Is it possible to kind of have more details, or what should we be doing there? So sometimes what happens is when we test sites for being mobile friendly, we try to render those pages. And if we can't access all of the content that's required to make that page mobile friendly, then we can't confirm immediately that the page is mobile friendly. So in particular, the CSS files is one of those things where if we can't access the CSS files, then we can't confirm that the page would be mobile friendly when users access it. So that means ideally, don't block the CSS file from being crawled with robots text, of course. If you have JavaScript files that also play a role in the layout, then also don't block those. If you have a lot of embedded resources that are required to be loaded to make a page mobile friendly, like a lot of CSS files, a lot of JavaScript files, a lot of server-side requests that are happening, then that could also be playing a role there. So ideally, kind of make it as easy as possible for Googlebot to render those pages. Using the mobile friendly test is a good way to double-check to see if there is any significant blocking there. If you see a page that is flagged in Search Console as not being mobile friendly, but if you test it with the mobile friendly tool, it looks OK, then you should be all set. Sometimes there are also temporary issues that pop up where we temporarily can't access maybe the CSS files. But if that works with the mobile friendly test, then you should be all set. There is nothing additional that you'd need to do there. The thing that I would avoid is only looking in Chrome or only looking with a mobile device directly. Because by looking directly in Chrome or with a mobile device, you don't see when there are blocks in place that would only be affecting search engines, like robots text. How would Google treat a page with both a review rating and an aggregate review rating? We do long expert reviews on products, but we'd also like to add the option for users to rate the product as well. But we're concerned it might negatively impact our expert review rating. I don't know offhand what the policies are with regards to those two types. The thing to keep in mind is the structured data should match the primary content of the page. For example, if you have a product and you have reviews, then that might be OK in a situation like this. The other thing to keep in mind is that there are only limited combinations that are shown in the search results directly. So sometimes you have a lot of different types of rich results mark up on a page, but we would only show maybe one set of those because we can't add all of those combinations and still make a reasonable-looking search result. So that might be another thing to watch out for. So I don't know offhand a combination that wouldn't work with regards to reviews, but you could imagine situations where maybe we would show a recipe image and there are some other content that you wanted to have shown, like maybe events or something like that, which wouldn't be possible to show in the same search results entry. And for cases like that, you could theoretically mark up all of those different things on the same page, but we would only show one of those anyway. So it's maybe something where if we only pick one of those and show those in the search results, then having both of those marked up is almost more work than you need to do. So that might be something to watch out for. I don't know offhand how it would interact with review rating and aggregate review rating in this case. I suspect we wouldn't be able to show both of those at the same time. So that might be something where you kind of take the practical approach and say, well, they can't be shown at the same time, therefore maybe I should just focus on the one that I do care about most and make sure that that's marked up. And that can be different across different pages. So it might be that you have one page where you say, the aggregate review rating is more important for me here or I have data for aggregate reviews and for the other one, maybe I just have a single review and just show that. I discovered an entrepreneur that co-founded a company with Thomas Edison. However, Wiki, Google, and other online sources couldn't find his contribution from history. How would you go about resurrecting someone and adding them into Google? So I think the first approach there would be to make sure that this is actually some content that's available online and kind of make sure that it's recognizable. So my general approach there would be to look at Wikipedia and see what you can do about making that information available there. In general, over time, we do pick up these kind of things. So it's not that there is an immediate submit to Google Knowledge Panel button that you have anywhere, but if we recognize that things have changed or things have gotten added to kind of recognized sources like Wikipedia and some of the other bigger sites, then that is something that we would pick up. Is the URL parameter tool in the old search console still effective? Yes, it still works. The question kind of goes on in I added a parameter that I wanted to have dropped, but it's not immediately visible in the search results. So the URL parameter and handling tool would not remove URLs from the search results. It's really only to refine the crawling of URLs. So it's not something where we would say, like, you add this parameter to that tool. And then suddenly, all of those pages disappear from search. It's more that over time, we would crawl those URLs less and over time, they would drop out from the search results. But it wouldn't be any immediate change there. All right, let's see. Hello. Yeah, hi. Oh, so my name is Govind. And yeah, I have a website, which one is hightechgazette.com. And it's listed in Google News, uploaded by Google News. Basically, my website got a normal X1 on my website. And this had that my website having my website having a theme content on that. And they applied manual X1 on my all over the pages. So my question is that I have re-submitted my reconsideration request to accept my website into Webmaster again so that I can get my ranking again in Google. But it's not approved by Webmaster. So I tried the URL that you have in the comment, and it doesn't work. To resolve, yeah. Excuse me. I tried the URL in the comment, and it doesn't seem to work. So I don't know if. Could you please check my website? Where is the exact problem on my website? Because I have updated almost more than 500 articles on my website. And also deleted 1,500 articles from my website, which are not providing any value to the users. And after that, I am publishing many new articles so that I can get my website again. Oh, it's OK. Yeah, I can take a look at that to see if there is something specific that is stuck on our side. But it does sound a lot like from the content side that it was a bit tricky. So what we sometimes see is sites that use kind of templated setups where they scrape existing news content and they likely rewrite the content. And that's all the kind of things that we would find problematic on the one hand for web search and definitely also for Google News. So what I would recommend doing there is making sure that you're not just rewriting the old content, but actually making sure that when you're providing news that you're creating something new, that you're not repeating things that other sites already have, that it's essentially that you're writing something from a new point of view. And I realize that's sometimes tricky with regards to news content because new news happens how they happen. But it's really important that it's not content that's just pushed through a translation setup, or that's just lightly rewritten where you use synonyms to replace words and kind of swap things around in the sentences. It should really be content that is unique and high quality content that's written for your website. So that's kind of what I would watch out for there. I don't know your website offhand. So it's hard for me to say what specifically to watch out for. But that's something that we see quite a bit with certain kinds of news sites. Can you review my website? I can take a look, but I'm not on the WebFend team. So I can't change that setting. What I would recommend doing is maybe posting in the Webmaster Help Forum to get advice from other peers and to really try to get as honest feedback as possible. So really explain which manual action you got, what kind of pages you used to have, what you changed, so that someone can really take an honest look at that. Because what you don't want to have happen is someone to look at it and say, oh, this looks OK, where you know exactly that there are parts of your website that are still problematic. You really want to make sure that you get the hard feedback that you need to significantly level up and to make things much better. OK. So that's kind of the direction I would have there. OK. OK, actually it's around one month. We have found the issue of door pages content. So we have changed that. And let's see, we have already submitted for the review. Let's see what happens. Yeah, I mean, it's something also where when you say, like we updated 500 articles on our site, that seems kind of hard to do when you're a small team and you have a lot of content on your site. So that's kind of would worry me a little bit, where it seems like maybe you're doing things to take shortcuts to have a large number of articles, but actually the content on these articles isn't fantastic. So that's something where I would generally trend towards saying have less content on your site, but really make sure that the content that you do provide is as high quality as possible, that if English is not your primary language to make sure that you have a proofread by someone who does speak English fairly well, or maybe to post in a local language as well, that's something where we often see that actually you don't have to compete within kind of this highly competitive English market. Maybe it makes sense to compete more in a local market where you have fewer competitors, but your knowledge is so unique that you actually do fairly well. Can we have any role of infographic on our website? You are already adding infographic on our website just after the manual access so that we could get it back. Is there any role? Infographic's a great. So if that's something that you can do to make your pages better for your users, then I would go for it. It's not the case that the web spam team would say, oh, there's an infographic. Therefore, I should remove all manual actions. But it's a good thing to do. I think sometimes it really makes the information on the pages a lot more visible. We are not native in English, but yes, we can provide the information in terms of infographic. Infographic will directly tell everything to the users. Whatever you want to convey, we can convey through infographics. You can do that. I think the important part is that the text on the page is still there. So if all of the text is in a graphic, then we wouldn't be able to recognize that text. OK. So Google can't recognize that keyword in infographics. Exactly. OK. Thank you. Sure. Can I have your email ID so that I could communicate to you? Ideally not. So I try to keep it in these office hours hangouts because it makes it a little bit easier to kind of work. OK. Thank you. Sure. Thank you. All right. Any other questions from any of you who are still here? Yes. Yeah. All right. I have a question regarding latest from snippet. So we have a client that is in e-commerce. And the latest from shows random products from their site. How is it generated? How do you mean the latest Chrome? Latest from. Like there's a snippet that shows under the results with latest articles or latest pages from that site. Oh, OK. I don't know. I don't know exactly how you mean. Do you have an example search result that you could paste into the chat? Sure. I mean, all the pages from Sweden is this. OK. I can take a look at that. So one thing that you probably want to watch out for is, I don't know, e-commerce sites. So I'd have to take a look. But the one thing to kind of watch out for, especially with articles, is to make sure that you specify the date in a machine-readable way as well. So on the one hand, visible on the page, but also with structured data. I don't know offhand with regards to the search results page here how that would work with an e-commerce site. But if you're seeing results in or from blog posts, for example, that are shown there, that might be something where that would play a role. And this is in Swedish. Is that correct? Yes, it's a Swedish site. I can send a link with a screenshot. Yeah. If you have a screenshot or something that you can share so that I can see what you're seeing, that would be useful. Yeah. Cool. All right. Someone else had a question. Yes. Can I have one more question, please? Sure. Perfect. So one of our competitors, they just got on number one. We eventually outrun them because our website has been there before for a long time and the quality is the best. So what they did is they've been developed by a huge company. It's a worldwide company. And they've developed like a thousand websites and they didn't rank. They link to all of these websites from each other. So like mywebsite.com slash showcase. And you see all of the websites they've done and they're all in the same sector in domain. They're all in health care. So they all eventually rank on the number one or like good ranking on Google because they're all linked to each other. And they all have been there for a long time and they are considered to be well-authored these websites. So what do we do? What do you do? I don't know. I'd have to take a look at the query that you're seeing. So are these new websites? Or are you saying they've been there for a long time? They've been there for a long time. And I did report some of them. And I think now I start to see that. They started to lose the backlinks. So I don't know if that did happen because now I can see they lost 400 backlinks. And I guess it's because of the reports I did. But should I do that for all of the websites? Or what's the deal here? I don't know. I don't know what exactly is happening to it. So what you could do is maybe pass all around the world. So yeah. Yeah, maybe you can post something in the chat here and then I can pass that on to the web spam team, especially if there's something like a bigger pattern that you're seeing where you're saying, like you mentioned, that one page that lists all of the different websites, that might be something that would be useful for the web spam team to take a look at. OK, perfect. Because they do that for all of the websites they have. And it's just like crazy. Yeah. So much. I will do that. Cool. All right. Any other questions? Yeah, I got a question. I'm following up the question about the Nordic website. OK. What we have there is actually we did see a drop in traffic once that link appeared and drop in rankings. Plus, there's no content in English. Nothing fancy, as you said. Nothing that happened. And we see that like I had that in my web in the SERP, another website. Got that and you drop in rankings, too. That kind of played good for me, but still, I dropped in rankings after that link appeared. Yeah, I mean, so from a practical point of view, usually that link is something that we show when we've already compiled the search results. So that wouldn't be related to anything from the ranking side. But I'm happy to take a look if you can post the URL in the chat here, then I can double check. There is a huge, like I wrote you actually an email and you probably got it, but you couldn't answer and I understand. But there's like a screenshot of everything that happened there. OK. And if it's possible for you to take a look, that would be great. But I really don't want to post it because it's my project. And another thing that happened with that website, I feel like everything broke down lately. It doesn't index images. I tried to index an image. And when I send it to the Search Console, it says it's bugged by robots.txt. But there's no blocks by robots.txt. I check it with the robots.txt checker and nothing happens. So that's something else. Yeah, that's something else. That's kind of confusing. But the URL inspection tool that you're looking at there to check the images, that's focused only on web search results. So if it's a web page, then we can fetch that. If we recognize it's an image, then we don't fetch that at all with the inspection tool. And I think it shows something like block by robots.txt because we essentially block them from being shown in the web search side. So it doesn't mean that we can't fetch that image. It's just that we can't use that tool for that image. OK, gotcha. So it's not a problem. It's not a buggy. Yeah, it's confusing. It's not really, it's not a reason that your images would not be indexed. As long as that's not a problem, I'm not worried. Yeah, so what I would check there is submit the landing page, the HTML page, for that image. And that's actually a logo. And then double check the rendered view of that page. And if the image shows up there, then you're all set. Oh, OK, so I did check that, and it does show them. But regarding my first question, is it possible for me to send you the email about it? Will you be able to take a look at it? Ideally, maybe, where did you send it? John, I'll take a look for a second. I sent it a couple of weeks ago, I think. OK, I don't know. I get a lot of emails, so it's. Yeah, I know, I guess that you, like, OK, I get it. I get it, but I think I sent it a couple of weeks ago. I'll be able to. OK, I'll dig it up. I'll see what I can find. Yeah, it's my project, and it's a really competitive industry. And I don't want people looking. I saw a drop in rankings immediately after that link appeared. And when I searched for my key phrases that I used to rank for with an English SERP, Google said to English, then I'm back to my previous rankings. When it's in the, it's in the Finnish language. When it's in the Finnish language, I'm dropped. But when I turn it back, when I turn it to English, I'm back to my previous rankings. So that's what made me believe that I lost rankings because of that. But I don't really have almost any English language on that website, like, no, nothing. Yeah. OK. I'll pick up the email. I will send you a follow up right now. OK. That will be useful. Cool. Thanks a lot. Thanks. OK. Love it. Cool. All right. Wow. I have a bunch of homework this time. That's good. Let me just copy out all of this stuff from the chat so that I have those. OK. I think we're a bit over time. So let's take a break here. Thank you all for all of your feedback and questions. And I'll see what I can do to get some of those things resolved that were submitted or to figure out what we can do differently for maybe one of the next hangouts if you want to join in back in then. Thanks again for joining. I hope this was useful. I wish you all a great weekend. And hope to see you again in one of the next signs. Thank you. Thank you. Bye, everyone. Thank you. Thanks. Thanks, John.