 And we're live. Oh, wow, I don't see no text anymore either. OK, welcome, everyone, to today's Google Webmaster Central Office Hours Hangouts. My name is John Mueller. I am a webmaster trends analyst here at Google in Switzerland. And part of what we do is talk with webmasters and publishers like the ones here in the Hangout, the ones that submitted a bunch of questions, sometimes in the forums, sometimes on Twitter as well. And as always, if there are some new faces here, feel free to jump on in and ask the first questions. I see a lot of faces I've seen before, not necessarily old, but pre-known faces. But if there's some of you here that haven't had a chance to join these in a while and have a burning question, feel free to jump on in. I have a, because it's still same old faces, I have a question just regarding branding, if that's OK. So review sites that try to kind of hijack a customer. So in other words, review sites that are, let's say, example.com and example1.com, try to hijack a brand. What do you guys do in this case where, when it overlaps over a brand? So in other words, a review site is trying to hijack a brand's name. Like, for instance, you were their customer and then you left them. And now they try to kind of spam that branding name with two review sites. And it's something that I sent you. It's something that I sent you before that you've seen. And yeah. So it's kind of like, in this situation, it's like you're hopeless and the only answer would be either you or someone. And then if you go to the forum, you don't want to do that because it leads to something else and it creates certain problems. And so what can be done in that case? But it's not just that particular area. And I'm sure it happens in the insurance area. It happens in the car area. It happens in those areas. So what can be done with situations like that? Not really much. So essentially what's happening is someone else is ranking with your brand name. And from our point of view, we don't have anything in our social results that would say, this is the official site and only this site can rank for those names. Essentially any site that has content on it that contains that name or that product or brand or whatever can theoretically rank in the search results. So it's not something where we would manually sort out the URLs and say, oh, that's not an official URL. Therefore, we won't rank it high in the search results. Or that is the official one. Therefore, we will remove all of these complaints in the search results and instead just show the official one. That's something where organic search essentially kind of plays out. But he is still crowding the elevator after 10 years, which is a long, long time. And still nothing has been done yet. Like he's still doing it. And there was a bit of calm. And now it's back again with him doing what he's doing. Yeah. Do you want to see that again? Do you want to just send you that example again? I'm happy to take a look at that. But in general, if it's just someone else ranking with a brand name, with a company name, that's kind of how the search results work. And that anyone could rank for that theoretically. Yeah, but you see the stronger you make your site, the more you'll kind of be safe against some random person just putting the name on their site and ranking for that as well. But it is sometimes tricky. And it is sometimes something that takes a bit of time for a site to really be so strong that you don't really have to worry about that on a day-to-day basis. OK, and then the last question before we move on to everything else, just regarding the disavow file. Question hasn't popped up in a while, but you talked about it where, so disavow file, let's say, has 1,000 links. I take my time and put all my time in there where I dig deep into that disavow file. And then from the 1,000 links, I find 500 that were just dead. And I clean that up. So if I keep on cleaning that up, is that something that you want or your machine learning will do it on its own? I like before a bit more. I guess we do try to figure that out on our own and try to take care of that for the sites. But if you're seeing these things and you really want to make sure that the algorithms don't have to think twice about this, then I would just put them in a disavow file. And then you're kind of done with it. So cleaning up dead links from a disavow is necessary or not necessary? I wouldn't say it's necessary. So I mean, dead links is something that falls out anyway. So that's less of a problem. But if you see problematic links to your site, that's something where if you really want to make sure that they're not taken into account at all, then you can definitely put them in a disavow file. OK, no, because I was just saying the current disavow file that's in there. And so because you were saying it's a good idea to clean the links that are sitting in the disavow file on a regular basis. Like whatever I added, if I added 1,000 links and then four days later, these XYZ sites fall off. And then I go back into the file. I'm just kind of helping you guys as well kind of clean up stuff. Is that a good thing or should that be done or? I'd say most sites don't need to worry about that. OK, all right. John, can I ask a quick question kind of related to whatever we've been talking about before? So you saw Dan Sharpe from Screaming Frog kind of hijacked your search engine optimization PDF guide. And then I went back. Do you have any comments about why, how that happened? I don't know the details of what specifically was happening there, so it's kind of hard to say. From our side, we try to avoid doing anything special with our URLs. So this is something where I think for quite some time other sites have been ranking with that specific PDF. And from our side, this isn't something that we would take to the engineers and say, hey, you need to do this or you need to do that or we need to tweak the algorithms to make sure that our PDF always ranks number one. So this is something that is, from my point of view, kind of interesting to watch to see how it settles down and not something where we would get involved in a manual way because it's our site and that would be kind of unfair to the rest of the web if we did something manual there. But this was an example of your exact PDF being hosted on a third party website. So it's kind of like a DMCA issue. Potentially, I don't know. I don't know who at Google would submit a DMCA complaint for something like that. But it's possible. We can blame the area if you want. You guys have your own DMCA. I forgot what it's called right now, but there's another section of Google where you can submit, right, like a DMCA. I think it's a part of that. But that's for the search results, yeah. Yes, yes. I mean, that's something that we would do, or at least like the WTAs and my team, the people working on the web search side would do. That's something where if the legal team finds that on their own and they say, oh, this is a hassle. We need to take care of this. Then they might do that. But it's not something that we would kind of push internally. Right. So no comments. I have a question for you. Yeah, go for it. OK, actually, we have two of them. The first one is we have a site. It's a site that basically is about reviewing medical clinics, hospitals, medical providers, generally. And at the start, while we don't have a lot of reviews of search parts of the very show, for example, the clinics from a certain city will be reviewing content. Of course, we'll have the first details about the providers, things like that, but not a lot of original content before more reviews kick in. So what I thought I'm doing was something like adding some snippets on the side of a page. Basically, for example, we have about 100-watt snippets about the country where it is. So 100-watt about the country. Let's say United Kingdom. Then 100 snippets about the city. Let's say London. Then another 100 snippets about the kind of industries, like, let's say, dentists, tomatology, dentists, dentists. Then we will have another snippet where most healthiest snippets on various pages. And on each page, they will be separated. But they will repeat on certain pages. For example, the cardiology clinic will search for the cardiologist clinics in London. We'll search for dental clinics in London. We'll both have a UK part and the London part, but we'll have different snippets for cardiology, respectively, with special treatment. Then a cardiology one in London and one in New York will have a cardiology snippet identical to different parts for the country and city. Is that OK where it would be seen as duplicative content? Is that good, is bad, is useless? I would be careful with just automatically doing something like that, because it quickly goes into the area of doorway pages and doorway sites. So that's something. No, it's not that it's health or main content. That is just useful for various reasons, because that isn't basically about medical tourism. So recommending people where they could find treatment on different prices all over the world. So it's kind of useful to tell them, OK, that's what you can find in this country. That's what this city is about. That's what this kind of treatment with clinic software. I just be careful to really make sure that the content you're putting on these pages is useful for the users and is relevant for them. So not something where you're just kind of artificially creating these pages, but where these pages really stand on their own. And sometimes that means that you can share content across these pages. That's, I think, kind of normal. If you have one product that you're selling in different variations, maybe you have different pages for that product and they share the same description or they share the same technical details with the exception of maybe one or two lines. I like saying widgets. And you have a widget description on pages for languages, low widgets, and then you have. I think that's something where if you're doing it really well, then that's certainly an option. But I would really be careful to avoid the situation where you're just taking various databases and you're mixing them together and the whole site is essentially auto-generated. So that's kind of what I would watch out for. The main part would still be somehow in the boilerplate because the main site would still be where there's new part. That's perfectly fine. Some boilerplate is kind of normal. OK, and second question. It's about special characters in different languages. I talk especially about their critics characters in Romania. I was asked with a little comment before it or if like that which you have also in Germany and you have also in French. What is it recommended in a local language site to use them or to not use them? Basically, I see all the big sites are using them and it's theoretically the correct form to use them. But on the other side, the people searching, especially from mobile devices, will always search without using a special character because it is just easier and faster to type. I'd use them the way that you think they make sense. So in the content, I would definitely use them normally. Usually what happens in search is we recognize that the words are the same and we treat them as synonyms. So that's something where we usually do a pretty good job. Sometimes you'll see slightly different search results if you do it with or without the accents or the umlauts. But for the most part, we get that pretty good. What I noticed is, hello. Yes. What I noticed is if we search, I use the critic plan for a special character plan. I search in Google for the normal word just a personal search. My word won't appear as bolded, because we might see the synonyms, but they will not see the most densely synonyms. And they will not bold them. You know that the exact search phrase is bolded in the result. Do you think that might affect click-through rate and things like that? So it makes sense to use them. Probably doesn't make such a big difference. But this is something where I would definitely talk with other people that are active in your market and kind of see how they're seeing that as well. But for the most part, we've gotten used to people putting all of these decorations on the characters. That's a part of the language there. So it's not something that I would say, oh, you should never mark your characters up like this. You shouldn't use them out, or the acrylics, or whatever. That's essentially the proper way to write the content. So that's what I would use. But on the long term, I think Google will move more and more towards the direction of determining that better and better. And it will be no disadvantage of writing proper and using proper characters in the language. Yeah, I would definitely go in that direction. So going on the long term, you would use the characters in the language format with special characters in the language. OK, and then for URL, use the English character, or just skip that character. Whatever you want. I see a lot of people use the Latin characters without any special markup just to make it easier to copy and paste them. But whatever you want, both work. OK, thank you a lot. Sure. All right, let me run through some of the questions that were submitted. First one is a topic we've seen a bunch of times. In the past, you said that Google is discounting hidden content. Like, click to expand for ranking. Is that still true in the mobile-first world? So with the mobile-first, that's something that will be different. So at the moment, it's still the case that if you have content which is loaded but not visible by default, we will see that as being less important on the page. And for example, we'll probably not use that in the snippet. But on mobile, we'll try to use that as much as possible. That changes mostly because on mobile, there just isn't that much room. So sometimes it makes sense to kind of fold things together and have these kind of UI elements. A question here from Peter. How important is a good informative 404 page? Does it carry any weight in the algorithms? So from a search point of view, what happens when we see a 404 result code is we ignore all of the content on that page. So from purely an SEO point of view, the 404 page content is completely ignored. You could leave it completely empty. On the other hand, users, of course, when they land on a 404 page, they want to have some way to kind of still find the content they were looking for. Maybe they were navigating within your site or from another site to your site. And they'd like to actually do something on your website, maybe buy something. So having a good informative 404 page, make sure that those users don't get lost. And in turn, you might have this kind of long tail effect of people saying, well, this site helped me. They had a really good product. I will recommend it to my friends, which in turn could result in something like a link that we could pick up and use for SEO. So that's something that might be an option there. But just purely with regards to SEO, if you put links on there or not, that's something we completely ignore. We have an Android app, which is localized into several languages in Google Play. Kind of goes on that this app is listed once in Search Console, and all of the statistics kind of come together. Is there a way to separate that out? Can I have multiple listings in Search Console? No. At the moment, that's not possible. Essentially, if it's one app, then it's one listing. That's one entry in Search Console. So that's not something that you'd be able to split out like you might with a website where you have different sub-directories or sub-domains that you could split out. I try not to use iframes, says Greg, but does it help Google to see that my 12 nofollow links are in a single ad set and not in 12 separate ads? So if these are nofollow links, then we ignore them. We don't pass any page rank. We don't pass any signals to those links. So if it's one link within an iframe or one iframe that has these links, or if those links are on one kind of div in their separate links there on a single page, then that's essentially the same for us. So that's not something where you'd need to mess around with iframes. Also, with iframes, it's always a bit of a hassle when you look at things, how they work on mobile. So that's something where I try to avoid going down that direction unless you really need to kind of block that piece of content from being crawled, for example. That might be an option. We moved our site to HTTPS a few weeks ago. And about half of the site is still showing up in the search results as HTTP, especially important pages. We submitted the URLs via sitemap, fetched an index in Search Console, but Google still doesn't seem to want to pick up the HTTPS versions, even though the cache is showing a date well after we moved. Should I be concerned about this? Are search rankings fluctuated a lot since the move? It seems that older sites don't handle switchovers to HTTPS as well as newer sites. So in general, this kind of a move to HTTPS should just work out. I don't know if that was your thread, but I found a thread in the help forum that something similar was mentioned. And I passed it on to the engineers to kind of take a look to see what exactly is happening there. Sometimes what happens is that we just can't crawl a site properly. There might be technical reasons on the server side. There might be limitations in the robots text file where we just can't crawl as much as we would like to, to be able to recognize that actually this is a clean one-to-one shift from one protocol to another protocol. So if that was your thread in the forum, then I'll try to follow up there. If you haven't posted in the forum, then that might be a good option. Let's see. Does the use of hamburger menus on desktop sites versus traditional menu bars affect how Google crawls indexes or ranks pages on a site? No. Essentially, that doesn't change anything. We can navigate through the site normally. We can see the links internally just as well. The only difference where, or the only kind of situation where it would make a difference is if the content of the menu isn't loaded by default on the page. So if you click on the hamburger icon, and then it does an Ajax request to pull the content and then shows the menu, then that's something we probably won't notice. On the other hand, if you're just using CSS or JavaScript to turn that on and off, then we'll find those links. We'll be able to follow them. We'll be able to use them just like anything else. With regards to hidden content, that's more of an issue if there's important content that you're also hiding in the hamburger menu. So if you have a big image that's visible in the hamburger menu actually has the textual content, then that might be kind of confusing to our algorithms, probably also to users, though. So that's kind of the situation where the hidden content on desktop would play a role. With regards to interstitials update, it's been suggested that triggering the IP triggered pop-ups for sending people to a local version would be bad. What can we do there? So from our point of view, that is definitely something that you don't really need to do in the sense that we try to send users directly to the local version already. And if people land on a version that's not really the same as you would expect them to use, then our recommendation has always been to show kind of like a subtle banner on top and say, hey, you're on the Swiss version of this website. You probably wanted the German version or the Austrian version. Here's a link to go there directly. So instead of blocking the full content, make it possible for people to go there, but don't get in their way. I have a domain which was doing good in the past three months. All of a sudden, Google made some update, and now it's not ranking so good. So what can I do here? I moved my whole website to HTTPS as well, and it's still not being crawled properly or not ranking properly or not ranking as well as I wanted to. So this is probably the type of situation where I would go to the Webmaster Health forums first to get some feedback from a broader group of people to kind of hear from peers what do they think about my website. Is my website really as good as I thought it was? Was I really ranking in the position where it kind of makes sense for users, where it's relevant content? Or maybe you were ranking kind of excellently really well, and our algorithms updated, and they figured it out and thought, oh, actually, this site isn't as awesome as we thought it would be. And the important thing here is to keep in mind that most of these issues where we kind of change the ranking of a site based on our quality assessment, based on our ranking algorithms, are not due to technical issues on the site. So if we change the ranking of a site based on our quality evaluation, then moving to HTTPS is not really going to fix things. Or similarly, just like putting a responsive theme on it isn't really going to change the overall issue that our algorithm think that this website is not as high quality as it could be. So that's something where it's important to take a step back, get feedback from a broad group of people, and listen to the harsh feedback as well, and think about what you can do across the board to make sure that your website isn't just technically OK, but also just from a quality point of view, from an overall kind of user point of view is the best it can be. So that's something where there's no simple trick. And sometimes it also means that maybe a business model that you use in the past doesn't make so much sense anymore. My other suggestion would also add a speech to text. Actually, if you're lazy in reading your pages over and over again, put them speech to text, and then if they don't sound good, change it up. So that's another one. Yeah, that's a fantastic suggestion. That's something where you recognize kind of this SEO writing style fairly quickly, where if you had someone write it, and they just included the same keywords over and over again, if you listen to someone actually speak it, then you probably recognize that this doesn't quite sound the way that I would talk to someone, or the way that I would present my product, or my site, my business, to other people. Exactly. Can I jump in with another question? Sure. So remember that mobile-friendly bug thing, where you were mislabeling? There were two things that I didn't dig into as to not bother you guys, but one is it was only supposed to show for the site owner if it's not mobile-friendly, right? That was part of the bug? I think that's always been the case there. OK. OK, and that has been fixed in terms of if your site's mobile-friendly based on the mobile-friendly testing tool, it will, assuming the date of the timing is right, it would now show the correct label. That's been fully fixed. And who it shows has not been changed. Yes, exactly. It's really only for the site owner. It's kind of an information like, hey, you should fix your site. So it's not meant as something where we're publicly blaming you for not having a mobile-friendly site. It's really just for you kind of information. Hey, we checked your site. We found it wasn't mobile-friendly. You should try the test to make sure that it's really not mobile-friendly and fix whatever issues might come up there. And that site owner determination is based if you have added it in Google Search Console? Yes, exactly. OK, so if it's not in Google Search Console, it is not displayed? Exactly. OK, may I have one more different one? All right. OK, I promise to try to be the last one. On another side, we are trying to merge quite a lot of pages because there was no penalty or anything like that. But we were not happy with the amount of content we had on them, so we tried to mix them together. And doing that would result in quite a lot of lost pages, lost URLs. In some cases, even losing all subdomain because it contains products we're no longer selling and things like that. And we are trying to redirect those pages to pages we think are most relevant on a case-by-case basis. But in certain cases, for example, for products we are no longer selling, it will be a lot of pages, maybe hundreds of pages, redirected to a single page, which will say we are no longer selling this product. We may have something else, wishlist, things like that. So basically, all this redirects, how Google would see them? Would see them as soft 404? You know, would see them probably as soft 404s. Because essentially you're saying this page is no longer valid. So we'd probably see that as soft 404s, maybe not always. But if there's no replacement, essentially, then it is kind of a soft 404, right? It's something that we should, in the end, drop out of our search results. So if someone is searching for that, we wouldn't send someone knowingly to a page that says, oh, we don't have this, actually. OK, but could it lead to any penalty or things like that? Because we take more pages and redirect multiple sources to a single destination? No, there's nothing bad about having soft 404s. So worst case, they will be seen as soft 404s, and we'd lose credit and link power for those pages. But nothing else, no risk for penalty or anything like that. OK, because that's what would make most sense for our readers. And on the same side, we have another page we promoted. It was a contest for bloggers, and it turned out incredibly popular. And it got a lot of backlinks to it. And the problem was that it has more backlinks now than our home page, because we didn't ask for those links. They were natural, so we didn't ask to link to the main page or link to a product page. So that page promoting our campaign is actually now by far the most popular page on the site. Could be this seen as a bad signal, because an inner page has way more links than our home page, more links than any other page. That could be perfectly fine. I think the situation where you're doing kind of a contest is something to watch out for. So mostly with regards to whether or not these are actually clean links or not in the sense of, are you perhaps exchanging something for these links? Like, do you require a link in order to take part of it? No, we're not requiring anything. Basically, it's just a contest. Kind of the thing that I would watch out for. But just having a lower level page, having more links than the home page, that happens a lot. That's completely normal. OK, thank you. John, can I just jump in with a quick question, if that's all right? All right. Sorry, just quickly. It's to do with GEO and how Google determines the kind of the scope of what locations, what countries it shows a particular site on. I was just wondering, because I know people in the UK recently have been, well, I know Dawn has and I have, and a few other people have been kind of reporting that we see results in the SERPs that aren't necessarily related to the country. And we get it the other way around where we get a lot of interest from other countries in our site. But we have a country specific TLD.co.uk. So the question is really, how much does Google look at the content of a site and the links, the inbound links of a site, in order to kind of give it a bit more information about what that site is relevant to? Because obviously, if you've got a country specific TLD, then that's a clue for Google. If you don't and you have a generic TLD, then in Search Console, you can kind of provide a few more clues as well. But there will still be some sites, some .co.uk site, that's only relevant to the UK. And there'll be some .co.uk sites that might be relevant to the world or to Europe. And how much does Google look at content and inbound links to kind of make that determination? That's hard. No, I don't have any number, or I can say 72%. But that's always a tricky balance, because you can have a website that's hosted locally, which is relevant globally. So it's not the case that a .co.uk should never show up in Australia or the other way around. But at the same time, showing something that is really focused on a really small service area locally in completely a wrong country, that's also a bad experience. So that's something where we have to try to balance those two aspects. This is always something where feedback and sample queries are really helpful. Because the team that works on this, they're constantly trying to find, like, how can they find two in these search results to make sure that they're picking up the right local ones? But sometimes when there's no results, like, for instance, if there's a query that you have nothing to serve, you might show Canadian sites, you might show Australian sites, right? Like, if you have nothing to show for that query. I think for us, the issue is, I mean, again, two ways. Again, as a user in the UK, we see things from other countries. And I think I've mailed you with a few and we had a bit of a debate about it. But also, as a site owner, we get probably about 10% of our traffic is from countries we don't serve, which doesn't sound very much. But that traffic seems to be very high quality. Those users try and try and try to convert. And that's a really poor user experience, both for them and for us, because we have to go back to them when they request delivery quotes and all kinds of things and say, I'm really sorry we don't do that. And I'm sorry Google's showing it in the results. And then they get really upset as well, because they spent 25 minutes on the site adding everything to the basket they want to buy. And so I guess the question was more about, is there something site owners can do? So for example, if site owners have the word Europe on their page somewhere, but they're for UK only, is there something they can do to remove that? Or if they have a very, a link profile that's kind of very high in links from across the globe when perhaps, you know, they don't serve those countries, is that an area that they can look at? Or is it really something that's completely out of their control? That's pretty much out of their control, yeah. Right. There's something similar for video where in the video site map you can specify, don't list this video on these countries. But that's something that, as far as I know, is mostly for policy reasons. So that's something that the web search team has in the past kind of shied away from having some kind of markup like that, where you can say, oh, I don't want my website shown at all in these specific countries. I mean, I do think that's something where I think having examples makes a lot of sense, because that's something which we can pass on to a team and they can say, well, we can't fix all of these because it's like way too complicated. We need a new meta tag that webmasters can put on their site and we won't be able to trust that because, I don't know. But this is something where those examples, they help us quite a bit. Okay, again, it's easy to do examples of where we see stuff in our country that's wrong, but when we get customers on the site constantly from other countries, that's more difficult to give you the examples. I do remember very briefly that there is something that Webmaster Guard does. I'm sure that when you have a generic TRD, that I'm sure it says somewhere in there that it does look at content of the site and inbound links. I'll try and dig out and send it to you if that helps at all. But that's more for geo-targeting. But I guess if people are going to your site and you can recognize that, you can also show them a banner. And maybe work with a partner company in that other location where they're always coming from and say, hey, I will send you people, you send me people, or we do some kind of affiliate deal or something. That might be an option. Yeah, that's a bit of a fudge though. I mean, it would be best if they just got the search results or irrelevant to them. But okay, thanks anyway. Yeah. John, sorry to quickly go back to this family label. But when I was doing the research on the 17th, which was I think Friday, I was able to see from like AT&T and all the sites complaining if it had that label or not. No, I was. I double checked. I personally was able, that's my screenshot, showing AT&T.com slash internet, your page is not mobile friendly. That's my screenshot. They posted it on Twitter. No, but I did my own screenshot also. But that's the same image that the guy from AT&T posted on Twitter. It could be, but I actually was able to check that. Oh, for God's sake, I'm not gonna argue. All right, so, I mean, if we were showing it to other people, then I think we would have seen a lot more people complain about that. But I double checked on that because that's one of the things I was afraid of as well. Okay. Let me see, there are two questions, I think in the chat as well. On the topic of HTTPS, are ranking fluctuations normal? Or how long? Usually with a move to HTTPS, you have minimal fluctuations in rankings in the sense that we can just switch things over, especially if it's really a clean one-to-one move. If all the redirects are set up, if we can while crawling the site, recognize that everything is redirecting one-to-one to the new version, then that's something we can pick up within a couple of days usually. If you double check with like a site query or the index status reports, you'll probably see like one go down, the other go up, but the overall traffic should be fairly stable. It's just either sent here or either sent here. If you're seeing ranking fluctuations for longer than a couple of days or a couple of weeks even, then that's something I would definitely bring to our attention so that we can take a look to see what went wrong because that's something we try to avoid. We've worked very hard with the ranking and the search teams to make sure that especially HTTPS migrations are as easy as possible with a minimal pain as possible. Will social link signals be lost? As far as I know, so when it comes to Google Plus, for example, when we recognize these kind of redirects then we do forward that. As far as I know with Facebook and Twitter, that also happens, I don't know for sure that that always happened, but this is something that usually also takes quite a bit of time to kind of settle down in the new state. With regards to all of these social mentions, it's important to keep in mind that Google Search doesn't take those into account anyway. So if, for example, all of your tweet counts reset, then from Google Search point of view, that's not really a problem. But in general, these other social networks, they understand redirects as well. They have had a lot of time to kind of work with sites that redirect. So that should kind of settle down over time. Thanks, Sean. There was another question here. If you have multiple internal links to the same page on your website, does it get crawled multiple times? If so, how can you tell the bot to only crawl at once? Usually, we crawl all pages multiple times anyway. So this is something where, depending on the page, we will crawl it several times a day, even sometimes once a day, sometimes once a week, sometimes once a month, sometimes maybe every couple of months. And that mostly depends on how we recognize this page, how often we think we need to re-crawl it to kind of stay up to date. And less with regards to how many internal links you have to those pages. Obviously, if there are lots of internal links to a page, then we might assume that this is actually a pretty important page within your website and crawl it a little bit more frequently. But in general, it's not a problem from our side if we crawl a page too often. It's not that anything gets reset with every re-crawl. It's just that we're looking for changes. And if there are no changes, then that's fine with us as well. Yeah, hi, John. Yeah, John, actually, I have pinged the URL of Google in chat. The thing is that Google is adding my page name in the left of the title tag. And due to this, the entire title tag is looking some weird kind of title tag. So I knew that Google always adds brand name in case that is not in my title tag. But page name, adding in left, I am starting from 20% discount. But the page name is left, extreme left. And due to this, 20% discount is in the middle. Can you just tell me why Google is doing this? I don't see what is happening. Where? Oh, the red button. 7th, yeah, 7th result. This is not my brand name. This is, even I was trying to find it hard, find it out why it is happening. I don't know. I mean, you have both of those words in your title as well. So I'm guessing that's kind of where that came together. But that's, I don't know, sometimes we'll get these wrong. You're welcome to kind of submit feedback to us or send me that example directly. And I can take a look with the title team. But it's not the case that you can just force a title that you want to have displayed. It's always a case that we try to figure out what people are searching for and try to adjust the title based on that to make sure that they understand in which sense that page is relevant to their current interest. So that's kind of where we shuffle things around there. So in such case, what would you recommend? Should I experiment some other title or should I just send feedback to the Google? You can always experiment, try different titles out, and see what works best for you. Maybe a short title that doesn't need a lot of changes or if it has additional words, and that also works. Also, consider what it might look like on mobile. So maybe it looks like this on desktop, but if everyone is searching for it on mobile, then maybe that's not so relevant actually to you. So those are kind of the directions I would head there. OK, I will experiment this first. Sure. Let me run through some of the more questions. Is it OK to use English Hindi units in the ingredient section of a recipe schema? I don't know. If you have examples, I would love to take a look at that to see how that looks and how that works. I can double check with the structured data team here. Can we use English and Hindi written in English in the main content for some of the words in the recipes? Sure, you can definitely do that. If that's what you think your users are looking for or what they use, what they use to understand the content, then you can definitely do that. That's generally not a problem. When users are redirected from our non-secure site in search results to our secure site, HTTPS, they see a warning in Chrome. We don't have an SSL certificate for the domain that's being redirected. Do we need one to prevent users from seeing this message? I'm not sure where users are being redirected to your HTTPS site. If you're redirecting them yourself from HTTP to HTTPS, then yes, you do need a certificate. If you're just serving the HTTPS version on your site but not really redirecting people there directly, then I would still either put a certificate on it or stop serving the HTTPS version. Because if the HTTPS version is available, then we might choose to actually index your content like that. And if you don't have a certificate, then that makes for a really bad user experience. Question about migrating a large site from m.domain to a responsive version, more or less, what's the best way to handle a redirect process? So I think we have this scenario actually covered in our documentation. So I would double check there. In general, the normal process that I'd recommend there is to redirect from the m. to the www version and also to remove the alternate mobile links from the www version so that we can crawl the main version and we have the full answer there. So that's essentially what I would do there. With regards to Angular, which you also mentioned there, if you can't do a server-side redirect from the m.version and you need to do that in JavaScript, then usually that's fine too. What will happen there, though, is that it'll be a bit harder for us to recognize the redirect if we have to first process the JavaScript. So it might take a little bit longer. In practice, that shouldn't play a role at all because you're just combining these pages and the m.version was just an alternate version. So it's not that you're moving your site, you're just essentially just taking out the mobile separate URLs, which is perfectly fine to do. And also maybe worth mentioning there, especially with the shift to mobile first indexing, having a responsive site makes everything a lot easier because you don't have to worry about which URL pattern is actually being indexed. So that's a great move to do. There's a booking service shown in our Knowledge Graph panel that doesn't belong to us. It's for a service that's not offered to the public. How can we remove it? The Suggest and Edit feature for Knowledge Graph doesn't seem to cover this case. So one thing I would do there is make sure you're signed in to your account that has your website verified because then we can take those suggestions from the sidebar through a different path. If we know that this is actually your site, your business, then we can kind of trust that feedback a little bit more. If you're still not able to submit feedback specifically around that, then feel free to ping me on Google Plus with the details. And I can pass it on to the team here. Search Console says that only 12 URLs are indexed, but a site colon query says there are over eight gazillion URLs, actually. What's up with the difference? So I guess there are different ways of looking at these counts. So a site query is generally a very, very rough approximation of what we actually know about your website. And that can be a much higher number than we actually use to show in the search results. So particularly if we've crawled a lot of duplicate content, if we've run through an infinite calendar on your website and we've crawled a million URLs, but we actually kept 10 with events on the calendar, then we might show that really higher number there because we've seen so many URLs. And maybe that's like a reasonable guess. The count that we show is not meant to be an accurate count. It's really just a very, very rough approximation. The Search Console index count is much more exact, and even better is a sitemaps index count. So with the sitemaps count, we know you care about these specific URLs, and we can tell you these exact URLs, how many of these are actually indexed. So if you're really keen on figuring out how many of the URLs that you care about are indexed, then I would use a sitemaps count. Are e-commerce sellers upload products with very little information? And images are also not so high quality. They're uploaded by smartphones. Will Google consider it low quality page? Maybe. Maybe. That's really hard to say. Essentially, what it comes down to is your website is your website, and we take the content that you provide on your website, and we treat it as your website. So if the content that you provide on your website is taken from random people on the street who are sending you low quality content, then that's the content that we think you want to have indexed. So that's something where it's kind of up to you to kind of say, well, I want to make sure that my website only provides high quality content. Therefore, I will work with the people that are creating the content to make sure that it's high quality content. Obviously, sometimes that's easier said than done. So finding a way to kind of find the right balance there is something that is sometimes tricky. Sometimes there are ways to do that automatically. Sometimes there are ways that you have to kind of manually do that where you say, well, I trust this person to provide something useful. Therefore, I will always accept their submissions. And this person has always sent me really terrible stuff that's maybe off topic or even spammy. Then I will always no index their content until someone sells me it's actually good. So there are different approaches that you can take for user-generated content. Let's see. Just a couple of minutes left. Here's a question from Jarno. Let me just take this one. Does Google support protocol relative URLs? Yes, we do support protocol relative URLs within your web pages. So within a sitemap file, we assume that they should be absolute URLs so that the sitemap file could be hosted anywhere, and we can still process it. But within your website, if you're linking to pages, if you're embedding content, then protocol relative URLs are possible. In the past, protocol relative URLs were recommended, especially if you're kind of moving to HTTPS. And nowadays, people say not to use protocol relative URLs because you're essentially saying it's also available on HTTPS, then you might as well be explicit and say it is on HTTPS. So that's something where I generally recommend to move from protocol relative URLs to specifically the absolute URL as well. And especially in the sitemap file, as in this question, it needs to be the absolute URL. All right, last question. John, actually this question was regarding no site search box link. OK, I have been trying a website that has been using this tag in the head section of the page. But still Google is crawling this website, is still showing this in search result. So what type Google takes to remove a site search box if we are using no site search box? Usually that takes a bit longer than normal crawling and indexing. So I've seen in the past situations where it takes maybe two, three, four weeks to actually be reprocessed because it has to apply to the whole website. And sometimes that takes a bit longer. If it's been in place for much longer than that, then I would send that example to me so that I can take that up with the team and say, hey, why aren't you following this meta tag? Or maybe there is something wrong with the markup that I can let you know about. OK. All right, let's see. One more question from Mark in the chat. We've seen an uptake from garbage links. Over 3,000 in the last several months of people doing negative SEO toward us. So we need to disavow these links. So in general, we recognize this and we just ignore that on our side. So for the most part, you don't need to do anything. If you're really worried about this, if you're seeing this in your metrics and you're saying I am kind of scared about these links, I don't trust Google to do it automatically, then I would just disavow them. If they're disavowed, then they're really not processed on our side. You can kind of sleep well at night and don't have to worry about that at all. So that's kind of what I would do there. If you've really seen them and you're really worried about them, just disavow them. If you're just randomly seeing them and it's just a small part of the links to your site, I would just leave them alone and assume that Google's algorithms will take care of them. Awesome, thanks. All right, I need to head out. Someone else also needs his room. So I'll be closing the Hangout now. Thanks for all of your questions. Thanks for all of the submissions. We have another Hangout planned, I believe, on Friday. And then, of course, a normal cycle every two weeks. So maybe we'll see you again. Maybe we'll see some other people again. And thanks again for joining. And hope you have a great week. Thank you for hosting us. Bye, everyone. Bye.