 All right, welcome to today's Webmaster Central Office Hours Hangout. Today we have a special guest from Quikflume. We would do... Sorry, I keep muting you. Actually, maybe we'll do it in English because maybe not everyone here speaks German. Do you want to introduce yourself briefly? Yes, my name is Thierry. I do SEO at Scouts24 Switzerland from Game of Scouts24.ch. And my name is Tobi. I'm head of SEO of Quik.ch, which is a Swiss publisher and probably the biggest one in SEO. I'm Eric. I do SEO and paid search at a small agency. I'm Sandra. I'm from Berlin and I'm working for Stamphly. It's an old traditional print enterprise, but we're transforming it to digital. So I'm leading all the digital stuff, marketing, sales, software, websites. I'm doing SEO. My name is Stefan. I'm SEO team leader at Scouts24, working together with Thierry. And yeah, we're taking care about the auto scout and more scout websites. And of course, motor scout. Hi, I'm Martin. I work with John here at Google. I am specializing on JavaScript rendering and JavaScript SEO and PWAs and stuff. So yeah, I'm here to answer questions in that direction if you have any. Awesome. Cool. Do any of you who are live here in the Hangout want to say hi briefly? Yeah, my name is Imad. I'm from Canada. It's my first time joining here. I got two questions, John. I've been at trying to figure out SEO for the last four years. I finally did figure it out. But I need to have two questions. Yeah, I've been at SEO for the last four years. All right. Do any of the others want to say something? Hi, I'm Michael from Munich and I worked small company selling culinary events. And I joined the English webmaster Hangout because the next German one is too far in the future. So, hi everybody. Hello. Okay. And someone joining from Vietnam. Drop the message. Okay. All right. So I guess, I guess we can get started. Maybe we'll just start here with the room. What's on your mind? What kind of questions have you come with? Last time the German Hangout, I already had a question regarding the image site maps. And if it's fine, we'll start with. And my question would be like, for which websites does it make sense? Where could be the added value for the webmasters? And does it still need those kind of site maps today? Is Google already smart enough to find the image URL via the respective source code on the page? We do find it with the source code. But the image site map helps us a little bit to understand that it's really like something that you want to have indexed. So I would see it more as a decision on your side. Do you want to have your images indexed or not? And for some websites, the images are not that critical because it's just decorative or used more for design. But for others, maybe people are searching using image search and they'd like to find something on your website. Okay. So I think it's something that a lot of SEOs don't spend too much time on, which is kind of, I think, a shame because for some types of activities, people search with images quite actively. So it's really something to kind of think about how do you want people to come to your website and how might they search with images for your website? I also have an image question. Okay. What about the size of images? Because we have discussions with designers that you have really high resolution pictures on big screens. Then we do it automatically smaller for mobile. Does Google realize the different sizes? Yeah. We try to recognize the different sizes and we can understand that the same images available in different sizes usually will try to pick a higher resolution image for image search just so that we have something nice to show to people. But that doesn't mean that you need like gigantic, more like for a larger sized user kind of that size image and not something where you have to deliver the original files. Yeah. It's being crazy. Yeah. What about X robot text to no index images? Because we have the case that we have also multiple versions of the image and we want to make sure that we don't get in trouble for legal cases if we must delete. So we want to make sure that just the version we allow is in Google search. Would you recommend to do it with something like this or? I don't know if we would use that. It's a good question. So usually what I'd recommend is using robots text to just block the other versions. Especially if you have like different sizes and different folders or different URL patterns, maybe that's either recognizable. But I don't know if we would use X robots tags for that. But you would respect it or? I don't know how images would process that. Because we primarily use the X robots tag for web pages. Okay. So. Let's see. I think it's somewhere implemented in our case. I want to check. I mean maybe it works. It's just the first time I've heard of people trying to use it like that. I know it works for things that other things that end up in the normal web search results. Like PDFs. Where you can use X robots tag header or the canonical header. But I don't know how that would work with images. You would recommend the Robustix T. Yeah. Okay. The Robustix T is just saying don't crawl it. Yeah. But does it also then say don't index it? Well, for images, we can't show anything if we can't crawl it. Like for web pages, we can guess and say like this is like a title that we found from links. But for images, we can't guess based on the anchor text. You have an old text that says like a picture of a beach with a boat and it's like okay. You can't really guess what it might look like. Cool. I will have a question. It's a Google cache. Okay. We have noticed that since we moved to React, our category pages show an error in the Google cache after about two seconds. So first they displayed the page and then two seconds narrow page. And is this a problem for Google? Our depth since this is due to our phishing protection. I suspect it's something with JavaScript on your side because if the page loads initially, then that means the cache version is okay. And we can run some kind of JavaScript on the cache page if it's within the cache page. So probably that JavaScript is running. And for whatever reason you're picking that up and turning it into an error page. Sounds like the phishing protection fix. Maybe something like that. Yeah. Maybe something like that. I don't think for search it would be a problem because if we can render the pages and get the content, that's fine. But it might still be what it's looking at and trying to fix. Okay. But in general also with JavaScript sites, the cache version is the static HTML version that you gave us. So if the whole page is built on JavaScript, then we might not be able to show that in the cache anymore. Okay. Maybe we can move to some questions from you all here. John, yeah. So my question is basically changing a page. So if I have an affiliate page, I notice that if I do a minor change on a page, say add a few things to a product, I notice that I have a drop on ranking. But it comes back, comes back up. However, I do want to change my entire product as it's no longer best. I'm concerned if my page will go to the second page because of that. Does Google reevaluate the links that are coming to that page after the page is changed slightly or significantly? Does Google reevaluate the links that are coming to that page? So much echo. Maybe I should mute ourselves while people are speaking. Okay. In general, what happens when you change the content of the page, we will try to reprocess the page and re-index the page. And sometimes that does mean that we have to reevaluate how we rank the page. But it's not the case that we would move any existing pages just to page two or drop them in ranking just because they changed. We're just trying to understand what has changed on this page and index it in that way. One thing to maybe watch out for is if you're also changing the URLs on these pages. So if you're moving things around, if instead of updating the existing page, you move it to a different location, then that does mean that we have to reevaluate everything. And then you will definitely see some at least temporary changes in ranking. Can I jump in because I had actually the exact same question. So let's say you have an established page that has been ranking for several years per second page. And then you decide to overhaul and improve the whole content of the page. Like you go from 1,000 to 2,000 or 3,000 words at videos, everything better. So it's my understanding that the real reevaluation will take place. And I've heard from friends that they lost a lot of rankings when they did exactly that. So would your suggestion be to better add the new content step by step? Or what's your take on this? You're not running to an accumulation and lose rankings for some time? I mean, usually the idea is not that we drop a page in ranking just because it gets updated. Usually it's more a matter of the new page is different and the new content is like we try to take that into account. So that's something where if you improve things for search, if you work with clear structure, add headings, and I'll take some images, all of those things, then you should see it go up. So it's not the case that every change that we see on a site means that we will kind of drop it in ranking first until we have it understood. So it should be something where if you're making changes, you can see positive effects. You can see negative effects depending on what you change. Not like you start from zero again? No. If it's the same URL, if it's the same page, then we start with the same basis. We say, well, there's new content here. We have to see how we can work with this new content. And that's kind of normal. A lot of pages change. And what's the difference to try to do the change step by step if you add small terms of new content, step by step to avoid re-evolution? Is there any difference in the outcome at the end? No, that shouldn't be. So it's not something that you can really avoid because every time we re-crawl those pages, there are small subtle changes anyway. So we have to re-process it every time we re-crawl it. Even if it's small changes, even if it's just the date on the page that is changed, it's a new page, we have to figure out how to rank it now. But it's not that we throw away all of the old signals that start over. It's more that the content has changed. We can take some of the signals and keep them. And some of them are based on the content. And we have to work with those again. Why does it drop in ranking sometimes or oftentimes? I think that's really just a matter of us understanding a page. Sometimes we think a page isn't as relevant anymore as it once was. And sometimes we think a page is better than it was before. So that can happen too. So usually my recommendation there is to say that to not artificially slow things down, because if you're improving the content, if you're making the page better, then why would you kind of hold that back and not show it to search engines or not show it to users, but rather make those changes? And if there is a temporary drop, then that's something maybe you have to wait it out and see what happens. Maybe it's also a sign that the changes you made maybe weren't that good for SEO things. So that's something where sometimes it makes sense to do it step by step so that you can recognize this fairly quickly, especially if you're changing a big website. If you change the whole website at once and it goes down in search, then like a lot of work to figure out what happened, what went wrong. But if you take maybe 10 or 20 pages and you change those and you see, oh, this worked and this one didn't work, then you can kind of improve things until you're finished. So continuing on that question. So my problem is I use the team on this blog. Basically, the team is open content on the homepage and then one, two, three pages. You understand what I'm saying? Like open, I got a blog and that basically shows the whole thing on the homepage and then it triggers down to other pages. Now the problem is if I want to change the entire team of the website, would that have an effect to my entire ranking? Okay. So changing the theme of the website. Sure. That can have an effect on the ranking as well. Most of the time, the change that you would see there if you change the theme is all of the internal links are suddenly different as well. So maybe you have a lot of internal links in the menu on the side and with the new theme you just have a smaller set of internal links or related topics that are on the bottom of the page. Maybe you have more or less of that. So that's something where if you change the theme of the site, then all of those internal links can theoretically change and that can definitely have an effect on how we crawl and index the site. So basically the site that I define open blocks scenario. Is that okay with Google having that type of layout? Because right now it's ranking and I don't see a problem but I think there's more potential for it to rank. I think for some reason Google is holding, because I've tested many different type of websites and it's the same amount of link power. It's not getting the same amount of result. I don't know. I think that's always hard to say what kind of the potential is of a website because that depends a lot also on the general niche that it's active in, if it's very competitive then even with the same website that might work really well, that might be really hard. So that's really hard to say. Okay. Second question as I'll be my last hopefully. Topic variation on an affiliate website. So if I'm talking about best of kitchen stuff and then I want to talk about best of living products, how much variation can I go? I mean, I remember seeing Matt Cutts blog post about health and how important that is and I greatly appreciate that. Like radiology and those topics are health related and they need to be provided to users with great integrity. I mean, if I'm putting up products about living room and there to help you with anxiety, would that go into a health? And I mean, it is an affiliate product line but would that become a problem and would I have an issue with that? It's hard to say. It sounds like you're headed in that direction if you're trying to position your website as like, I don't know, interior design specifically for certain kinds of health issues and you're kind of headed in that direction where if you want people to take you seriously about your input on health issues then maybe you need to position yourself as actually being serious about these and not just randomly providing other things as well. I appreciate that. That's exactly what we're doing. But I wanted to make sure before I jump into that scenario I fully preferred that I'm not going to get penalized because I've put out something that's against Google's policy. So I wouldn't see that as something where we would have a manual action on a website because of that. So kind of the traditional penalty situation that doesn't sound like that. The penalties mostly apply specifically to Webmaster guidelines and I think what you're headed into is more around kind of the general quality and how we think a website is relevant for these specific search results and for that there's a very broad spectrum available. So it's not that we would penalize a website for having a certain mix of topics. Thanks, John. John, I would appreciate if you can drop your email in the chat box that you had. I wanted to share some website with you. Okay. John, can I ask a follow-up question on that around where as well as changing content you're also changing a domain. So we've got, I've got a client at the moment, a big retail company who have also got a services arm and they've got two separate websites for retail and services. They want to amalgamate them both under the retail domain effectively. The services site ranks really well and they want to sort of shift it as effectively they're sort of going under a single brand and a single domain. It is, what does Google see as sort of the most important? Is it the content? Is it going to be how they structure the incorporation of the services site within the retail site? Obviously we're going to be doing, you know, 301 redirects from the existing site over to the new one, et cetera. But where should we be sort of looking to concentrate the efforts to make sure that we don't suffer too much of a drop? So I guess what I would focus on there primarily is to make sure from a technical point of view that you're really kind of putting things together in the right way. In particular, when you're merging sites like this, you would expect to see some drop in search because we can't just take the two sites and add them up and use that as a signal. We essentially have to reprocess everything and look at the new site the way it is. So to make sure that that works optimally, I would really work hard to make sure that from a technical point of view you're doing everything you can to redirect pages on a one by one level to the new versions. And that you're not blocking anything from being crawled or being indexed in the sense that if it was indexable in the old site that it should be indexable in the new site. So I would assume from the content point of view you're probably covered, but from a technical point of view a lot of these small redirects sometimes get lost and that makes it harder for us to really combine all of those signals. Where you've got a site with, so these sites have got a lot of pages, can we adopt a sort of triaging approach to the redirects so going for the priority high level category pages first and then working our way down into the lower level pages? I think that would make sense, but I would try to use tools as much as possible to really get the broad coverage. You're really sure that everything is properly moved over. Okay, thanks. Cool, okay. Let me take maybe a question or two that were submitted so that they have a chance to. If a web page mentions your brand but doesn't actually link to your website, would that count as a positive signal? So I mean, we would pick that up as a mention on another website, but it's not a link. It doesn't pass any page rank. It's kind of not really the same thing as a link. I've noticed our AppSpot staging versions are showing up as backlinks to our site. Do you think this is an issue? Should we disavow AppSpot.com? You don't need to disavow AppSpot.com. If that's your staging site, you don't need to disavow those links, but it does sound like we're maybe indexing your staging site, which probably you don't want to have happen. I've seen lots of weird problems come up when that happens. So I'd recommend blocking that in whatever way works best for you, which could even be blocking it by user agent if you can do that or blocking it by robots text in the worst case so that it doesn't get indexed there. What's the correct way to use breadcrumbs from an SEO point of view? So home category item or home tag item. Both of these could work. That's essentially up to you how you want to structure your website, how you want people to navigate through your website. Do search suggest queries include voice queries than via voice search, Google Home and Google Assistant? I don't know. I don't think so, but I don't know. Any of you guys know? Yeah. I think a lot of the voice search questions and kind of the queries that come through Assistant, it's probably still something where people are working on to see how it ends up working in the long run. It's also something from the Search Console side. We're looking into how we can show that in Search Console, but it's kind of tricky because we don't really have a search results page when you ask something and Assistant gives you an answer based on this website. This is the answer. How do you best show that in Search Console so that it's so useful? All right. A bunch of questions. How do I redirect pagination from an old website to a new one? Kind of similar to before. You wouldn't really need to look into the type of pagination or what's happening there, but rather try to redirect as much as possible on a one-by-one basis. Would it be best to know index PDFs that have the same content as the website? Maybe. I think that's kind of up to you if you want to have these PDFs indexed or not. For some websites, it makes sense to have PDFs indexed. For others, maybe less so. I would look into how users are using those PDFs. Let's see. Let me just grab a few more. We're currently using Google Tag Manager for structured data. Is that OK or not? Kind of simplifying the question. It's OK. But I think what you need to keep in mind is that the Tag Manager needs to be run by JavaScript, so it only works when we can render your pages, which also means that there's sometimes a delay between when we pick up your page and when we process the JavaScript and render the pages and are able to pull out the structured data with the Tag Manager. That's one thing to keep in mind in that there's a bit of a delay there. The other thing is that you're adding quite a bit of complexity for just a few snippets of JavaScript that you're injecting into your pages, which means things can go wrong along the way, and you might not necessarily notice it when you're looking at those web pages. So that's something where I would recommend, if at all possible, try to get the structured data directly into the pages so that it's a lot easier to diagnose when things go wrong, when things break, and to fix them early on. Then, speakable structured data is our timeframe when it's coming outside of the US? I don't know. A lot of times we test it on the US market, probably because we have a lot of people in the US, but also because maybe they like to try stuff out, and if it works out there, then usually we end up rolling that out to other countries and other languages. With speakable markup, maybe it's also a matter of being able to understand that and being sure that the quality of this content is okay. So what I would recommend there is to implement it on your pages and kind of be ready for when it does roll out for other countries. If we ignore manual actions, what will happen? That doesn't sound like a particularly good strategy, I think. So manual actions are done from the web spam team when they see something significantly wrong that they feel they need to manually fix that. So if you don't fix those issues, then those issues will remain and that manual action will remain and whatever that manual action does will kind of persist. So these manual actions do expire at some point, but we're talking more about order of years rather than weeks or anything like that. So if you want to wait out a manual action for a couple of years, then that's your decision, but probably not a good short-term one. So I personally recommend taking these into account and taking them as feedback and fixing them and doing the reconsideration. I mean, if your website isn't meant to be shown in search, then maybe you don't care about manual actions. Like if you're just trying to do ad signing pages, then maybe that's fine. Subdomain or subfolder? I'm not going to answer that. Sorry. International SEO. At the moment, some of our page titles in the U.S., Germany and other countries have Southern Africa added to the end of the title time. Southern Africa is shown in the search results and it looks like people aren't clicking. We have the page title, hreflang, and international targeting set. What could be happening here? So I'd recommend maybe posting this in the health forum and ideally sending me a link. You can drop it in the Google Plus post for this event and I can take a look there. What might be happening here is that for some reason we think at least for a part of your website, Southern Africa is kind of a site title type thing and that's why we're adding it to these pages. Usually that's more a sign that something with your site structure isn't right where it looks to us that Southern Africa is really a site title and that might be with the URL structure, with internal linking, those kind of things might be something weird. Okay. Wow. More and more questions. Maybe we'll just go back to here for the moment if you have anything. I haven't quite talked about personalization because our plan as a publisher is to personalize the whole homepage for every cookie. So every user will be targeted by interest profiles and stuff we have and then we will show something. In one scenario we will hide all the articles, a user already read. So we have to do cloaking I guess. So what's your idea how to handle Googlebot in this scenario? So the easy part is Googlebot doesn't keep a cookie. Okay. So that might make it a little bit easier. In general, personalization is something you can do if you want to do that. It's not something we would see as cloaking, especially if you've recognized that it's a repeat visitor. If they're logged in or not, I think that's totally up to you. The important part for us is that from a general point of view the content is equivalent. So I think this would primarily affect things like the home pages, category pages maybe. If we still see those as category pages, as home pages for your website that lead to that's not something we would worry about. If the individual article pages also change significantly like someone clicks on an article about Christmas and they end up on an article about Eastern then that would be the kind of thing where we say okay, the user is being misled. But if they're searching for sporting events and they land on your sporting events site and that page is personalized for them, that's perfectly fine. Yeah, the biggest problem is probably the home page because this can really be completely different. If we see that the user is interested in sport then we will probably show 80% of the stuff is sport and for the other one it will be economy news or something. Yeah, I mean that's totally up to you. That's something where you have to work with your users to find the right balance. But from our point of view people would be going to your home page because they're kind of searching for your brand or for general maybe Swiss news and they go to your home page and then your home page is still relevant even if it has different content there. I think especially for news sites that kind of change happens naturally anyway. We crawl it on at one point and we see this collection of articles. The user goes there, there's a different collection because things have changed and that's kind of common. Okay, John, can I ask another question? We had a massive drop in really important index sites and these are highly individual and unique pages. They are our regional category pages like wine tasting in Munich and wine tasting in Hamburg. And then I looked at Sarge Consul and I saw that these pages were excluded and dropped from the index because Google says they are duplicate and the submitted URL is not selected as canonical and instead another one. But the problem is that the Munich page suddenly seems to be identified as a duplicate of another city page like Cologne and these pages are really unique and have individual content and individual links and the events on them are in these cities so there's actually no way of confusing them so I don't really understand what's happening there and we do have individual site maps for each store and we use rel canonical and do put self-referencing canonicals on each page and the content is highly individual so I don't understand what's happening here. Okay, so usually this happens when we think that a part of your site is duplicated so kind of like how you saw and it's not always the case that we look at the content and say oh the content is exactly the same but sometimes the case that we look at the URL structure and we see maybe the city name is, I don't know, a URL parameter and you can enter any kind of word as this URL parameter and still get one of those cities then we might decide that maybe this URL parameter is irrelevant that it's not necessary for determining the content on the page so that's kind of one place where this could be coming from it could also be that maybe for some cities you're showing the same content so for example if there are cities that are right next to each other maybe you show the same content because the events are the same for these two locations and that might be confusing us so what I would do there is double check to make sure that the difference that you have between the individual locations is really clear that the URLs are such that when you enter an invalid city name that you show a 404 page instead of picking another city and kind of making sure that for us it's really clear that these parameters or the path or whatever you're using for that is really always leading to a unique different page Yeah, this is what we do in fact so we use speaking URLs and these pages were identified this only happened some weeks ago and before that Google didn't seem to have a problem with these pages and we have speaking URLs and the cities are not closely connected like Munich is not really close to Hamburg and the content on these pages is really highly different so I don't understand how this can be confused Okay, maybe you can send me some example URLs or if you have a thread in the help forum with the example URLs make it drop that into the Google Plus comments and I can take a look at that with the team because that is something that the team does care about to figure out where we fold things together properly and where we shouldn't and sometimes we get that wrong Awesome, thank you so much Sure It's more generally a question regarding single page applications When do you think Google will be able to read client-side render the website completely with the first crawl for example there is no hreftag anymore in the source core so that SEOs don't have to bother the front and there's any more Right, so we are working right now as we speak on getting rendering and crawling better integrated and everything moves a little closer together I can't give you a timeline on that It's working progress We also try to improve rendering to begin with because if you ask like can you render or can you index single page applications Yes, we can do that today and the rendering gap is not always as big as people think so I wouldn't vary too much about it The problem is that the client-side rendering also influences how quickly we can queue things for crawling so the crawl budget plus the rendering delay is something that is really hard to spot from the outside so you only see like oh my page takes a day or two or three to be indexed that doesn't necessarily have a rendering issue it might be a crawl budget issue it's a tricky one to figure out I've seen that a few times now and even though that's an issue that we understand it might take a while for us to bring things closer together so for the time being if you really care about frequently changing content and large sites with lots of frequently changing content I would still recommend dynamic rendering that's not something that's going to pop and disappear in like a month or so we are working on improving it it's going to take a while I'm not going to give you a timeline I think especially if you have a website that has a lot of content you should really be using dynamic rendering or server-side rendering in general I also just today saw that there was a blog post from Bing a few months back that they were also saying use dynamic rendering for things like that so I really kind of invest in that direction and like Martin mentioned it's not going to go away so social media sites also if you want to share things there if you want people to share URL on WhatsApp then it also needs to be pre-rendered for that if you have the engineering resources though I would highly recommend looking into server-side rendering or even better hybrid rendering because that yields user experience benefits as well because you basically ship the critical content in the first go to the user and then you enhance it with JavaScript to bring the single-page application experience on top of it so you get the benefits of dynamic rendering but you also get user experience benefits and things are appearing faster especially in mobile thanks for the sign but if you can't or if you don't want to gamble the engineering resources then dynamic rendering is a really good idea Welcome Let me take some more that were submitted and then we can go more to here or to the folks on Hangout Our company recently moved and the road our building is located on is not properly mapped on Google Maps When I try to upload the listing it changes the street from way to road We submitted feedback through Google Maps Is there any other way to get that fixed? Probably you would need to go through Google Maps but I don't really know everything around Google Maps so I would perhaps post in the help form about this The times where I've had to get something fixed in Google Maps have usually been really quick and they give a lot of feedback so maybe that's fixed in the meantime How do I remove a site that's scraping my content and cloaking it to Google? Let's see I submitted some DMCA takedowns and was asked to do it on a per page basis but there's so many pages What can we do then? So DMCA is probably the best approach here and it is on a per page basis so there's not really something I can do to take that away from you That's kind of the legal process in general for situations like this Let's see What's the max size of an HD access file? We have a lot of redirects I think this depends on your server configuration but you can also use things like regular expressions to simplify these files With making topic categories within blog content that mimics the core website's product category be a good idea Totally up to you You can do that if you want If that works for you then go for it If it confuses users maybe pick something different Can you analyze the issue for my website? My website got a manual action for spammystructureddata and there's some sample URLs What I would do here is post in the webmaster help forum with that because sometimes the spammystructureddata issues are a bit tricky to diagnose and sometimes it takes some help from other people to figure out which direction you should be going I have a website that lists local businesses together with user-provided reviews Sometimes we have a really large JSON-LD snippets over 100 kilobytes because of all the reviews Is there a limit on the number of items? Is it okay to just include a few of the reviews? I don't know about a limit on a number of items about 100 kilobytes for JSON-LD snippets seems excessive So I would look into finding ways to just paginate these review pages rather than trying to include everything on one single page Alright, two more questions Private membership websites that has a lot of unique content limited but someone goes off and copies it and publishes it What can we do? I think first off you probably need to look into whatever normal legal approaches that you can take there which might include the DMCA but from a web search point of view, we don't know where this content is coming from if we can't crawl it There's not really much that we can do on a web search side to resolve amongst each other directly We're working with a healthcare group multiple primary care locations when creating content for each we're attempting to concentrate on each of the locations unique services How does Google view this in differentiating healthcare service locations for search? I think that's fine different locations and if they're unique in their own ways and you have content for that that seems to be a good fit Alright, I think there was a question here from one of you all that I just saw pop up briefly but I don't know let's see, they're here The spamming structure data markup I think we talked about that Yeah, okay If you have any more questions now is your chance Well Joe, I don't think my question was answered regarding the changing of content For example, you said if I were to maintain all the existing internal link structure on the new theme but the HTML would change Would that have an effect or is the HTML having the effect on the overall ranking or is the combined HTML and the anchor text or the link structure? That's a combination of both of them It's something where the theme can just be a matter of changing the design, if just the CSS changes then the HTML is the same then probably that would be pretty non-issue probably not much for change there but if the HTML changes as well then you could imagine things like the headings on the page being marked up with just bold versus marked up as a heading and those kind of differences can play a role in how we evaluate a page So if you're changing the HTML significantly then that could affect how we rank those pages I mean, it could also affect it in a positive way so it's not always that things will go down but if you improve things then that's good for you too But I think that makes more sense, yeah Cool Any more questions from you all here? Yes, maybe about the progressive web ads How sure are you that they will push through in the next time? What do you mean by push through? Maybe replace the native apps That's a prediction I would not want to make it's one of these predictions where I end up saying something like nobody needs more than 64KB of RAM That's quite a sweeping statement I think what we see already today is that a lot of partners have fantastic successes with switching to PWAs such as Starbucks has higher engagement rates, has higher conversion rates Flipkart had successful metrics, I don't have the exact numbers in my head but there's a bunch of case studies that prove that PWAs improve or increase engagement and conversion rates sometimes really significantly I think Starbucks had double digit percentage increase in engagement day over day or something So I would say they are already successful we are seeing more and more partners into them and so far haven't heard bad things about it I think there is still things to be desired in terms of the integration in the operating systems I know that we are working on that there's the Fugu project that we announced at Chrome Dev Summit this year where we are basically working to improve the capabilities that the web provides with so that we are basically trying to close the gap between native capabilities and web capabilities but with what we've already got today we see significant successes with the partners that are implementing it so that's helping definitely and I guess it's an upward trajectory but predicting the future is bound to be risky Also I had a look regarding someone of you asked regarding the ex-robots tag header in images we do respect that it's good to know for resources we do respect that okay cool and there's a question here in the chat how long will Google how long will it take I think when we fix a manual action so usually that needs to be processed manually because it's a manual action so someone needs to take a look at that and review the changes that you've done and usually you get a reply back within about a week or two so if you fix these issues then I would go ahead and submit the reconsideration request and usually within about a week or two you should have an answer there with regards to yes it was okay we will resolve this manual action or no there are still issues here sometimes depending on the manual action it can also be that we see you fixed a lot of this but you haven't fixed anything so we will reduce the scope of the manual action with structured data that might be a bit trickier let's see another one about internal links and backlinks whether Google has the same page rank recipe I don't know like how we calculate the page rank but maybe it's not something that sites really need to worry about so if you need to figure out the details of how page rank calculations work then maybe you need to work at Google to get all of those details because I mean we don't show page rank externally anywhere there's no toolbar that shows it so it's kind of a theoretical question how can we ask for reconsideration that's within search console where you see the manual actions you can submit reconsideration requests it's fine to do multiple of these if you're working on a problem and you're trying to resolve it on the other hand if we see that you're just going back and forth and like fixing things and getting manual action again then kind of moving things back and forth then it might be that the web spam team says okay they're just toying with us we will ignore their reconsideration request for a while or kind of stop them from submitting more of them so I would see this as something where you really try to fix the problem do the reconsideration and react based on the feedback there I would also get help from the webmaster forum if you're not sure about what exactly you should be fixing there because a lot of these issues you're not the first one to run into this problem and people in the forums they have practice with this so I definitely get help any last questions from you all I have a question about there is structured data called carousel and I just see these for recipes I don't know is it planned to have it for other types so is it a good idea for example for if we have a tag page for a celebrity to add articles about the celebrity into this carousel and maybe we would get not just the general information about the celebrity but we can also offer articles below as a carousel I don't know it's kind of you're making a bet that this will happen so I I just see recipes but in the documentation it's not mentioned as just exclusive for recipes I could see it being expanded over time but I think from a practical point of view you as someone working on a site would have to make that decision and say well I believe in the future this will happen and we will be prepared but you have to know that maybe it doesn't happen it's kind of a bet that you'd have to find your site and maybe it's just minimal amount of work to add that so it might be worth just doing it if it's easy enough but if it's a lot of work and Google doesn't support it at the moment then it's I don't know if that's the best use of this time maybe start with the personal recipes of the celebrities yeah that might be nice you can rank for both recipes or Christmas cookies from I don't know yeah cool yeah any other questions for you all for the moment hey John since I got asked the update that Google had in August, September or trickle down in October some sites had ghost effect of links I'm still seeing sites that are using the same techniques and ranking however they've changed a little bit of a strategy because I guess I can tell how Google picked up those strategy and the other sites are using that strategy with a bit of different technique is there more to expect from Google on this because it's not fair for sites that are ranking with proper outreach and with white head links I would call them and those are grey hat links and they're if they get one link it could be equivalent to 100 links that we may have so it's not really fair for working out or trying to get 100 links to a page and somebody just puts one and breaks it I'm not sure what specifically you're referring to so it's kind of hard to say in general if you see sites doing things that are against the webmaster guidelines feel free to send us spam reports we really appreciate those if you're seeing bigger scale things that are hard to report you can also send those to me directly and I can pass those on to your team that's why I was asking for your email I would just post it in the Google plus comment and then I can pick it up there let me I can put it here too I've seen sites that are within a month or two from 300 less than a thousand to a million traffic on an auction domain it's ridiculous okay like an expired domain type thing okay those are always interesting for the website so if you have some examples they like looking at that it's not so much a matter of us taking these examples and saying one specific site is doing something wrong but rather taking those and saying well this might be a general problem and we might be able to recognize this better on a broad scope with help from some examples that's something where we try from a web standpoint of you not to focus too much on individual URLs but rather try to figure out what kind of is the system behind this so that we cover all of these cases and then just picking that one specific one out alright let's take a break here it's been great having you all here and thank you for traveling over here to visit us I don't think we have any other plans for the rest of the year so I guess we'll see you all maybe next year happy holidays enjoy the winter or the summer depending on where you're located if you have summer that's not nice enjoy I would like summer bye everyone happy holidays