 Hello, and welcome to this week's JavaScript SEO Office Hours. My name is Martin Split. I am in the Google Search Relations team, and I am a little bit of an expert on the JavaScript rendering and indexing side of things. So I'm here to answer your questions regarding JavaScript SEO problems. And you can ask your questions either in the YouTube posts that we do before these meetups, or you can join these live recordings by joining the link that will be posted shortly before the actual hangout in the YouTube post comments. Awesome. Thank you very much for everyone who's joining. I see that we are a small select group today. There's not that many questions in YouTube, so we'll see how much content we have today. Let's start with the YouTube question. Hi, Martin, from the QuickLook on the Frontify theme built with React.js, 2020.frontify.org. Can you see any SEO issues there or recommend some improvement? This is actually, I think this is a good opportunity to give you an idea of how I would look into these things. And let me just start a screen share, because I actually haven't taken a look before, so I'll just do this now. And the question is if I can actually get that Chrome window that I just opened, no, it has to be. OK, in that case, I just clicked on the wrong thing and actually have to share an entire screen, which as a sorry, an entire window, that's fine. So I would start by looking at the theme itself. So the theme itself is 2020.frontify.org. No, was it not? 2020. I'll just click on, actually, I'm just going to copy the link and then we'll see what happens if I made a mistake. OK, it's a redirect thing. Yes, I want to go to that side. There we go. I must have mistyped something somewhere. Frontity, not Frontify. I don't know why I said Frontify, aha, OK. So a very quick kind of litmus test would be to just run a Lighthouse audit. And in this case, actually, you know what? I'm just not going to run all of these. Maybe not progressive web app. We don't care that much. To just get a rough feeling for how does this website generally work, how well does it work, how fast is it? That looks pretty good. There are already, so the thing is in the SEO audits in Lighthouse, they are very, very basic and they are vendor agnostic. So a lot of Google-specific stuff we can or don't want to put into Lighthouse. But even here, we see some suggestions. That's a mobile usability, potentially mobile usability issue where some of the tap targets are not large enough. Most of the audits pass. So we do have a viewport. We do have a title. We do have a meta description. In this case, that would be sort of set up, kind of an issue, not really a theming issue. Links have to script the text. Robots, txt, valid, outs, attributes and images, hreflang, eligible fonts, avoids plugins, so it's not using Java or anything. In the performance, it looks pretty good as well. First contentful pain could be a little faster, but that's OK. First meaningful pain, yeah, CPU idle, sure. So it could be better, but it's pretty solid. I wouldn't worry about that. So that's like the very first stage of taking a look at this that I would do. And then the next thing that I like to do is I like to ask the mobile-friendly test if there's any surprises, any problems with actually getting to the content. Oh, great. Yeah, and I see a car there, and I see a car there, and I don't see cars anymore. Oh, OK. Google doesn't trust my guest browsing window. No more crosswalks. Seriously? I don't think I have more buses. OK, this. Oh, that was a cheap trick. Is that a bus or is that a truck? No, it's a FedEx. It's a truck, OK. I wish I could machine learn these things and not have to fill them out. Are we good at some point? Or are you showing me more buses? No, that's the red light. So that's what? Ah, that is frustrating. OK, sorry. I should have just logged in, and then I would have avoided all these funky. What? OK, I'm sorry. I'm a robot apparently. Maybe it's because I spend too much time with Googlebot. Who knows? What are you? OK, nothing. It's not a bus. Good. I think I'll cut that from the video because that was just stupid. And here we go. Come on, come on. But yeah, so performance wise looks good. The screenshot looks all right to me in the very first get go. Mobile friendly, no loading issues, no surprises. There is something going on with some JSON issue, but we don't care. Like, it doesn't matter for SEO. Yeah, the HTML looks pretty complete. There is, can I see if they have a description? I expect computer getting slow right now. Oof, come on. Well, there was. So we could figure out what this is, where the data RH comes from. I'm not sure, but it doesn't really matter. Like, it looks pretty solid from an SEO perspective. It's fast. Maybe look at the tab target sizes where Lighthouse was complaining because that could also be a mobility issue. But besides that, I think this is an OK theme. And that's more or less what I look for. So I look at the rendered HTML. I do a smoke test of the screenshot, and Lighthouse gives you a few pointers as well. So good question, but I don't think there's anything to look at specifically here. I guess then the tricky bit is probably setting it up correctly with your actual content, but the theme itself seems to be fine. Do we have questions from the audience? Any of you having a question while we are at it? I've got a quick one. About how many resources and stuff load in the page, that if there's any kind of limits that you recommend to make sure something's going to render properly. Good point. I've kind of got some arbitrary figures when I look at it. Try and keep them under 50, and try and make sure they fire in 0.2 seconds. And that seems to be a relatively sane target. And under that, they tend to get reliably everything. But that's not always the case. And you see pages with way more successfully always render, and sometimes pages with less that don't. So I know there's a lot of nuances to that, but I don't know if there's any kind of guidance. The general guidance is the fewer, the better, I would say, with the asterisk that you want to split reasonably so that caching is effective. So basically, we don't really have a heart limit or anything. It's just the more resources there are, the more likely you are to experience that something does not load, or if you're looking from a user's perspective, there's a chance that the network cuts out or some transmission error happens. So the fewer resources, the better, but be reasonable. It doesn't help to have everything in one huge file because then you can't effectively cache. Because then if one thing in this large file changes, the entire file needs to be downloaded. So Googlebot generally tries to be smart about these things by caching very aggressively. So even if one render fails, we will see that it fails, but then we have at least cached a bunch of resources already, and then we would retry, and then we would probably get the rest of the resources. If crawl budget is a huge issue, and that is usually true for websites with millions of pages, then that could be a consideration, like trying to keep the number of resources as low as possible. For websites that are relatively small, that's not an issue. And for us in general, we are trying to fetch everything we can, and we use aggressive caching to alleviate the number of resources. So that's why you see websites with lots of resources doing well, as well as websites with a small number of resources. And sometimes things just go wrong, and then it doesn't really matter how many resources you have. If it's that resource that is critical, and we are not fetching it, then we have to retry. But we are retrying, so that's the least good. But that's a good question. Yeah, it's a bit of a very subjective thing, and a very nuanced thing, as you said. You want to make sure that you are taking a look at your specific situation and judge based on that. There's no general formula or silver bullet that can be applied to this question. I could say it depends. Right, one last YouTube question, and then I'll ask you again for audience questions. Thanks, Dave, for the question. Hello, Martin. I have a new site where in each article, comments are enabled by users. So users can comment on articles, OK? These comments are grouped in the HTML code within a JSON-LD called comment with different attributes like orthotex and date. I've seen at Google about renders these comments in the HTML view from the mobile friendly tool. But I don't want to be indexed in search because there are comments with no SEO value for the page. How can I manage the situation? Generally speaking, you don't have to. There are, like, the fact that we renders something. I mean, generally speaking, we index everything that is on the page generally. But if the comments are user-generated, we are relatively good at figuring out that this is user-generated content. And B, it doesn't really matter. It doesn't hurt to have these comments there. It doesn't give you any benefit. But I would not. This is one of these cases where I think you're looking at a non-problem. And when there is a non-problem, then you don't have to solve anything. That's why I would advise against trying to come up with a clever way of hiding. There are definitely ways of hiding things from Googlebot's view. But I just would not. Because there is a chance that you are shooting yourself in the foot by creating something that is less stable, less robust, or that you accidentally overshoot. So for instance, to give you a very simple example for this, one thing is, if these comments are fetched by a script on your website that's like you have a comments.js file that actually fetches these comments from the back end and puts them in the content, you could robot away the comments.js. And then we can't fetch the comments.js. And so we can't execute that JavaScript. And so we can't actually fetch the comments. But if you play around with your robots.txt, you might end up accidentally, maybe, blocking more than just that. Or like, I don't know, block all the JavaScript. And now your main content doesn't show up anymore. So then in trying to fix something that wasn't a problem in the first place, you actually made the problem. Or you created an actual problem that wasn't there before. So I would advise against worrying too much about this, unless you have very, very good reasons to. Like, I don't know. If the content is very hard to distinguish from the main content or something like that, then we might not see that it's user-generated comment content. But if it is just comments as you would normally mark them up and there is sufficient content on the page, then don't worry about that. It's more risky than useful. All right. We have exhausted our questions pool from YouTube, I think. I will check the previous JavaScript SEO office hours, because I know that sometimes people post after the actual Hangout into these posts. But do we have an audience question in the meantime? Search news was playing. That was loud. Do we have an audience question that we could look at today? Fair enough. Let's have a look. Maybe we have something previous one. I think I answered that one previously, but I can answer it again. My crawl rate is very low. I lost traffic. Is it because of that? No. Crawl rate has no quality indicator or signal of something positive. It's possible that your server was responding with a 500 for some requests, or it's possible that we just have a version of the content and know that the change frequency is low, in which case, why will we crawl again? It doesn't hurt. It's not a problem, per se. If you're running a single-page application and a client-side rendering and the HTML preview shows a blank page, you want to figure out. So I'm assuming that you mean the rendered HTML in any of the testing tools that we provide, like the URL inspection tool, the mobile-friendly test, or the results test. If you see blank HTML or an empty page HTML, that doesn't have the actual content in it. Check if you are robotic any of your resources, like any of the JavaScript that fetches this stuff. If anything wasn't loaded properly, that's something that you want to look into. And last but not least, just double check what could be the reason for this. I mean, I can't answer that question because it's like saying my car doesn't start what's happening, and it's like, it can be anything. It can be the battery is dead. It can be that you have no fuel. It can be that it's still locked somehow or whatever. It can be so many different things, and that's the same here. You want to be a little careful with that, but basically just try to make changes in the application and simplify things in the application until you get a positive render that includes the HTML and render HTML, and then try to figure out what breaks it. I use a technique that's called bisecting. So I have a situation here or a version here that doesn't work, and I know I had a version here that did work. Let's say there's six months in between. I jump three months back. Does it work or not? If it works at this point, and that means that my search radius is now only the last three months. Then I go one and a half months back, basically I jumped from, so now I'm three months back. I jumped one and a half months to the front. Still works here, so it must be in the middle here. Then I basically just like iteratively jump closer to a version that will break. And eventually you find the versions like, works here, doesn't work here. And then you know what the changes are that were made between the working version and the broken version. And that way you can figure out what could potentially be the problem. And this is why I'm so happy that we have the newer tools, because previously you had a screenshot in the old fashioned render, and that didn't really tell you anything. It's just like, yeah, it's broken and I see it's blank, but that's pretty much it. And then trying to figure out what specifically broke that last bit that pushed it over the edge was not that easy. It's a lot easier. Now you can use tools like LocalTunnel or Ngrock to put your local development version into a publicly reachable URL. Temporarily so that you can try this out with the testing tools, but yeah, that's what I would do. All right, if we don't have any further questions from the audience, then I would say this was a short office hour, more like an office half an hour. Any questions from the audience? So just a very quick one, that's okay. When you call something like an API, if you have a noindex ex-robots header on that API, that's not a brilliant issue. Is it for rendering? I don't think that's an issue. I haven't tested this. That's a very interesting question. I should definitely have a look at that. But normally we would fetch the API call because it's not a robot, it's just noindex. And then we would get the data back and use that in rendering, but I don't think that noindex on an API makes any difference, really. That's what I would expect. I would test that because I'm not 100% sure about this, but from logic, that should be how it behaves. Good question. I like questions where I don't know the answer and I actually have to test. It's good. Awesome. In which case. Thank you very, very much for joining. I'll post into the youtube.com slash Google webmaster slash community the thread for questions for the next office hours which will be on Wednesday the, ah, what day is that? Third of June or something, I think it is. I think it's already June, but I'm not sure. Where's my calendar? Next Wednesday is the third of June. Hey, I'm actually not bad. So on third of June, we'll have another JavaScript SEO office hours. I'll leave a thread on the community tab of our youtube channel where you can ask your questions. And then I'll also post the link to the hangout there if you want to join the recording. Thank you so much for joining. My little audience here in the Hangouts call was a pleasure. Stay safe, stay healthy, have a great time. Bye bye. Bye bye, thank you. Thanks for joining. Bye.