 Please. All right. Hello and welcome to the JavaScript SEO office hours on October 28th, if I remember correctly, or if I can read my numbers on the watch correctly. In these, you have the opportunity to post questions onto YouTube. If you go to youtube.com slash google webmasters slash community, you'll find different threads for the questions for the different office hours. There's the general office hours that John Miller runs, and then there's the JavaScript SEO office hours that you're watching today. You also have the opportunity to join the recording live and ask your questions live. I'm really happy to see that I have three guests in this office hours today. Hello, how is it going? And we have a bunch of questions submitted to YouTube, so I'll go quickly to those who have submitted questions on YouTube. All right. Is it bad to use JavaScript for pagination? Oh, that's a lovely question, because the answer is no. It's fine. Test it. Make sure that you're not breaking it, that you're not doing something that doesn't work or that we can't pick up. But generally, JavaScript pagination is not an issue. Ludwig Larsen is asking a longer question. What page links to products in the raw HTML, the A-link tags are formatted as href, and then double curly braces url. With JavaScript enabled, that placeholder that double curly braces url gets replaced with a real link and correctly rendered as href, for instance, slash category 1, slash probably 1 to 3. These links are followed and indexed by Google. Good. However, in Search Console, the 4.4 error report contains many links, such as category 1 slash double curly braces url, which obviously originate from the raw HTML. Since Google can index the JavaScript rendered links correctly, will it somehow be a back practice to add disallow slash double curly braces url in robots.txt to quickly solve this issue? Well, first things first, if the links get indexed, it's not really an issue. So the error report, the 4.4 error report, is just to highlight that there is something that is a 4.4. That can always happen. It is definitely possible that someone links to a page that doesn't exist. We crawl it and then you see a 4.4 error report for that. That doesn't mean that you need to worry about it. It's not an issue that you need to address. It's just something, if it annoys you, then you can totally look into your robots.txt and disallow urls with that placeholder. But it's not a problem that you need to address. It's more like an inconvenience. Eric asks a very broad question, which is SEO and Create React app. Can you talk about that? Yes. Generally speaking, Create React app is mostly fine out of the box. You do want to add React HTML, React helmet to get some metadata in as well. But if you want to learn more about that, conveniently, I did talk about that in the past. If you go to our YouTube channel and search for the JavaScript SEO video series, there is an episode on React that I highly recommend you check out. Then Ricardo asks a question about something that is actually not JavaScript. So I think I'll skip that. Sorry for that. I should have marked that beforehand. Then we have Manoju is asking JavaScript error in AMP. I'm sorry, that's too broad to answer. If you have details, maybe take that question to the webmaster form as well because that is not a question that makes sense to answer because I don't have any context here. Joseph is asking about the benefits of using Jamstack for a website. He specifically asks about Nuxt or Next.js or Gatsby if there's any pitfalls in these approaches or if I would recommend it. So generally speaking, Jamstack websites are great for websites that have some dynamic data but are not like super dynamic. Next.js, Nuxt.js, both of these allow you to use server-side rendering plus hydration, which is generally a very robust and stable approach. If done right, the big pitfall is that people are not testing things. They're like, ah, it's server-side rendering, it'll be fine. If your server-side rendering output is garbage or if the hydration makes it garbage, it's not going to help you. So definitely test this with the testing tools, be it search console, be it mobile-friendly test, be it rich results test, just make sure that it works from our perspective that the rendered HTML looks like something that you would want to see but generally we do recommend server-side rendering just as a way of having a more robust setup but server-side rendering can have interesting consequences if it goes wrong. It might require a more complicated server-side configuration. It might require a little bit of care in terms of how you are dealing with data sources but there's nothing specific for SEO or Google search that you need to be aware of. Hi Martin, how can we add JavaScript power graphs? Andres asks Andre to a large website so that it will not affect COVID vitals. Well, will not waste our crawl budget and will not make Googlebot treat this content as duplicate. Can you point to some typical pitfalls? One typical, I mean if you add graphs to an existing page that's not duplicate content, that's not an issue. If the only thing that is on the page is a JavaScript powered graph and that's probably not really that useful in terms of content and we might ignore that but if you embed the graph into a page that has content that is in there, that's not a problem. You wanna be careful with cumulative layout shift if the graph container basically just pops in eventually shifting everything on the page once it does that might impact your COVID vitals. You also want to make sure that you're not loading a bazillion different JavaScript files just to have a graph because that way, scroll budget try to have like one bundle for the graphs or integrate the graphs libraries into your bundle so that you have one API call maybe, like plus the JavaScript source that needs to be loaded. And then also use CSS so that the containers have fixed dimensions so that they don't magically appear shifting everything on the side that should help a lot. Also minimizing the amount of JavaScript is generally a good idea. If you can pre-render the graphs somehow, do it if that is not feasible or is too costly then JavaScript graphs is just fine. And then Dave asked the question, I have a client that's planning a range of internationalization features, clothing retailer. The first step is to change the price and currency, shipping and sizing info to match the user's location. That's very nice and they make it swappable. That's good. I like to hear that because I hate it with the passion if I'm in the US and everything is like seven inches and I'm like, I don't know what that means. You might as well say like 20 caterpillars. That's not helping me. I want centimeters. I'm living in the metric world. So they want to do this on the client side using or during hydration, which will differ from the server side content that Googlebot might see. Besides seeing the US version because Googlebot usually crawls from the US, would this be an issue? Generally, oh, read more. Ah, okay, then there's a second step that we will get into in a moment as well. Generally, I would say if these two pieces of content differ, that's not much of an issue. Just be careful that if the rendering for some reason fails, like we can't fetch something, some resource or something, then we might end up seeing the server side rendered content and then you might get like weird movements or even like canonicalization changes depending on if the content looks different enough for us to say like, okay, so this is a canonical version of this one and then there's these other canonical because it's different enough. Second step is they plan to go for the fuller translation during hydration, also update the URL to an international one. Yeah, generally that is fine. I don't see a potential problem right away. Again, testing is key here. And I know that we have Dave here. Do we have follow-up questions on that or? Yeah, it was more on the second one because the planning, if I arrive from France, they plan on updating the URL to French one and obviously doing all the full translation, but they also want to use hreflang, et cetera. My concern was if Googlebot is almost like how the, kind of advise against on something where you, 301 redirect or 302 redirect people, which they generally say, no, I kind of worried that basically I'll wipe out everything because ultimately you're updating the URL I wasn't sure whether Google would definitely treat that as a redirect to sort of the French one and they'd never ever see anything other than the U.S. version if that's where they're mostly crawling from. I mean, they're adamant that they really want to do it, but it, and I can see it's a nice experience. Once again, they've got a little switchable thing, but that relies on cookies, which obviously aren't going to work on Googlebot. So, it is a nice, from a user experience, I can definitely see their arguments. It's a nice experience. It's kind of not having to, are you from the U.S., are you from here in a big manner? It kind of takes in there, does still give you the opportunity to go where you want to, but I do worry that for Googlebot it's going to be no different, so if you just force redirects and then moulds the U.S. version. So the only solution I could sort of come up with was try and prevent that happening for Googlebot, which seems a bit fragile. I never really liked doing that kind of thing, but would ultimately that cause an issue if that was the only solution? I'm not sure how big of a problem this might be. I would expect, like it, try it out if you can. Try to see what happens when we run into situations like that, and if that is what you expect, then don't build additional brittle bits and pieces in. I think a force redirect for Googlebot would not be too brittle, depending on how it's implemented. Yeah, good question. No, I can't foresee any obvious problems with this, to be honest. But then again. What I'm trying to get into is just launch the U.K. version and then maybe just roll out a country or two and maybe see where we go from there. As they'd love to know it all up front hand, but I guess we don't really know up front hand, so. Okay, thank you. All right. Yeah, I mean, staged rollouts are generally a good idea. These all or nothing things are, or have a tendency to break in interesting ways. Now, I've got three questions from Asaf to finish up the YouTube backlog, so to say. First question is the JavaScript site with client-side render setup. It's using dynamic JavaScript to change the title and react and uses the React helmet component for description and other social tags. Sometimes in search results you see the default title, which is what is in the initial HTML I'm guessing. And the description looks corrupt as well. Well, I'm not sure what that means. The question is just taking the helmet tags higher in the head section will make it more stable. How can I improve the stability for showing the correct title? First things first, depending on where you see these, as in like if you're seeing these for actual search question queries, or if you just see them in the site list of things, you should be more or less worried. If it's for actual queries, that's something that I would be a little worried about if you're seeing it in the site query operator, then that's not really an issue because we usually don't use like default bad titles and descriptions when the actual queries are being used. So I wouldn't worry too much about that. If you move it up, I don't think it makes that much of a difference. I don't think that we go by that. It might just be that we are sometimes picking up things without rendering first and then rendering takes a little moment until it actually kicks in and then we are updating the index updates take a bit, talking minutes here, not hours or something. So moving it up doesn't really make that much of a difference. I think if you can server-side render the metadata, that's definitely a way to avoid the situation there. If you can't, then again, I wouldn't worry too much about it. Another related question to pagination. They use an infinite scroll. They want to add classic pagination at the bottom of the page with push state setup. We don't really have a tutorial on this, but basically it doesn't, again, it doesn't really matter if you have a classic pagination with pages one, two, three, four, five, six, seven, and so on and so forth, to 10 or just next and previous links because one, like the pagination with more links to different pages just allows us to discover these a little quicker, the different pages. Whereas without that, we would have to do multiple cycles of crawl, but it doesn't really make that much of a difference. I know that for e-commerce, we are working on guidance on this, but I can't give an ETA on when that documentation will be available. All right, I think those were the questions from YouTube, let me refresh because sometimes some questions trickle in after I started, no, okay, fantastic. In that case, let's open the floor to questions. Do you folks have questions for me? I'll have a go at one. I'm not a super expert in JavaScript. I've been recently playing with CDNs and something I realized was things like static resources, they're great for that. You get the resource closer, but if you're looking at dynamic stuff like JavaScript doing APIs, I get the feeling that the CDN actually slows things down there, so there's a kind of compromise because you have to go via the CDN network. Is there any kind of best practices? Do you do API calls on a different domain or something? So my general guidance for these kinds of things is CDNs are more, for me, are strictly for static assets. So your JavaScript files, your CSS files, your images, your videos, your fonts, this kind of stuff goes onto your CDN. Also because you can cache it, these things, tend to have a tendency to be cacheable longer. With JavaScript bundles, that's debatable, but let's say CSS doesn't change that often, probably. Fonts do not change that often. Images do not change that often. Sure, there's more images probably, but I rarely see an existing image being updated every week. So these are really, really handy because they speed up things. For API calls, I would put them on a separate domain. I would have something like API.example.com instead of just putting it on example.com. Simply because browsers have a connection limit and actually Googlebot, the same thing, has a connection limit. So when you, I think for Chrome, it's 12 connections, six connections. So say you would load lots of images or make lots of parallel API calls. You could do like six of them, and then afterwards they would have to stall until the first one is finished and then it basically goes over. You can avoid that by splitting it out to a separate domain. To be fair, if pretty much the only thing you fetch from the main domain is the static HTML and then all the assets come from the CDN, then you should also be fine with doing your API calls from the main domain. Depends a little bit on what your setup looks like, but in general, I would suggest to try to design an API in a way that minimizes the amount of network calls that go from the browser to whatever server infrastructure you've got. That sometimes collides with clean API design, especially if it's a REST API, you want these like different resources, but you might need the bundle of resources. One way to ship around this is a GraphQL API where you basically send one request and say like, I want all these things, and then you get one response back with all these things. The other way would be to build like what I call an API proxy or an API facade. They say like, get data for this screen and then it makes the actual like internal API calls that it needs to make to fetch all the different bits and pieces and then it gives back one response. Yeah, but unless you have lots of API calls that isn't really a problem, but then again, the CDN surely slows things down because you shoot to the CDN and then the CDN basically becomes a proxy and I don't see the value of that proxy, especially if it's API calls that you can't cache. If it's something that you can cache, then maybe like, I don't know, stock prices you can probably cache because every user will more or less get the same result within a certain time window. Well, yeah, I've seen of common practices to put a version number on static resources so you can have them, you don't have to worry about. Exactly, you can cache them infinitely because if they change, then file name changes. Yeah, that's a really good practice. I also found a thing by default, CDNs will not cache HTML resources because they vary, they might have user specific stuff and things. So I was thinking the same with that that that should be on a different domain to your static stuff. And it's probably, I'm seeing this because Australia's the wrong side of the world and I've got servers in the US, CDN here. And so it's, I found HTML going via the CDN would add half a second to the basic request and just go tossing it up. Faster images, slower HTML. So I'm thinking about this idea of split, like you saying use sub-domains and things like that. Yeah, that's a really good common practice. And as you say, it can differ quite a bit if you're pushing non-cacheable things to those CDNs. That's a good observation. And yeah, I know how you feel, we had customer, I used to work for a company that had a few customers from Australia and we ended up actually like basically spinning up a few instances on Amazon in Australia just because that's the only way to like make it reasonably fast without much change to our code or our infrastructure. Fun times. I think Natalia had a question as well. Yes, I have a quick question or maybe not a quick question, we'll see. I'm wondering because very often when doing page speed reports, JavaScript from third parties is flagged. And particularly I'm referring to a case that I've seen many times in international SEO where we use for example Google Analytics and also Yandex Metrica or different third party scripts we cannot really change and optimize. And I wonder if there's any best practice because to kind of solve this issue with the speed because typically it's making the results terrible and it's always a battle to convince the developers to keep the codes like that, like Yandex Metrica for example. It's typically like 20 second difference in their page speed results. So I wonder if you have any advice on that or any concept how to maybe execute it so it doesn't harm there. Oh, that's a really good question and definitely not a quick one because it comes down to a very subtle, very delicate balance. On one hand you want to load these things as early as possible so that you have a good chance of catching everything that happens on the page. On the other hand, it makes things slower. That's just by the nature of these scripts. Doesn't matter if it's Google Analytics, if it's Yandex Metrics, if it's HubSpot, if it's HotJar, it doesn't really matter which one it is. They are working really hard on making these things as fast as possible but any JavaScript interaction, any JavaScript request or any JavaScript resource that needs to be fetched does make things a little slower and sometimes a lot slower. So you want to load them as late as possible in the process which contradicts what I just said that you want to load them as early in the process as possible to catch as much. So you have to be a little like on the fence about like, am I more okay with losing some data in terms of analytics and tracking versus how okay am I with losing some of my performance? And in my opinion, I would load them as late as possible and I would load them as deferred so that basically the entire HTML gets parsed and displayed and the vital JavaScript, the bits that actually serve the content if you're using a client-side rendered application basically get to work and fire first so that they actually display the application rather than showing or basically loading the tracker first and then starting with the application. What this balance is and what it looks like for you, that depends. That's a really tricky one. There is a brilliant talk by Harry Roberts who talks about third-party JavaScript. Let me see if I can find it real quick third-party scripts. I think that should give me a reasonable... So actually there's also an article. Aha, okay, that's nice. The article is from 2018. I'll post it in the chat real quick. I'll put it in the video notes for those who are watching the recording. That article I talked about like identifying these, auditing these and then like discussing with people about the value of these. And what I found that surprised me a little bit is that sometimes you run into... Oh, and here's the talk video as well. You run into third-party scripts in large sites like there's five, six different analytics trackers in there. And then when I worked in a company that had that situation, I just accidentally removed two and no one ever complained. So I removed the third one and the fourth one and no one ever complained. I would work in there because I would be the one to complain usually. Perfect. So that's the situation that you might find it elsewhere, but then if you are on top of your trackers then that's not really a valuable ally to go down. But it's tricky. And what we found for Google... Well, when I say we... At the company I worked for beforehand before I joined Google, we actually used the Google Analytics, I would call it API, to basically fire our events when we needed. And then that... Because we basically just wanted to see like basically conversion tracking what happens when, how do people navigate through the site and when do certain events happen and how often do they happen and which locations do they usually happen from. And then we basically built our own analytics tracker based on the Google Analytics tracking API and then shot these requests ourselves. It is a little tricky. It is heavily under documented if you ask me, but it might be worthwhile if that's what you're using. Maybe Yandex has something similar. I'm not very familiar with their product. But besides that, just load your trackers as late as possible. And when I say it's late as possible, as late as it's reasonable for what you need the data to look like. Because to be fair, you always have gaps in analytics and trackers, right? There's, I think I read numbers somewhere between 10 and 30% these numbers will be off. So it's then the question, is it have knowing that there is such a gap in the data already, is it worthwhile making everyone wait for these trackers to actually load and fire? That's a question that you would have to answer for your specific case. Thank you. Yeah, that's a good answer. A lot to think about, many decisions to make. Mm-hmm. I know. It would be great if it would have been the quick answer like, yes, no, maybe. But I mean, the short answer would have been, it depends, but that's not helpful. Awesome. Do we have other questions in the audience? Why is it always the leaf blower people? I'm not sure if you hear it on the recording or if you hear it in the live, like someone literally like five minutes before we started recording, decided to start blowing leaves. A report and update, how are the sheep? Oh, sad news. The sheep are gone. They were taken away on Monday. And it was particularly sad because it was also raining, like torrential rain and the sheep are standing there looking a little sad and we're like, blah, and then they were taken away. And I was like, no, don't take them away, leave them be. But I mean, on the other hand, they did a great job. The grass is really short now. So I understand that they have to move on to greener meadows, to greener pastures, I think is the term that you use in that case. Which is sad, but it's good for the sheep, I think. So I have to take one for the sheep. But yeah, the sheep were fantastic. If you are not aware of what happens with the sheep, check out my Twitter feed, go back a little while and you'll see what I mean, because there were sheep here in the neighborhood. And I could see them from my home office desk and now they're gone. All right, excellent. In that case, thank you very, very much. The next JavaScript SEO office hours will be in two weeks. That'll be the, ah, me and my calendar. Hold on, hold on, hold on, hold on. That'll be Wednesday, the 11th, yeah, press F to pay respect to the sheep. So Wednesday, the 11th, at 6 p.m. central European time, we will have the next JavaScript SEO office hours. I will post into the YouTube community thread so that you can drop your questions or join the live recording next time. Hi, Deborah. Good morning, Martin. How are you? Great, how are you doing? Do you have a question? I do have a question. And I'm like, wait, is it JavaScript only? Can I only ask JavaScript questions? These JavaScript SEO office hours are JavaScript only questions. Yes. Oh, JavaScript only questions. OK, well, then I really have to condense my thought to being, I'm thinking about programmers and I'm thinking about programming. And my conundrum is that when it comes to for programmers to start understanding your code, excuse me, understanding SEO, that's my big concern right now, is having programmers understand SEO concepts. And that's where my question is, is that when you're talking to programmers, coders, front end people, back end people, and you're trying to explain SEO to them, I mean, they're not getting it. And what I mean by that is that they don't understand the Google guidelines. They don't understand why we do these things. They don't understand pagination and what it means. And my question is, how can we better explain to them as SEOs, you're a programmer. How can we better communicate between one another so that I can get my thoughts over to a programmer who said, this makes sense. I've come up a lot lately. So I was thinking of your thoughts on that. That's a really good question. Thank you very much. That's actually kind of my mission. That's the mission I'm on as a developer and working in the SEO space here. My big mission is to bridge this gap in communication and also teamwork. I've done a bunch of presentations about it, but I don't think we have found the silver bullet to solve these problems. The way that I usually approach it is, depending on your developers, they might already have certain awarenesses about accessibility or performance or usability. And then I tie into these things. Because if they are accessibility aware, so they are aware that they should build things that do not per default exclude people with different abilities, then I say, consider this like that. That Googlebot or any other crawler is basically a user with assistive technology needs. Because the crawlers are not human. They can't really see. They can't really understand the text necessarily. At first, they just need to be able to consume as semantically rich as possible data and then work from there. If they are performance aware, which most developers at this point are because we have been talking about this to them for the last couple of years, that performance is a really important factor. And you can say, look, performance-wise, you want to make sure that your website is fast because that makes users happy. And then they say yes. And then you say, OK. So then consider that to make users happy, you don't only have to be fast. Fast is one aspect of SEO. That's one thing that I'm looking at as an SEO as well. But also, we need to make sure that it shows up in the first place. Because in the end, good developers want to build things for people to use. But if no one finds your thing on the internet, there is no one using the thing. It doesn't matter how fast you make your website. It doesn't matter how accessible you make your website. It doesn't matter which JavaScript framework you use for your website. If no one's ever going there. So what I usually tell them is, if I were to ask you a question, where can you rent sheep to Mojolon? What do you do? And then they will probably say, I'll Google it. Or I'll use a search engine to find it. And then you say, aha. So if you were the person to build a website for sheep rental, what do you need to make sure in order for people to actually find your website? Well, that it shows up in search engines. Yes, that. And that's exactly what I am doing. And that's what I'm really good at. And that's what I'm helping you with. You can focus on the technical side of things. But please, please, please talk to me so that I can test the things you build and show you what you need to be addressing if there's something going wrong in terms of us not showing up in search engines. And then if they are understanding that, if they get the point that, ah, this is one more requirement, because developers work with requirements. Yes, they do. Right? They get a requirement like this website has to load in 1.5 seconds on this specific connection. It has to be. It has to be our editors have to be able to upload products in the background. It has to do this. It has to do that. It has to be mobile friendly, whatever. One more requirement should be it has to be visible in Google search or in search engines in general. I'm just coming from the Google search perspective. But it should be visible and standing out enough in search engines. And then they're like, ah, OK, so that's a requirement. I don't know how to make sure that we fulfill this requirement. And they're like, well, conveniently, that's exactly what I can do. Let's work together on making sure that we show up. Well, I love that approach. That is amazing. That usually works. That's a great one. And that's for the conscious developer who's on the high end spectrum. What I'm seeing in the industry right now is that the clients that I'm working with are working with people to just get a website up. And so a lot of them are on do-it-yourself platforms. They're on customized CMS systems that have only been evolved for half a year. And they're vying into it. And that's the stumbling block. So I think it's a twofold piece. We're making developers aware of this. We also are needing to make clients aware of this on the small side, because they're not sure. They're building on systems. And they're putting all of this effort and money into it. And all of a sudden, it's blocked. Or it's not even visible at all. So these are the hurdles we're trying to get around. I'm so super psyched that you're doing this. And it's 5.30 in the morning, if I sound too much. So Eastern Coast time. But thank you for this. And I appreciate this. And I will always listen in, because that was the best answer. Thank you, Debra. Thank you a lot for the question. Thanks for being up so early, as well. See, this is exactly why I swapped the times. So in two weeks, it will be a lot later, actually. I think it will be like eight hours later, so that then is a friendlier time for you. So every second week, you don't have to get this up this early to get your questions in. Fantastic. And I'll sound more awake. Are you going to be posting this on Twitter, your Twitter feed? That's how I found it, to notify us when these groups are having a super. You can also hop into youtube.com slash Google Webmaster slash community. And there's all the office hours. So the ones that John does, there's basically like a posting, drop your questions here, and then you can drop a question. And also, we post the link to this recording in these threads. Thank you so very much. Have a beautiful day. Thank you very much. Likewise, have a wonderful morning. Wow. So we are spanning pretty much the entire world. Now we have Australia on the one end, and we have the US on the other. Wow. That's amazing. Sweet. Any further questions? Five, four, three, two, one. All right. Ladies and gentlemen, it has been a fantastic time. Thank you so much for joining the live recording. Thanks to everyone who dropped questions into the YouTube thread of questions. We'll see each other again in two weeks. Stay safe, stay healthy, have a great time. Bye-bye. Thank you. Bye, Martin. Bye, Maria. Stop recording. Bye, Deborah.