 Coming from a developer background there's so many misconceptions and myths about SEO that developers come up with or have heard or That come from the SEO world as well. Where do these come from? Well, how do these get into the world the myths the legends that come through about JavaScripts? I think a lot of it is people with very good intentions will try to provide the information they have available and there's a gap in translation between The SEOs and the developers and how they think and what they consider So by going ahead and adopting is acceptance criteria as part of my tickets when I work with devs That lets them know very specifically instead of being like and I want you to make magic for me When you go from give me magic to hey, here's my user story I would like to accomplish three pieces for acceptance criteria. You can bridge a gap Hello and welcome to another episode of SEO myth busting With me today is Jamie albarico. Jamie. What do you do in your job? Exactly? Thank you so much for having me here. I'm a technical SEO with error electronics And that means that I am embedded with a number of dead teams across a number of projects And we try to execute these initiatives to get new Features available on the site in an effective and search-friendly way And that means a lot of times we have to have conversations about how we're using our java script Having you here is fantastic because then we can have a conversation about pretty much everything that you want to know from the search side as well as the web developer side, so Any questions that you have in mind anything like pops into your mind. Oh, so many questions I get to poke I get to poke the black box of google here and I have one that's absolutely burning Is javascript the devil? Oh, that's a fantastic question. Um, it might seem that way Sometimes especially when things are not going the way you want it. You see the horror stories They're on forums or they're on twitter. Everything is gone. Yeah, that that's one thing that's the SEO side on the developer side Is also like oh, it's a language that wasn't designed to be like super resilient But it actually is and then a lot of people are oh, it's a seas type language and it's not really It's a lifts type language. So like a lot of misconceptions coming from both worlds together and clashing here I don't think it is the devil I think it has its benefits I mean it allows us to to build really cool and and fantastic stuff on the web and be really responsive to what the user does and wants To do with our applications and it has moved from it has moved the web from becoming or being a Document platform towards an application platform and I think that's fantastic So I think we are already pushing hard on Fighting this javascript is the devil and if you use javascript, we can't be next at all So that's not true for for a long time But I think now the documentation is catching up with like outlining the different Bits and pieces that you should be aware of and the features that you have to deal with that are not available One thing for instance is you probably have built signal page applications, right? Oh, yes Has there been problems in terms of seo when they rolled out? I was pretty lucky. I had a dev team who believed in seo. That's good. That's really good That was actually my the big moment of my career when I only got under technical seo And I came and I talked to one of my new developers for the first time with this very specific problem I was trying to solve and you just paused and looked up from his keyboard. I went you're not snake oil So I think we're making a lot of progress between seo's and devs to go That is fantastic. That's a great story So you might hear a few people in in the community going like, oh, should we do a single page application? Is that risky? And one of the things that a bunch of developers are not aware of and some seos are not necessarily communicating all the time is That we are stateless. So that means with a single page application You have a bit of an application state, right? You know which page you're looking at and how you transition between these pages However, when a search user clicks on a search result, they are not having this application state They're jumping in right to the page that we indexed. So we only index pages that can be jumped right into So a lot of the technology at JavaScript technology is making assumptions of how The user navigates to the application. So like the developer as a developer in my test So, okay, here's my application. I click on the main navigation for this particular page and then I click on this product And then I see and everything works But that might not do the trick because you need that unique url It has to be something we can get to correct not using a hash url And also the server needs to be able to serve that right away If I if I do this journey And then basically take this url and copy and paste it into an incognito browser I want people to see the content that not the home page and not a four four page So that's something that we're working on giving more guidance for lazy loading You probably have seen a bunch of communication about that one as well. Yes How do we how do we get a rich media experience out to users? But do it in a way where if you're on your cell phone We we keep that very small time frame. We have to get your attention Correct, and you want to make sure that if you have a long list of content you don't Bring everything into the especially on the cell phone right just dealing with like 100 images But what about a jacks? What about using asynchronous javascript and xml? Right, right. That is perfect. Oh I haven't I haven't heard a jacks being used in a while and it's spelled out in a while I think a lot of everyone's using it, but no one's talking about it that much It was just like yeah, you just load data in as you go. That's perfectly fine We are able to do that Also, I often get asked about how that Affects the crawl budget. I knew that you would crawl budget. Let's talk. So what worries you about that? Well, if we're using ajax, let me request say a product detail page And we are using ajax to supplement a lot of pieces of content to it Googlebot's requested one URL and it's gotten back nine Because each of those ajax calls had a unique string How do we handle that and does it negatively impact our crawl budget? So I wouldn't say it negatively impacts your crawl budget because crawl budget is much more complex than you might see It's one of these things that looks like super simple, but there's more than meets the eye We're doing a bunch of caching right because we expect that content doesn't necessarily like update too much so Let's say you have this product page you make one request to the product page and then that makes nine more requests We don't make it we don't distinct between like during the css or the javascript or the images or the api calls That get you the product details. So if you have nine calls From this one page load, then that's going to be 10 in the crawl budget, but Because of caching we might have some of these in the cache already So if we have something that is already cached that doesn't count towards your crawl budget So if we were to version our ajax calls, yes, those could be cached instead of calling every time Cached exactly. Yes, and then that's that's one way of working around it if you can do that if that's a possibility The other thing is you could also consider it not just an issue for The crawl budget But also an issue for the user right because if you're on a slow network or spotty network connection It might flake out in the middle and then you are You're left with broken content. That's not a great user experience You want to probably think about like pre rendering or hybrid rendering or server side rendering Anything in between there and crawl budget is tricky generally because we're trying to deal with the host load situation So what can your servers actually deal with so we are constantly adjusting that anyways? So it's like, oh, this affected our crawl budget negatively Not really because we just like had host load issues with your server. So we like adjusted it anyways or we had Balancing issues across your entire content. So I wouldn't say that it's not much of a deal Normally speaking, but I see that it's a very important for for people to understand that and unfortunately, that's not that easy Can we demystify googlebot a little bit? Oh, yeah, we have this The ominous the great the googlebot But it actually goes through a series of Actions so we get that initial html paris We find the javascript and css that we need to go ahead to make our content Then call those pieces. We know since google io There's actually a gap between our initial paris and our html rendering But I want to know more because googlebot follows html html five protocols. Yes, there's some nuances there I don't think I know I didn't know about Where say you've got an iframe in your head and you've got a closing head script right there That ends your head for googlebot and all of our lovely meta content are hreflangs and canonicals below that Have a tendency to exist That is true. Um, there is a bunch of things at play So when we say googlebot what we actually mean on the other side of the curtain is a lot of moving parts So, um, there's the crawling bit that literally takes in URLs, right? And then fetches them from the server that so that when you are providing the content to us We get like the raw html that tells us about the css the javascript And the images that we we need to get and also the links in the initial html Because we have that already we have such a Wealth of information already we can then start to like go off and fetch the javascript and everything that we need to render later on But we can also already use the html that we've got and say like oh look There's links in here that need to be crawled So when you have links in your initial html, we can go off and basically start the same process for these URLs as well So a lot of things happen in parallel rather than just like one step and then the next step and then the next step So this is definitely the start of it And as we get the html in parallel to extracting the links and then crawling these we queue them for rendering So we can't index before we have rendered it because a bunch of content needs to be to be rendered first Um, but in a way that benefits us if we've got a single-page application We now Googlebot has the template they just got to grab the content that fits within there Yeah, so wouldn't that mean that googlebot likes These javascript platforms like that if if the more content you get us Quickly in the first step in the crawling step the better it is because we can then basically Carry that information over rather than having to wait for the rendering to happen. That is pre-render always the best solution I That's a tricky one I think most of the time it is because it has benefits for the user on top of just the crawlers, but You have to very carefully measure what you're doing there. I think so Giving more content over is always a great thing That doesn't mean that you should always give us a page with a bazillion images right away because that's just not going to be good for the users Because they're going to have to then if you if you're on a really old phone And I have a pretty old phone and you have a page that is full of images and transitions and stuff then you're like I can't use this website So pre-rendering is not always a great idea. It should be always a mix Between getting as much crucial content in as possible But then figuring out which content you can load lazily in the end of it So for scos that would be you know, we we know that different queries are different intense informational transactional So elements critical to that intent should really be in that initial part Exactly, and you might consider if if the intents are wildly different And the content is very very different consider making it into multiple pages or at least multiple Views if you're using a single page application so that you have an entry point for the crawler to specifically point at it when When it comes to surfacing the search result So treat it like a hub and let the users branch out from there. Yes. So that's where we'd use Maybe our css toggle for visibility. That is a possibility Just having different urls is always an option, especially with the history api you can probably In the single page application figure out which route to display and then like have the content separated between different routes Or or be a little more dynamic there. We support parameters. So even if you use url parameters Basically expose the state that is relevant to the user in the url What other ways does that benefit our users because there are ultimate goals to make them happy? Yeah, and that's our ultimate. So like we are we are the same in terms of what our goal is We both want to serve as useful information to the user as quickly as possible So the user's benefits are especially if you do like hybrid rendering or the server side rendering That they get the content really quick normally if it's done. Well, if it's not overloading their device Um, and they get to jump in right where the meaty bits are right So if I'm looking for some specific thing and you give me a url that I can use to go to that specific thing I'm right there and I'll have a great time because it's the content that I needed So yeah, if you have performance metrics going up as well Then even if I'm on a slow phone on a really spotty network, I still get there I mean our performance metrics that's based on a lot of pieces. We have a stack of technology That is true What should seos look for in our stack? Where should we try to identify those areas where we could have a better experience for not just google bop at our humans? Yeah, so I think A bit that is oftentimes overlooked not by seos, but by businesses and developers is the content part So you want to make sure that the content is what the users need and want and it's written in a way that helps them But on the technology side wait, so that blurb at the top people always do where they're like Here's my hero image and then 500 words about this thing and I'm a human who wants to buy something and there's so much stuff in the way Yeah Don't do it Or at least like have two pages have like the promotional page that you want to direct marketing towards and then If I specifically look for your product, just give me your product. Just let me let me give you money So I think uh Talking about performance and all the different metrics is a bit of a blend of all the things like Look at when does my my content actually arrive when does my page become responsive? So you look at first content full paint you look at time to first bite as well Less important than the first content full paint I would say because it's fine if it takes a little longer if then the content is all there versus So trying to first bite can take a bit of a hit. Yeah, if we deliver that faster First meaningful pain exactly because in the in the end as a user I don't care about if the first bite has arrived quicker if I'm still looking at a blank page because JavaScript is executing or something is blocking a resource If it if it arrives a little later, but then it's right there That's fantastic, right? And you can get there in multiple ways. I highly recommend testing testing testing testing What testing tools would you recommend? So I definitely recommend lighthouse. That's a great way web hint is a more More broad approach as well. And uh, you could also use page speed insights or the new seo audits in lighthouse Um, mobile friendly test also gives you a bunch of information page speed insights Just look at that full page though. Mm-hmm. And we have a bit of a bit of a gap. We have almost this futurist Lighthouse where we want that time to interactive and then we have people adopt this this methodology And that's how we got, you know, so much contact via Ajax because full page like this fast But all that content was still coming I would recommend lighthouse that gives you like the film strip view of when things are actually ready for the user to work with So I would highly recommend looking at lighthouse, but page speed inside gives you a good like first First view over and it integrates with lighthouse really nicely now So wonderful. Do you think that javascript and seo can be fans now and that developers and seos can also work together? I do. I really think that uh, you know, if google is a library and a web page is a book Using these javascript frameworks. Let us make pop-up books and written experiences to engage with. Oh, that's a fantastic Analogy, I love that image. That's a that's a beautiful one. Thank you so much. Thank you very much And I hope you enjoyed it and see you next time Have you ever wondered where on the map you should put ux and performance when you're talking about seo? So have I let's find out in the next seo muthbusting episode