 Howdy, mods fans. Welcome to this week's edition of Whiteboard Friday, and I'm your host, SCO Sean. And this week, I'm going to talk about how do you help Google crawl your website more efficiently. Now, to start at a high level, I want to talk about your site structure, your site maps, and Google Search Console, why they're important, and how they're all related together. So, site structure. Let's think of a spider. As he builds his web, he makes sure to connect every string efficiently together so that he could get across anywhere he needs to get to to catch his prey. Well, your website needs to work in that similar fashion. You need to make sure you have a really solid structure with interlinking between all your pages, categories, and things of that sort to make sure that Google can easily get across your site and do it efficiently without too many disruptions or blockers so they stop crawling your site. Your site maps are kind of a shopping list or a to-do list, if you will. Of the URLs, you want to make sure that Google is crawling whenever they see your site. Now, Google isn't always going to crawl those URLs, but at least you want to make sure that they see that they're there. And that's the best way to do that. And then through Google Search Console, anybody that creates a website should always connect a property to their website so they can see all the information that Google is willing to share with you about your site and how it's performing. So let's take a quick deep dive into Search Console and the property. So as I mentioned previously, you always should be creating that initial property for your site. There's some wealth of information you get out of that. And of course, natively in the Search Console UI, there's some limitations. It's a thousand rows of data that are able to give to you. Good. You could definitely do some filtering, regex, good stuff like that to slice and dice, but you're still limited to that thousand URL in the native UI. So something that I've actually been doing for the last decade or so is creating properties at a directory level to get that same amount of information, but to a specific directory. And some good stuff that I have been able to do with that is connect to Looker Studio and be able to create great graphs from reports, filters of those directories. To me, it's a lot easier to do it that way. Of course, you could probably do it with just a single property, but this just gets us more information at a directory level like example.com slash toys. Next, I want to dive into our site maps. So as you know, it's a laundry list of URLs you want Google to see. Typically, you throw 50,000, if your site's that big, into a site map, drop it at the root, put it in robots.txt, go ahead and throw it in search console, and Google will tell you that they've successfully accepted it, crawled it, and then you can see in a page indexation report what they're giving you about that site map. But a problem that I've been having lately, especially at the site that I'm working at now with millions of URLs, is that Google doesn't always accept that site map, at least not right away. And sometimes it's taken a couple of weeks for Google to even say, all right, we'll accept the site map, and even longer to get any useful data out of that. So to help get past that issue that I've been having, is I now break my site maps into 10,000 URL pieces. It's a lot more site maps, but that's what your site map index is for. It helps Google collect all that information, bundled up nicely, and they get to it. The trade-off is Google accepts those site maps immediately, and within a day I'm getting useful information. Now I like to go even further in that, and I break up my site maps by directory. So each site map or site map index is of the URLs in that directory. It fits over 50,000 URLs. That's extremely helpful because now when you combine that with your property at that toys directory, like we have here in our example, I'm able to see just the indexation status for those URLs by themselves. I'm no longer forced to use that root property that has a hodgepodge of data for all your URLs. Extremely helpful, especially if I'm launching a new product line, and I want to make sure that Google's indexing and giving me the data for that new toy line that I have. And always, I think, a good practice is make sure you ping your site maps. Google has an API, so you can definitely automate that process, but it's super helpful every time there's any kind of a change to your content, add sites, add URLs, remove URLs, things like that. You just want to ping Google, let them know that you have a change to your site map. So now we've done all this great stuff, what do we get out of that? Well, you get tons of data, and I mean a ton of data, and it's super useful as mentioned when you're trying to launch a new product line or diagnose why there's something wrong with your site. Again, we do have a 1000 limit per property, but when you create multiple properties, you get even more data specific to those properties that you could export and get all the valuable information from. Even cooler is lately, recently, sorry, Google rolled out their inspection API. Super helpful because now you could actually run a script, see what the status is of those URLs, and hopefully some good information out of that. But again, true to Google's nature, we have a 2000 limit for calls in the API per day per property. However, that's per property, so if you have a lot of properties, and you could have up to 50 search console properties per account, now you could roll 100,000 URLs into that script and get the data for a lot more URLs per day. What's super awesome is Screaming Frog has made some great changes to the tool that we all love and use every day to where you can not only connect that API, but you could share that limit across all your properties. So now grab those 100,000 URLs, slap in the Screaming Frog, drink some coffee, kick back and wait till the data pours out. Super helpful, super amazing, makes my job insanely easier now because of that. Now I'm able to go through and see, is it a Google thing, discovered or crawled not index, or is there issues with my site, to why my URLs are no longer or not showing in Google. And as an added bonus, you have the Page Experience report in Search Console that talks about core vitals, mobile usability, and some other data points that you could get broken down at the directory level that makes it a lot easier to diagnose and see what's going on with your site. Hopefully you found this to be a useful whiteboard Friday. I know these tactics have definitely helped me throughout my career at SEO, and hopefully they'll help you too. Until next time, let's keep crawling.