 Hi, I'm Daniel Weisberg, search advocate at Google. And today, I'll talk about how to use Search Console to learn which of your pages have been crawled and indexed by Google and any problems found during that process. Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google Index. Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. When a user enters a query, Google Machines search the index for matching pages and return the results we believe are the most relevant to the user. The index coverage report discussed in this video will give you an overview of all the pages Google indexed or try to index in your website. To learn more about how indexing works, read through our documentation. It will help you get the most out of this video series. Check the links in the description. Before I go into the report, I want to say a few words about when and how often you should check the index coverage status report. If Search Console detects a new index coverage issue on your website, you will get an email. But if an existing issue gets worse, you won't. This means that you don't need to check the coverage report every day, but keep an eye once in a while to make sure that nothing is going from bad to worse. When you open the index coverage report, the first page you see is the summary page. The default view shows indexing errors on your website. But you can click to show valid with warnings, valid, and excluded. In addition, you will find a checkbox to add to the main chart the number of impressions your pages got on Search. Here's what each status means with some examples. Errors prevent pages from being indexed. Pages with errors won't appear in Google search results, which can mean a loss of traffic to your website. For example, you might get an error when you submitted a page containing the no index directive. Additionally, Google might find a server error, or your page might be returning a 404. All these cases would prevent the page from appearing on search results. Issues on pages you submit via site maps are explicitly called out, since they're most likely to be problems we should resolve. Valid with warnings are pages that may or may not be shown on Google search results, depending on the issue. But we think there is a problem that you should look into. For example, Google might find pages that are indexed, though blocked by robots.txt. This is marked as a warning, because we're not sure if you intended to block the page from search results. If you do want to block this page, robots.txt is not the correct mechanism to avoid being indexed. To do so, you should either use the no index directive or request HTTP authentication to see your page. Valid pages have been indexed and can be shown on search results. No need to do anything. Excluded pages were not indexed and won't appear in Google. But either we think that is your intention or we think it's the right thing to do. For example, the page have the no index directive, your choice, or the page is a duplicate of another page, Google's choice. Or the page is simply not found and returns a 404 error. For a more comprehensive list of issues, check our help center. In the summary page, you should start by checking the chart to learn if your valid page's trend is somewhat steady. Some amount of fluctuation is natural. And if you're aware of content being published or removed, you might see that reflected here. Let's take a look at the errors list now, as they are the most pressing issues. The table is sorted by severity and the number of affected pages. So start investigating at the top of the list. If your site doesn't have any errors, you can also follow along with any of the other tabs. Click the top issue to drill down into the issue details. The details page shows an overtime distribution of all pages suffering from this issue and provides a list of examples that you can check more closely. You'll also find a link to learn more about the error in the Search Console Help Center. In addition, you should click an example URL and inspect it. This will show all the details available for a specific page, as explained in the URL inspection episode of the series. Try both the indexed and live versions. What you see in the index coverage report reflects the index data in the URL inspection, and it might differ from the live test result. Watch the URL inspection episode to learn more about it. Once you understand what needs to be fixed, you have two options. Make the required changes yourself or share the details with a developer that can perform cold changes to your website. You can do that by grabbing a link using the Share button. The link grants access only to the current page plus any validation history pages for this issue. It does not grant access to other pages for your resource or enable the shared user to perform any actions on your property. You can revoke the link at any time by disabling sharing for this page. After you or your developer have fixed the error on all your pages, click Validate Fix, and Google will validate your changes. That's the index covered status report. Hopefully now you understand how to read the information, prioritize your fixes, and let Google know when you do it. Don't forget to subscribe to the Google Webmaster's YouTube channel, where we'll be publishing lots of Search Console videos. Stay tuned. Hey, Green. Hey, Blue. What's up? Oh, good. Did you watch the new episode of the Search Console Training series? No, I didn't watch the last one. You want to watch with me? Of course I do. Let's go.