 Hi, I'm Daniel Weisberg, Search Advocate at Google. And today I'll talk about how to use Search Console if you're a developer. Since the tool provides a lot of information about search optimization, developers may think it's not useful to them. But I beg to differ. In this video, I'll talk about the most useful Search Console reports to help developers build websites that are healthy, findable, and optimized for Google Search. In summary, use the index coverage report to understand site-wide search indexing issues. Use the URL inspection tool to debug page-level search indexing issues. Use the security issues report to find and fix issues affecting your site. And use the core web vitals report to make sure your website provides a great page experience to your users. The first thing that is really important for you to know is whether Google can find and crawl your pages. Small bleaches can have a massive effect when it comes to Googlebot being able to read websites. For example, sometimes we see companies accidentally adding no index tags to entire websites or blocking the content from being crawled through an error on the robots.txt file. These issues can be easily uncovered using the index coverage report. When you open the report, the first page you see is the summary page. The default view shows indexing errors on your website. But you can click to show valid with warnings, valid, and excluded pages. In addition, you will find a checkbox to add to the main chart the number of impressions your pages got on search. Here's what each status means with some examples. Errors prevent pages from being indexed. Pages with errors won't appear in Google, which can mean a loss of traffic to your website. For example, your page might be returning a 404 or 500 level error. Additionally, you might get an error if you submitted a page through a sitemap, but it contains a no index directive. All these cases would prevent the page from appearing on search results. Valid with warnings are pages that may or may not be shown on Google, depending on the issue. But we think there is a problem you should look into. For example, Google might find pages that are indexed, though blocked by robots.txt. This is marked as a warning, because it's not clear if you intended to block the page from search results. If you do want to block the page, robots.txt is not the best way to avoid being indexed. To do so, you should either use the no index directive or request authentication to see the page. Valid pages have been indexed and can be shown on Google Search. So good job. Excluded pages were not indexed and won't appear in Google. But either we think that is your intention, or we think it's the right thing to do. For example, the page has the no index directive, your choice, or the page is a duplicate of another page, Google's choice. You can watch the index coverage episode in this series to understand more about the issues affecting your site and learn how to validate a fix you have implemented. To debug an issue with a specific page, for example, a page Google is showing an error in the coverage report, you should use the inspect URL tool. You can use it to learn the current index status of your pages, to test a live URL, to ask Google to crawl a specific page, and to view detailed information about the pages loaded resources and other information. You can access the tool from the top bar, from the sidebar, and you'll also see a little magnifying glass next to URLs in some reports. For example, if you drill down to a URL from the index coverage report, you can click to inspect it. Once the page displays the results, you will see three different sections, all of them presenting information from Google's last crawl, or crawl attempt. If you recently made changes to the page, you might want to check if they're working as intended by clicking test live URL, and comparing the live version to the indexed one. In the presence on Google card, you'll get a verdict on whether or not the URL can appear in Google search results. There are two important options available for developers in this card. First, if you made a change to the page and want to request Google to re-index it, use request indexing. Second, you can click view crawl page to check the HTML version that Google indexed, and more information on the HTTP response and loaded resources. In the coverage section, you'll learn where the page was discovered, such as a site map or referring page, when was the last crawl and by which user agent, and whether the page is included in the Google index, or another version of it was chosen as the canonical. In the enhancement section, you'll find any structured data details, along with AMP and mobile usability warnings and errors. For example, if your page is not marked up properly with structured data, the inspection will return an error detailing the missing or wrong values. You can watch the inspect URL episode in this series to learn how to use the tool to debug and fix any issues you find. Search Console also includes two reports that will help you optimize your site's health, security issues and core web vitals. The Search Console security issues report shows warnings when Google finds that your site might have been hacked or used in ways that could potentially harm a visitor or their device. For example, a hacker might inject malicious code in your pages to redirect your users to another site or to automatically create pages on your site with nonsensical sentences filled with keywords. These are examples of website hacking. An attacker might also trick users into doing something dangerous, such as revealing confidential information or downloading malicious software. That's called social engineering. When you log into Search Console, you'll already be notified in the overview page if you have security issues on your site. Clicking the alert will lead you to the security issues report, where you'll find a list of all security issues Google found in your website. In the report, you'll find more details about the type of threat, a sample of pages affected by it, and a process for you to inform Google when you fix the issues. I recorded an episode about the security issues report with Aurora Morales, my colleague from the Trust and Safety team. Check it out for more details. Lastly, the Core Web Vitals report shows how your pages perform based on real-world usage data, sometimes called fuel data. The report is based on three metrics. LCP, or largest contentful paint, is the amount of time it takes to render the largest content element visible in the viewport, starting from when the user requests the URL. This is important because it tells the reader that the URL is actually loading. FID, or first input delay, is the time from when a user first interacts with your page, when they click the link or tap the button, to the time when the browser responds to that interaction. This is important on pages where the user needs to do something, because this is when the page has become interactive. CLS, or cumulative layout shift, is the amount that the page layout shifts during the loading phase. The score is rated from zero to one, where zero means no shifting and one means the most shifting. This is important because having elements shift while a user is trying to interact with the page is very annoying. I'm sure you will agree. Log in to Search Console and navigate to the Core Web Vitals report. You'll see that the report is broken down by mobile and desktop. Open one of them to see an aggregate report with more details. By default, the chart shows trends for pages with poor performance, but you can click on need improvement and good to check their trends too. Be aware that if a page does not have a minimum amount of reporting data for any of these metrics, it is omitted from the report, so you probably won't see all your pages. Click an issue in the table to drill down. In this report, you'll find more details about the issue, a chart showing trends, and a table with sample URLs. Check the links in the description to find out more about the Core Web Vitals report. You might also like to check out tools such as Lighthouse before you deploy changes to production. With that, I'll end this video. It's too long already, and even if you drank coffee in the beginning, the effect is probably starting to die away. To recap, use the index coverage report to understand site-wide search indexing issues. Use the URL inspection tool to debug page-level search indexing issues. Use the security issues report to find and fix threats affecting your site. And use the Core Web Vitals report to make sure your website provides a great page experience to your users. Don't forget to subscribe to the Google Webmasters YouTube channel to watch our upcoming Search Console videos. Stay tuned.