 Hey, everyone. My name is Philip Walton. And I'm Minhas Kazi. Today, we're going to show you how you can use free tools from Google, like Google Analytics 4, BigQuery, and Data Studio, to measure, analyze, and debug performance issues from real user data collected in the field. That's right. Today, we're going to focus specifically on the Core Web Vitals metrics. But the techniques discussed here will work for any performance metrics that you can calculate in JavaScript. OK, let's get started. The first step in understanding the performance of any website is to measure it. And when measuring performance, it's absolutely critical to get data from real users in the field. Tools like Lighthouse and web page tests are fantastic during development time. They help you test your pages before you deploy to production. But they don't tell you what your users are actually experiencing. For that, you need real user data, which we often refer to as field data. Google does provide tools that measure field data. For example, the Chrome User Experience Report, or Crux, as well as PageBeat Insights. And these tools are great for getting a high level sense of how your site is doing. But in most cases, they're too high level to provide actionable insights. For that, you really need analytics. And I want to really emphasize that last point and highlight why analytics is needed and what it offers that other Crux-based tools don't. First, with analytics, when you make changes, you can see your results much faster. Google tools that expose field data, they generally have a 28-day aggregation period. And 28 days is just way too long to have to wait to see if your changes actually led to a real improvement. Second, analytics tools are typically the place where your companies are already measuring important business metrics. So if you want to see how your site's performance affects your business, it's better to have the data in the same system. Third, Crux-based tools will tell you what your performance is, but they won't tell you why and how to fix it. For example, if you see that your site has a CLS of 0.5, you know you have a problem, but you don't necessarily know what's causing the problem. And analytics tools can let you capture additional data to help debug those types of issues. So hopefully I've convinced you of the importance of measuring performance metrics in your analytics tool. Now let's talk about how to do that. But first, let me start by saying that if you're already using an analytics tool and it supports the Web Vitals metrics, then I definitely recommend sticking with that tool. If you're not already measuring Web Vitals and you're looking for a great free option, then I recommend Google Analytics 4. Google Analytics 4 is the latest version of Google Analytics, and it has one specific feature that I think developers are really going to love, BigQuery Export. With BigQuery Export, you can link your Google Analytics 4 property with the BigQuery project, and then all of your event data gets automatically populated into tables that you can query with SQL. But in order to get your data into BigQuery, you have to instrument your pages to capture the data and send it to Google Analytics. So let me show you the recommended way to do that. The easiest and most accurate way to measure the Web Vitals metrics on your site is to use the Web Vitals JavaScript Library. And if you take a look at the Read Me on GitHub, you'll see instructions for how to use it with various versions of Google Analytics, including Google Analytics 4. Now I've already added this code to a real e-commerce site, and it's been collecting data for the last few weeks. To show you what that looks like in Google Analytics, let's switch over to the GA4 real-time report. Here we are in the real-time report for my e-commerce site, and this view will let us see live event data as it's coming in. Over here on the right, you can see the event report, and notice how the core Web Vitals metrics are already showing up, along with other events like page views. If we click on the LCP event to take a closer look, you can see all of the event parameters that were sent along with that event's data. So I want to highlight that with GA4, you can send basically whatever parameters you want with your event. In this case, in addition to the metric value, I'm also capturing the metric rating, whether it's good, poor, or needs improvement, as well as some other debug information, which I'll get to later. If I drill down into the metric rating PRAM, you can see a nice breakdown of all the LCP scores that have come in over the last 30 minutes. So this view is really nice after you've deployed a new version of your site to production, and you want to verify that it's performing as you expect it to be. So now that we see that our data is showing up in Google Analytics, let's set up BigQuery Export. To enable BigQuery Export, go to the Admin view in Google Analytics, and under the Property Settings section, click BigQuery Linking, then choose a BigQuery project from the list. If you don't see any project listed in here, then you probably just need to create one first. Once you've selected your project, go ahead and fill out all the rest of the fields in the form, and when you're done, click Submit. Also, before I go any further, I should mention that we've published an extensive blog post on web.dev that covers everything we're talking about in this video step by step, and we'll make sure and put a link to that post in the video description. OK, so once you've enabled BigQuery Linking, you should start to see your Google Analytics data show up in your BigQuery project within 24 hours. I actually set up BigQuery Export a few weeks ago, so you can see there's already some data here. If you click on the Events table, it'll show the table schema, and if you want to see what the data actually looks like, you can click Preview. Here, we can see that each Analytics event is a row in the table, and as I scroll down, you can see that the Web Vitals events are here, as well as all of the event parameters that I set earlier in the code. OK, now let's run some queries. So the first query I have here will aggregate all of the Core Web Vitals event values over the last 28 days at the 75th percentile. Also, don't worry about the specific syntax I'm using in these queries. All of this will be available in the blog post that I mentioned before. OK, let's run this one. OK, so as you can see from the results, we have each metric, the values of the 75th percentile, and the number of events received for each of the metrics. And while this data is nice, it's a nice overview of our scores, it's not that much different from what we can get in crux or page feed insights. So let's run another query with a few more parameters. This next query is going to show us what our LCP scores are for the most visited pages on our site. OK, let's run it. So as you can see from these results, with the exception of maybe the sign-in page, it looks like we have a bit of performance work to do since most of these pages have a score above the recommended 2.5 seconds. All right, now let's run another query that shows the CLS score for each page on our site in order from worst to best. So that should help us identify which pages need the most work. Now, if you remember from the results of the first query, the CLS at the 75th percentile for the whole site was actually 0. But that doesn't mean that the CLS for every individual page is also going to be 0. As you can see from the results here, some of the pages do have a CLS above the recommended threshold of 0.1. But even these page results are an aggregate of all users across all device types. And I happen to know from personal experience that on this site, there are more layout shifts on mobile than there are on desktop. So let's tweak the query a bit to limit these results to just mobile devices. Yeah, so as you can see, the scores on mobile are quite a bit worse than the overall scores. So it's nice that we're actually able to query that information and get that insight. OK, so far, these queries have given us insight into how our pages are performing against the Core Web Vitals metrics. But they're not helping us fix the problems we're seeing. I mentioned before that I was sending debug information along with each of the Core Web Vitals events. So let's see how we can use that information to help identify the root cause of these issues. This query lists the worst pages on the site by their CLS scores on mobile. And it's also grouped by the debug identifier param that I'm sending with each event. In the case of CLS, the debug param corresponds to the element on the page that shifted and contributed the most to the final CLS value for that page. So as you can see from these debug identifiers, an element that looks like it's probably a footer appears to be causing problems on many different pages. I also see this UL slide-out item on a few pages that sounds to me like it's probably some sort of navigation menu. So yeah, this is super helpful information. Before, all I knew was that my CLS was high on these pages on mobile, but now I have an idea of which components on my site are likely causing the problem. Anyway, that's a brief introduction to querying this data. Now I'll hand it over to Minhaas to talk about visualizing it. Thanks, Phil. Now that you've seen the kinds of insights you can get from the BigQuery data, let's take it a bit further. Suppose, rather than being a single aggregate measure of a time period, you want to understand the daily trend. We can run this query to see the daily 75th percentile values of LCP over a month. As Phil mentioned earlier, all the queries we're using in this video are available in the blog post link below. When we look at the results, it might be a bit hard to understand what the trend is like. Plotting this data in a time series chart might give us some insights. We'll click on Explore Data and then select Explore Data Studio. If you're not already familiar, Google Data Studio is a data visualization and reporting tool that is free to use. Here, we'll create a time series chart and use our metric to visualize it. Now we can see the trend of our LCP values over a month of time. Data Studio's Explorer mode with the direct link from BigQuery makes it easy to do quick ad hoc analysis for any query you are running. However, as you get more familiar with the data, you might want to look at several charts in an interactive dashboard to get a more holistic view and also to drill down into the data. You can create a dashboard using BigQuery's native connector in Data Studio. While this will work, for our use case, it is not the most efficient solution. Due to the structure of the data for the GA4 export and pre-processing required for the WebVetals matrix, parts of your query will run again and again. This creates two problems. First is dashboard performance, and second is BigQuery cost. You can use the BigQuery Sandbox mode for free. On your GCP account, you'll get the first statewide of processing free every month. After that, you can go into flat rate pricing or on-demand pricing at $5 per terabyte. For our WebVetals analysis, unless you're using a significantly large data set or are heavily querying the data set regularly, you should be able to stay within the pre-limit. But if you have a high-traffic website and want to regularly monitor different metrics using a fast interactive dashboard, we suggest pre-processing and materializing your WebVetals data while making use of BigQuery's efficiency features like partitioning, clustering, and caching. And here's how you would do that. This script will pre-process our data and create a materialized table. In addition to using the efficiency features, the pre-processing makes the data set smaller. Here's what the materialized data looks like. You can directly query this materialized table from within BigQuery or use it directly in Data Studio. To make things easier, we have a prepackage solution that will create a template dashboard for you so that you don't have to start from scratch. To use the connector, first make sure you have materialized your data set. Then visit this link. After a one-time authorization, you can complete the configuration by providing your BigQuery table ID and the BigQuery billing ID. This process will create a new dashboard for you linking to your data set. You can edit, modify, and share the dashboard as you like. If you create a dashboard once, you don't have to visit the control link again, unless you want to create multiple dashboards from different data sets. This dashboard has multiple sections. First, you can see the daily trends of the core web vitals metrics as well as some usage information for your website like users and sessions. In the second tab, you get a breakdown of the metric percentiles as well as user count by different usage and business criteria. For example, for FID, engaged users and returning users had a lower percentile value. You can also cross-veter in this tab. For example, you can look at only mobile users. It gets interesting in the third page. In the page path analysis, you can pick a metric to see an overview, but you can also see a scatter map of all page paths with all the percentile values on y-axis and the record count on x-axis. You can see these are your problem areas. These page paths are places where you have higher percentile values and you can click on them to filter and find out which debug targets you need to focus on. Finally, the revenue section is a good example of how you can monitor your business and performance metrics in the same place. This section plots all sessions where the user may departures. You can compare the revenue earned versus the user experience during a specific session. If you want to resolve performance issues on your site, this is a highly actionable tool that can help you to pinpoint problem areas. To make this dashboard more useful, as advanced steps, we recommend setting up schedule query in BigQuery to get updated data, joining first-party data like CRM for business insights, and reporting your code versions through Google Analytics Hits to monitor change results. You can find more information about these in our blog post. Thanks, Minhas. Those visualizations look great. I've built a lot of dashboards in the past and I wish I had this connector back then. Seems like it would have saved me a lot of time. Okay, so let's wrap up by going over some of the main points we covered today. First, measuring performance with real user data is critical for understanding, debugging, and optimizing your site. Second, you get better insights when your performance metrics and your business metrics are in the same system. Third, BigQuery export of raw Google Analytics data gives you unlimited potential for in-depth custom analysis. And last, Google APIs and visualization tools give you the freedom to build your reports exactly the way you want them to be built. So that's all we have for you today. I hope you learned something and I hope we inspired you to explore some of the tools we mentioned. And remember, everything we talked about today is covered in far more details in our blog post. It is linked in the description below. Thank you for watching.