 This next speaker is someone that I'm fortunate to call a friend. One of the best things that came out of staying home all the time during the pandemic is the opportunity to connect with people like Noir in a different way through apps like Clubhouse. If you're in SEO and in Clubhouse, you've most likely been in one of the many welcoming and fun rooms hosted by Noir with a very eye-catching poop emoji in the title. Noir is the product director at two October's. He is a technical marketer, nicknamed the Kraken, who is happiest building SEO tools, automations, data pipelines, and communities. When not in the lab, he loves skiing, fly fishing, camping with family, and walking his dog Shadow, who have heard a few times in the Clubhouse room I mentioned earlier. I've been fortunate to see a preview of what Noir is presenting today, and I promise it's an exciting one. Buckle up. Howdy Moz fans! I am so stoked to talk with you today about Google Search Console data because I know it's going to completely revolutionize how you do SEO. It's going to allow you to see tons more data with so much more context, and it's going to allow you to get your work done so much faster. To get started, let's look at my favorite view of space. It's called the Hubble Ultra Deep Field. This is also our oldest view of the universe. Inside, we can see tens of thousands of galaxies, all of which takes place in the size of a baseball held about 100 meters away. Why is this relevant? Well, using the API is a lot like unleashing the power of Hubble on our search data. It's the difference between having a sense of what's going on and the ability to see so much more with so much more clarity. To set the stage, let's jump into Search Console's performance tab so that we can understand why this tool just isn't good enough and why the API is so much better. If we perform a search for a key brand on our website, we can see a number of things, right? The first thing that we see is that we just get a thousand rows of data. Search Console only lets us see, analyze, or export out of the tool a thousand rows of data at a time, which is a total bummer, especially when we know that the API delivers 25 times as much data for that exact same search. And what about context? Well, the performance tab allows us to see one dimension at a time. For me, that's just not good enough. I need to see multiple dimensions combined together in order to generate context. Alright, so the performance tab hurts. We get that, right? We also know it's super slow. It's crazy tedious. There's tons of clicking around to find information, and there's lots of context shifting. And you're probably wondering, is there a better way? And I'm here to tell you, yes, there is. And let's explore that. This is the first page from a tool that we're now building that we call Explore. You can see here that we have multiple dimensions together with our metrics combined at that combined level. So we can see the position for a query on a specific page. Isn't that killer? Because then we can understand what page we need to optimize in order to drive results for a specific query. All of our metrics down below are presented independently so that there's no logarithmic scale, clouding context. We can filter in any number of ways around the top and right side, whether it's question words, different steps of the buyer funnel, or whether it's by position, impressions, and clicks. How's this for context? Totally killer. Alright, so 25 times the amount of data is awesome, but where's the money? Where's the real value? Where's the meat? Okay, so to understand this, we did a really fun little thought exercise. We compared the amount of clicks that we could see for that one search inside Search Console against what we could get from the API. It's not about the rows, right? It's about the actual clicks that we're missing. We were able to establish that there are about 7,000 clicks. And if we multiplied that by our site's average conversion rate and by our average order value, we were able to see that there was about $40,000 worth of revenue that we couldn't see, that it was hidden inside the tool. And that's just one out of 285 brands on the site. Total bummer. And if we expand that thought exercise to the bottom of the funnel terms, we found that that number was much larger. You see the API gives just an astonishing amount of data. And even better, if you grab data from the API and visualize it in the tools that you build, you can have different ways of analyzing the data that match your workflows. You're not stuck inside anyone else's tool anymore. You can analyze data that in ways that match the questions and answers that you need to drive performance. I think that's so powerful. And one of the key ways that we can accomplish this is by building multiple layers of filters in order to get to our answers super quickly. We're going to go deep into this really quickly here. Again, we can see tons and tons of context. I know this is what you're thinking and you're probably wondering this. And that's why I'm here to kind of show you the system that we use to build Explorer. We built a way to grab data out of Google Search Console's API. We then push it up to Google BigQuery, which is just this amazing database. And then we use Google Data Studio as our front end tool in order to drive analysis to make great decisions to have meaningful impacts on our client's performance. All right, so how do we get the data? And as we look at this, this is a great time to just take a quick time out. Just to reestablish why this exact moment is so important for the future of search. Before now, massive corporations and only the top enterprise agencies had either the internal technical skills or the budget to be able to afford data warehousing at scale, which is how we store all of our search data and databases. But multiple different technological advancements both on the cloud technology and data storage pricing have now made it possible for all of us to be able to afford this technology. And what that means is that the playing field is about to get level. That's why I'm especially stoked about newcomers like Jepto because they've been able to bring the pricing for data storage down so dramatically that it makes it possible for all of us. Let's get into the costs a little bit more. You see, getting data into BigQuery is where the majority of your costs are going to come. You're going to find that depending on which service you decide to go with if you need to use a service, it's going to cost between 10 and hundreds of dollars per month per client to get your data up into BigQuery. Here's the good news. BigQuery is super, super reasonable. It's going to cost between two and 25 cents per client per month in order to store all that data. Here's the better news. Data Studio is 100% free. Isn't that amazing? OK. So you're probably wondering, what are the advantages? What are the benefits? How can I crush my competition? What do I get if I put a tooling like this in place? Well, the first thing you're going to find is it's so crazy fast. In terms of how long it takes for visualizations to load, it might take two to three seconds for BigQuery-driven visualization to load. Whereas if you're working inside Search Console or if you're using the native Data Studio connectors, it could take minutes sometimes for blended data to load. So it's way fast. You're also able to use filters in order to segment your data pretty much any way you want. Here's an example of how we filter data for both organic versus Google My Business queries. To learn how to do this, you're going to want to learn all about calculated fields and something called case statements. I know, whoa, technical jargon jargon jargon. Keep in mind, there's tons of links at the end of the deck and my DMs are totally open. Please reach out to me on Twitter or send me an email if I can help you in your journey because I'm so stoked to help anybody. And of course, be patient as you're learning something new because with exploration comes challenges. But stick with it because once you learn how to do this, you're going to uncover whole new ways of doing analysis that you haven't even known were possible before, like position bucketing. This is something that's only possible if you put your data up into a database like BigQuery. I thought you could do this in Data Studio before. I didn't even realize how wrong I was until I saw this tweet from Kyle. After I saw this, I then went down a 15-hour rabbit hole to make a very long story short. It's not possible unless you have your data up in a database. You're going to love that embraced position bucketing. You can also segment your data based on branded or non-branded. You can also split up your site based on directory structure to find issues and opportunities. You can also filter out based on your key brands so that you can either learn to optimize specific brands or groups of brands. How do we use it? Well, at two October's, we love thinking about the buyer journey. So we built tooling around that. One of the pages in our tool that we built is called the bottom of the funnel page. Using this page, analyzing one of our client sites, we found that there was a massive amount of search impressions for queries that included sale and for sale. We saw this and thought it was a total no-brainer to add for sale as a page title suffix on all of our collection pages. Okay, so what happened? Tons of clicks, position improvement, and most importantly, tons of revenue. All of this was uncovered by the tool. So it's just amazing. This is the key takeaway here. Layered filtering is what helps us get from this massive amount of data down to actionable insights really, really quickly. Another page from Explorer that we love is called Topic Explorer. You can see your metrics on the left. You can see all of the search terms on the site in the middle. And we could see all of the URLs on the right. I want to show you how powerful it is, but first I want to show you a little magic. In Data Studio, if you go from View Mode into Edit Mode and you select multiple tables, and then over on the right in the Edit menu, if you scroll down and click Apply Filter, you're going to be able to do different layers of filtering back to back to back, getting very quickly to actionable insights. All right, so with our tool, if you enter a head term, like we just did, it'll then filter our search terms in our URLs, which is awesome. We've just shrunk down our data a little bit. If I then click on one of those rows in the middle table, this is where that magic comes into play. All of my URLs shrink down even further down to five URLs for me to consider. Then if I look at the data, I look at my clicks, impressions, and position, I can start to understand which of these pages I might want to be my Keeper page. I can also see potential cannibalization opportunities because multiple pages are getting clicks and impressions, and I might want to just consolidate that into one URL. We can also see that we might want to internally link to the Keeper page. And if I then perform a SERP analysis, I can then start to learn about search intent, like what type of content, what content structure does Google think needs to be aligned with that particular search? I can then look at my five URLs again and figure out which of them has the right content structure, because as Cain Jameson once told me, you can't brute force intent, i.e. your content has to match with Google's showing. So then you're going to want to keep your page that aligns with that intent. You're also going to want to 301 redirect or de-optimize the other pages. De-optimize might mean taking that head term out of the URL or the page title or other meaningful areas of the page, like headings. We can also use the same tool to work backwards. Instead of going from terms to URLs, I could click on one of the rows of URLs on the right and it's going to filter all of my search terms, thereby showing me content expansion opportunities, which I love. Okay, so what just happened? Before we did anything, we could see all 317,000 search terms and 15,000 URLs. And in a space of five seconds, I was able to go from all of those terms to one page that I wanted to optimize. And I was able to go from all of the pages on my site down to five in order to then get down to the Keeper page that I wanted to optimize around. So powerful. We can also build tooling around low-hanging fruit. Let's say we want to look at all the stuff that ranks between position four to 15. Say if you're wanting to get everything in the local pack, you can build a page like this that enables you to see only stuff that ranks in that position bucket to find easy win opportunities, right? What about FAQs and content expansion? We love this question searches page for that very reason. This enables us to dive down based on head terms that we enter into the query dropdown or specific pages that we might want to add an FAQ section for or specific questions over on the right. We can filter by those or we can, again, filter by position buckets down in the bottom to find opportunities to add FAQs as sections to pages or to build freestanding FAQ content. Whoa, conditional formatting. I love this. You're going to find it so useful, especially to help your junior staff perform well. Here's a formula that we use to highlight all of our opportunities for stuff that's ranking between position six and 15 and has good search impression volume. Those three things together equal opportunities that are just jumping off the page, right? Okay, so all of that stuff that I've shown you before, you can all see just by natively pulling the data down into BigQuery from the API. Well, what if you pre-process the data up in BigQuery to then see even more information in your search data? This is where the real opportunities, I think, get uncovered. One of the coolest ones that you can see are trending queries. Like, what's performing really well, i.e. getting, or what search term is getting tons more impressions over the last seven days, 14, 30, 60, or 90 days, or which terms are seeing major position changes or click changes. These are super important. Over on the left, you can see the underlines. If you click one of those search terms, it'll take you back to Topic Explorer in order to learn more. I think that's really powerful. Not only that, but you can also create change columns when you're pre-processing the data up in BigQuery. And what that means is that you can have these different columns that are sortable. So you can sort your content based on which ones are seeing the biggest click changes or impression changes or position changes. This is stuff that's just not natively possible. And I think it uncovers tons of insights. Think about a new site, right? Like, mostly when they see impressions, they spike like crazy and drop to nothing, right? Well, what if you were tracking content that was seeing the largest shrinking of impressions over time? Maybe you would want to then change page titles or headings for those particular pages so that as the impressions drop, you don't drop to zero, but you drop to a much higher baseline, thereby growing traffic across your new site. Hat tip to Arnaut Hellermann's on that particular concept. Okay, so what have we learned today? Couple things. Number one, Search Console API is the new table stakes in Search. Period. Mic drop. Not only that, but the combo of the API, BigQuery, and Google Data Studio is, in my opinion, the Goldilocks solution. And I feel like if we're not all embracing this or if you're not embracing this, you're going to get crushed by the competition. And if you do embrace it, you're going to build better content that both ranks and converts. Again, I had a blast with you today. If you have any questions, DM me on Twitter, send me an email. I'd be stoked to help you in your journey. Here are a couple really useful resources. The first is a three-part video series that we made to help you in your journey. The second is a link from Google, and the third is all about building case statements. Thanks so much. I can't wait to see you next year in real life. Ciao.