 Howdy Mozvans. We're here in Seattle for MozCon 2023 and I want to talk to you about the context explosion. When we talk about context within the frame of reference of SEO or search, we typically talking about either implicit context, which describes the searcher, their time of day, their location, their language, or we're talking about explicit context, which is the search query itself. And it's this that I want to talk to you about today because I think we're on the brink of seeing a paradigm shift where there's going to be an explosion in the amount of explicit context going into searches. So to start, I want to tell you about my vacation planning for last year. I wanted to take my family on a vacation and the search that I wanted to do in Google was show me vacation itineraries for a family of five with a private pool near the beach and some restaurants in or near Europe. However, what I ended up doing was a search like this or variations of this, because the last 20 or 25 years of using Google has taught me and most other people that they're expecting you to type in two to five keywords. They're not expecting a whole sentence or two sentences of text that interfaces and set up for it. And also we know that typically we won't get very good results like that. So what normally happens is we type in our two to five keywords into Google. And then if it's a complex search query like vacation planning, then we'll go and we'll open lots of those links manually looking through them and seeing does this fulfill our criteria basically applying the rest of that explicit context manually ourselves. And this behavior is known as post search browsing is a well understood phenomenon. Most of the search engines have written academic papers about this. And basically this is doing the second phase of the same search, but manually rather than doing it in the search engine. Then what typically happens is you maybe refine your search in Google and you repeat this process a few times until you get adequate results. So let's compare and contrast that to the new world order of chat GBT. One of the exciting things about chat GBT and similar interfaces is the promise of being able to refine and filter your search right there in the search engine in the same interface. And so you ask it something, you get some results and then you can ask it clarifying questions or ask it to filter those results. And basically it brings this second phase of the search right into the same interface. That is really powerful. It's really compelling and we want it. But at the same time, chat GBT and other chat based interfaces for search haven't had a huge impact. They're still absolutely tiny in terms of usage compared to regular old Google. The other thing about chat GBT is it requires a different interface and we're going to come back to, I've just noticed I've got a spelling mistake, so I want to fix that on the fly right as we're recording. So it has a different interface and so that I think leads to users being resistant to using it. And so the question is, can we get this same benefit of bringing this second phase of the search into the search engine itself but without the chat? And the good news is, all of the ability to understand context and filter comes from the GBT part of chat GBT rather than the chat part. GBT of course stands for generative pre-trained transformers and whilst chat GBT is an open AI product, GBTs themselves are just a type of large language model that are now out there in the world for anyone to use and there are lots of companies already building things on their GBTs. And so let's compare and contrast the two worlds. GBT4 has an estimated 100 billion neurons and that is where once it's trained it's sort of storing its knowledge in the connections between those neurons. And let's compare and contrast that with schema.org which is the best way we have right now of representing context in the form of entities. And so let's add schema.org right there. Schema.org has approximately 1,400 types of entity that it can represent. So you can see that if search was the shift from using this sort of model for representing entities in context to something like this, the magnitude of change would be absolutely massive. It would have a very profound effect on search and SEO. And as evidence that Google are moving in this direction, I want to talk about Project Magi for a second. A couple of months ago now Google did an interview with the New York Times where they talked about Project Magi and this interview was in direct response to Bing launching their chat interface. And in the interview Google said several interesting things including that they were going to be bringing the AI features into their existing search engine. Google already have loads of AI features but this was in direct response to a GBT Bing launch. And so it's not hard to understand that Google were talking about GBT AI features when they were talking about this. They also talked about updating their search engine to anticipate users' needs. And here I think they were talking about basically anticipating the explicit context that we want to add to queries. And the reason that they might want to anticipate that is to find ways to incentivize us as searches to add more explicit context to our searches when we've been trained over the last 25 years to type 2 to 5 keywords. But if they're going to have GBT powered backend, they need to incentivize us to add more explicit context into our search query. And so for my Mozcom presentation I predicted that they were going to add some sort of faceted search functionality. And then a couple of months ago they launched these filter bubbles right in the main search interface. And so you do a search, if you press one of the filter buttons, it basically extends your search, adding more explicit context into that query. And this is basically a way to incentivize users to add more explicit context where historically we've been trained not to do that. So I think this is going to lead to an absolute explosion in the length of queries that we're seeing for complex search queries. The existing long tail is going to start to look small and we're going to see a huge increase in number of searches that are entirely unique. What exactly this means for us as SEOs is still yet to be determined, but it's a very exciting future. And so I think over the next sort of 6 to 12 months we'll see this huge shift happen. Exactly how that looks is time yet to tell. Thank you very much.