 Hello, and welcome my name is Shannon Kemp and I'm the Chief Digital Manager of Data Diversity. I'd like to thank you for joining this Data Diversity webinar, Death of the Dashboard sponsored today by ThoughtSpot. Just a couple of points to get us started due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them by the Q&A in the bottom middle of your screen. Or if you'd like to tweet, we encourage you to share highlights or questions via Twitter using hashtag DataVersity. And if you'd like to chat with us or with each other, we certainly encourage you to do so to access and open the Q&A or the chat panel. Again, you will find those icons in the bottom middle of your screen for those features. And as always, we will send a follow up email within two business days containing links to the slides, the recording of the session and any additional information requested throughout the webinar. Now let me introduce to our speakers for today, Lisa Aguilera and Nate Weaver. Lisa is responsible for helping to shape the go to market messaging and strategies at ThoughtSpot. Prior to ThoughtSpot, Lisa was at DataRobot, the leader in automated enterprise AI and machine learning. And at Altrix, the leader in analytics process automation. Dave has, or Nate has, excuse me, I can speak today, Nate is an analytics expert with 17 plus years of experience selling and implementing solutions across multiple industries with a key focus on supply chain optimization. With a pre-sales engineer and solutions architect background, he has delivered analytics and business intelligence solutions across ThoughtSpot. All right. So with that, I will turn it over to our speakers. Lisa and Nate to get us started. Hello and welcome. Hi, Shannon. Hello, everyone. I'm really glad to be here with you all looking forward to this. I love the webinar I've been waiting for, the webinar we've been loving wanting to give. So excited to be here chatting with you. I'm Lisa Aguilar and with me, I have Nate and Nate, just so you all know, has been in your seats. He understands your world of pain. So he's going to be definitely sharing his points of view along with us on this webinar. So let's get going here. Now, just a quick introduction. If you're new to ThoughtSpot or if you've just heard of us, we are the modern cloud analytics platform. Think of us as the leading consumer grade front end built for the modern data stack. And we are helping customers all around the world, both big and small, complete their transformation and modernization journeys everywhere from well-known companies like T-Mobile and Capital One to smaller companies like Metro Mile. And we really only are as good as our customers recognize us to be. So we are completely honored to be recognized by data and analytics teams through a number of accolades and recognitions, as well as received honors and awards from our technology partners like Snowflake, AWS, Microsoft and many, many more. But let's get into the meat of this. Going to do a quick question here. I know not everyone's into polls, so go ahead and give us an emoji hand raised if this sounds like you. How many of the reports and insights that you built three months ago are obsolete today? And, Nate, I know you've lived this. So why don't you why don't you share your personal experience here? Yeah, probably one of the most common questions I ask our customers as they're becoming a new ThoughtSpot customer. And I'm going through their background of having another analytics tool. One of the first things I always ask is if you can refine your reports and dashboards by the number of views, give me accounts of the kind of information you may want to replicate or recreate if you could in a self-service environment. What we find is that when we are looking at it on a weekly basis, pretty good amount of users monthly drops away significantly. And then if you look at three months ago, it's just terrible statistics, right? Sometimes 10 to 20 percent of those have even been opened or looked at. So yeah, they become obsolete and they are nothing more there to manage, to maintain, to govern. And so it's a very common straightforward answer every single time I ask. And that is right. We're seeing zeros. Yes, it takes a year or two to become obsolete. Well, let's take a look at this. COVID has actually changed everything. You know, it's been really challenging for a lot of organizations, but I don't need to tell you that. You've all were in the thick of it living it. The pressure was on. I'm sure each and every single one of you felt new use cases that you probably didn't even know you needed or even imagined would be needed, needed to be stood up overnight and companies actually changed the way that they looked at their data. They were looking at it to help them outmaneuver the uncertainty and course correct again and again as the circumstances has changed. And what this has led to is this is actually awakened the sleeping beast. So if you didn't feel like you were in a fire drill before, you're definitely in a fire drill now and you're going to be in a fire drill for a lot longer. The value and your impact has been felt through and through your organization and that huge appetite for digital transformation is completely on and data teams are now on call to help the business reassess assumptions, reevaluate scenarios and strengthen business users ability to sense and respond to changing conditions as they're happening. Now on hand, this is great. Your impact has never been greater on the other hand, the pace of insights and the shelf life of your insights is actually shrinking exponentially. You know, day old insights are no longer relevant in this instant in context insight world that business users are needing and wanting and are demanding to become the norm. And I can't say it any better than Thomas Mazatharo here, the CDO at Western Union. You know, he said this when he was presenting at an industry event to other CDO's who's basically saying this is the slowest pace that data teams are going to see for the rest of our careers. And I'm sure that you're all seeing that the uncertainty kind of unleashed this need for speed and it's forced organizations to digitize overnight. And it's painful as the pandemic has been and I know many of you have spent probably many late nights and weekends working. It's actually offering a powerful reset button for data teams and data leaders alike to accomplish more faster and to make what we thought was impossible. Well, really possible these days. Now, the months that we typically take to identify priorities and then build a plan and then migrate everything in the years to execute, they're actually no longer. And this crisis has forced us to innovate and it's forced data leaders to break, revamp and redefine old processes. And this is especially true when it comes to technology. There are a multitude of use cases you can see in the news today where modern data leaders and teams know that real time access to data driven insights is a company wide initiative, not a one undone. And they're transforming everything that they're doing within their architecture to support this. And they're leaving those dashboards behind with increasing frequency and trading in legacy BI tools for modern cloud analytics solutions, just as they did with data back in the 2000s migrating it from these heavy legacy, funky on-prem solutions and moving very quickly into the cloud these days. And they're doing this and getting not just a movement, but they're getting huge wins across every single industry, even on the unprecedented challenges and uncertainty that happened over the past year. For example, Walmart has completely shifted and they're giving their frontline workers and CXOs visibility at the skew level across billions of rows of their data with on demand real time insights. Canadian Tire, they didn't just survive this pandemic, but they actually thrived posting quarter over quarter growth and year over year growth because they had modernized and moved away from legacy on-prevent dashboards and they were putting real time insights in the hands of over 4,000 business frontline workers, allowing them to interact with the data themselves, ask over 75,000 questions per week and get their own answers, allowing them to sense when the business was changing and adjust right on the fly. As they saw customer demand shift and get ahead of that before supply chain issues and before user demand overpowered what they could bring into their stores. And nationwide also has done this. They're reducing their their bottleneck by ditching dashboards. Fannie Mae is introducing new ways where business users can just ask questions of their data and AI translates that into SQL in sequence and gives them an intuitive answer. And SunCorp has also cut ties with dashboards. They're tapping into search and AI and powering over a thousand business users to answer their own data questions and using the power of AI to uncover insights hidden within their data and clarity. The travel industry, you know, when we thought travel was the hardest hit industry of all the industries actually weathered their storm by moving away from dashboards. They were able to stand up net new use cases from proof of concept production in hours. Now let's talk about some of these fire drills. I'm sure you all were facing how many of you need a week or maybe more to create a net new dashboard. Let's see some hand raises, emoji, some hearts, anything in the chat. Great. Let me know, Nate, I know you've experienced this one. Yeah, another popular question that I often will ask, you know, if in your traditional analytics tool, how long does it take you to build out a specific request? And the problem will go through in the next couple of slides. Really is that I keep getting the answer. We can't hire enough report developers, right? Analysts are overwhelmed. They have no choice but to put additional ad hoc requests in a queue, they will get to it when they get to it. And often it becomes seven to ten days before they can turn around a new report or a new dashboard. So that's one of the common problems that we'll talk about. Yeah, and I'm seeing some great chats here. It takes us months. It's tricky. Let's see. All right. Let's let's take a look at this realistically. Now, if we take a look at this in context in our everyday world and you almost anywhere in the world, you can get a car to come to your exact spot and take you anywhere in less than three minutes. You can fly from New York to Singapore. That's thousands of miles in less than 24 hours. You can get the same day or next day delivery on literally almost anything that you want. And you can mail a package from the UK and door to door within 48 hours, get it to the US. Now, all of these require perfect physical, logistical alignment across peoples and items to get this done. So why is it that we've been able to solve these immense physical, logistical problems and get them down to mere hours? But when creating in that new insight or report, which is fundamentally a software issue, still takes us weeks. We don't like waiting for packages. We'll go find and buy a new package somewhere else. So why are we waiting over a full business week in order to get insights from our data? And in some cases, I'm seeing here in the chats months. Yeah, I'd like to kind of pull out one comment here by Jason. This is a great, great perspective that it's not always just the reports themselves taking time. It's the background objects, right? It's the movement of data and assets from source into the database, creating new views, getting that data modeled so that the so that it's accurate, right? Taking into account business logic. There's a lot of things that go into building the next report or next dashboard that we'll also talk about as we go into more detail. So well, let me just go into it now, right? It's because every business user is different. You know, no two marketers or finance managers are the same. They may interact with similar data, but they're coming from two very different perspectives, right? So the problems that we're trying to solve may seem simple, but they're actually quite complex behind the scenes. So what often happens is as a request comes through, we model the data. We may be building into a basic structure of a report or a dashboard, delivering what we kind of assume is the answer to the question. But the problem becomes the next question and the next question. So typically that means building a hierarchy for drill down or modeling it at a different granularity that you didn't have before. And it just it's a constant loop where if you're like myself as the director of analytics, I had a very similar team like you see on the screen now. The business user would bring a request. The analysts lived in the dashboard tool itself, whether it was our BI or Tableau or our executives had purchased DOMO. The data engineer was great at the business logic side, creating database views, making sure that everything was accurate. And then of course the DBA would control access to everything and make sure data is pulled in to the correct data warehouse, et cetera. So after this request gets passed from person to person in multiple meetings, unfortunately all too often it was too late. The business had moved on. We finally got to the answer and someone would just reply back with a thank you. Got it. Thanks. And then we would see them use at one time, right? A very frustrating process. So unfortunately we found that static dashboards all too often treated end users like carbon copies when in fact they're not. One very complex dashboard we would deliver did not fit a one-size-fits-all approach. So anyway, let's keep going and I'll go into a little bit more detail. That's right. And Adele and the chat is lighting up. I think I've struck a chord. You have Chris and Adele they're saying the same thing the business user is changing. The tough one. Yeah. So let's go through this a little bit, right? So whenever a business user comes to an analyst to request new content, they in our world again they set off a chain reaction. It would result in a business kickoff meeting where requirements were gathered and we had to structure this environment simply because again, we couldn't hire enough analysts or engineers to really keep up. And I was in a fairly unique environment where we delivered dashboards and reports to end customers. So dozens of customers inside each customer was hundreds of users. And so the requests were just nonstop. Lisa, you can go ahead and go through this slide a little bit. But what we were finding that was when we were allocating resources to this that we would have to start justifying behind the scenes to our executives where all of our time was going because we had internal requests as well. So when we start thinking about the basic costs and how long it takes to get a new report out and continually delivering that the analytics department was being viewed as a loss leader that we were wasting too much time delivering ad hoc requests to our customers. It was good for customer service but it just took so much time and resources to develop it that it was kind of hard to justify. So anyway, let's keep moving. So the problem here was that our data teams, right, were always dealing with what was mentioned earlier, the publishing through a pipeline, multiple servers were being used. We had different platforms hosting different data for individual customers keeping everything separate. And so even if we were to get to the right answer that a business user would come to us for, you know, it was tough to distribute that out to a wider user base. So what it felt like was each individual request was just a one off and it was just absolutely eating away at our productivity. It felt like remedial task to the analyst and the data engineers. They felt like they were doing the same thing over and over again and not actually delivering value. One of the most common things I always had to manage was they wanted to deliver more business value to see an ROI, right? To really make their mark. And unfortunately they felt like they were just report writers, which is frustrating, but that's part of the role. Well, that's the trap that we were caught in with the legacy dashboard flow, right? The same process having to go through four or five major steps just to get that out, get it delivered. And it, you know, it was not only a time suck, it was a little bit just deflating from an ego perspective. So again, my only real point here is that often these ad hoc requests kind of can feel like remedial task versus true analysis. It looks like we've got a lot of people agreeing with you on the chat here, Nate. The granularity and the next question and the root cause analysis and some of the wishy-washy requirements from your business user that feed into the legacy loop and the legacy process isn't helping. Yep, that's a great point, right? With GDPR and DPA, the aspect of having to hide personally identifying data not being able to go down to the correct granularity or being able to go down to a different granularity after security rules have been put in place. Of course, role-level security is critically important. So even when creating one dashboard that can serve a broader purpose for multiple users, once you layer in role-level security, some people can and should have access to different levels of granularity. Things get incredibly complex. So I'm gonna try and actually show that when I get into my demo. Okay, so let's kind of keep moving kind of quickly here. I'm gonna fly through it so we can go a little bit faster, right? So this ultimately unfortunately means that there's a little bit of tension between dynamic data, static dashboards and data as we all know is not a static object, excuse me. It's just living, it's breathing, it's always changing. If I go back a decade ago, one of the things that I suffered with was we would use sample data internally to create a dashboard on the desktop and then we would publish it and push through the entire data set. And of course, that's normal, things do change, so we would go back and model over again. But if the business went through a fundamental change, like going from a Verizon store, selling dozens of phones into today's world, selling thousands of accessories, well, the same dashboard won't work if you just use the product attribute, right? Because you can't easily visualize thousands or tens of thousands of products. So of course there's product categories and the hierarchies and et cetera. And as businesses go through flow or go through changes, we have to keep up with that. Things are always moving. So that is again, just the nature of ad hoc analysis of being able to ask a specific question, get a specific answer and move on hopefully without tremendous amount of one-off data modeling. Okay, so here's where we really start to get into the meat, right? It's the flow, the fragmented tools do get fragmented insights, right? Dashboards have created a world of chaos in a way where we've got data governance policies. We're always trying to get to the single version of the truth. But when we go back and we actually ask, we found that 54% of data analysts or data leaders report that multiple sources of truth and conflicting data is probably their number one challenge. What happens here, and I'll just tell a personal story and we'll move on, is that we spent a lot of time modeling business logic into our data models. So that means we would create measures, calculated measures into the data model itself so that if we were delivering a dashboard, whether it be Power BI or Tableau or Delmo in my experience, the answer should be the same. But when requests came through the support team, we would have to go back and QA some answers and we all too often found that the analysts themselves were adding calculations into the data model in the dashboard tool itself. And so that is a data governance problem that we had to solve, but they didn't always realize that the answer was already there. They were creating a new calculation, not taking into account the weeks of data modeling that we had already worked on for specific rules behind the business and why things are calculated the way that they were. So the obvious answer there is there were a lot of inaccurate measures that were propagating through our dashboards, through our reports and that's simply because of the ad hoc requests that were coming in, analysts were delivering specific answers back without always realizing the bigger picture. So in my world, dashboards weren't able to keep up. The requests kept coming. We didn't quite have enough members of the team to continue building more. And that is probably the answer I hear from current ThoughtSpot customers when they first become a customer. I will often ask, I know that you have a very large analytics department. I know you have existing dashboards, legacy tools today. Why ThoughtSpot? Why another tool? And the answer is almost always the same. We don't have enough people to keep up. Traditional processes just result in too many report requests. What we're looking for is true ad hoc analysis. We're looking for a way to move faster so that we can deliver data in a more agile manner. I've seen this kind of coming through the chat a little bit, looks very similar on your side as well. We are moving from that older waterfall methodology into a true agile approach in today's world. And so again, the answer is we just can't keep up with our legacy dashboard tools. The business is moving too fast. All right, so once you've done all this amazing work in creating a dashboard, how many of you are still struggling to get the business user to self-serve and answer their own questions? Give me a shout out if you felt this pain yourself. They just kind of keep hitting you up. Well, can you just give me this quick answer versus trying to figure it out yourself? Hey, it won't take you very long. Can we just sneak in and do this one really quickly? You know, it's that personal favor of let me cut in line. But unfortunately in my example, that is where the answer would end up being wrong and someone would make a decision based off of inaccurate data. And we had to put a stop to that because it was a huge problem. We got us so true, I would cry. So we, you know this pain. So let's look a little bit about why this pain actually exists here. Now, I want to say for 20 years dashboards, they've served as the foundational element for business intelligence. They've helped leaders visualize and share valuable data across our organization. And at its inception, these have been great. They're the perfect vehicle for delivering key KPIs. And they were great at kind of opening the data doors and getting you in. So you didn't have to have a background in IT, it just made it much more accessible. Now, despite these 20 years of investment in dashboarding tools, our analytic adoption rate from business users has stagnated around 30% across the board. And this is despite two decades of tremendous investments by organizations that use them to try and get business users to actually adopt these tools. Now, why is that? Well, the problem isn't inadequate training or time. As we can see here, we've had plenty of time. We had 20 years, but we're still not moving the needle here. The problem is it's not a lack of motivation from the business users either. They want data, they're coming to you all for it. On the contrary, what's happening here is these traditional tools just really weren't built for them, quite frankly. According to HBR, Android, 84% of frontline business users report a poor experience with their analytic solutions and difficulty in accessing data and insights for themselves. And they also report needing a better insight technology. What's more is even executives aren't satisfied with their current solutions. 67% of them admit to not being comfortable accessing or using data from their existing tools and resources. And this is because BI tools have always been intended to be used by you all, highly specialized data analysts. I mean, imagine just being dropped into the cockpit of a Formula One car and being told, hey, all right, go ahead, race around a Grand Prix. And you've just barely gotten your driver's permit. It would be a complete disaster. You'd probably know where the gas pedal is. You'd know how the steering wheel works, but would you really know how to drive it? And that's what's happening. Expecting business users to engage with these highly complex and technical analytic tools possesses the same challenges. It's a failure through no fault of their own or your own. There just hasn't been a better solution. And in today's always on, always changing world, modern businesses can't afford to have this kind of experience for their business users any more. Yeah, at least I'll jump back in for a second. Yeah, what we're seeing is a complete fundamental shift moving into a modern cloud analytics architecture with the popularity of cloud data warehouses being able to crunch through enormous amounts of data in a faster time. We're seeing fundamental shifts moving over to Redshift and Synapse and Snowflake and BigQuery. And these just tools are able to separate out compute and storage costs. And so it feels like all of our customers have already moved over to this newer architecture. What they're looking for now is a way to deliver data to their end users in a different way. Not always having to go through the same old legacy process, especially as they realize that the legacy dashboard tools can't perform even though the database can. So even if they're using a live connection, data still has to be modeled into extracts or it has to be aggregated or they can't build enough hierarchies. Whatever the problem happens to be, we're seeing just a fundamental shift. So I'll kind of go from here and kind of talk through how Fauswalt can help to fix this problem. And I'll just do this checklist quickly before we move on. Fauswalt was built for a cloud native architecture. So using that live query, that direct connection to take advantage of everything that you have modeled in that new cloud data warehouse or in your data lake as an example. What we've done is we've completely shifted away from the traditional dashboard approach into a self-service environment. And you're going to see that as I start to go through the demo now. Again, it uses live query. So anything you build and then give access to see will be used in a live environment. I won't have time today, unfortunately, to go into the machine learning side of things, but I will focus on the AI-driven insights, being able to help uncover some of those hidden trends and anomalies that maybe exist in your data today. And I won't read through all of these. I think they're going to come through. But one of the questions that came in was about security. And of course, there are controls for personally identifying data of not exposing it in an autocomplete, deciding who has access to see what, dropping out the names, right? And again, we'll go through this a little bit more in detail. Yeah. So this is where a ThoughtSpot really is flipping that entire paradigm. ThoughtSpot is the modern cloud analytics platform. So you want to think of this as a consumer grade search interface for all of your data for anybody, even your most non-technical business users, to be able to ask any data question and discover insights automatically. It is the fastest way, an easiest way, to get your business users to understand how to answer their own questions. Everybody intuitively understands how to use a search bar. And that's what ThoughtSpot essentially gives you for your data for your business users. Well, at the same time, giving you all that flexible APIs so that you can do two things. You can, one, you can build this into your existing ecosystem. The extensibility of the platform allows you to really integrate it within your cloud data environment, as well as push your insights into either your favorite business applications or deliver it through our self-service platform. And the flexible APIs allow you to create interactive data apps with the search and age of an experience through a low-code developer experience. So if you're trying to bring this out to partners or consumers or anyone outside of your firewall, you can easily integrate this consumer grade experience. And as Nate touched upon, the architecture is completely different. This is really about giving a consumer grade experience to all of your business users using natural language search and AI to answer their own questions. It's unlimited insights into your data because you're directly connected to it. You can drill down into the finest level of detail to get the exact answer that you need. No remodeling, no re-aggregating your data, a single source of truth. And because we are a cloud-native architecture, it makes us all possible. You can take advantage of the modern data, speeds and volumes, and you never have to worry about optimizing connections, performance times, or dealing with any of that publishing server middleware between you and your users and your data. You curate a search experience across your data very quickly. And we have Search IQ and Spot IQ. Search IQ is our smartest, the world's smartest search engine built for numbers. And it can handle extremely complex queries over billions of rows, like Nate was touching upon with that enterprise grade security and governance. And Spot IQ is the AI-driven engine that allows one-click auto-analysis and automates drill downs to cover anomalies and outliers hidden within your data. Lisa, if you don't mind, before you move on, I'm just gonna answer a few questions that I know are going to happen during my demo. Absolutely. So guys, as you're seeing this architecture, we are using live query. Of course, we do have other alternatives, but almost, it's like 95% of our customers have moved over to a cloud data warehouse in live query because of the speed and the scale and the complexity that they're able to handle. So it's gonna be live query. When we say value indexing, ThoughtSpot automatically indexes all of the data for search. So basically this is built for self-service off of well-governed data. I'll keep hitting on that point. It's not someone searching across all the data. It's about them searching exactly what you've been giving them access to do. I'll focus mostly on search today. I don't think I'll have time looking at the clock to go into much detail about SpotAQ. So Lisa, maybe we'll find a very quick four minute YouTube clip that we can share in the chat so everybody can go back and see that as we move on. Sounds good. So why don't we jump right into it? We do offer two versions of ThoughtSpot. There's Enterprise and then there's Everywhere. It just allows you to extend the search experience beyond your company walls and build extremely engaging data apps. So there's two flavors. So why don't we get right on into it, Nate? Yeah, so there's a couple of questions, Lisa, that I'll just let you kind of help try to manage as I jump in. There's a couple that I'll talk about here kind of quickly and I'll just go with the newest one first. It seems to me there would be a lot, have to be a good metadata, including robust definitions of business terms. Yeah, I'm gonna focus on that one in detail as I go through. Now, I'm gonna move very quickly. So let me just go ahead and apologize ahead of time as I do this. I've got a lot to cover in about 15 minutes. So I'm gonna start with a blank screen. This is me searching for the very first time and so much like Google, what I'm gonna do is just ask something that, I don't know, maybe is an ad hoc request. You can see that if I accidentally hit or misspell, it's going to automatically correct it for me. In this case, yes, I'm starting out with the most simple, easy kind of keyword search to begin with sales jackets monthly. It automatically built out a trend. At any point I can right click and drill down. And I wanna focus on this because there's no need for pre-built hierarchies at any time you can drill anywhere, including the ability to go down to the underlying data. Now, this is a live environment. This one happens to be ran on a snowflake environment. So everything that I keep doing from this point forward again is live query. I don't know, I'll just, you can see that when I clicked into the search bar, it actually gave me some recommendations. It actually knows who I am. It remembers my search history. It knows my data security rights for row level security that's been put into place. And then it uses a bit of popularity of what are other people that interact with this data set also asking and it starts to surface that to the top. Now I can of course keep moving that target anyone deleting something. Let's say I am responsible for the South region. And so I can see that, but you can see it's actually doing a keyword there instead of all of the regions you're seeing down here. What I actually wanna do is just say South versus West. And of course it will just remember and retain just those two. So what you're actually seeing there is an analytical keyword. There are dozens bordering on, I would say 60 or 70 different keywords that are in the documentation that are kind of like accelerators. There are ways to make it easier for an end user to search. Top will automatically default to top 10. You could also say things like top 24, top 13, bottom three and it will understand that and automatically sort automatically. Contains is great if you're used to a little bit more of like that, you know, sequel type statement like jackets contains rain in the name, right? So that it only filters something with rain in the name, et cetera. I can keep going through detail but I'm gonna move on kind of quickly. That is the most basic fundamental easy search just so that you can see I'm doing it from a search bar. Now I'm using keywords. What I'm gonna do is switch back over to the actual homepage where everyone would be onboarded. This is where you would land every single day. This one I call data search. This one we call content search or searching across answers. These are metrics that can be tagged and be followed. They will show and alert you as they're trending up or down. So think about your dreams and your reds. There's a bit of popularity happening about what's trending. A pinboard is what we do call a collection of views or individual visualizations kind of like you would think of a dashboard. Answers are one-off reports. But what I'm gonna do now is use content search to use something like more natural language. You know, what we're sales by, I don't know, product type in 2020. And what this allows you to do is for an end user to search across all the content that has been built and to reuse an answer rather than having to recreate it over again. You can see that when I select one, it gives some context that the analyst maybe has put in. This is just, you know, a natural language generated kind of descriptor. It's showing me the metrics and the attributes that are inside that particular answer. But what I'd like to point out is when I say what we're sales, it's match that product type, match that. And then in this example, 2020, in this answer has also kind of helped translate from last year, so it knows some context. I can preview this. And yeah, maybe this is what I was looking for. I found it, let me go to that. It'll start to open it on the pinboard that exists. And then it will open this actual answer on its own. Again, this is a live query being sent to Snowflake right now and returned so that the data is always refreshed. It's always live. Okay, so here I am in my answer. Normally in a dashboard tool, yes, you can right click. You can drill down if there's a hierarchy and that's why you can drill anywhere. But it's all about the next question. So we've got this new menu on the right called explore. And what explore allows me to do is, yes, I'm responsible for the southern region. So I can click on that and it adds a filter. If I'd like to add things, I can do that. I can search or I can just click on the hot button. Again, it remembers me. It knows that I do this demo often. So this one, again, it's just using our second set of a little bit more complex visualizations. But basically what this will allow me to do is to ask the next question and the next question and continue to drill down to look at only the products inside that jackets category or to add whatever else I want like I showed earlier. And it's gonna give me that answer side by side. And then again, allow me to go down and look at the underlying transactions at any point. So what I've shown you already is the basics of searching on day one through content that you've been given access to through your onboarding, through your training or jumping over to the search bar itself and be able to ask a specific question and get a very specific answer. Now, I asked a very simple question and I know everyone's thinking, yeah, but it's not that easy. What about more complex questions? So I've kind of started here with a similar thing. This one is taken down to the daily granularity because that's what actually started with Fortune 500 customers when they went to market seven years ago. They knew there were enough dashboard tools in the mid-market or departmental solutions. So they wanted to tackle the big customers that needed a better answer. So what we needed to build was better speed and scale and complexity into the tool, into the solution that maybe your traditional dashboard tool just couldn't handle. So in this example, sales daily jackets, what I'm gonna do is I'm gonna keep moving forward. I'm gonna add in a couple of forecasts in this example that happened to be coming from data robot or data bricks or modeled wherever it happens to be. And let me go ahead and actually put these on the same trend line here so that we can see them all at the same time. What I'm doing is just simply grouping everything. So they're on the same axis. And when I do this, what you'll actually see is that in a traditional world, a forecasting tool would have used, I'm gonna get rid of a couple here. They would have used a historical trend and they would have just continued going along that seasonality. Of course, we all know that last March or April, what happened was COVID. So as COVID spiked, this whole first forecast got completely thrown out. It's trash. It means nothing anymore. So we're moving on with these other forecasts. Now behind the scenes to drive these is COVID data. So when I put this in, this happens to be coming from a live feed from Johns Hopkins to create this dataset. This query will actually take a couple of seconds, but that's because this is an enormously complex dataset being driven and to kind of show what that looks like behind the scenes is our, what we call query visualizer. And this is where I really wanna dive a little bit deeper because we have so many analysts and data engineers behind the scenes today on this call. When I hit the little I information, it tells me how it got to the answer that it did. And I'm gonna go to query visualizer. Now bear with me for a second while I zoom out. What is happening here is thought spot automatically generates a precise SQL statement, sends it off the snowflake, returns the answer and visualizes it in a matter of seconds. In this example, the further I zoom out, what you're actually looking at is five distinct fact tables being combined together automatically based off of the join rules that have been set in place. And you can see some are full outer, some are inner, some are left joined so that the numbers are accurate. At any time you wanna look at that, you can look at the SQL that's being generated or you could have switched over to your cloud data warehouse to see that SQL that was executed and returned back. This is how you sometimes help to validate the answer. But what that actually looks like is thought spot can handle incredibly complex schemas off of tens of billions of rows of data. It's very common that Walmart hit 70 terabytes of data just in their self-service environment a couple of years ago. Some of our other customers are even getting large, Hulu as an example is an enormous data set that they're using today. And so what I wanna go into detail next because I can't keep up with the chat is to show you how we do this. So a worksheet is a logical middle layer that allows you to translate the data from the database. So normal tables, fields, et cetera that are joined together through primary and foreign key constraints. When we create a worksheet, this is now a UI that puts all of those attributes and measures together. It helps you to automatically hide the keys that you wouldn't want someone searching across. And it allows you to translate that data. So you can add as many descriptions as you want. You can add as many synonyms as you want. Like if I wanna just allow the end user to just type campaign or website didn't need to be two words. That means they can search website this way or website this way or I don't know, let me find another one. We love acronyms, right? So CPC would work in a search bar versus cost per click. And you can have as many of these as you want. They're common delimited. What you'll also see here is me modeling the data for end users. Again, this is where it becomes the self-service off of well-governed data. Okay, with that said, I am going to switch over to the next side of the presentation. That was searching against data inside the enterprise. What we're also starting to see is more customers want to extend the data outside their walls into their suppliers, their customers, their vendors, et cetera. So that becomes a embedded environment. So what Thought Spot allows you to do is to use a developer playground to build and completely understand what you're going to deploy before you do it. So in this example, what I'll do is show that you can embed the actual search bar. You can embed the entire dashboard or report if you wanted to. Again, we call them pinboards. That's a good transition. If you're coming from a dashboard environment, you want to deliver that, but then enable that Explorer functionality to ask the next question and the next question allow a non-technical end user to continue to interact with the data. You can embed the entire experience or you can just use API framework to build your own visualizations and just surface that data. I'll do search. When I hit see it live, what it actually does is go over to the developer playground and what this will do is this shows me what the end user is going to see. So I'm going to move pretty quickly here just in case you guys aren't, maybe not everyone is wanting to embed. As I interact with the data on the left, you can see it's highlighting the JavaScript code down here. So if I hit run, what that will do is it will automatically select this data set for the search bar. It already hid the data panel. So now people can just search or I can create what's an action if I will. So like as an example, in an embedded environment, I'll do action dot, I don't know, let me do save. It's probably a good example. So what this is actually going to allow me to do when I hit run on this environment and someone goes to search, I don't know, sales by product is a good one. It's already auto completing that. Then it has actually disabled the search of a save button. So I can control absolutely every aspect of the environment, including something really interesting here is a hot button. This one says create segment. What this actually allows us to do is integrate or push data into a business application so that people can truly take action and do the next step after they've analyzed that data. So let's look at that. I'll show you what it looks like. This is a true embedded environment. This is thought spot behind the scenes driving it, but as you can see, it looks completely different. I'm going to focus on a business user here for a second. I don't know, I'll go straight to something. I'll go to web marketing and let's look at conversions. What it's going to do is open that collection of visualizations that have been given to the end user. And again, what I wanted to highlight here was that I can use this Explorer feature even in an embedded environment, allowing true self-service off of that governed data. So if someone wants to only focus on social media or if they want to add a website as an example, it'll automatically continue to filter that down. Answering that specific question, this is the type of question I would have gotten in my past if someone just wanted to see something a little bit different or maybe we provided a complex dashboard that could have maybe answered a broader set of questions, but as dashboards become more complex, the more confusing they become for an end user, right? So that means that adoption drops off. That's why when we log in to look at number of views by report or dashboard that they're really low, right? Gartner and all these other governing bodies will always promote that user adoption is still hovering at a low 30% across the analytics environment. That's because of the dashboard conundrum that we've been discussing today. Now I'll show one more just kind of quickly. If I go over to the analyst role in this example, let's see, I'm just trying to look at the time real quickly. Sorry about that. Let's see what we have access to see. We're running a little bit short, Nate, and there's lots of questions. And let's go ahead and just end it there. I was just gonna show how a custom action can actually like push data off to Marketo or to another tool, but rather than seeing that, maybe we'll just put in a video. So let me go ahead and stop sharing. I know that I went really fast. So Lisa, is there any questions that I need to answer? Or you go ahead and go and I'll read through. Tons of questions. Why don't we jump right into them. The biggest one that keeps coming up is billions of rows of data. Yeah. Not the complex tables and modeling the data. So why don't you dive right into that one? Yeah, so we have this concept called zero to search. It's something, again, I'll try and find the video and post it really quickly. You can look it up on our YouTube channel. Basically, when you go through the UI, you would connect to your data warehouse. Let's say it's Snowflake in my example again, I would start to select the tables and the individual fields that I want to enable self-service on. Could be a database view, could be a collection of tables. Maybe those are already modeled together into a schema. I select those. Then they show up in the ThoughtSpot UI where I can create that worksheet. So that's where I try to do the translation layer from how data store in the database versus how an end user may ask. What language are they going to use? Do they use different business terms? Do I want to have really intelligent descriptions for the hover over? And so that's kind of what I showed in the worksheet. Even though you see a collection of many to many joins, multiple fact tables that I showed for complexity, the number of rows behind those tables can be approaching the tens of terabytes, the billions of rows. ThoughtSpot handles that extremely well. We've been able to scale in numbers that other dashboard tools just cannot touch at all. And that's probably why we're seeing a higher number of customers moving over to a live query environment, doing self-service in ThoughtSpot. All right, then I'm going to thank both of you for this great presentation to kind of jump in here on behalf of the attendees. There have been so many great questions coming in, so much engagement. Just love it, y'all. And just to answer the most commonly asked questions, just a reminder, I will send a follow-up email by end of day Thursday for this webinar with links to the slides and links to the recording for you. So diving in here, what must be the metadata quality like to get such a result with ThoughtSpot? Yeah, integration with all the other tools that are commonplace in a modern analytics tool set. So we integrate with Elation and other tools through our API framework for data governance. Many of the cloud data warehouses already have data quality being either baked in or integrated at that level. So if someone is modeling, cleansing, and governing data in their cloud data warehouse, ThoughtSpot inherits all of that and would simply become the end user viewer tool for that data. One of the comments that I saw is, well, we have 3,500 tables. Yeah, that's commonplace. What would happen is you select the tables that you would want an end user to search. You would have potentially, I don't know, a dozen or more worksheets and a worksheet or collections of tables. So think about that like a use case. No one would ever just allow broad access to search across the entire data warehouse. You would pick and choose your battles of which data you're giving to which department first. We call those use cases. That's always a best practice. That's what you're actually seeing on the screen today. Snyder Electric started with human resources. They had, I think it was like 3,000 Tableau users and their user adoption was just abysmal. And so what they did was they put those 3,000 users into ThoughtSpot and their adoption has just skyrocketed. Now that people can ask and answer their own question. Again, Hulu has 400 billion rows. Their number of tables and the complexity behind their algorithms to show you the next relevant show that you might like is incredibly complex behind the scenes. But again, ThoughtSpot's able to handle that scale. Again, I'm not gonna read all of these off the screen but you can see that. I love it. So how would this interact with alongside a data catalog? Yeah, so like Elation or any other tool. Again, through our API framework, we work with all these other enterprise level tools so that you can see what data has been governed. A lot of times we would call that like a golden record or certified data. That certified data is what gets connected to and surfaced or available for the end user to actually use. So you would maybe rarely do new tables or new data that hasn't been governed quite yet because again, complexities of the data, business rules, maybe are not fully baked yet. So best practice would be to go off that governed data. I think we have time to slip in at least one more question here. Can you import descriptions and synonyms from the data governance tool in the worksheet? Yes, so that's part of the API. That's something that we're actually pushing harder on in our next release is to pull everything through. So as an example, when I connect to Snowflake today, it will inherit the joins or primary foreign key constraints that have been set. So you don't have to manage that twice. It will inherit all of the role level security rules that have been baked in already. So you don't have to manage that twice. The next one is us taking it forward one more step with the data governance tools so that any notes left behind, any descriptors, anything that will help that you've already built once and don't wanna have to do it again, we will simply inherit and pass right through to the end user. And I know we're at time here, Shannon. So I just wanted, there was a couple of questions on things here and some resources. So there are some great resources here. This was the Cliff Notes of Dashboards Are Dead. If you'd like to dive in a bit more, you can. We definitely have some kits. I saw a lot of people saying, you know, CXS look great, UI looks great. And then you try and do it yourself and it's not as easy. We say challenge us, go for it. Please try it free for yourself. Set up a zero to search workshop. We have helped countless customers log in, connect to their database within less than 90 minutes and curate a search experience against it. But don't take our forward for it. Give us a challenge. Let us see how much data you have and put us to the test. I'll leave everyone with one thought because I know we have to hang up is that think about how much more value you could deliver if you're managing data and creating worksheets without having to answer every little report request that you get. If self-service can truly allow any users to ask and answer their own questions, you can focus on some of the more important questions I'm seeing over here in the chat of working with the databases and working on the data sets. That's where you can truly provide value and really move the mark in your organization today. So like Lisa said, give it a try for yourself. You'll be blown away that it is as easy as I'm showing today. Do it with your own data. Well, Lisa and Nate, thank you so much for this fantastic presentation. It's really been very good and so impressed in how engaged you've been able to keep up with the chat. Not everybody can do that. That's awesome. And thanks to our attendees for being so engaged in everything we do and all the great questions that I'm afraid that is all the time we have for today. Again, just a reminder, I will send a follow-up email to everybody by end of day Thursday with links to the slides and links to the recording of this session. And I can get you all the resources as well for all of that they have provided here in that follow-up email. And so thank you, everybody. Thank you, Lisa. Thank you, Nate. Thanks to ThoughtSpot for sponsoring today's webinar. Hope you all have a great day. Thanks, everybody. Thank you so much.