 Hello and welcome. My name is Shannon Kemp and I'm the chief digital officer of DataVersity. We would like to thank you for joining this DataVersity webinar Beyond the Hype, the real impact of AI on business intelligence, sponsored today by Metric Insights. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we'll be collecting them by the Q&A section. And if you'd like to chat with us or with each other, we certainly encourage you to do so. And just to know Zoom defaults the chat to send you just the panelists, but you may also change it to network with everyone. To find the Q&A or the chat panels, you can click those icons found in the bottom middle of your screen. And as always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and any additional information requested throughout the webinar. Now let me introduce to you our speakers for today, Marius Moscovici and Mike Smithman. Marius is the CEO of Metric Insights and has over 20 years of experience in analytics and data warehousing with roles at Oracle, Integral Results, and Linden Lab. He was co-founder and CEO of Integral Results and a leading BI consultancy that was acquired by IDEA Integration. He also formed and led the analytics group at Linden Lab. Mike is the VP of Sales and Marketing at Metric Insights. He has over 15 years of product and marketing experience in the business intelligence industry. Mike helped bring analytic products to market with senior roles at Seagate Software, AIM Technology, Teal Leaf, Accero, and Good Data. And with that, I'll give the floor to Marius and Mike to get today's webinar started. Hello and welcome. Thank you, Shannon. Hello, everyone. This is Marius. And thank you so much for joining us today for Beyond the Hype, the real impact of AI on BI. So before we start, actually, I'm going to start with something unusual and then there's already a question in the Q&A and it actually relates to the start of this. So Stephen McDougal asked, how long do you think this trend will continue before people are no longer interested, become dissolutioned and funding runs out? And I'm not going to answer that immediately here, but just to give you a sense whether you are on that spectrum of the AI, the range, if you're somebody who is clearly rightfully perhaps a little cynical about what's going on with AI and all the hype that's there, or if you're an AI accelerationist, somebody who's just watching with bated breath every day to see what new developments there are around AI and cannot wait for the next item and feel like it was going to change the world very, very quickly, whether either side or somewhere, if you're somewhere in the middle, there's no denying that there's just a huge amount of hype out there. And if you're familiar with the Gartner cycle, the hype cycle, I think it's hard to argue against the fact that we are certainly at that peak of inflated expectation stage of the hype cycle. And those of you familiar with the Gartner hype cycle know that following that is, I believe, the trough of disillusionment. So this purpose of this conversation and then the presentation here is really to cut through the noise here a bit and really focus on what is the real impact of AI and where is it, how can you really navigate this in a way that you're not getting to the trough of disillusionment, nor are you sort of overhyping your expectations. But you are in a sober and clear eyed way approaching the impact of AI and BI in a way that you can actually generate value. And so that's what we want to focus on in this presentation. So with that, let's talk about, you know, BI in general and AI and what is what's the promise with AI as it relates to BI. And I think really at its core, it's all about this unfulfilled promise that's been around with business intelligence, as long as I can remember, you know, since I started in business intelligence 25 years ago, right? And that's the promise of self service. Now, I can remember when the very first generation of business intelligence tools came out, you know, Brio and business objects and those technologies and they were, they were all, they promised to completely revolutionize business intelligence where people consume data and they make every all self service. So didn't really fulfill those promises. And then, you know, the next generation came along and you've got more modern, more easier to use technologies with Tableau or Looker or Power BI and again, all the same kind of promises tend to come out all around delivering self service. And that's so that's always been the promise. And yet it has just generally fallen short, right? You don't really see lots of organizations where business users are self serving to a large extent and are just not out needing help from their from their analysts. So, you know, let's look a little bit at why that is and what and to do that, let's look at the journey. You know, what is the journey that the BI user goes through as they're trying to understand how to get an answer to their question. And I would pause it for you in this discussion that it's very much akin to climbing a mountain, right? So think of Everest or any other sort of challenging mountain you might climb. And if you're familiar with the process of doing that, you know that, you know, you don't just charge up the mountain. Instead, you have a set of base camps that you try to reach. And as you get to each base camp, that sets the stage for the climb where you're sent to the next base camp. And often there are two or three or four base camps that you have to go to before you are sent to the peak. It's very much the same way when a casual user in your organization tries to answer a question with BI. Right? If you think about what their journey is, it starts off where, you know, they have a question and they need to figure out what information asset, what report or analytic is there out there that might be useful in answering my question. So you're finding that right report. And this might seem trivial to a BI analyst who is very familiar with what are the most relevant reports, or perhaps somebody who's been in the organization a really long time and knows where the go-to reports are. But if you put yourself in the shoes of somebody who is a casual business user, is new to an organization or new to a particular functional area within an organization, it can be pretty daunting to know, like, what is the right report? Which of these 10 reports that look at customer revenue is the one I would actually go to in order to answer this particular question, on which represent maybe something that was created two years ago and is now obsolete that I shouldn't touch versus the thing that actually has the right information for me. So discovering the right report, that's really the first base camp you've got to reach as a user. Then the next thing, if you've identified the right report, then you've really got to understand, you know, what is this report telling you, right? You know, okay, I have a question about our sales and I found a useful sales report. Well, is this measuring sales in a way that I understand it? Is it gross? Is it net? Does it include things through channels? Is it, you know, all the different nuances associated with how something as simple as sales or revenue or churn or anything that you might measure in a business, it's in a large organization. These things can very quickly become complex, the nuance, and you've got to be able to understand, you know, what the data means, have that data literacy to be able to interpret the numbers correctly. Otherwise, when you're looking at the information, you can very easily draw the wrong conclusion, even if you're looking at the right report. So second base camp you got to reach, really have a good, having good data literacy and understanding on what is the report telling you, what are the key KPIs in that report, how are they measured, how are they defined. Then if you're successfully get through that, then you can really step in and do the analysis. You're going to use that report, that officialization to analyze the data, to dive in, and hopefully through that process, you will then get to answer your question, right? Now, thinking about this for a moment from your perspective of your business user going through this process, it's easy to realize that if at any step in this journey, if I as a business user fail to reach that any one of these base camps or have difficulty, well, what am I going to do? I'm going to turn to a Sherpa, right? I'm going to look for that Sherpa that's going to guide me through that process, which is going to be your analyst, right? So the default, the fallback plan is I'm picking up the phone or I'm sending an email or Slack or Microsoft Teams message to an analyst and say, hey, Joe, I'm trying to answer this question. Can you help me? And then you want to essentially unravel the entire, if it's a human analyst, you want essentially unraveled the entire self-service paradigm, right? So the hope, I think the promise around AI is to say, can AI be a stand-in for this analyst as part of that journey, right? Can it, can it, you know, maybe not in all cases, but can it in a very significant percentage of cases be that guide to help the user get through the necessary base camps in order to send them out and so that they do not need to bring a human being in so that that those analysts, those, they're very scarce in your organization, that they can spend their time not answering, you know, queries about what is this number and how can I get to this report and what's the, what's the right answer to this. They can do that deep dive analysis around how do I move the needle in the fundamentals of this business. So with that in mind, let's look at how AI can be that virtual sherpa at each stage. What's happening today in that regard and, you know, what works and to what extent where the challenges are as well. So let's probably the best place to start is to think about the BI tools themselves. And here I would classify it on tableaus, power BI's, micro strategies, but also thoughts, thoughts and, and, and tell us and any of these tools that are doing the data analysis, you know, how, how are they integrating AI into the experience? And if you look at that, if you look at, you know, the tool explain data from Tableau or data stories or fabric and co-pilot from power BI or, you know, if you look at all these different technologies, they really do one of two things. They either provide data storytelling and this has got an extension on the old NLP, you know, AI Gen 1 type of capability where instead of you looking at a chart, you get a narrative that describes what's going on with the data. You know, here's the key trend line. Here are the key anomalies associated with that, here's those driving those changes. So you get some kind of a verbal narrative to explain what's going on behind the data, high level trends and the drivers. So that's one area of AI. And then the other area, the other solution space here is that they'll try to say, well, instead of having a dashboard that's already present where I need to go and change a bunch of filters to see values, just ask me the question and the tool will give you a visualization or a set of visualizations that are specifically tailored around the question that you're being that you want to answer. So this is something that ThoughtSpot and to some extent TELUS originally innovated, but now is present in, you know, a tool like Copilot where you don't you can go in and as well as the Tableau where you can without any pre-existing set of visualizations, just the data model and ask the question, you can get the visualizations automatically generated for you. And, you know, we won't spend a huge amount of time on this. These are, to some degree, successful. To some degree, when you get into nuanced and complex data models, they fall short. You know, mileage will vary. You've got to really look at your particular use cases and take a look to see to, you know, how nuanced is the underlying metadata and how well can you really address the use cases with these solutions. But they can work effectively to help and guide the user through this last stage, this analyzed stage of the process. But of course, that is Camp 3, right? That assumes that I as a user, if I'm going to use these capabilities, I already know which data set I need to go to, which data model or Tableau dashboard or, you know, where do I need to go in Power BI to be able to find this information. So and that's no use to me if I cannot get through Basecamp 1 and 2 as we talked about a moment ago. So how do I go through to get to those other Basecamps? Well, the answer to that, that's where a business intelligence portal comes into play. You know, that's what metric insights provides. So let's talk a little bit about that, about kind of what is the business intelligence portal? How does it get you to Camp 3? How does it get you to the point where you've got the, you're looking at the right analytic with the right context? And then, you know, we'll give you an example of how that actually works in practice and then talk about how AI enables that part of the journey as well. So if you think about the landscape, the challenge place, you've got, you know, your BI tools on the one hand and the applications that are there and some data will maybe have KPIs. You've got your data catalog sitting kind of off to the side. And of course, there's Active Directory, AD groups all integrated into your access control system within the enterprise. So a BI catalog takes and connects to all of the tools that you have, all your BI tools and applications, as well as the data that has a high level KPIs that you might want to see together with the BI tools. Then it overlays on top of that connection, it brings in through that connection the content that is most meaningful for users. So, you know, you might have in your Tableau or BI environment, you might have thousands of reports, many of which are obsolete, some of them are redundant, but there is a core set of capabilities and reporting assets there that are incredibly useful. So you bubble those up, you have a publishing, a certification, workflow and capability to ensure that that is pushed out so that users are then no, you know, separate the wheat from the chap and know what is the useful content that they should be able to access. And oftentimes you integrate with your data catalog so that you can bring in things like glossary terms, lineage, your things that will provide context to the information. Remember that key thing to understand what does this dashboard tell me? What does this KPI really mean? You know, having that kind of information that, you know, often that stored in your data catalog, you're bringing that in and uniting it with that analytics. And then, of course, you publish that for users to be able to consume and you, you know, you want to govern that where you're existing AD, LDAP system, right? So you don't want to have to reproduce any kind of governance. So the same security model that governs access within your existing business intelligence tools is used to govern access within the catalog as well. And then it's available to users to search and to find that content, whether that be online through a desktop on a web or whether it's through a Slack or Teams application via email on their mobile phone. However, it is that they want to consume. So there's a presentation link. So the catalog is intended to, you know, together in this BIPortal ecosystem to act as the first the first two base camps of the journey, right? To help users find the content that they care about and then to be able to help them understand what does that content mean? What are the right key terms? What are the key glossary terms? What's the lineage? What are the things that give me that the assurance that I'm looking at the right report that I understand what that report is telling me? And now that I can go do the analysis to answer my question. So let's I'm going to turn this over to Mike now and he's going to give you an example of what I just described in practice and talk a little bit about how AI can assist in that process. Yeah, thanks, Marius. Hey, everyone. So let me jump across to the metric insights here. Let's give you sort of the baseline for what Marius was just talking about. And then as he just said, I'll layer on sort of how we're thinking about AI and how it can help in this sort of discovery process so that we can sort of get to that third camp that Marius mentioned. So, yeah, to start for those that haven't seen it. So, you know, let's show an example of what a business intelligence catalogue or portal might look like. So what you're seeing here on the screen is as an example catalogue where we're connected in this environment to, well, it's not uncommon in a lot of organizations, a lot of different sources of BI. Yeah, oftentimes in a big company, that's potentially multiple BI platforms, but it's also things like spreadsheets and documents. It's things like reporting and operational applications. And so for sort of that discovery process to work, whichever portal we're putting in place, we need to make sure that we can connect to every source of information to every source of BI in the organization to start giving the correct content to users. So in this example, you'll see I've got content coming from Tableau, Power BI, click. I've got some spreadsheets in here down the bottom and some files, but each of these tiles here is basically an asset that we have published into the catalogue so that users can get access to them knowing that they're sort of the blessed version of a particular report that they should be using. And obviously, as Mara said, this is tied into sort of security and permissions. So any catalog should only show sort of the content that a user should have access to at any point in time. If we drill into these tiles in a little bit more detail because this becomes critical when we start talking about how AI helps in this process. So if we take something like this sales analysis tile and I click on that, when we publish a bit of content, what we wanna do is make sure that we're publishing it with the necessary metadata that gives a user the understanding and the literacy around what this particular report is. So in this case, we've got a sales analysis workbook from Tableau. We're picking up a preview of what that Tableau workbook looks like right now. But on the right hand side, we've got a lot of metadata that might help the user understand what this particular report is. So things like glossary terms. So this report contains a number of metrics. And if I click on those, we can see definitions and calculations around what those metrics actually are, perhaps ownership of those definitions who's responsible for them. So that glossary that Marius spoke about, we may add that, we may pull that in from our data catalog wherever it might be being defined. We may be tagging the dashboard with certain tags so that it's easy to find. We may have ownership on the dashboard. These are all just examples, but who do I reach out to if I have questions around this? Obviously descriptions and any other classification that might help me understand what this report is and how I should be using it. So we may have stuff coming in from our Kaliber or Alation platforms that say it's for internal use only or it contains PII data or it's related to a particular business unit. But the idea when we publish a particular asset is that we, as a user, we can get the understanding of what this is, what it means. And so if I drill into it, I've got the context to now go and analyze that report. So now I'm in the live Tableau dashboard, I can do my analysis. I found what I'm looking for there and understand it. Oftentimes when we publish this content, we're not gonna get into sort of publishing processes today, but as part of a publishing process, typically content will be certified. So again, a user knows it's a particular report that they can trust, perhaps who's being responsible for that certification and when that report was last certified. So again, we're trying to instill this level of trust into the content. So the user knows that this sales analysis is the one that I should be using. Another promise that has been made for many years is also having the definition of what's in this dashboard, but also understanding where the data is coming from that has fed this particular analysis that I'm looking at. So having access to lineage at the point of consumption is important. So in this case, our sales analysis dashboard is being populated with data coming from a snowflake table at the top there and is actually being used out in a bunch of email distributions as an example here. So again, we wanna make that available to the user so that they've got some knowledge of where the data is coming from that is defining the analysis that they're looking at. So metadata, certification, lineage is all important in this process. So we publish our content. We organize it on the left-hand side here in a way that makes sense for your business. So one of the challenges today when we're looking to discover BI in the organization is A, I've got to go to multiple tools. B, I have to understand the organizational paradigm in each of those tools. Is it searching through sites and projects in Tableau? Is it looking through applications and click? Is it looking through folders on my SharePoint site to find a particular report? Each has a different organizational paradigm while providing a single sort of navigational structure for your users as well is important so they can browse through the content that they have access to. Then finally, and this will transition us into this discussion around AI in a minute. The final thing is we wanna give them a way of easily searching through this content and search today in a lot of portals as it is in metric insights is kind of a Google style experience. So if I go and search for something like I wanna see everything related to sales analysis, for example, when I run a search like that, what's happening is it's doing some analysis in my search term to understand that I'm looking for sales information, but it's going out and crawling through that metadata that we've captured around each of these assets to bring me back a ranked set of results that hopefully will deliver me the reports that I'm interested in. And similar to sort of a Google style experience, often when I run searches, I'll get potentially pages and pages of results. Most organizations have thousands of assets out there that potentially users have access to, but you're kind of relying on the algorithms behind the scene to understand what you're looking for and look through that metadata and bubble up to the top the most relevant suggestions. So it looks in the title, the description, the tags for sales, and hopefully bubbles up to the top what I'm interested in. And obviously I can go and start to filter out that information by the different fields that we were looking at before. So I could further refine my search, but relies on me to kind of drill through that content now and decide what it is that I want to focus on and what I want to look at. So that was a quick sort of tour through the basics of what a BI catalog might be trying to achieve. And it's really this search paradigm that we're seeing as an opportunity where AI and these LLMs that are out there can have an impact on this process and really help users get to the content quicker. So let's look at what we're talking about there. I'm just gonna tee this up from some slides and then we'll drill into another example. So how does AI enhance that portal experience, that catalog experience today? Whereas I just mentioned, the search paradigm is very Google-ish today, right? So when I run a search, I can expect to get a lot of results. I'm probably gonna focus on what's on the first page and see if there's anything in there that's gonna help me answer my questions. If I don't quickly come across an answer I'm gonna pick up the phone as Marius touched on before and reach out to an analyst. But oftentimes I will get some useful suggestions from the search if I know ultimately there's something out there that I'm looking for that will help me answer my questions. So really what search becomes is quite a good tool for finding assets that I have that I have to know or I have a suspicion exist already. Yeah, there's that report I looked at a while ago. I can't quite remember where it was called. I can't remember where it was. But if I run a search, I'll typically find it. What search isn't great for is really making recommendations and suggestions around what it is I should be using to answer a particular business question that I might have. And that's where some of these AI engines can start to help us. And we've developed this concept of a concierge in our platform. And obviously it's analogous to the idea of sort of a hotel concierge. So the idea here is I can, if I'm traveling to a new city or country, I've kind of got two options. I can get online or I can look at guidebooks and figure out where it is I wanna go and visit and see and where do I wanna eat and what sites do I wanna go and visit. I can do that online. And based on what I find, I can set myself an itinerary based on the best knowledge that I have on my searches. Or the alternative is I can turn up at the hotel and talk to a concierge who is local to the area, has probably the most up-to-date knowledge of what's going on. And through a back and forth conversation can make me some suggestions and recommendations and probably create a more compelling itinerary for me for my time there. And we wanna bring that concierge type experience to the search that we were just looking at. So, how is this working? And we'll look an example of this. So there's some key points around that concierge experience, right? One is rather than just kind of running one-time searches against the catalog, we wanna have more of a conversational aspect to it. So very often if I'm dealing with the analyst, rather than looking for content, I'll pick up the phone and I'll ask a question, we'll go back and forth with some clarifying comments or statements and eventually we'll come to a recommendation. Well, we wanna bring that sort of a back and forth conversation to search. Obviously it needs to leverage all the metadata and security that we spoke about before. And it needs to be accessible anywhere that I'm working. So we looked at search in the metric insights catalog, but if I'm on my phone or I'm working in Slack or Teams, I should be able to run this experience there as well. So let's jump back to the demo here and kind of look at how this might materialize this out of the way here. So, as you'd imagine, oops, let's go back here. As you'd imagine in the Zoom window out the way here, as you'd imagine the first sort of step here is our initial question. I'm gonna look at this from sort of two perspectives. One is sort of more of the business user, but you'll also see that the analysts themselves can benefit from looking through this, through this engaging with this experience as well. So let's take a very simple example to start with, right? So maybe I'm a business user, I'm new to the organization and I'm on the marketing team and I wanna understand the behavior of our website visitors. So, yeah, today, if I went and ran a search for website visitors in the old search paradigm, it would probably bring me back a few pages of reports that I would have to drill through and figure out which one is gonna show me sort of website visitors and which one's gonna be useful to me. But what the LLM here is very good at and the Concierge is very good at is sort of inferring the intent behind my question. So we're getting to the bottom of website visitors and behavior and ultimately what that means and make a much more sort of targeted recommendation around all the potential website reports that we have, which one might help me the most. So this is by no means sort of all the marketing website reports that we have in the system, but it's making an educated sort of suggestion here around what it is I might be interested in that can help me. So on the left-hand side, it's actually made one suggestion here. I've got access obviously to the description and the metadata. On the right-hand side, it creates sort of this red trail as you'll see of suggestions as we go through this and I can preview those and drill into the content if it's a report I wanna look at. And I can continue the conversation. So what other reports do we have for the website? And as we said, there could be lots of them, right? So I can ultimately get to those if that's something I'm interested in. But the point in this first example is by asking a more sort of targeted question, the concierge is very good at sort of making some recommendations rather than just throwing everything at me to and hope that I can determine which one it is that I should be using. So if we go back and take another example, I mentioned that there's value to the analyst as well with this sort of experience. So we looked at the lineage in the demo before and that sort of tree diagram that showed where data was coming from. Well, rather than doing that on a report-by-report basis, again, maybe I'm new to the organization and I'm responsible for creating marketing content and I wanna know which reports have been built off of our sales data mark. I can run a question around the lineage there and the concierge is gonna go on and pull back that information, so it's bringing me back. In this case, all reports related to marketing, it's created my bread train over here with my thumbnails and sort of suggested the top reports that are based on the sales data mark. But I can continue that conversation and ask additional questions, as we said before. So I see sort of Andrew here has created a lot of this content. Well, I know Andrew, I'm gonna engage with him. What other reports has Andrew admin created? I know he's key to the business. So maybe I wanna understand what other content is out there that he's created. Maybe I wanna know which of those are certified. So we mentioned the certification process earlier. So I can run a follow-up question there. Apologies for the banging in the background if anyone is hearing that. And I can see which of that selection is certified and I can see the certification levels or maybe questions around the data itself. So do any of these contain PII data? So sensitive data. Ultimately what I'm trying to do with this conversation is perhaps get to a particular recommendation that I was interested in understanding. So I finally got to this quarterly financial report that does contain confidential information that was created by Andrew that's based on our sales data mark. So you can see rather than just sort of the run the search and get to the bottom of it myself, I can ask these questions. I can have this conversation and ultimately get to a recommendation. Let me show you one more example here. Let's look for some customer journal reports here. So again, I'll start a new thread here asking the concierge to recommend some customer journal reports. Again, it's going out for the metadata or it's looking at these reports. It's taking into account things like popularity of reports as well when it makes these recommendations. So ensuring it's sort of making suggestions are probably the most relevant to what I'm interested in. So let's come back with four reports here. Again, I can follow up with at least certified. Do they contain any confidential information that sort of follow up question? I can also start to ask questions around what I can do with this content. So perhaps how do I subscribe to one of these reports? So I can go beyond sort of searching for the content here and now I'm asking a question about how I can do something with the content. So I wanna receive one of these reports in my email can I do that? How do I go about doing that? And it's important in this case to sort of point out the questions I'm asking here these are functional things that exist within our portal today. So I can go out to reports and I can subscribe to it. We can create what you're seeing here the idea of a burst, which is a set of content that I wanna send to email or sac or however it might be. It's functional things that can happen in the product today but the require knowledge by the end user in terms of how to do that. So how do I take a set of reports and add them to a burst and set a schedule and define where I wanna send it, make sure that the content is updated before I send it. That requires product knowledge to be able to do that. What we wanna do is start to bring that into this concierge experience. So that we can start to direct users through that process. And the first step is sort of exposing some of the help information that you're seeing here. So it gives me a description of around how I can go about creating a burst. I can ask for questions to this as well. So can I set my own schedule for a burst? So again, I can keep the conversation going and start to learn about how I might do this and it can give me some direction. And so it's finding content but it's then taking this to driving more of sort of actionable functionality around that content as well. And today we do that by sort of indexing our help documentation or allowing users to ask questions around that. I'm gonna hand it back to Marius now and he's gonna talk a little bit about how we can start to use the concierge to actually carry out some of these actions now. So throw it back to you Marius. Thank you, Mike. So yeah, so we've talked about kind of what's possible as far as assisting in the journey today, which is the ability to be able to help you discover content and help you ask questions that might tell you how to do things. You're gonna just facilitate that process where otherwise you would send an email or send a Slack or you'll have to go sort of searching for that information and have assistance. So what are the sort of the next steps? Well, you can imagine that as you're working through this if you've got a particular result you might wanna say, hey, I wanna receive this content, right? So you might wanna say, hey, subscribe me to this content. So up here on the right hand side I clicked on a button to do that. And then if that's the case, then it will ask you, the concierge will ask you maybe do you want to include this report in your weekly reporting digest which you receive every Friday at 8 a.m. It sees that you already have a distribution. Then you say, well, no, I want to send this report to me through Slack on Monday morning at 9 a.m. And so if you do that, then it would kind of go in and say, okay, you'll create a new distribution that it was gonna go out to you, includes this particular report and it'll come to Slack with you every Monday at 9 a.m. So this is something, if you know your way around the tool and you'd be at portal, you'd be able to configure and set up this action, but perhaps you don't. Perhaps you just come in here for the first time or you don't want to learn how to use the complex interface to do it, to be able to ask to have a particular, a common task like this, say sending this report using this mechanism as particular time through an AI assistant, you can see that how that's very useful. And again, makes it possible for somebody to self-service in a way that without this AI would be very unlikely to happen. Another example in this action menu here is you might want to say, well, I'm looking at this sales analysis report and I want to share it with another user. This is an interesting report. Maybe I want someone else to see it. And so the chatbot would ask you, well, who do you want to share this with and what's the message you want to include? And I can say, okay, I'm going to send this to Mike Smith and let's ask him to check the sales numbers against the latest forecast, right? And so doing that, then again, the concierge should be able to take care of that action for you, email that sales analysis to this user with a message saying, hey, John Frank asked you to check this information so that the user would receive that right in the inbox of that information. So again, facilitating something that would otherwise require me, cotton paste and get the link and credit email, just making that much easier to do. Another example is the very common use cases that you might discover a particular report and say, I find this interesting, but I really don't want to have to come back and check it all the time. I want to be able to set up an alert so that I can receive a notification whenever the data's changed here. I don't want to look at this dashboard every day. I want to look at it when the data is changing in some material way. So I might say, set up an alert. In this case, concierge asks you, well, what is the alert condition that you want to set? And then you can say, well, okay, I'm going to be alerted if sales in Canada drops unexpectedly, right? That's what I'm looking at for. And then concierge can then go in and set up an alert based on metric and sensory handles alerts, right? The final alert for you that says, well, if it falls by less than more than 20% for a 30-day moving average, it'll be scheduled and it'll come to you via email at a particular time if the condition is met. So again, I've taken kind of a multi-step process and I've activated through conversation without requiring me to understand a lot about the UI and where to go to do this and the various widgets and controls that are possible. I think just to jump in here, I think the other thing that, so obviously what we're talking about here is using the concierge or the LLM, the chatbot, whatever you want to call it to, sort of carry out actions that can already happen in the product today. As Marius said, you just need the knowledge to do it. I'm sure this is the same for a lot of products that are thinking of integrating AI, how do we get it to carry out functionality? I think one of the added benefits here is in this example, if you look at this alert, we asked it to say, if Canada drops unexpectedly, tell us, it's got the knowledge of the metadata behind that particular report to know that it's daily information in the Tableau workbook and therefore it can make an educated suggestion around what the alert rule might be, rather than just saying, well, if it drops below X, we'll do that. So it knows it's daily information. It knows that it's updated in the morning and therefore we could send the alert at 9 a.m. because the data would be updated. All that sort of knowledge around the metadata comes back into the suggestion though it's making that, which I could then go and edit. Yeah, that's a great point. Because then a human analyst doing the same thing would be doing that, right? They'd be saying, well, when should I send it? What's the right criteria? Let me think about that, let me look at it and be able to take an attitude that would be very useful. So these are all things that are coming and on the product today, but we wanted to give you a little bit of an idea of how we're planning on extending, how this idea of extending BI portal capabilities beyond enabling search and discovery for documentation through actions really can potentially have a fairly transformative experience around how users are going to consume this. And it's a very practical way of applying AI to make a user's life a lot better. And in this case, actually, you can see what Mike was saying. As a user, I might say, well, let me change this to 30% threshold because I don't want 20% and actually send it to me, send it to my mobile app. And so that review and update process would work. So with that kind of this, you kind of see the picture of more practical approach of what are we talking about? I think there's a place for AI within the BI experience at each layer within this journey, each step within this journey, the BI portal is the key layer where AI can help with the journey through the discovery and understanding process where the user tries to find the right analytic and then figure out how they can interpret that analytic correctly. And then the analysis itself, the BI tool, the report itself and the capabilities that those vendors are placing into those tools should increasingly be able to help take care of that part of the journey. So hopefully then AI can help with a lot, answer a lot of questions that would otherwise require a human intervention. I think with that, maybe we'll throw it back to you, Shannon, and I see a lot of chat going on. We haven't been able to monitor it and potentially a few questions. Yeah, thank you both for another great presentation. Always such a joy to have you both here with us. And thanks to our attendees for everything and just to give you, to just answer the most commonly asked question, just a reminder, I will send a follow-up email to everybody by end of day Thursday for this webinar with links to the slides and links to the recording. So diving in here, how is metadata on an asset managed populated to metric insights? I can take that. So for us, it really comes from sort of one of three places. So the idea with creating metadata is if it's being created somewhere else, then we wanna leverage that. So some of that might be coming from the tool itself. So if we connect to Tableau and we publish Tableau Workbook, we obviously typically get the name, the description. We might get some tags from Tableau, for example. So anything that we can leverage from the tool itself. Alongside, if you have a data catalog, for example, an elation or a calibre, there may be some sort of classification or I see a future question here on the lineage. Some of that might be coming through your data catalog tool and we wanna pull that in as part of the publishing process. And then oftentimes there's, the third thing is that it's added as part of the publishing process. So it's not stored anywhere. When we go through a publishing workflow in metric insights, people are actually responsible for adding that metadata at the time of publishing. So maybe it's ownership, maybe it's updating descriptions, attaching documentation to the asset. That can be done at the time of publishing. I guess there's a fourth. Some customers are managing sort of metadata and spreadsheets out there and they can be imported as well at the time of publishing. So if it exists, we'll connect to it. If it doesn't, we'll add it as part of the publishing process. And to add to what Mike said in the case, so that lineage sometimes exists in those data governance tools. And there was a follow-up question point here made about, hey, it assumes the metadata exists and is current and accurate. And so yes, if you ingest it from the data governance tools, that is the assumption. You don't necessarily need to ingest all of it. It doesn't have to go all the way down to the, oftentimes for a user using this, they don't necessarily care about every ETL rule that was used to bring it in. They just wanna know, is this coming from the data warehouse? Is it coming from this table that I know is certified from the data warehouse? Something around that level. And that is possible to get from the data governance tool. It's also possible to get from some BI tools. So for example, Tableau and Power BI, we can reach indirectly through the APIs that exist there and get lineage directly from there if you don't have that information in a data account. Perfect, thank you so much. So how are the answers and direction that the concierge gives fact-checked for accuracy? Yeah, and that's a great question because as we all know by now, the part of the hype machine doesn't cover the fact that these algorithms definitely tend to hallucinate, right? So from our perspective, first of all, I would say that the problem set that we are attempting to solve on the portal side of things has in general less of an exposure to hallucinations for a number of reasons. One, first of all, you're directing users to content, right? So if you are 99% accurate in connecting people to the right content, that's pretty good. That's a really good usage that people will be happy to say, well, I'm losing a resource that 99 out of 100 times is gonna give me the right answer. If you're 99% accurate on the other hand with answering data questions, probably not gonna get that satisfied. You're not gonna have that same level of satisfaction because that one time out of 100 might be somebody in the senior management team asking a question and then using that number to make a strategic decision that has massive impact on the business. So that said, first of all, the way that it gets used for content matching is a little bit, the impact of hallucination is significantly lower. As far as what we do, we have a feedback cycle as you've seen kind of an open eye and other tools where you're just, you're able to provide feedback so that we do have that loop that comes back and it can continue to improve the model as it and the prompting that's there in order to increase accuracy over time. And then secondly, it's designed carefully in order to be able to make sure that you've got sort of this inference layer where it figures out what is the action that you're trying to perform and then there are tools performing that action and thereby, as long as you can determine the user's intent in a reasonable way, you can kind of constrain the action to make sure that you're generally giving good answers. So there are a number of techniques behind the scenes to do that. And then you saw when Mike showed you that in many cases where you're asking for specific information, we're giving links to it. So you're not gonna get a, if you're getting a link to an actual report, you're getting a link to an actual document. So it's very easy for you to verify that in fact what the LLM has told you is correct. I think as well, I think we've taken a approach where we sort of err on the side of caution, right? Ask a follow up rather than just throwing out recommendations and guessing. So the concierge will come back with a clarifying question rather than if there's doubt in terms of what needs to be done. And I'm gonna kind of skip here because you started touching on this already. Is the LLM behind concierge? Is it out of the box or custom trained? Yeah, so what we can't today, we're able to kind of bring your own LLM to the party is the way that it's working today. So at the first release that we've already, we're having beta today, it works with OpenAI with GPT-4. It works either through Azure or through directly with OpenAI API. We are building the capability for you to bring your own LLM. Some organizations have a high capability LLM that they've created and deployed behind their firewall and you'll be able to connect to that. And then down the line later on in the year, there will be capabilities that where we ship with a capability to do some of these functions without having, it's a kind of to be either your own LLM or externally, it's gonna be, obviously, you're not gonna get the same level of capability from using a foundation model versus using something that can run on locally, but we're looking to provide some level of capability even in those cases where there isn't any LLM capability available. Sounds great. So how can I control how far back to ADL lineage goes and practice these graphs can be so deep? Yeah, absolutely. And I think when we extract lineage ourselves, that is when we're looking at doing the extraction, let's say Power BI or Tableau APIs, we're really looking to the tables and columns that are used in the SQL and the processing by the BI tools to get the data. So we're not going all the way to whatever derivative sources are there. And if you think about the consumption model for most users, that's what they're looking for. They don't really care about all the ETL rules that you use. They care about, where is this data generally coming from? Is it a trusted source? You can bring lineage from other sources. And if you do, then of course it's up to you. One of the things that this sort of point, this question, not just with lineage, but in general, one of the challenges today with data catalogs is that organizations will catalog everything in their ecosystem. And so therefore there's just so much information in these catalogs. And it's so hard to figure out like, what's the piece of information that you need? So usually you are not bringing everything in. The rule of, you may be bringing in 10, 20%, you're bringing in the content that's certified, you're not bringing in every piece of metadata for that. You're bringing in just the things that users are going to find useful from a data literacy perspective, without overwhelming them. So it's gonna be glossary terms. It's gonna be that first tier of lineage. It's gonna be key certification attributes, other custom field information, but it's not gonna be every attribute of every field and lineage is like three or four or five layers deep because that's just overwhelming to a user. Perfect. And we have about five minutes left. So I'm gonna slip in as many questions as I can here. So as a nursing informatics student, I'm interested in exploring the relationship between AI and BI in healthcare. Is there a difference in industry? We have a number of healthcare customers and the thing I will say is that your healthcare system has some of its unique challenges in a sense, especially in the hospital area. There are systems that historically have been a little bit more about the silo black box systems like Epic, right? And so they're starting to open up and I think we're very hopeful that these capabilities are going to be available at the same time. Also hospitals are very challenged via the ability to, you know, data in general, obviously because of HIPAA and sensitivity around PI information, they're often constrained from being able to use services out on the web. So these foundation models are very difficult to access. That said, there are lots of organizations that are beginning to invest in, you know, some more capable open source models or contracting with organizations to provide foundation models that's behind their firewall. So those are all, those are all things are coming. So I think there's a lot of promise for improving things in that area. You know, I'm not an expert overall in healthcare. So there are many ways in which the technologies there be companies that are doing things with AI and healthcare that they're very exciting and interesting. But as far as BI goes, I think it is promising. I think that there are obviously healthcare organizations move a little slower. So there's a, you know, the jury's still out as to how quickly they can incorporate some of these technologies. Makes sense. So are there best practices related to tuning the concierge LLM based on both upstream sources and catalogs changes as well as BI portal usage behaviors, BI data asset usage metrics, discovery, understanding operational questions? So we are, you know, at least in our solution to date we have not been, we have not been fine tuning any of the LLM so we're using where we are because that allows you to then switch off so that if, you know, your organization can't use open AI, but you have a Lama 2 model that's been implemented behind your firewall with ability to access things, then you can connect to that. So we try to avoid that and we've, and we've accomplished that through prompting majority of the sort of best practices have been around prompt engineering. We enable you to be able to, you know, supplement the prompt that we provide so their capability so that you can add some of your own rules and criteria. And that's another place in which you can look, take a look at, you know, what kind of responses you're getting and if there are places where the answer is not what you want, it's maybe you wanna clarify that particular term means something we can add that, you can add additional metadata, you can add your own FAQs. So the way that we, you can think of tuning here is you're adding to the corpus of knowledge that we ship with and the metadata that we're extracting from all the different, from the BI tools and that you're adding through your publishing process, you're adding to that perhaps other metadata that you're adjusting that we're bringing into the system. You know, FAQ documentation, other resources, other docs and then making that available through to the LLM and potentially adjusting some of the prompts that we ship with in order to optimize the use of that information. That gives us the flexibility then to work with any LLM? Yes, that's right, any LLM besides the scenes and of course, the more capable your LLM, the more accurate the answers are gonna be that you're seeing together with the problem that you do. Very cool stuff. Okay, I'm gonna slip one more question in here. Do you support Microsoft SSRS hosted on-prem? If so, what versions? Or do you have a list of tools that you support? Yeah, so we do metricinsights.com or help.metricinsights.com, search for data sources, plugins, you'll find that list. Yeah, or reach out to us. Or reach out. Reach out to us, info, metricinsights.com or we can get you that too. Yeah, I don't have the exact versions off the top of my head. But yeah, but we have a lot of customers doing on-prem and SSRS, so that's not a common situation. Perfect, Mike and Marius, it's so great to have you back, thank you so much for another amazing presentation. Thanks everyone for your time. Thanks everyone. Yeah, and just a reminder again to everybody, I will send a follow-up email by end of day Thursday with links to the slides and links to the recording for everybody. Thanks to all our attendees who've been so engaged and everything we do, we just appreciate it. Until next time, thanks again, guys. Thank you.