 Hello, and welcome. My name is Shannon Kemp, and I'm the Chief Digital Officer of Data Diversity. We'd like to thank you for joining this Data Diversity webinar, Decoding Data Quality with Data Products, sponsored today by Tamer. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we'll be collecting them via the Q&A panel. And if you'd like to chat with us or with each other, we certainly encourage you to do so. And just to note, Zoom defaults the chat to send to just the panelists, but you may absolutely change it to network with everyone. To find your Q&A or the chat panels, you can find those icons in the bottom middle of your screen for those features. And as always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and any additional information requested throughout the webinar. Now, let me introduce to you our speakers for today, Matt Hosopful and Nicola Ferrier. Matt is the head of corporate strategy and leads Tamer's technical solutions working closely with customers on large-scale deployments. Prior to joining Tamer, Matt held positions in strategy at Sears Holdings and strategic sourcing at Dell, where he led the development and implementation of new analytics sourcing tools to significantly lower procurement costs. Nic is a technology leader focused on building out cloud deployments and infrastructure for early-stage companies. Currently, he is the lead cloud architect at Tamer where he leads cloud infrastructure, technical operations, and security efforts. At Tamer, he led cloud-to-cloud migration efforts. We've strapped their SOC2 program and is currently focused on developing Tamer's data products, enabling customers to use data product templates to consolidate messy source data into clean, curated, analytics-ready data sets. And with that, I'll give the floor to Matt and Nic to get today's webinar started. Hello and welcome. Hello and thank you everyone for joining. Before we get into the content here, we wanted to just learn a little bit more about where everyone is at with data products. So, if you wouldn't mind, we would love if people responded in the chat with where they're at in terms of familiarity with data products. Is this a new concept? Are you familiar or are you off to the races that have actually implemented? We want to see what some trends are here and use that to just make sure we're tailoring the content appropriately. So, we'll give a few more seconds here for everyone to key in an answer. Well, it looks like we do have one D, multiple data products in production. That's awesome. A lot of A's, good number of B's and C's, really all over the board, which is not terribly surprising. And it aligns very well with what we see with our customers. Typically, we're working with larger enterprises, typically a billion dollars and above, who have a fair amount of complexity within their data ecosystem. And so, just some last pieces here. Looks like we have a nice distribution of people across every level of familiarity, which is great. I mean, we're certainly going to cover some of the kind of broader definitional pieces around data products and hopefully be able to get everyone on the same page in terms of why our data products needed. What's the value and then in addition, if you're starting on this journey, what are some of the best practices that you can learn from others who have implemented data products and are starting to see value from it? Just to level set all of us, the way that we define data product and a definition that we've seen that aligns pretty well with how others in the industry think about it is really that a data product, I think the kind of key part here within this is that it's a consumption ready set of data that's used to help solve business challenges. One of the things that you'll see throughout this presentation is a key theme. It's really that last piece used to solve business challenges and really understanding what is the purpose of the data product? I think it's kind of first and foremost what problems is it trying to solve and then work backwards from there. We'll get into some of the complexities that can make this challenging, but at the simplest level, a product in any sense is something that should fulfill an customer need. In this case, the end customer is typically some business stakeholder within the enterprise who's looking to use data in order to make a better decision, automate decision making, drive new insights, whatever it may be, and data product can play a very key role in that. Before we get into some of the nitty gritty on how you actually implement a data product strategy and ultimately to enable your business to work a lot smarter and more efficiently, be helpful to have some context on kind of how we see the ecosystem today. So, Nick, share some of that context. Yeah, so if we take a step back and look at kind of historically a lot of the companies that we work with when we talk with them 15-ish years ago, 20-ish years ago, they only had a pretty simple kind of architecture for what their data looks like. They would have an MDM that usually powered some of their operational applications that are critical to their business, and then they would export views from that to what nowadays we'd call a data warehouse, but would probably be analytical database back in the day that would power a lot of their BI and reporting for the decision making process. And what we've seen is over the last 10-ish years that the acceleration of people moving to the cloud both to where they store and process their data in addition to adopting cloud applications, SaaS applications has really kind of exploded some of the complexity and the amount of different sources that you have and you have to deal with, and also the amount of integrations teams expect. Now, we see customers that might have multiple sales forces in addition to a HubSpot instance, just a loan for their SaaS applications just for managing their customers and their communication form, right? Then oftentimes we'll then see individual teams are then buying external data to try and generate and fill in values and specific reports and to help create better views in their data warehouse. In addition, there's now tools like BI tools are pretty standardized and expect kind of basic integration to data warehouse. You might have data science teams that are expecting certain views and tables to be constructed and consumed and more kind of tech-focused companies are already starting to build their own AI and ML models inside their organizations and expect high quality data products as the base to be able to train their models to be able to then put them into time into the existing operational use cases. It's a much more kind of complicated landscape that really lends itself to treating data like a product where historically it's just kind of a loose collection of views. And if we kind of think about what the headline has been from this migration to the cloud and really what a lot of the large cloud vendors really focus on, it's been the story on the right of the explosion of data sources. You're going to have a lot more data and you're going to have a lot more tools in order to consume that data. One of the things that can get a little bit lost in that story is that becoming data driven and actually getting the full value out of this new ecosystem out of all of the new sources of data, all of the new and fantastic end points for consuming that data. It does have a cost in the form of new requirements for data quality. And so with BI in reporting, the big need was, you know, we need data to be clean and standardized. We need our date column to be in a consistent format. We need country codes and consistent formats so that we can look at sales by by country, etc. And so a lot of data tools like like ultricks, for example, really kind of born out of out of these these needs. Let's let analysts be more effective in how they clean and standardized data. As new end points have kind of come to market, and there are more ways that people are consuming data, things like data science, for example, the need changes, it's not enough anymore to have data that's clean and standardized. Now I need a lot of attributes, because the more attributes I have, the better model I can build the more predictive my model can be in the more accurate it can be in its predictions. As we've moved to more automated decision making with things like customer data platforms that are really driving and firing marketing events for really a larger scale customer interactions. It's created this need for a single view of a customer. We need to make sure that we have this integrated view of our customer profiles so that as we're executing on these events that we're doing that accurately and and effectively. Next, as data apps have really grown and become much more mainstream with the growth of applications like streamlet, for example, where people with Python and kind of a knowledge of. How to kind of manage a database are able to build a simple data app on top of that that they can serve as either an internal application or even an external one that they post on on their website. One of the things that this enables is this puts end users much closer to the data itself and also just exposes a lot more of in organizations data. One of the things that this creates is the need for people to be able to fix issues as they as they see them, the more people who have access to data, which data apps is really driving the more kind of nuanced and narrow people's people's feedback is going to be looking at an account that doesn't look right a customer 360 whatever it may be by seeing issue I want the need to have that issue resolved quickly. And then finally, I think nothing kind of has brought this this point of data quality to the forefront more than what we're seeing now with these next generation models things like large language models, for example, and just kind of the broader foundation models that people are using in order to ultimately try to build a better customer experience, whether that's kind of common use case like a chatbot for example, people are putting AI and ML really at the center of their customer This creates a lot of pressure on all of the data quality requirements because garbage into one of these models and your your reputation is potentially potentially ruined these are the types of things that kind of end up in the New York Times or other media outlets where if there is an issue with AI model as a result of bad data that can have serious implications and so the importance of data quality has really never been higher. I think, you know, the good news is that we've kind of been building towards this this point and so organizations are totally caught totally flat footed, but certainly there's, there's more work to be done as evidenced by a lot of kind of third party data that's been collected on just what is the state of data quality and data teams I think you know one of the kind of common threads is that data leaders just can't scale their teams quickly enough that number of sources that people are working with has exploded like like like Nick outlined at the beginning. And a lot of the impact of that has fallen on data teams to fill in the gaps, we don't have data that is accurate and entrusted well our data team is going to fill in that gap and that that's creating heavy strain on on the organization. And a result of this is that decision makers aren't getting answers fast fast enough, and that the number of companies that say, or data leaders that say their company is data driven hasn't really moved, despite all of all of all of this investment. I think one of one of our customers is a chief data officer and he's in the media and entertainment business and kind of specifically his internal stakeholders are talent agents and for them, the key metric that they measure is really around analytic velocity and time to insight they track for a question that a talent agent has how quickly would they be able to to answer it and have kind of various scenarios that they they they evaluate in order to just understand how that metrics trending, because they know that if people don't get the decision already able to answer their question fast enough, they're not going to use data and so their only way of being data driven is by being able to answer those those questions quickly. And then finally, just on this point of AI and kind of the importance of it. I think we mean the height is more than just height at this point, organizations are putting serious dollars behind AI and and using it in order to kind of stay stay ahead of the curve which is awesome. A lot of a lot of great opportunities that are possible. The issue is that executives don't trust the data that's going into the system and so to this point, previously, reputations being being at risk and in line certainly it'll be kind of interesting to see how the next coming months and years unfold on this front because there is a lot of top down pressure to use AI but I think a lot of people generally feel like that that could be very risky given the the state of their data. Yeah, in addition, AI is not the only kind of technology that's driving some of these conversations. What we see is a lot of our customers, in addition to trying to do these data projects to make their company more data driven. It's often usually paired with a project to move to the cloud or switch clouds that for a lot of reasons, usually the first project that a lot of large companies do when they're moving to the cloud is moving their data assets their data warehouse. Over to cloud based resources to try and take advantage of the elastic compute there to enable lead in seas and kind of applications that they couldn't have done before with on prime hardware. Now this is just going to expand even drastically of when people start to you build out AI use cases and trying to leverage machine learning more and more in their business operations. They're going to really need high quality data to be able to do that. And that's going to make these data teams have to almost solve through technology problems at the same time as trying just to get to the point where they have good data. Oftentimes they're going to have to be the ones building out the foundation of their cloud environments and cloud infrastructure, how they want that to work. In addition to also providing integrations into the either data analysts or machine learning teams that are trying to consume that data to build out models. This really kind of puts a focus on and should make you think about how you want to manage your data and thinking about what your data product strategy is to be able to do this repeatable across multiple use case. Which really kind of dovetails into something we've seen from a lot of customers that they start with use cases approaches, but they're really successful ones are thinking more along the lines of data products and use cases. Yeah, just on this point of the use case based approach. I think, you know, now kind of kind of more than ever that this is becoming difficult for data leaders. One of our customers at a healthcare company, he's, he's been under a lot of budget pressure within the within the organization effectively he has some important analytic and business initiatives that that he needs to needs to support. But funding now is much more more difficult to come by they they're switching to a model where they need to be able to have connected your charge backs to the to the business in order to find some of the investment in modernization that they want to make as they're in the middle of their their cloud journey. And I think one of the kind of first insights that that he had was the only way that he and his team are going to be able to be to be successful in this new this new model is effectively if they operate in a way where instead of every use case is you know a dollar basically and then they have a dollar of cost. They need to make it so that a use case is something that can be used much, much more broadly and so if they're getting a dollar revenue. It's it's not a dollar a cost it's you know 25 cents to serve it and then kind of continuously decreasing. They know that just just adding more more resources to the problem and trying to kind of fund their their their initiatives purely through just doing more more use cases isn't going to to be effective for them anymore they need to kind of shift away from the use case mindset one towards where they're really thinking of how can they be most most profitable as a as a as a team, which is what's really driven them to kind of think about how do they manage their data as a product so that they can kind of operate more as a PNL as opposed to a kind of a consulting staffing agency where you're doing much more kind of one to one mapping of use case to people. Yeah, so one of the parallels to that at least I naturally gravitate towards one thing about this problem comes from the perspective of how different like SAS application providers to have their offerings right where there's companies that will do a traditional hosting based solution where they'll have spin up separate set of dedicated infrastructure resources for every customer that they have. And that's the more old school hosting of hey we onboarded say if you're running a WordPress site. We onboarded a new customer we're going to spend up a whole new WordPress application for them dedicated resources dedicated databases. That doesn't scale very well in terms of how you can leverage your staff, like for every 10 new customers you might need a new employee to be able to manage all that versus multi 10 SAS applications which are kind of the standard now. You can scale your staff and resources and costs sub one early to the amount of customers and requests that you're scaling. That's really also what kind of happens with data products. When you start investing in the platform to be able to serve all the different use cases and have the same set of resources, you can standardize a lot of these processes, you can start to scale your data quality in your resources and a sub one year to the amount of requests that you're getting from your stakeholders that literally makes it a very high leverage solution to how you can meet your business needs. And the root. And one of the, I think, kind of most challenging pieces or kind of biggest mind shift mindset shifts that we see within the journey towards managing data as a product ultimately to get much better data quality is the feedback management cycle. And I think it's become kind of common wisdom and common knowledge that if you're building a product of any kind that one of the first things you should do is quote unquote get out of the building go get feedback, learn from customers and continue to to iterate. And that's a muscle that can be challenging within kind of data organizations because it does require a unique set of processes and capabilities to be able to actually manage that feedback loop incorporated into the into the data product and ensure that that data product is something that's continuously getting better you don't have people doing kind of the classic move of by seeing issue with the data I'm going to download it into Excel I'm going to make changes, and then I'm going to be able to get my report done and serve that that end user need but rather having good process and governance in place so that people aren't just rushing to make some end tweaks or tweets on the edges to to a spreadsheet in order to ultimately solve that kind of short term need but are really thinking much longer term on on how to improve improve the data on an ongoing basis. And one of the things that has really been a bit of a challenge with building data products in the past has been the best practices for building data products can be can be expensive. I think, historically, if you wanted to build data products you're kind of looking at the options of well do I just hire more data engineers and stewards in order to take on this this new workload to put in place new new processes and improve our data operations. Do we just reduce our scope where instead of trying to hire more people and scale out. Maybe we say, you know, we are only going to manage our customer data, I think a lot of kind of traditional approaches to master data management have kind of fallen in this this view of what we're going to have this narrow set of data we're going to manage it very tightly put a lot of top down governance on it but it's going to be trusted high quality. And we just hope it covers enough of what we are trying to do in order to to be effective and successful. I think there's a comment in the chat here these more like the dumb practices I think, you know, certainly it can can feel that way these all seem very expensive but these are what what we see people can often doing in order to kind of get get out of kind of data debt challenges. Yeah, and recently we've also seen an uptake and people trying to incorporate AI to also help out with this, where it can make your developers and your data engineers far more effective. A lot of these and all models have good plugins that are ideas that if you use big query, you can even enable it right in the pool console with sharing SQL queries, just asking questions and they can generate your report views. That's really a high leverage kind of tool that can help enable and make your existing team more productive. It's also starting to see it used in a lot of other different use cases, where you can ask it. If you have a good system set up you can ask it questions about specific rows or clusters of records of like, is this a person or a company, and you can then use that as a deal P tool. Like, hey, we do not want to process people inside of our company data products so if this is a person, please filter it out, you can type just directly into the system that you would normally ask a data storage or curator to go and manually remove that row or ask them that question what is this doing here. Also another good use case for it is enrichment of these sparse data sets where at its core, a lot of these online models are meant to just predict the next word that can also be used to predict the next seller attributes in the data. So for some of these fields that have very common things where we see a classic example is someone will fill in a partial address of like, there's Boston's the city, but then we've stayed in country blank. Most of the models will be able to guess what those attributes are pretty easily and tying that into your systems that can be a very good tool where you're not taking up any time for a human and they can go and fill in a lot of those values for you. And it's also if you have a system where you already have a baked in kind of task management workflow where you have tickets coming in and curators or stewards managing those tickets and then applying changes to models. Well, not the models but to your data sets, you can plug in AI models into that to give pay it is what we suggested what we think should happen and just have your agents accept those changes and auto commit the changes to your data. You can really save them a lot of time of having to do some manual process. AI certainly is not a silver bullet. It is providing a lot of value as kind of described here but I think the important thing is that what AI really enables within data engineering more broadly is it enables data engineer to data practitioners to really kind of use much more declarative logic in defining a data product and being able to reduce the amount of complexity with creating and managing a data product. I mean it's very similar to I think what we've seen with with software engineering practices where kind of the the initial efforts was just making software engineering that was something that was much more predictable and be able to understand kind of distinct functions that we're seeing something similar with data engineering where you have kind of distinct transformations dbt you really kind of drove this with analytic engineering where you have distinct models in order to handle different parts of the system that are very reusable and what AI is enabling is kind of taking that to the to the next level where now data engineering can be much more declarative in nature and instead of kind of operating at the level of needing to write a lot of individual transformations being able to kind of push that down to to AI and really focus more on the business logic of the data product itself. And if we think about kind of where where that is is ultimately heading in our opinion in our view is that most organizations will have a data product platform that combines this more declarative approach to data engineering and data transformation with human interfaces in order to drive collaboration on the data itself. In most cases the people who are expected to deliver value from data. People in sales ops people in procurement analytics, for example, they might have some understanding of SQL and how to transform data but really their their superpower is understanding their business understanding for example, and customers in the case of sales and in the case of procurement being able to understand suppliers and kind of the nuances of the market. So if they can interact with their data team at more of the business logic level and then give feedback on individual points within the data then it really elevates the quality of conversation and ensures that there can be good effective collaboration and it kind of definitely feels like we are very much heading in this this direction where we are able to have business and data teams kind of interacting at the business logic level and then after that data product is delivered be able to manage it through human human interfaces kind of very very similar to tools like Zendesk for example where it'd be impossible to imagine shipping a product without some form of feedback where customers could say hey I'm having this issue and then we're able to learn from it. We think the same is true with with data products where it'll be unimaginable in four or five years for you to ship a data product not have a mechanism for collecting feedback for being able to track usage and adoption of that that data product. And that's what's represented in these consumption services where collaboration is is is key. And beyond the benefits described ultimately having a data product strategy in place and having all these tools and mechanisms for managing those those data products effectively should increase speed accuracy decision making and make people feel confident in the data that they're using is one of the really key key drivers of data products and in my opinion the most important piece of data products is that it really breaks down the barriers between data teams and business teams and really kind of makes it easier for these teams to collaborate on the on the data itself because it's a common set of assets that are continuously improving and you have kind of clear rules of engagement for how to give feedback on that data how it improves and what the SLAs are associated with it. It's it's kind of antithetical to the traditional approach of we have data that was in a warehouse and then we have an analyst that sits in the line of business and then they do kind of a spoke data prep in order to serve a use case. That certainly can help get things done, but you get not great kind of leverage more broadly and also the collaboration there is very siloed because you have this island of data debt which is the spreadsheet on the the analysts desktop that other people might not have access to and there isn't good built in collaboration in a lot of these these cases. Yeah, so one of the other questions that often comes up when we start to talk to customers about why they're looking at enhancing their data is recently in the last six months there's been the huge focus of the industry of everyone talking about AI what's your story, how is AI going to disrupt your business or customers or industry. So one of the questions that we'd love to see what everyone's response to is in the poll, where are they on their AI journey. You're just starting to learn about the technology, starting to use it potentially in some POCs have it in production and kind of non critical projects where it's just kind of on the side, as you're growing your kind of expertise and how to use them how to run them and how to integrate into your business or are they already kind of a core part of the business. So the reason we're asking this is because the first step usually when we're having conversation with customers about why they're using the data products or why they're starting this journey on data products is the very first thing we think that they want to do is why they're doing it and what their bulls are. A lot of times that is to enable some of these things around being able to deliver AI or having a story around that or they may even have a vision of where they can use that. So yeah, really wide range of results it looks like most people are just starting at the beginning of the journey with AI. Looks like we've got another slide. Matt you want to go into. Yeah, I appreciate the question on what do you mean by AI and if statistical predictive models qualify because I think that that is an important point which is that I think a lot of these technologies around AI machine learning. People are using it in some way, shape or form. Just, you know, it might be some of the, you know, quote unquote earlier versions of it but I think from a organizational readiness standpoint, those those examples and being able to point to some of the statistical models that that do drive decision making in an automated way can can help really get buy in on a much broader strategy related to, for example, using AI in a data product strategy. AI services is a kind of good example very regulated industry. A lot of certainly a lot of risk and definitely has been one of the industries kind of at the forefront of using AI through an ML through through some of the modeling it that they do and things like underwriting, for example. And I think being able to point to examples like that through an organization really helps a lot with getting buy in and in budget. Above all, whether it's an AI strategy or data product kind of strategy specifically starting with your why is by far the most important piece. We love when we get on a call with a prospect or a customer and they say we need to implement a data product for our customers because we are trying to do the effectiveness of a CDP that we implemented very clear why we know that it's going to be a very successful engagement they're not just no shopping for a stack if you will but but they do have a clear business problem that that they're they're looking to solve. As you get going with the data product strategy. Some of the specific challenges that that we see organizations facing include how to aggregate disparate sources and more importantly it's just understanding when is this going to be needed in order to really drive meaningful outcomes. I think, particularly when you have data sets that are frequently changing things like external data and also when there are a lot of insights in the long tail of your data. This becomes a an important challenge to have in the top of your mind as you as you start to move towards a data product strategy in the world of just kind of. You know, basic reporting or trying to understand, you know, maybe kind of one off customers here and there that this type of problem isn't isn't going to be particularly painful but the more that you're trying to use your data as a product and use it for things like automated decision making but the more that you need to have an answer for. You know one just how big of a problem is this is this for you and to do you have a solution and a way to solve it because data integration is not free. And so being ready for that going into your data product journey is is really important in order to ultimately meet the timelines you expect. When we start talking to customers, one of the things they get a little bit afraid of is how big the project can feel when they're start talking about what their end goal is and why they want to do it and where they would love to be at an end state. And what we really want to try and help them to find and start with is to find their use case for what is there minimally by the product. And oftentimes what we want them to focus on early on is data integration and not system integration. So for the data integration, the architecture of that looks a lot well there's a lot of boxes and arrows that can click the next slide. Well there's a lot of boxes and arrows and it looks like there's a lot of moving parts at its core. It's really just trying to get all your data into one place into a data warehouse where you can then start to clean aggregate the data and create views where you can then point upstream applications at to use to drive the use case. So for one of her customers that effectively met that we would just take all their data they had, whether it's from their Shopify from their Facebook advertisement, their Google advertisement and just getting into their data warehouse. And then they use one of our customer focused data products to clean up, create a basic view of their customers that they're then able to put into their marketing tools to do segment analysis and targeted marketing to their customers, right. And then they're able to really kind of say hey we can create this view these are the segment demographics we really want to go after and target. Now it wasn't tied to any of their operational kind of systems where it's not something that's tied to anything that's trying to generate recommendations inside their website in real time as people are checking out or trying to do up sales or managing their golden record of an customer, it's just trying to build what is the smallest kind of use case that can be valuable to the business and answering why. Now if they set up that architecture in a good way and have some forethought how to do it, doing the system integration piece next is a lot more of an incremental step that is necessarily a massive project that they were afraid of when they started. Now, this is a different customer. No mature that we have a lot of for us external material on and about their data journey to our code on top that you can scan to find out more about what we've done with them. But they started off as just doing the data integrations, but once they had that working in to end and then trust in that data, they then built in integrations using mule soft to be able to then push that data back into their SAP system to actually drive their operational use cases that are kind of mission critical to them and that's really what we mean by system integration. That's a much higher bar usually get done where you start getting a lot more things around needing to go through change approval boards and get a lot of other teams to sign off on those changes so we tend to have customers focus in starting on more data integration at first build up trust in that data build the basic views that they want to use and then eventually switch over to doing the system integration actually consuming those systems like the output of their data products into their operational systems. And another pretty common challenges is just around how do you sort through this mix of legacy homegrown and modern tooling in order to improve your data KPIs. If you will, kind of across the board whether that's from new data quality to analytics and automated decision making, and we'll share a customer story on how they were able to kind of do manage this trade off effectively. But but first I do want to address the question in the chat here on how do you define a minimum viable data product I think that's a that's a great question and I think you know, one of the kind of things that's really important thinking through how to define a minimum viable data product is that it should really help to prove out both the business goal and reduce and serve the business need and then also prove out technical goals or reduce overall technical risk and so what I mean by by that is, let's say that you are building a customer data product where ultimately that data product is going to be used for customer segmentation. It's useful as part of defining the minimum viable data product to also scope out how that customer data product could be used for a different applications such as sales territory alignment or within within a CDP, because one of the things that really kind of critical to managing data as a product is being able to go through that full end to end loop of adding new attributes, new sources and taking feedback so that the data product can serve multiple use cases. If the minimum viable data product only maps to one single application or use case there's a lot of risk that you're kind of back where where you were where you have this data asset that that actually is high quality and good. But it serves a very narrow need and doesn't have kind of a clear path towards being able to serve a much broader need and also doesn't have the broader buy in that if we're going to be doing any sort of analytics around our customers that we're going to use this this data as the as the foundation for it and so doing some of that extra work up front just to make sure that what you're building can actually scale and be leveraged across multiple applications is is is really important. And this kind of gets that a question in the in the chat on just defining your use case and you know the contradiction between that and moving away from from use cases. I think that you know the word use case, we definitely presented that as as a bad word but ultimately you do need kind of downstream applications and it's the the one to one mapping of a data asset to that individual use case and then just kind of creating that silo that you know that new silo it might be an aggregated data set but it's still a silo and I'm still a source of data debt, kind of ensuring that that you're not going down that path is really the key, the key part of it and ensuring that your minimum viable data product and where you start can serve multiple purposes and can prove out that you have a good and effective feedback loop and way to to improve that that data product over time, adding sources adding attributes curating those those attributes, because that's really kind of what's what's challenging and different about a data product versus just hey this is a pretty good table that someone is using for a dashboard. Yeah, so back on this whole concept of the journey of how to do this and this is really a journey of modernization and kind of switching to the cloud. So we've been working with old mutual now for I think couple years and they started out when we first were talking to them, they didn't even have an AWS environments head up yet. They were just, they were still running an MDM system with parts of it running on a mainframe, but they knew they didn't want to continue to support into the future. So that was really a huge chunk of their why was that they wanted a more modern system that they could use and they wanted they knew that their data wasn't great and that they thought that there's a lot of value they could derive from it. So when we started working with them, the first thing that we focused on was an analytical solution that would generate some clean data set on their customers that they were then allowed to they they then used in like quarterly review processes and trying to do some audit based sort of like, hey, we are selling insurance and we have this insurance policy out the policy holder, we have a death record for that no longer live can we cancel that contract close it out so we stop making payments or stop having like premiums and kind of close the loop of the visual kind of customer story or customer journey. Once they had built up good trust in that data, they started to then use API's that were pointing at their data product and the data warehouse at the point of kind of consumption, where where they built it into their business applications to search for existing customers when they would open up accounts, so that way they can link it and say where you have a record of this person they're just adding an extra account or an extra person to their account and not create a whole new identity and ID for that person to prevent the sprawled bad data from being throughout their system. And then they're in the process and almost fully done with switching over to now, pointing their business applications to other update endpoints to create endpoints on top of the new real time source system that's directly tied in and to their data product. So that architecture when you zoom out. And if you go to the next slide. This is what it looks like from our perspective, where they started out building out landing zone and putting exports from their existing MDM system into a landing zone and we took that up. Reconciliation process to use our secret sauce of AI and ML to clean that data and really make it the highest quality that we could. And then put it into a real time store that was able to serve them to run analytical reports off of. They then started tying into API starting mainly with read and search API is pointing at that real time system. And that store eventually switched over to using the full crud suite and pointing their business applications at those endpoints. And then they were able to slowly move off the rest of their applications and using their old MDM system to just using the kind of new latest and greatest system. But this was a long journey over several years and several distinct phases, but they started with where they wanted to be. And we were able to increment our ways there starting out with answer delivering business value as soon as we could. Great. And just as you head down this journey, some of the questions that we think are really important in kind of shaping the direction of the market. One is just around the domain specific needs that will come up. I think one of the things that we've noticed with our customers who've implemented a data product strategy is that the nuances of the industry become increasingly important. The broader that that data product becomes and so the more external data, for example, that you're trying to integrate into into your data product. The more important it becomes that you're partnering with people who really understand the domain and are building tooling for the specific needs of those domains. Yeah, the other thing that is as AI becomes more and more popular almost every tech company will have something that they'll say is AI. And they'll say it's even generating AI. And if their answer is just that they have more advanced search that usually just means that they used an AI model to generate beddings and put in an extra database and is a slightly more advanced search bar, or they'll have some chatbot and sometimes a chatbot can just be trained on their doc site is really just basically automated support. We should really be looking for do they actually have use cases where they can say hey we use AI in this specific way to deliver this value. It's not just a chat bar that pops up when you log in or an extra search bar added somewhere it really should be tied into their product and actually have a concrete use case. Yeah, and then and then finally, certainly the data products kind of mindset and managing data as a product and also layering in AI can require some changes in skill sets and certainly this is something to just understand kind of going into into your journey is is do you have the right people and skills in place in order to make this successful. Really, I think one of the big promises of AI is that it will make it much simpler for people to be very productive with, for example, you know highly technical solutions and so it should ultimately simplify the skill set and reduce the kind of diversity of skills that are needed but this is something to push on in order to really understand what is the maturity of the kind of application that that you're looking to to adopt. Yeah, and just some kind of core takeaways from my perspective is to wrap this up is that as much as I think a lot of us here on this call are probably very tech heavy and technology center core very easy to get sucked up into what is the latest grace technology and spend all your time talking about, oh, should we be using snowflake or big query or you AWS is your GCP technology at the end of the day is just a tool for really matters is the outcome that you're driving the business cases that you're enabling and really focusing on the outcomes and like technology isn't the only interfaces that matter and tying together the systems somewhere at the end of the day. There's humans that are involved in this process they're either consuming the data you in some way shape or form whether that someone at a POS system in a store trying to look up someone's customer loyalty number to do a return, or you have a seller that's trying to get updated information for contact to renew a deal students at the end of the day they're consuming this data to have to design the you have to use the technology to make their jobs easier to enable them. And another thing that we try and really feature customers is don't try and solve everything day one. Start with something that is as small as possible that can deliver value to your business and iterate from there. Because if you can deliver value early on it's easier to get the further investment to continue going down the journey, versus if the bigger you try and do something at the start, the more permanent is the running late, or going over budget, or just not getting finished. And then the last thing is, a lot of people will look at some of these slides and be like oh that's simple we think we have the team to do that can do that. The question that it is kind of it's a false economy in my mind is more of a question is do you want to spend time on building out these data and system integrations yourself, we're actually figuring out how to tie it into your business applications in your business process. That's closer in line to your core business and what the value your business offers the news time together these systems. We have a couple minutes left here for Q&A. Thank you so much for this great presentation. If you have questions for either of them feel free to submit them in the Q&A panel. We've had answered a lot of questions along the way that came in through the chat already. And just answer the most commonly asked questions I will send a follow up email by end of day Thursday for this webinar with links to the slides and links to the recording of this session. So diving in here. So can you share any real world examples of how companies have successfully implemented data products to address their specific data quality challenges? Yeah, absolutely. So we did highlight some examples such as NovaCure. I think another one that comes to mind for me which I think is a really good story in terms of the amount of leverage that it created for the business is one of our customers is a asset manager, specifically a venture capital firm and the firm is maybe 20 people and the data team is maybe three or four people and so kind of efficiency was really key to them in order to be effective. The data team was brought on board under the kind of basic premise that this venture capital firm primarily invests in consumer businesses and consumer businesses. There are a lot of alternative data sources that enable you to understand the health of an industry as well as the health of an individual business. You can get external data that shows credit card transactions at individual storefronts. And they knew that in order to get an edge in the market, they would need to find a way to use this data in order to get ahead of some of these trends and know which of their portfolio companies, for example, should they be trying to double down on or which companies that have maybe only raised a little bit of money should they try to engage with early so that they could be a lead investor and help accelerate the growth. And so the data team was brought in in order to really kind of build out the full set of data apps that the investing team would use in order to understand trends in the market and then be able to make these decisions. And the way that they did that was through a data product strategy where they put in place data products for companies first and foremost and then also for contacts who do they know within those companies as well as data products for their investors so that they can kind of tie in even though it's a small group and have clear linkages between the people in their firm and the people that they're engaging with and have had interactions with over time. And so they put this in place in a matter of getting the full kind of stack in place and starting to get first insights was about six months and once they had it in place, it was just a matter of weeks until they were actually making investment decisions using this data. They found some kind of early signals that one of their portfolio companies was starting to get a lot of traction and could kind of take their growth to the next level with an additional round of capital. And so they use that insight in order to engage the portfolio company who was kind of agreed with their view of the market and ultimately enable them to get kind of preferential terms on the round. But I think this is a great example of where a company was able to get a lot of leverage in their data stack as a result of putting in place a data product strategy and be able to accomplish things that, you know, years ago I think having a team of that size be able to move that quickly and have that type of impact in such a competitive industry would be incredibly challenging. Oh, we've got some great questions coming in, but I'm afraid that is all the time that we have for this session. So I'll get those additional questions over to you all to Tamer and so that you can get to those and see those. But Matt and Nick, thank you so much for this great presentation. Thank you. Thank you everyone for joining us. It's a lot of fun and look forward to keeping the conversation going. I love it and thanks to our attendees for being so engaged. We do and everything we do again. I'll just send a reminder. I'll send a follow up email by end of day Thursday for this webinar with links to recording and links to the slides. Thanks everyone. Have a great day.