 Here we go. Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Officer of DataVersity. We would like to thank you for joining this DataVersity webinar, Optimizing Solution Value, Dynamic Data Quality, Governance, and MDM, sponsored today by Precisely. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them by the Q&A. Or if you'd like to tweet, we encourage you to share your questions via Twitter using hashtag DataVersity. And if you'd like to chat with us or with each other, we certainly encourage you to do so. And just to note, Zoom defaults the chat to send to just the panelists, but you may absolutely change it to network with everyone. And to find the Q&A or the chat panels, you may click on those icons found in the bottom middle of your screen for those features. And as always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and additional information requested throughout the webinar. Now let me introduce to you our speakers for today, Chuck Kane and Sue Pollock. Chuck is the Vice President and Product Management at Precisely and is a 20-plus year veteran of the IT industry, having worked across multiple industries, such as commercial banking, insurance, manufacturing, government, and consulting. Chuck has led multiple data consulting practices, delivering high-value client engagements, leveraging IBM, MDM, Initiate, and Informatica being most recently at Tata Consulting Services before joining Precisely in 2014. Sue is the Senior Manager of Product Marketing at Precisely. She has over 15 years of experience coordinating marketing strategies that unite teams behind value-driven solutions. Sue is passionate about creating messages that is not full of jargon and techno babble but easily understood by business and technical users alike. And with that, I will give the floor to Chuck and Sue to get today's webinar started. Hello and welcome. Hello, and thank you so much. I am going to share my presentation real quick. Just make sure I have the right presentation up. Looks good. And I do believe I do? Yes. All right, great. Thank you. And thank you, Shannon. And thank you to the audience for attending today's session. We do have a lot to cover. And we hope that what you see today and what you hear is engaging and informative. So let me make sure I get on the right screen here. As you know, as is introduced, I'm Chuck Cain, Vice President of Product Management for the Verify Portfolio. And I'm going to be joined here with Sue Pollock, who is the senior manager of our product marketing. And Sue and I work really well together almost on a day-to-day basis. So organizations rely on trusted data to make good business decisions. As they drive towards being a more data-driven organization, there are some trends that we see and some key business initiatives that evolve around the transformation of customer experiences. We see applying AI, artificial intelligence, to proven business cases, to support driving new insights and increasing efficiency. We see the power of leveraging location to be able to solve new business problems. And we also see that the business, to make sure that the business is securing a plan and that the data is actually being managed most effectively. And each of these initiatives heavily depend on an integrated, clean, accurate approach to contextualize and enrich information in order to be able to drive maximum benefit into the organization. So data is definitely fueling if you will allow the various different decision making that we make today. Now, one of the things that we look at is historically data problems were addressed by point solutions that solve specific data challenges. These types of solutions address data quality, master data management and data governance, sometimes together, but oftentimes in isolation or in pockets within the organization. The main issue with point solutions consists of a couple of main problems. One is not all solutions work well together. So you might not have a good data quality solution that maps to a data governance solution or to a data governance solution that maps to a master data management solution. And so oftentimes they work in these silos and don't deliver on a complete and accurate picture. And the quality and understanding of the context of around that data ends up being somewhat of a challenge. And so what we really are looking at driving is more of a business-focused, a business value focus where each organization has data that should be used as a positive earning asset to support increased revenue and support accurate decision making and alignment around the strategic initiatives. But often we see that not having a fully integrated data quality, master data management, and data governance program does lead to challenges where there is money left on the table. And what we mean by that is the costs are really high to be able to achieve the results. The decisions that come out of those solutions are not fully aligned to the business initiatives. And you end up having top talent and subject matter experts aren't always engaged on each and every of these projects to help with some of the data stories or some of the interpretation around that. And so what when I was talking about earlier is this evolved approach around business value focus. As a whole, the industry is beginning to transform from being a more point solution focus to driving more towards the value drivers around the business. And not only in driving data across organization but driving them out of silos. So this evolution is important because business value is expects solutions with integrated capabilities that account for both agility and flexibility. So in this session, what we're going to be talking about is various views of business goals that our customers and we have seen that are driving the evolution of data quality, master data management, data governance in what we consider today's business operating environments. Key areas that we'll see as part of this evolution is around the data modernization, around key initiatives that are powering customers to leverage their data as an asset. We're going to talk a little bit more about the single view of a customer, which is that intersection between data quality and master data management along with data governance in support of a single view of the enterprise capability. And when I say single view of the enterprise it's really beyond just the single view of a customer, single view of a location, single view of a product or services, it's really how do we tie all these things together to better drive out enterprise decision making. Customers are also evolving their approaches around data on the move to increase the trust around the data and data transformation projects around mergers and acquisitions. And as well to system to system data movements all of these around the operational and evolving operational value with more automated, robust and certainly if you think about it existing data quality solutions that increase operational efficiencies. So in this first segment, let me make sure I get this to the next slide. In this first segment, we're going to discuss how an integrated approach supports the evolving nature of data monetization. So what I would like to do now is I would like to introduce our next speaker, Sue. As we talk about Sue, Sue has extensive experience on positioning multiple data monetization programs that leverage the intersection between data quality and data governance. So Sue over to you. Thanks, Chuck. As Chuck just said, data-driven organizations are increasingly seeking to leverage their own data as an asset to generate measurable economic benefits or monetize the value of their data. In a recent study, data professionals speak trends in data governance and data quality. It was reported that improved data quality is the leading benefit of data governance initiatives. And that does mirror what we're hearing from our customers that data quality that's deeply integrated with data governance works together to deliver business data that they can find, understand, and more importantly, trust. This enables confident decision-making, operational improvements and taking action to comply with or even leverage regulatory and regulatory requirements to improve their bottom line. Looking at an example of one of our customers and independent wealth fund was looking for a significant improvement in the way that they source, manage, and use data to better inform their investment-making decisions. They created an office of data to improve the effectiveness of the management of their operational data. But then they quickly realized that for them, data governance, data quality, and even data reconciliation were the key building blocks for increasing maturity of their data program to increase data literacy and monetize those data assets. We partnered with them to provide data quality and governance capabilities, allowing users to find data quickly, understand the context and the quality of their data, and then ensure that their data is managed as a strategic asset for the organization. Additionally, partnering with them to build a measurable strategic plan, leveraging our strategic services made sure that they were achieving their goals. And this was an integral part of their initiative, tracking those KPIs and those goals. Currently, they're also exploring data observability. This is capability to proactively uncover data anomalies and take action before they become costly downstream issues. In another instance, one of our consumer packaged goods customers was struggling to find ESG data that was useful, well understood, and accurate. ESG, as you probably know, stands for environmental, social, and governance factors that determine the sustainability and ethical impact of a company. Very, very hot topic. But this could describe any initiative around any regulatory requirement. We work with this company to design an ESG governance framework that measures the availability, usability, and the integrity of their sustainability data. To accomplish this, they needed the capability to discover, catalog, measure the quality of, and then identify the supply chain of their critical ESG assets. They also needed to build an automated organizational workflow model to ensure the business accountability. So ownership and execution around that data was important to track. Ultimately, we delivered cataloging and governance capabilities, allowing full visibility into the context of the data, that is like the policies, stewardship, lineage, history, and data quality results around all of their sustainability data. And from there, they were able to increase the accuracy and frequency of their sustainability reporting. And additionally, we're able to improve the visibility of ESG information for the leaders and decision makers across the business for better and more timely business benefits and insights. Chuck, back to you. Thank you, Sue. And what I would like to do now is transition to the intersection, as I was discussing earlier, of data quality and master data management, along with data governance and support of the single view of the enterprise. So this next one is a bit of a build out where we're really referring to here as the initial capabilities and or support of where you really have this initial intersection between data quality and master data management. In fact, I often have you data quality and master data management as two sides of the same coin. In order to be able to get to a true golden record of entity, I first must solve for the problem of applying good data quality techniques to master data. These data quality rules are often defined and governed by the data, such as matching and consolidation. But we also need to be able to start to think about how we actually tie these entities together across the enterprise in support of these various personas and these personas describe the relationships that an individual might have or how they might interact with you. So for an example in financial services, we see many different use cases where business owners also have personal accounts and vice versa. And so what we wanna understand is the persona and the interactions around those personas so that we can fully describe, if you will, the relationship and those interactions across the organization. And really why we wanna be able to do that is it helps us to be able to drive out these customer centric insights. So together we take all of this data to support and optimize the overall experience and provide insights that are both predictive and accurate. And so when we think about this single view of data, we think about the single view of enterprise or a single view of an individual and organization. This is one example where we looked at a, we have an example of a single view of a merchant. So as we move forward with this first example, I'd like to talk a little bit more about this for financial services. So as is the case with any business, especially large companies who have collected data over long duration and say, you know, decades, we find that the value of data is locked in these various different data silos I was referring to earlier of information and these data silos, when we are able to break them down and start to connect them, help us to better understand these insights, help us to better understand the outcomes. And so within this use case, we're able to unlock those silos and support a better understanding of merchants and the locations in which those merchants were actually operating in. And by doing so, we're able to transform these data assets or transform data into assets that drive business decisions and unlock, if you will, further sales, sales opportunities and growth across the organization. Really what we're looking at here is also a solution that includes not just data quality with single view capabilities such as a matching capabilities, but looking at that location, understanding the specifics of that location and enriching information based on that location. And further is to be able to support both on-premise. So there was capabilities that were required from an on-prem with a traditional data warehouse to also scalability inside the cloud. And so as Sue was talking about earlier with some of our data integrity suite, we've actually been innovating and bringing technologies into the marketplace that enable us to run not just the data profiling, data observability, but also looking at where we do geo-addressing location, if you will, enrichment and be able to drive that operational value across the entire enterprise. In the next use case, what we're gonna talk about more is about a multinational oil and gas company. So this is one of the world's largest providers of chemical and as well other types of investments across a very, very large global workforce exceeding 60,000 folks. And so the challenge here is that MDM is vital to well operations, but it's also vital to the investment that the organization is gonna make. And deep water assets, as well as other domains of data and information. And so today their existing processes in which they were actually handling some of this master data was that the data wasn't linked together. There was issues around understanding the critical data of these multiple systems for day-to-day operations. And I'm sure many on this call have witnessed this is require a lot of IT support in order to be able to derive value from that data, as well as the integration across these systems were very inefficient. And by introducing and providing if you will a master data management at Data Hub, we're able to support a lot of the business value that you see here, which is creating that golden record for now what is considered up to 10 domains of data. And some of these domains include vendor, location, equipment, customer and material domains, but also enriching and exporting that data across systems. So having system and system consistency and ensuring that data quality is the same across these systems and you have the same types of insights and outcomes, but also enabling integrations across multiple source systems both on-prem as well as across the cloud. And so there's a lot of value that we're as able to be delivered through the support of just a master data management solution that's paired with data quality that is governed by the rules of the organization to understand exactly what these things are and how those things are described as part of these various views of the business relationship as a whole. So with that being said, what I'd like to do is next bring it back to Sue who will talk to us some more about evolving data on the moves. So Sue, back to you. Thanks Chuck. Another way that we're witnessing the evolution of data quality and data governance is through the recognition of the importance of data on the move. Data on the move is really any type of data that's not considered to be in its final destination or in its final state. This is basically transitory data that's moving through multiple locations or for one format to another. Let's take a deeper look at examples of how organizations are evolving their data on a move. Data on the move includes a number of different key business initiatives from data transformation and modernization projects where data is moving from legacy or on or third party sources to the cloud or other destinations to help further realize any greater business value. This also may include any data that may be altered while hopping from one source to another. And it could also be moving any type of data from mainframe to the cloud via Kafka or some type of API or web service. Another huge piece of data on the move includes movement of data related to M&A, mergers and acquisitions that demand accurate and consistent mapping and transition of data to quickly capitalize any business value of that M&A. Finally, data on the move may include simply monitoring and observing data quality and anomalies as it moves across your organization between systems and processes to ensure quality and consistency. This includes monitoring from system A to system B or system B to system C or even from system A to system C to get the full breadth of the picture of where your data exists. Let me walk through a few customer use cases to learn about how to share how they are approaching data on the move with a more evolved approach to data quality. Auto owners is a regional property and casualty insurance company that provides life, home, car and business insurance. They serve over three million policyholders in 26 operating states. And the challenge that they faced was within their financial claims division related to data modernization and the movement to a large data warehouse. Their main goals were to, number one, improve the quality of their data and number two, balance and reconcile their diverse and large data sets. Their main drivers were to modernize their data streams to support their admin system modernization and to provide more accurate and reliable data for corporate reporting and analytics. The part of our solution that was a deciding factor for auto owners was having an out of the box workflow within the tool that helped them get to the right identified data owner stakeholder in order to fix data in a timely fashion so that reconciliation piece. Another great outcome of the solution was that they initially thought that they were gonna have to do two separate RFIs to deal with this, one for data quality and then one for the reconciliation piece. So through the evolution here and after reviewing our solution they determined that our tool was able to satisfy both cases for data quality and reconciliation and they were able to consolidate the vendor selection process using our solution. Let's continue to our second use case here. This midsize health insurance company is a not for profit that serves over 4,000 members across their entire state. The challenge that they faced was that their existing claims process resulted in missing late and even duplicate claims that ultimately led to several compliance fines as well as very poor member experience. This problem was even escalated to the CEO who wanted to put it in place the proper monitoring and validations to ensure this would not happen again. Their main drivers for their business was to reduce their compliance risk and they ultimately wanted to strive for a zero compliance risk fine environment that was their ultimate goal. As a result of selecting precisely this customer was able to provide monitoring over all of their claims and 100% of the claims across all of their systems and interfaces and across all the data hubs in which their data is processed. This gave them better insight into any of the escalating issues that might have been faced by poor data quality within their claims process and it did drastically reduce the amount of reconciliations mostly manual reconciliations that occurred over the weekend for them. So they were able to achieve their zero fine environment that they were going for regarding these statistics. The next topic that we're gonna talk about in the evolving nature of data quality is evolving operational value. Evolving operational value is something we see consistently in customers using our technology to quickly solve complex operational picture challenges. These solutions can take the form of data quality issues already described we've already talked about or complex data integration of data augmentation processes to complete missing data fragments or create an operational picture pattern as we've heard described by our customers to close that gap between perception and actual fact. Data ops is an emerging discipline of many organizations but the constant is resolving the demand for business value and data integrity to increase the operational value of the data assets we're creating. One aspect of our value contribution is enabling teams to react quicker with automation eliminating time consuming and error prone processes. Precisely supports data operations and now evolving data ops concepts by simplifying the cycles to create data assets that are fit for purpose, fit for use immediately. The result is providing trusted operational data with greater efficiencies and a more complete operational picture. Examples of attributes that make data more fit for use include product interchangeability, matrix solutions that include what products are capable of what, assets and field attributes, customer demand recommendations, what products recommended not just by social association but by actual buying patterns and also known purchased items and compatible related items. We also see around supply chain management we see demand to consolidate vendor performance metrics and actual demand forecast to rapidly identify supply gaps and top performing vendors. In the example you see in front of you this large healthcare organization had an initial base business need to validate addresses beyond just a mailing address. The evolved operational processes made the data more fit for use and solved many other operational gaps. In this case, it included the data recommending radiation and data management functionality to quickly correct data integrity issues thus driving the adoption of a larger solution through a correlation of multiple data sets and a more holistic operational picture. Specifically, we leveraged position association looking at practice attributes, building in the dimension of time which is critical and the association of the physicians and their areas of specialty, the location of their practices, understanding what rights doctors had to work in certain healthcare facilities. And then we even needed to factor in drive time distance which is also critical to this customer. For healthcare providers having data traceability and a clear understanding of the validation and standardization methods is critical. Black box solutions and fuzzy data associations many times create undesired research efforts. And one of the dominant strategic benefits for precisely technology brings is minimizing predefined taxonomies, data producing methods, processing methods that force customers to customize and adopt the non-optimal business practices or technology frameworks. Our second example is around data integration of data quality issues. And we're gonna talk a little bit about this. Data integration of data quality issues share many of the same characteristics and business impacts that Chuck described earlier when we were discussing single view of data. It's kind of a different side of the same coin. But resolution of these issues can be resolved in many different methods. The power that precisely brings to the market are a collection of best of breed technologies to meet individual organizational challenges not forcing your organization to adopt one technology over another. To highlight our capabilities we chose this complex manufacturing data integration business challenge. In this scenario, there is not just one source but this is a multi-source system issue with a timeliness issue of operational logistics. So the complexity of taking multiple systems in this case, product, customers, vendors, location and employee data domains, tying them together and providing an operational picture throughout the dimension of the transaction as one of the critical elements the customer wanted to resolve. Here we were able to incorporate advanced addressing validations and enrich them with geolocation intelligence that was leveraged throughout the transaction. These types of solutions incorporate the need of data consistency across multiple systems and the creation of multiple operational pictures in near real time. This approach is to prevent data gaps and increase the offers of operational awareness across those multiple domains. The primary benefit to the client is the focus on the fit for use data resolution at the beginning of the operational process and then subsequent refinement of the data to mirror the operational picture demand throughout the entire transaction. The solution approaches well beyond the aggregation of metrics or analytics or mastering a few data elements or just traditional data standardization. This approach drives data driven decisions throughout the entire operational chain once again, evolving it to the next level. These types of solutions are just one example of how precisely delivers a complex solution using configurable, agile and highly integrable technology assets whether they're precisely branded technologies or if they're existing client side technologies which we can also work with. So with that, I'm gonna turn this back to Chuck to summarize our message. Thank you, Sue. And as we started in the earlier part of this conversation is these use cases and driving off this evolving value evolving value around insights across many different types of industries. But very interestingly enough, really each of those we're trying to get to the same benefits and the benefits really are optimizing those business decisions, gaining, if you will, clear alignment around strategic direction as well as the execution accelerated time to value as well as the improved engagement of top talent on both the strategic and innovative initiatives across the organization. And as Sue was talking about earlier is through this evolved approach is really thinking about data integrity from an end to end capability in the sense that what you really wanna do is integrate which is to modernize, if you will, your infrastructure either for the cloud or to eliminate the data silos or to automate existing business processes to also validate that data and build around that the appropriate data governance and quality initiatives and make sure that those are built as a data-centric process to both ensure accuracy and the consistency of that data to leverage, if you will, location. And location is an inherent asset to your data which you can actually derive a lot more sophisticated analytics and if you will actionable insights based on where that location is or where interactions happen and then in the enrichment component which is to compliment a lot of your core business data with expertly curated data sets that can add that additional layer of critical context which really can help to drive increased value. And when we think about data integrity as the integrate, verify, locate and enrich it wouldn't go without saying that we want to engage and certainly drive a seamless what I consider a personalized omni-channel experience where you make sure that you're able to communicate consistently across any medium at any given time and really be able to engage at the right time with the right message to the right individuals or the right, if you will personas that you are targeting. So with all that being said I'd like to now turn it back over to Shannon who will actually help us with some of the questions. And Sue, thank you so much for this great presentation. If you have questions for them, feel free to submit them in the Q&A portion of your screen and just answering those commonly asked questions just to know I will send a follow-up email by end of day Thursday with links to the slides, the recording and anything else requested throughout. You all are being very quiet here but let me dive in here. What cloud-native technologies can precisely work with? Yeah, so I'll take that one and Sue, you can comment on it any further. When we think about cloud, we're really thinking about two aspects of cloud. One is there's the azures of the world, the AWS's, there is GCP for Google and certainly each of those have their own pluses and minuses as it pertains to the deployment capabilities of how organizations wanna integrate and leverage each of those clouds. But there's a second, I think facet of cloud which is what are we looking to try to achieve? Are we looking for operational value? Are we looking specifically for analytical value? Are we looking to try to hedge our bets between the two? And so in thinking about the cloud, we also think about the elastic nature of the cloud and what we mean by elastic nature is to support the organization's ability to scale up as time and as the requirements require or to be able to scale down and save money. So think about various promotions that you might do during a Super Bowl or doing a sporting event or during any other type of big event. You may have a large volume of data that's gonna come in from a seasonal perspective. And so you want that elastic nature where you can scale up, where you can support that. But then you don't want those systems just running because you may not have those campaigns, you may not have the volume. So you want it to be at a scale down and you want it to do it intelligently, right? So you don't wanna have to think about when you scale up or when you have to think about scaling down and you need this to be sort of in this elastic auto-scale capability within the constraints of how you position the technology. So I think that is another facet. And then a third component of this is thinking about the various, if you will, data warehouses that are out there. So you might have a traditionally warehouse that you built in-house but you might be thinking about and looking at leveraging let's say a modern data warehouse like Snowflake. So you might want to have data processes that are gonna run native to a Snowflake environment. You may wanna have data processes that run native to a let's say a Databricks environment for analytics or for other outcomes. And so I think this third facet of this is both the ability to be able to run next to the data so you're not extracting the data, you're able to maintain the data in that single source. And then I think the fourth component of the cloud is all about when you think about enrichment. So in thinking about enrichment, there's lots of various different data providers that are out there. Precisely is also a data provider through our spectrum on demand. And so there's an API capability that needs to support the say a validation of an address or a data enrichment around that address or being able to support various different facets of data enrichment, which could be a name and or email and or phone or other types of information validation. So there's a lot when we think about the cloud as to these various perspectives and how you actually might consider, you know these integrations both as a hybrid capability. So thinking about, you know you might have some on-prem solutions that tap into the cloud. That's what I was considering with like spectrum on demand versus cloud native which you're actually deploying the capabilities inside these cloud environments to support those types of operational outcomes. Hope that answers the question. Definitely. And certainly can have people ask additional insights if they would like some more information but moving on down the line here, you know how we get this question a lot in all of our webinars and all the context and it's always good to get a new perspective on this. How do you overcome the difficulty of getting buy-in from leadership for allocating resources to enrich data? So do you wanna start that off? I do, I do. And actually I do wanna talk a bit about in a number of the use cases we mentioned our strategic services team. And the purpose of the strategic services team is to make sure that you have an organized plan, a business case that is measured and that is reported back on and that you have a game plan so that as you're moving forward you know that you are meeting your goals. Now that doesn't directly answer the question about what we're talking about here but that is the problem that we're looking at, right? Is that anytime you're looking at bringing in any kind of data initiative whether it's enrichment, whether it's governance whether it's quality. If you don't have a strategic plan that is going to resonate with all of the influencers up and down the organization it's very difficult to move forward or it's very difficult to move forward and continue to understand are you reaching the real value that you need? So point that I'm gonna make here is that our strategic services team is made up. We are an unusual company in that we have our own strategic services team. We know data intimately we have a number of people who are previously CDOs and in larger organizations who now will work with our customers and with our prospects to make sure that number one we've clearly defined what the goals are. What are we trying to do? Are we trying to enrich? Are we trying to bring in governance? What's the purpose? What's the goal? What are the things that we should be measuring so that we can document that value? And in our solution we innately do that document the value and then have it reported on so that you are actually making sure that that business case is being realized. So the question is once again how do I build a good business case? It's not as easy as it sounds, right? Just because you have a pain it doesn't mean you can go out there and say here's a solution and make sure that you're gonna be getting that kind of buy-in. But it really is about making sure that you have a strategic plan and measurement of that strategic plan so that you can make sure that you are delivering on business value. Now the short part of that answer is once done it's a lot easier to replicate and that first strike is really the hardest. So hope that answered the question as far as being able to get there for making sure that you're getting buy-in. Love it. I'll leave it up to both of you to respond or add as you wish to these questions. So I'm moving on to the next question. As far as data in motion and verify and timeliness how can that be done reliably with quote unquote real time and near real time which can be in seconds and minutes, hours and lost old school daily delivery and validation does not have that complexity of validation timeframe. Yeah, so I can get started and I think when we think about timeliness I think time is becoming a dimension that is almost expected as we think about the stand up of various center of excellence as we think about the business being more engaged if you will on driving these initiatives or being part of these initiatives. What we don't wanna do is drive projects to be an IT only focus but as part of that, that's that integration into if you will a set of user experiences. And so one example of a very simple user experience is for instance on address validation old school delivery around address validation might be that I would enter the address however it might be I might actually take an extract of that data and then I would go validate that against a validation engine and then update the data in the targets location where it might be. Then you might really move forward a little bit from old school from a batch perspective into where I might do a real time API but it's on the submit and during that submission process I'll get feedback that says well, maybe this isn't the right address or maybe I'll get an address recommendation that I could potentially use. And then there's one step even further which is how do you integrate that address validation into the actual field while the individual is typing? So when you're like on Google or whatever when you're typing it actually makes certain recommendations around what you're typing the same thing with address validation as you're typing in the address it actually is coming up with the address and then you just select from the list and it then pre-fills if you will, all that information that is necessary so it actually enhances that overall user experience and it makes the overall validations on the back end a lot faster because we're not building in this exception logic that says well, we got this address but it's not the right address and all of that. So I think when we think about these capabilities it's really about where in this real time near real time in milliseconds if you will responses that would support these various initiatives. And so extending that even further into like for instance, address and or not address but email or phone or other types of validations is leveraging the technologies in a very similar fashion but then really understanding the requirements around this is to streamline the user experience we had a streamline if you will, the data so that we're doing things a bit more prescriptively with the data versus trying to be react on the back end. So hopefully that answers the question. Sue, do you have any additional comments you wanna make? I was just gonna jump in a little bit Chuck obviously talked about looking at data that needs to be enriched or standardized in real time and his answer is perfect on that I wanna address the dead in motion part and I don't wanna get too technical but the solutions that we have to offer have unique capabilities to measure how long it's taking something to move from one system remember we had that system A to system B we have unique capabilities that we can actually determine it should go from system A to system B within a certain timeframe, if not send an alert we can validate that all files are received in a certain timeframe. And again, as far as that from system A to system B to system C or going across that's how we actually are looking at those things in real time or I would say near real time I'm being honest with us here we actually do even have the capabilities of doing quality checks in the Kafka messages so as they're going across. So again, don't wanna go too deep but taking it a step further as far as dead in motion that's just kind of another piece to what Chuck said already. Okay, both. And our community loves to network with each other do you have user communities that are able to learn more from other users? Sue, do you have that? Absolutely, as a matter of fact we have a very robust community especially in the governance world as a matter of fact we're incredibly proud of that we have customers that are regularly presenting their best practices and standards around what they're doing as far as governance and quality we have a very robust community as far as our other solutions other the other things that we've talked about here but it really is something that as a product marketing manager I'm constantly learning from those groups as far as what are the new tips and tricks out there in different ways that we can learn from each other. So yeah, really something that we're very proud of here precisely. Very nice because networking is so important too. Absolutely. Data users. So what is data observability? You know, that's a, I can take that if you want. You know, data observability is an extension of data profiling. When we think about data profiling, you know, a lot of times what we're thinking about is, is really trying to understand the measurement of the statistics around the data. So maybe looking at various conformity or maybe various, you know, patterns in that data and really this is a precursor to what we consider a lot of data quality, you know, projects. And then maybe ongoing profiling so that you might better understand, you know, the trends of that data over time. Data observability takes those trends and sort of drives it, you know, it puts it on steroids and what we're looking for with data observability is things like, for instance, volume drift or schema drift or data values that have been introduced into fields that are not necessarily intended for, you know, the kind of value. So like, for instance, putting names into an address field or putting address fields into name fields. And then also really paying attention to and better understanding, right? Schema changes, schema drift and all of this being set with, you know, a number of observations. So this is an ongoing process that happens behind the scenes. It's a bit more proactive and what we mean by proactive is the sense that as these, if you will, events happen, these drifts or these observations happen, what we're able to do is provide an early warning system around that. So instead of actually being woke up at, you know, two in the morning because of a problem with the job that's running, right? You might start to get some trends that say, hey, over the last couple of days, we've noticed that the volume of the data actually has decreased. We've also noticed that the values that are normally in each of these fields is not complete or we're looking at values that are not, you know, if you will adjacent to the type of field that it should be in. And so you can get these early warnings, these alerts and then you can evaluate those and say, you know what? I actually want to do something. I want to take some action on that. Or you could say, hey, I want to ignore that alert and maybe set a threshold that says, you know, for this type of alert to pop up again, maybe it's not just I want to see it decreased by 1,000 records, if I have 10,000 records that are coming in on a day-to-day basis, but maybe if it drops by 2,000 records, right? If it's only 80% of the volume that I expect, now I might want to, you know, start to see this and start to understand that. But the other aspect of data observability is just being able to evaluate that on a time scale. So basically looking at, you know, how are these trends and how are we evaluating this and what kind of things that we need to do with that data. And all of those are actually really good insights that feed into the governance process. So think about, you know, making sure that we're compliant to the rules around the engagement of that data as well as to the data itself, right? So applying it to the data and making sure that we actually have the right, if you will, operators that are happening on the data to ensure that we are conforming to the requirements as a whole. And observability really helps us to just maintain and stay on top of our data. Because, you know, we've heard many times before from not just from analysts, but from what we've witnessed in industry is that when you do data quality projects, you do have, if you don't have an ongoing, you know, effort to maintain that, is a three to 5% decay in the data. And oftentimes some of that decay goes unnoticed. And this is where observability really helps is to help bring or illuminate when those data decay issues happen and hopefully help you to stay ahead of them. And if you will, make sure that they're, you know, they're being managed adequately. Perfect. Thank you so much. Everyone's a little quiet today. So that's the end of the questions that I have queued up. I'll give everyone a quick moment if you want to add anything else. Again, Chuck and Sue, thank you so much. Anything else you want to add? While we give everyone a quick moment to add any additional questions. I'm good. I just appreciate the opportunity for everyone to join us here. Thank you so much. I like that for the same. And thank you both. It's always a great presentation. I really enjoy this and already it's been requests to share with our team. So, so always a good sign. So thank you both. And thank you to all of our attendees for being engaged in everything we do. Just a reminder, I will send a follow-up email by end of the Thursday for this webinar with links to the slides and links to the recording. Thanks everybody. Hope you all have a great day. Sue, thanks Chuck. Hey, I thought it went well. A little silent. I was hoping for a few more questions, but we got a couple. Yeah, yeah. No worries at all. Thanks all. All right, thank you so much. Take care. All right, bye-bye. Bye-bye.