 go. Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Officer of Data Diversity and we'd like to thank you for joining this Data Diversity webinar achieving a single view of business critical data with master data management sponsored today by precisely just a couple of points to get us started due to the large number of people that attend these sessions you will be muted during the webinar for questions you will be collecting them by the Q&A or if you'd like to tweet we encourage you to share highlights or questions by Twitter using hashtag Data Diversity and if you'd like to chat with us or with each other we certainly encourage you to do so and just to note Zoom defaults the chat to send you just the panelists but you may absolutely change it to network with everyone and to find the Q&A or the chat panels you may click those icons found in the bottom middle of your screen for those features and as always we will send a follow-up email within two business days containing links to the slides the recording of this session and additional information requested throughout the webinar. Now let me introduce to you our speakers for today Sumit Negpal and David Vega as a Director of Sales at Precisely Sumit is responsible for supporting customers across industries and verticals and helping them leverage data as a strategic asset. Sumit has over 25 years of experience across functional areas including sales and business development, professional services, consulting, plant operations, procurement and engineering. David is currently a strategy principal at Precisely focusing on the consumer packaged goods food and beverage omnichannel reality. David has over two decades of CPG leadership experience across functional areas that demand trust in data including sales, category management, distribution and retail analytics, software product management PIM, data syndication and PLM solution development and business process transformation and with that I will give the floor to Sumit and David to get today's webinar started. Hello and welcome. Thank you very much. Good to be here. Thank you Shannon. I hope everyone can see my screen. You can see it Sumit. Great. Hello and welcome. So thank you for joining the session today. Today we'll be talking about how a single view of data with MDM can support your business goals and strategy. We all know that today's digital economy runs on data and if you're going to compete and thrive the core business data must be right. It must be compliant and just as important, it must be available wherever it needs to be by the business faster than ever. We are talking about the core master data that describes your business here, things like your products, your customers, your vendors, equipment, material, location data. So how do you improve the quality of your most important assets and ensure that it gets to all the right systems and people that need it quickly? But with today's loosely coupled system digitizing process is challenging. We all know we need data in a single spot but how do we get there? The critical data that powers your core business process and underpins your transformation is dispersed more than ever before. Now for a minute, let's think about your customer data. It may reside in your ERP systems, marketing systems, e-commerce, point of sale and several other systems. It may be great that you are able to take an order in time and process the order but if it takes forever to process a customer record or set up a new customer, you lose the customer and the experience is not efficient. And you can certainly find any number of surveys or analyst reports that talk about this aspect. Here are some stats from Forbes, HBR and our own data trends survey. When two-thirds of organizations say that siloed data negatively impacts their data initiative and almost half of the newly created record have at least one critical error. It is no wonder that 84% of the CEOs doubt the integrity of the data on which they make their decision. Let's take a minute to think about this number. 84% of the CEOs doubt the data on which they make decisions. So to make better business decisions, we recommend a data hub that provides the foundation for success in today's digital economy. MDM platforms drive not only better data but also better business processes and ultimately better business results. MDM systems provide key capabilities that organizations require to master, govern, and share data across the systems that cover the business. This includes capability needed to handle multiple data types, including master data, application data, reference data, and digital assets. But as we all know, it's not enough to just have high quality data. You need to get that trusted data and digital assets to the system that power your business quickly, whether those systems are in the digital core or in a complex landscape that's around your core. And as the data is ingested from various sources, it should be profiled for completeness for accuracy and context. We also should have workflows to route it through a review and approval cycle. When the data meets the data quality levels, it can then be syndicated to any downstream systems on demand in real time or based on a predefined schedule. Also, an MDM platform must support multiple integrations for pulling and pushing the data which may include API integration, web services, file-based, or direct connections via a connector. To find the right MDM solution, there are key design principles to look for. So it all starts with, is it flexible enough? Can it morph to the ever-changing business needs, or do I need to change my business to meet the product requirements? Can it be open and extensible? All of the tools and features necessary to manage data and does it solve for business complexity without making the business complex? It's never just about the product. It's also you have to factor in the people using that product and the process. So all three need to come together to achieve the desired outcome. Let's focus on the technology first. Flexibility across data domains or data products and data types, implementation styles, different licensing structure and deployment model is critical for a flexible data hub. A highly flexible data model enables you to easily model any business requirement and data structure using the UI. It needs to be extremely flexible in many dimensions. Organizations can centralize data governance activities in a single platform and doing everything in one platform lowers the total cost of ownership. Using a multi-domain MDM platform enables cross-domain intelligence. Now let me describe that. Using a MDM, a multi-domain MDM, organizations can create linked relationships between different data products that allow you to navigate from a supplier record to a product record to a customer record or vice versa. This ability enables business users to make sense of the data as well as establish relationships. Implementing a MDM provides several insights like managing and viewing master application data, reference data and metadata across domains. We can enable users to aggregate and view analytical data from many sources within the platform so you can also bring in sales history data or social sentiment data as needed. So a data hub solution for managing this type of data be product customer supplier which allows you to ingest, synchronize with any source, whether it is your ERP, your CRM and once ingested does it allow you to validate that incoming data to make sure that it meets your data quality requirements and once validated the data can enter a workflow where it can be further enriched by business users and each business user can own their own attributes. So no business user owns the entire record. Everybody's focused on their specific departmental specific attributes. Business can also surface up any analytics on transactional or on structured data in the context of the domain they are working in. So for example as I mentioned earlier sales history or social sentiment data has the product performing in catalogs or on a website can be viewed in the MDM as well. So it becomes a hub for better business not just for better data. Let's look at some of the core capabilities that master data management platform should include. These include in the areas of data stewardship and quality, ability to secure access to the right set of attributes, the ability to track who made the change, when was the change made. We talk about aggregating content from multiple sources do a match and merge but also create a golden record so that you know or you have visibility to a complete record but can also track the lineage. MDM platform should also allow you to integrate with multiple systems in your current landscape. It should allow you to browse the data or filter on the data in different ways using different types of hierarchy. So the same customer can be viewed in different hierarchies. So you can have a direct customer hierarchy or an indirect customer hierarchy. It should have enterprise workflow module that allows you to route the data between various teams for review and approval process, link the data to create upsell, cross-sell, substitute relationships and also have role-based UIs which includes several dashboards. So as a business user, I am prompted on where I need to go on things that I need to address on a given day. As Forrester puts it, MDM is something you grow into not dive into off the gate. So an extensible MDM platform gives you multiple options or add-on modules that are domain-specific and have domain-specific functionality to take a modular approach. It allows you to right size your solution out of the gate and grow over time. The key is not to try to do all of the things that you have in your vision all at once. You can start small, think big and scale fast. Now once you have standardized the data, it adds value to the organization. That's a given but next as you dedupe the data, it adds more value for possibly a different use case. Similarly, enriched data adds more incremental value. So as you traverse from point A to or step one to step six on your data transformation change, you can start to derive value at every step which is capturing value at every step in your data journey with a MDM platform. So as you standardize the master data for each data domain or data product, you now have a connected data model or a data mesh through multi-dimensional and multi-domain capabilities. This helps you manage and see relationships between suppliers, their products and customers that buy them and traverse in any direction. And also you can start anywhere. It also helps you with viewing data in different hierarchies that are naturally persisted, view complex relationships that are accurately represented as well as have an optimal retrieval of complex and highly dimensional queries. It also gives you several opportunities to both ingest the data and extract the data in multiple ways. A good MDM platform also has match and merge capability that allows you to aggregate the content from different sources, whether it is ERP systems or just different data sources, create a golden record, and you can pick and choose the attributes on which the match and merge needs to happen. It also has screens or user interface for data stewards to look at any exceptions and make decisions and also view a lineage on how that golden record was created. A good MDM platform should also have a flexible data model. A data model is how your, whether it is product data, supplier data, and relationship between them are represented, the metadata that is used to describe these objects, whether it is attributes or hierarchies and how you visualize them as a business user. Organizations that are global or have global presence also need to store data in multiple languages. So a MDM platform allows you to do that and create the dynamic relationship between the entities and domains, and it provides you with several out of the box data model that are industry and domain specific. Data quality is not a one-time event. It's a process and a MDM platform should allow you to view the data quality throughout the life cycle of an object, whether it is product, customer, or location data. So it should profile the data, tell you where the gaps are, and allow you to enrich the data at every step. Business rules can be applied to help you in the process. So certain business rules can be invoked based on the state of the data in the workflow. Now data enrichment can be done in multiple ways. It can be enriched from third-party sources or from users within your organization. A master data management platform should also allow you to keep a track of who made the change, when was the change made, and roll back to a previous version as needed. It should also allow you to compare multiple versions and the rollback should be possible for a given attribute, for a given record, or rollback to a given date. Now this feature is really powerful and allows you to keep a track of who has made the change and why. No one user touches the entire product record. A product record is or a customer record is made up of multiple attributes and different teams work together to get the data across the finish line. So enterprise workflow capability allows you to route the product or customer or supplier information so that it gets reviewed by different individuals. It allows you to assign a work item or a task for review and approval purpose. It also allows you to set reminders, escalation emails, so that at any given state as a business user, I can look up a dashboard and find out where things are at any point in time and where the bottlenecks are as well. Hierarchy management is key to browsing your data in different views. A good MDN should allow you to create any number of complex hierarchies, a multi-level hierarchy, so whether it is a hierarchy for a website, a financial reporting, or a merchandising hierarchy, these hierarchies can also have versions and it should also allow you to classify your records in the right buckets based on either business rules or key attributes. These hierarchies are used to make sense of the data and to quickly get to the right set of records that in a given category or subcategory. Now taking that concept further, there's a concept of tax anomies which allow you to classify and structure your data or attribute your content in the right buckets. A good MDN should also support inheritance, so it should allow you to assign attributes at the parent level and child or subcategory should inherit those attributes. Ownership can also be established at any level in the taxonomy, so you should be able to define which group or which user owns a given category and they then can add and manage the attribution required for their own categories. Now MDM as a data hub is designed to drive success for both the front office and the back office. Now although MDM has traditionally been a back office initiative, empowering the front office or the business is also equally important. Now you do that by enabling them to use the platform directly or giving them timely access to your trusted data that the business users can use. With that, I'll turn it over to my colleague David to walk you through the remainder of the presentation. David? Thanks a minute. You want to go to the next page for me or I guess I can drive on my end. One second. Just a second. Okay. Yeah, sorry about that. Okay, I have one. So as much as we all love to talk about automated workflows and machine learning and bots, we really know that there's no technology silver bullet that can run and evolve on its own. What we found is that successful programs, regardless of size, industry or go to market strategy, really have four core components in place that complement their technology investments. The first one is an actionable framework for data governance that clearly aligns data to business objectives and outcomes. Once we've identified our critical data, we want to make sure that we clearly understand the business usage and ensure that our efforts directly align with the outcomes they expect. And an example of that is driven by is the way we approach business outcomes for data driven by timeliness and availability of data. And this will vary by one driven by accuracy and completeness of data. This also allows us to build a common business language for data. One of the common pitfalls we see in programs is a failure to share a common data vocabulary. We can have the best short term data quality, but if people aren't aligned on what that data means and how it's used across the enterprise, we have really no shot at long term success. You'd be surprised, or maybe not, that even things like the definition of a brand or a customer are misunderstood and misapplied across an organization. The next component is establishing an operating model for data collaboration and ensuring we have a complementary organizational structure. Once we're all focused on the most important things and have a line on common definitions, we then have to define our data standards and business rules and ensure that we sustain the required levels of data integrity. Think of this as our playbook for data quality and governance. Once we know the playbook, it's a lot easier to align the roles and individuals that should run the place. A common pitfall we see in a lot of companies is trying to define the organization or data roles first and then trying to figure out how they all fit together along various processes. These really need to be done together to ensure a scalable model. Our operating model has to be a living breathing thing. The business and their priorities are going to continue to evolve and our operating model for data always needs to adapt. If you build an operating model based on just individuals and not a process, it's going to break down as soon as the individuals move on. The data harmonization framework is probably the most important component complementing technology. It's a great way to eliminate opinions and focus on facts that make data important to the organization. It may be counterintuitive, but the key here is to not ask the business what data is important because the truth is they probably don't know. You'll get 10 different answers and 10 different people. Instead, if we focus on the criteria that makes data important to them, we can identify and constantly evaluate which data fits that criteria and always be focused on data that's driving organizational value and business impact. The last component is defining a measurement model that drives adoption. At the end of the day, nothing on the page is going to matter or be supported long term unless we can measure and demonstrate value to the company. Everyone who started a data program measures data quality in some way. What a lot of them fail to do is tie that to performance metrics that really mean something to the business. Data quality metrics are just numbers unless they have context to the things the business cares about. There's a really a progression over time to fully defining the business impact measures, but they always need to be top of mind and always being worked on. To really gain long-term commitment and business partnership for your data organization, you really need to resist the temptation to just add new metrics that don't translate to meaningful business value. In the short term, they may make your DQ numbers look better, but they're very often not meaningful to the business. Establishing and following a data harmonization framework that was one of the components we looked at, I want to spend a whole slide on this because it's a key aspect of how effective organizations perform and insurers a value-based approach to everything we do. Everything should start with defined goals and outcomes and focus on the value of the data enabling those outcomes. If we use that as the North Star to our approach, we should then follow a structured and repeatable process to identify our critical data, determine the appropriate harmonization approach, and establish a governance strategy that provides efficacy. The first step here is establishing the criteria that make a data element critical and make it a candidate for harmonization and governance. Here we want to focus on the top criteria that makes sense for the various functions or processes within the company, from business value drivers to risk avoidance, resource productivity, to process, and technical considerations, really any of the things that make sense to your organization. This approach ensures that everything we focus on is adding value to the organization and that we're working on the highest impact data, rather than data people think might be important based on some recent problems they had or you guys know that one big fire we all had in 2010. Once we filter out our data through this gate, the next step we can identify the specific critical data elements that satisfy the specific decision criteria. As an example for a typical customer master record, most organizations will maintain roughly 130 data elements, but when we apply a leading practice criteria, we'll quickly filter and identify there's about 30 or so that really matter and drive value to the organization. So once we get those filters set, moving on to the next step, now that we have our critical data elements identified and know what business value they drive, we can then establish the levels where we need to set harmonization and governance to deliver the value. These would typically be levels like enterprise, business lines, markets, product lines, or even platforms. It's critical that we harmonize and govern our data at the right levels. And finally, we need to think about what our harmonization and governance efforts need to support. Our approach and tactics can vary significantly based on what we're trying to achieve. For instance, the depth of harmonization needs to be much greater if we're supporting cross-platform syndication of the data versus reporting in analytics. Beyond a syndication in RNA, we typically want to evaluate metrics, regulatory and compliance requirements and potential system constraints. This one's pretty short and pretty straightforward. Data harmonization strategies have to consider context or they'll fail to deliver value and never gain meaningful followers from the organization. Context also ensures we prioritize the most important things and executes our data quality and governance efforts efficiently and effectively. This is always a great accelerator during initial implementation efforts, but importantly also helps support steady-state operations and expansion of the program. Context is critical to all the data being syndicated between systems, being sent to data leaks, et cetera. It is just incredible how out of context things can get and you can get really bad reports or really bad data being used because the right context was not there for the end users to see. Lastly, we'll talk about mobilizing the organization. Change management and org enablement are really key factors in gaining followership for your data programs. They also enable the company to fully leverage the value of your tools. The traditional change management path, which a lot of you have seen and you see reflected here in the track above, doesn't always work for data. Data is different than traditional process-centric change management in that data involves a lot of different personas and is sometimes just a piece of someone's job. A lot of times the key role may not even think of data as part of their job description, but truly is. So when we mobilize these cross-functional teams, it's important we also align them around the goals for the data program and clearly outline their role in the process. The roles will vary significantly depending on the program. Think about stakeholders versus project team members versus the actual end user of the technology or report or whatever the case may be. And then similarly, when we empower the organization with new capabilities, we need to be able to point the measures that are important to them. Everybody has a what's in it for me in their mind until they can see how this will benefit them or their business results. So these augmentations to the traditional change management approaches can turn a perceived good project into a perceived game changer for you and the organization. And I think we're going to that's it for the part of the presentation part. We're going to have definitely plenty of time for questions here. Thank you student and David for this great presentation. Just to answer the most commonly asked questions, just a reminder, I also want to follow up email for this webinar by end of day Monday with links to the slides, links to the recording and anything else requested throughout. If you have questions, feel free to submit them in the Q&A portion. So diving in here, what is your view on using external data for enrichment, for example, D&B, etc? Was that for me? Either one. Yeah. Yeah, so we are able to not only take data from D&B. So the key is to enrich data and not just rely on your data that you have in your systems. So whether it is D&B, Melissa, USPS, or precisely ID, any data source can be brought in via APIs and enriched for address, for tax IDs, for phone numbers, for emails. And we all know the benefits of having the right data when you need it so that you can reduce errors as well as make sense of who you are dealing with. And I would say depending on how you are using your MDM, but if your MDM is also serving as let's say partially a PIM for product data management and syndicating out externally, you can also think about some of your inbound integrations may include third-party marketing players who are looking at your data and providing feedback on things that would drive SEO or things that might enhance your product content. And that could be also integrated back into a workflow so that you could get that recommendation in and either accept or reject the recommendation and then flow that data also back out within the environment. And with that, what is the main difference between a PIM product information management and master data management? I can take that. So product information management is a specific use case for master data management. With product information management, we are more concerned about how do I get a new product to market quickly and as efficiently. What are the right set of attributes that I need to describe for e-commerce or for print catalog consumer facing with master data when I do master data for product. I'm looking at the core attributes of a product record or a supplier record, location data, the supplier data that describe that product or object irrespective of how it is merchandise. So to summarize, a PIM is a specific use case of an MDM platform. Some companies or some vendors have two separate products or two separate solutions, but there are others that allow you to implement both in one single platform. Okay. So do you have any tips to encourage the business to embrace MDM and its impact for its investment? The key to success in any IT project is to make decisions specifically in MDM, have an understanding that it is not a departmental solution. MDM is an enterprise initiative. It's an initiative, not a project and MDM is a journey. Everybody is at different level and one is always evolving. So for success, one needs to start with the core data domains and grow over time. Also, it requires partnership. It requires governance and change management. So the key is also to provide the business with a platform that can support their initiatives and their requirements. I might add to that a little bit. So Ray, one of the other things that we do as a part of strategic services with imprecisely is we do help companies to build a business case for either new technologies or even business process change. MDM falls into that same camp where we have a lot of experience with a lot of companies overhauling a go-to-market strategy. Many times leveraging in MDM, but it could be other technologies as well. Really, the business case tends to be things like how do you reduce touches? How do you improve data first-time right? How do you improve data availability and visibility? Because there is definitely a business case to be made and the statistics have been out there with Gartner and several others for years now of just how much it costs a company to find data or to cleanse data if they find that things are wrong and there is a business impact and a very substantial financial implication to that depending on the size of the business. But the processes, the people generally speaking are the same in terms of the impact that an MDM can have positively and it does start with how do you build that business case to which generally speaking we find there are financial ties that can be made that show that it's worth the investment. Very nice. And how would MDM impact the current data architecture, including current data warehouse? I can take a start at that one. Really, this is up to the implementation within the organization. There are organizations who have replaced front end data setup systems. You think about the system that communicates with your ERP or frankly, if you only have an ERP today and people are just setting up that data directly in the ERP. So let's just use SAP as an example since they're the biggest one in the world. You have a lot of folks who are just going directly taking a spreadsheet from let's say an email marketer or whatever the case may be product developer and they're going in and keying the data into the ERP directly. And MDM can be the front end of a go-to-market process and the data collection from end to end can really happen within the MDM solution and it can be outwardly syndicating data to your ERP and writing data to other databases and also taking data in along the pathway. So it can really be used as a way to improve the speed with which you set up information about a product or a service or in some cases it can also be sitting in the middle of the process. It kind of depends on the architecture and the processes that are involved as to whether the MDM would be sort of an orchestration tool along the entire pathway or sitting at the beginning or sitting in the middle. How do you account for risk in enriching data with PII values? So a good MDM should allow you to not only secure the data vertically but also horizontally which means it should allow you to restrict or grant access to the right records but also to the right attributes and need to know masking are all good practices to follow. So PII data can be stored in MDMs if needed for specific domains and specific users in a given role can be granted access as needed. Along that with the privacy laws, how to apply centralized MDM in a global organization in context of data sovereignty and requirements where data must retain, must remain within bounds of a country? Yes, we run into a lot of those use cases on a daily basis. So whether it's a company operating out of Europe or Asia, certain countries have restrictions that data cannot leave their country. So in those scenarios we have several decision points. We can either replicate the data, create another instance of the MDM which ensures that the data resides in the data center of that country and then we can replicate the data as needed or carve out a process, a workflow that ensures that the right people have access to the right set of records or a given data object in the specific country. So a combination of security and infrastructure options allow us to support that need. So what's a good way of explaining the difference between a system of record and HRIS is the system of record for employee data and an MDM solution that managers have access to the same data domain? David, do you want to update that? Trying to think by fully understand. Let me reread it. Yeah, what's a good way of explaining the difference between a system of record and an MDM solution that manages the same data domain? So for example, the system of record for employee data. Okay. I think generally speaking it kind of comes down to, as I was saying before, sort of harmonization and the context, right? There are many times when your HR data may be very, could be minimal or it could be, I should say, either minimal or it could be the opposite. The HR system could be the system of record that has a lot of different data elements. So if I talk about, kind of like the example before where I talked about a customer record may have 130 data elements, but only 30 are truly valuable to the business. Same kind of thing may be here where an HR system may have all of your employee data could be a couple hundred data points about the employee, but only let's say 50 of those actually get set down and used within the MDM context. And so it's really a usage and understanding of that lineage is also another thing that a good data governance program needs to have is you need to have visibility to that lineage to understand, again, to your point, system of record where that data is being used and also how that data is being used. So context-wise, you know, an MDM may be using that for other things that the HR system would not use it for whatsoever. And so just making sure that as you're pulling the right data from the right place to send it to another reporting analytics tool or whatever the case might be, that the context and lineage of that data is understood. Hopefully that helped answer the question. So and certainly the questioner, Michael, can add any additional thoughts to that. Moving on to the next question here, what if data governance have silo data stewards for different segments rather than domain? It's difficult to build out the domain models with harmony. So in this case, when you have data stewards also operating in silos, we definitely need to come up with a way so that there is a workflow that orchestrate the decision from one data steward to the other. And each data steward has access to why a decision was made in the previous step so that they can make informed business decisions. So it's possible to operate in this model using workflows and following multiple MDM implementation strategies. MDM does not need to be implemented as a data hub. There are other implementation models that can be applied. And it can also be implemented in a hybrid model. So we would like to learn more about the business needs. And based on that, you would be able to pick the right approach that works for you. And I think the other potential answer here is that you do have data stewards, but there should most likely also be a business process owner that exists within either the domain or it could even be within the segment. But those business process owners and the data stewards, frankly, should be having a bit of a conversation with each other just to ensure that if the system can make things more efficient, and let's say that five steps of all the segments processes are exactly the same, and then two steps deviate here, two steps deviate there, you have a custom part of the workflow doing that work. But if there are steps that are the same, data that is the same, it could be wasted work and wasted effort to have all of that duplication set aside just because you're sitting in a different segment. And how do you organize MDM data when data is shared across multiple business units with their own context views, definitions, or values across common data? This is a common use case, not only when you have multiple business units, but also when you acquire as a company, you acquire multiple units or organizations on different ERPs. Each ERP may have the same data with different level of completeness or different data quality level. And so MDM comes into play where MDM can create a common data dictionary, as well as keep a track of which is the source system and by attribute define a primary source and multiple secondary sources. And as it creates a golden record, it can also keep a track of which source system was used to build that golden record. And as the data is enriched, the MDM can send the data back to the business units to their ERPs. So yes, it can allow you to describe the same product based on the context of the business unit, and based on how a user in a business unit wants to describe that particular object. David, would you like to add anything? No, I think you got it. All right. Love it. So how do you go about identifying a single source of truth? To identify a single source of truth. Well, there is, again, it's never at the record level. Data comes from multiple sources. And the best way to do that is break a record down to the attribute and find which system or systems has that data and set up a priority. If system A does have this data and system B has this data, which one to use and if system A does not, can I pull in the data from system B and use the MDM to match and merge based on specific attributes that you define, create the golden record, and define the single source of truth in the MDM, not in the ERP. ERPs are traditionally built for transactional data, not for managing a single source of truth. The other thing I might add to that is, I think my recommendation would be that there is sort of like the unicorn, right? Single source of truth that most companies, especially large companies or legacy companies, is probably just not tenable. It's probably not something that can actually be done. What really needs to happen is a focus on documenting and understanding the lineage of that data and understanding, to some of its point, where the information has come from, where it is going, and especially if there's any risk of dual entry because you just don't have an interface from system A to system E, for example. That lineage and having an understanding and documentation of it, it's more critical to have a documentation of it, is really what I would consider to be the single source of truth is the actual governance tool, whether that's in a data dictionary in Excel or it's an actual data quality or data governance tool that you buy on the market. The lineage part of this is how you will get to understanding not necessarily a single source of truth, but a truth about the source. That's critical to understand that lineage and how data is used, and especially, like I said, dual maintenance where you have system A not talking to system E, and therefore somebody has to rekey data. That's okay if it's necessary, but it's not okay if there is not an understanding of the fact that that's happening. Thank you, and I think this next question is a little bit loaded. Probably going to do a whole webinar on it, especially on all these silo questions, but is it possible for a company to be successful without a centralized data governance team? Yeah, I think the answer is yes. I mean, 100%. Yes, there's thousands of companies out there who are making millions of dollars and don't have centralized data governance teams, absolutely. I think it's a question of at what point does the work that you do and the work that your people do decentralized become scattered enough that information starts to get degraded, quality starts to go down, and trust in your data within the organization and also outside of the organization starts to falter. That's really the point in time when you have to start looking at, do I need to start centralizing my processes, start centralizing a team, or get a technology in place, or multiple technologies in place to start monitoring and managing my data? Because at some point, the scattering of spreadsheets, the scattering of an access database, the scattering of PowerPoint decks may be fine in a smaller organization who doesn't have a lot of products, or maybe it doesn't have a lot of people, but at some point when that scale gets out of hand, you're going to either be driving significant productivity reductions and people spending more time trying to find information than they are creating and driving sales, or you're going to have, like I said, a degradation of consumer trust if your information is just wrong out there in the ecosystem. Thank you. A great answer. And just going back a little bit here in the chat, so many great questions coming in. How does MDM consolidate data sources when common keys among different data sources are not available? So I'm going to let you take that one. Yeah, so MDM can create a composite key. If you don't have one primary key, you can use business keys. So you can take the name of the manufacturer, the manufacturer ID or the UPC number to create a business key to create a unique record. And yes, sometimes that becomes a challenge because different systems, different objects may need different approaches, but with a MDM solution that allows you to map your solution to the business and not the other way around solving this problem is possible. And we have indeed done whole webinars on this next question, but what would be your approach for developing data quality scores? And how would you continue to measure those scores? Several areas, and I'll let David also chime in here, but data quality scores can be built on the fill rate, the accuracy, the timeliness of the data, and channel readiness reports can be set up that not only look at how complete is my data, but how accurate and is it in the right context of the channel that needs that data? David? Yeah, the only thing I'll add to that, I think, is my approach is always action oriented, is number one, if it's a critical enough data element that it is supposed to be, let's just use the most basic of data quality metrics, is it complete? If it's a data element that is critical to a process or critical to a KPI, and it means that it needs to be 100% complete all the time, then great, that's a metric that you should put in place. But if you have a field that is, let's say, conditional, and sometimes it's supposed to be filled in, sometimes it's not supposed to be filled in, well, you're going to have to build a data quality metric against that that also has the, if it's conditional, so if it's only filled in, if this other field is filled in, then you've got to bake that whole data quality like data set together to understand those connection points and understand that context, because what you don't want to do is have the conditional field where it's a 35% completeness hit and people start scrambling thinking that there's a problem, and that's exactly what's expected because only 35% of the other attributes that it's conditional on are actually filled in. So it really, it takes a little bit of time and understanding, truly, when I want to score an attribute, number one, do I want to spend the time scoring it? Number two, what is the context and the action that I'm going to take if I have a quality score that's less than X% and that's another thing that needs to be considered in the longer term is not all scores are created equal. Again, go back to does that attribute and the completeness or non-completeness or conformity of it or the accuracy of it, those things that you want to score, do those measure up to something that is meaningful, either has a negative business impact or a business opportunity by having that score be greater than 80% or greater than 50% or at 100%. So those things all have to be taken into context when you're setting up your data quality scores. What is it that you want to score? Why is it that you want to score it that way? And truly the last part of it is what is the action that's going to be taken? And there's a process, right? So the action needs to also come with who takes the action? What is the process? Does the action need to be governed and approved by somebody? Because even that, when you're trying to bump up data quality scores, you could temporarily hire somebody to just crank out a bunch of data to get something from 80% to 100%, but does that need to be approved? Does it need to be reviewed? I would call that process right? To make your data quality scores improve. You still need to make sure that it's governed and managed well to achieve that data quality score you're looking for. Very nice. So I think we've got time for one more question here. Do you have a list of criteria that makes data important? Do we? Yes, we do. We actually do have a what we call a critical data element decision tree. It's not, it wasn't really part of this engagement or this webinar, but we do have a template that we use that really looks at the definition of a critical data element in terms of number one. I'll just walk through the criteria right now. So one is does it have business impact on more than one function or more than one usage? So does it have a business, if data is wrong, does it have a business impact on analytics, on actual revenue, privacy information? Does they have an impact on operational uses? So things like transportation, warehousing, and if it has a business impact on more than one of those, it's probably at least should be sitting in the this might be critical to the organization. If it has an operational impact, so if that data and the reporting that comes off of that data impacts more than one team. So think about business segment, a region, you know, a different function, sales versus marketing, the pricing team, again, if that data element hits more than one functional area, probably also falls into the camp of being a critical data element that needs to be looked at. So business impact and operational impact are sort of the two big areas that we look at. And you're looking again at cross business units, cross functions, and if that attribute is used and impacts more than one of those things, most likely it's a critical data element. But the key there is even if you even if we say that it is a critical data element, you still have to pay that one next level to say how do we want to govern that data, right? So is it critical enough that we need to have it actively managed in the tool at the time of entry? Or is it passive, right? So you got active versus passive governance strategies. And the passive would be, let me throw a scorecard together, their scorecard will run, you know, once a week, once a month. And you know, here's the action that gets taken if that data is inaccurate or found to be incomplete at that time. You know, there's still sort of that next level of critical, but how critical? Critical to the point of I need to check the zip code at the time of entry, like Amazon does every time we go to try to make an order. Or, you know, is it passive that I can check that data, you know, once a week, once, you know, once a day, once a month, whatever the case might be. So those kind of two things work hand in hand in terms of how you set up your overall, you know, critical data element matrix and also how you govern it. Well, David and Sumit, thank you so much for this fantastic presentation and conversation. Just again, and thanks to all our attendees for being so engaged and everything. We do lots of great questions that came in today. Just a reminder, I will send a follow-up email by entity Monday for this webinar with links to the slides and the recording from today. David and Sumit, thank you so much. Thank you as well. Thank you. Thank you as well. Thanks, everybody. Thank you all.