 All right, hello and welcome. My name is Natalie Raymond and I am the digital program coordinator of Dataversity. We would love to thank you for joining this Dataversity webinar data at the speed of business with data mastering and governance sponsored by Informatica. Just a couple of housekeeping points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we will be collecting them via the Q&A panel. If you'd like to chat with us or with each other, we certainly encourage you to do so. And just to note, Zoom defaults the chat to send only to panelists, but you may absolutely change it in order to network with everyone. To find the Q&A or the chat panels, you may click on those icons found in the bottom center of your screen. As always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and any additional information requested throughout the webinar. Now I'm happy to introduce you to our speakers for today, Jason Beard, Ryan Glazenau, Jay Hawkinson, and Taryn Stebbins. He is the Senior Director of Data Strategy and Governance at Informatica. With over 25 years of data industry practitioner and consulting experience, Jason is a hands-on strategist, data governance expert, and change leader with experience spanning the business and technology domains. He has led enterprise programs, transformation initiatives, and operational teams in a variety of industries, including research and educational publishing, consumer package goods, banking, investments, and insurance. Jason joined Informatica in 2018 where he came over as a customer using Informatica's MDM, DQ, and data governance solutions. He is a leader for the data governance solution specialist team in North America, helping to support other data management practitioners, operationalize their data strategy, and showing the value of how data governance can help accelerate an organization's business transformation. He is the Director of Data Analytics and Insights at Valmont Industries. He is a change agent who builds high-performing technology teams to make game-changing differences in organizations. His broad-based business background has been fueling change through people, processes, and technology. Utilizing a strategic approach to resolving business challenges, Jay has maximized IT value by improving business processes, reducing IT spending by over 25%, and delivering gains in client satisfaction of up to 83%. Last but not least, Taryn is the Director of Data Strategy and Master Data Management with Fragmen. With 20 years' experience building data management frameworks that enable enterprise-wide data governance, Taryn is a highly accomplished data leader with expertise in strategic planning, data management architecture, and continuous improvement. She brings a proven ability to deliver both high-level and fine-grade insight into enterprise operations. And with that, I am happy to give the floor over to our esteemed panel to get today's webinar started. Hello and welcome. Thank you so much, Natalie. So welcome to our panel discussion today around data at the speed of business with data mastering and governance. I appreciate the introductions, Natalie. I want to give a little bit of background about Informatica real quick, where we have a new theme talking about where data comes to life, and this is because Informatica is a leader in enterprise cloud data management. We're empowering businesses to realize the transformative power of bringing their data to life, and we've created a whole new category of software, which is the Informatica Intelligent Data Management Cloud, which is powered by AI, and an end-to-end data management platform, which connects and unifies all of the data. Before we dive in, again, just want to introduce myself. I'll pass it over to Jason as well as to Jay and to Taryn. But Ryan Glass, I'm the senior director for our data governance field sales here at Informatica. Jason. Hi, everyone. Jason Beard. I'm the senior director for data strategy and governance in our advisory services practice at Informatica. Over to Jay. Hi, there. I'm Jay Hawkinson, the director of data and analytics at Vollmont Industries. Vollmont is a global manufacturer doing lots of stuff in all sorts of areas, especially in the irrigation and agriculture space and infrastructure like the traffic and lighting and other pieces. I came on two years ago to Vollmont to try to build out their data piece so that we could really go on a really good data digital transformation. And so far we've been some great success. Hi, I'm Taryn Stebbins. I'm with Fragonin, and we are the world's leader of immigration services and supports with over 60 locations worldwide. And we leverage all of our immigration experience to offer all of our clients anything that they need from an immigration and mobility perspective. We are currently going through a digital transformation for the firm and also a need for data standardization and improvement of data quality. So we are leveraging technologies like Informatica and master data management to start to launch that initiative. Perfect. Thank you, Taryn. Okay, this is Jason. I'm going to get us started with just a little bit of context setting first around the topic for today. Data is the lifeblood of every modern business with so many capabilities available though in the data space. It's easy sometimes to forget why we're doing it. And building a great platform is pointless, unless you do so in the interest of directly addressing real business problems. I was just talking a little bit about some of the kinds of business issues that we commonly see from customers, particularly in the master data and data governance space, growing the business and increasing agility, making it easier to find, understand trust and access data for today's use in applications, overall a reduced time to value for data, improving customer experience through data intelligence that makes your business smarter. To understand your customers and all of their needs, having an awareness across their household and awareness of claims activity if you're in the insurance business, understanding your customer from many perspectives, and across the entire value stream in which they operate with your business. So reducing operational costs, enabling greater efficiencies, less time wasted searching for answers, exposing relevant data to your organization. A great example of one of Informatica's healthcare clients which was able to reduce their data onboarding costs by $10 million through greater leveraging of these kinds of platform technologies. And managing risk and compliance to ensure that data driven business initiatives are aligned, not only with what you're trying to achieve as a business but broader industry government mandates, protecting privacy for your customers. However, we know that with high performing organizations that are focused on data delivery, sometimes there can be, but next one, Brian, please. Exercise there we go. Thank you. It's important to be thinking about what are the right questions to ask. As we get into solving some of those business problems. It really comes down to how do we understand the data how is data defined. Where do we find full compliance requirements about the data what's what's actually required for us to achieve within that. How is data classified, being able to find information that's relevant to the topic that you're talking about cataloging that information, making it available to a broad cross section of your business governing data how do I know who who is responsible for this information. We all live in in corporate environments where it's sometimes hard to understand who is the, who is the better of this information who's really got the best understanding who owns this information who's responsible accountable for its quality for setting the rules around it. So governing that information to ensure that we've got that kind of visibility. We're constantly mastering the data to make sure that when we have 15 versions of Jane Smith in a system. We know which one of those 15 are actually the same person and which ones are not. So simple things around, obviously, mastering customer product, but also supplier vendor. And of course, the information location information, a whole raft of reference data that is really at the heart of a lot of what businesses rely upon to operate. And of course, ultimately measuring your information understanding how that data looks quality wise and being able to ensure that it's available and trusted as the desirable outcome. Over to you, Ryan. I think we're actually ready for our first discussion. And I think where we wanted to start today with thinking about these business imperatives that we just talked about is what are your business imperative so when we talk about things like faster time to value for information we talk about improving customer experience, what are the business problems that that you guys are trying to solve so that we start tearing with you maybe kicking us off with what some of the business imperatives are around around fragments business that are driving your data initiative. Yes. So what I kind of mentioned at the onset, one of the key things that we're trying to do is digital transformation so we see a need for managing data, especially master data to enable digital transformation, so that that can support their growth and also to increase their agility to offer more services or products. But as we've been taking this journey, we have seen several the other three that have popped up around improving that customer experience because we are having some quality issues having fragmented systems and fragmented processes around our current master data and bringing that together with an informatic has helping to improve that customer experience. And then also, there's a lot of manual efforts around managing our information our client data. And so we're starting to see, you know, less of a need of people to keep, you know, the business as usual or keeping the client quality data up and running. So we're starting to see a reduction in operational costs. And then of course, there is, you know, managing that risk and compliance, you know, we have a reputational risk that we see as far as, you know, improving our data quality within for our customers, but also making sure that we're in compliance with some of the government regulations we have as well. Fantastic. Thank you, Tara. And Jay, maybe just a couple of thoughts from you as well on what are some of the key business initiatives driving Valmont's data initiative. Just about everything. So when I was started it was mainly about getting access to data. And once we address the access problem, we ran into all the others. No one was really trusting the data that was coming more no one knew where the data they needed to get to was. And I did a lot of work and focusing on how to do it my challenge is a little bit different as in I'm being asked to solve all these things all at the same time as I'm trying to explain to people how to build the house, how you have to start from business imperatives, a lot of it comes down to that digital transformation, being able to get it so that we can take all the silos we have throughout the organization and be able to address them quicker and easier, as well as the digital transformation of trying to eventually consolidate our systems into strategic platforms to go across, but at the same time as all that is going on, increasing the time to insights or reducing the time to insights for our people across the board. Excellent. And those are all the items that both you and Karen have mentioned our topics that we hear about quite often. The big challenge of course in organizing to solve those problems is how do you collaborate across your organization. Understanding what the business problem is what you think you need to do to solve it is part of the puzzle, getting engagement across the business is the other part. So maybe tearing back to you how you approached addressing that that issue. So when we kicked off the master data management project we didn't start with the technology we started with the processes, and at first, so we can do to chase. You know, what he was mentioning for their digital transformation because we started with the processes, then we can kind of understand what data is needed within those processes. We laid out what we call the life cycle of client services and we focused in on a, and on a couple of those life cycle steps in contain the processes to start our master data management journey. And that helped to kind of contain some of the scope creep in a sense, even though that it's continuing to add in similar to Jay. And covered every day that we work on this initiative and this this program, lots of things keep popping up as you know, you know, kind of, I wouldn't call them data issues, but just hey can you fix this can it do this can it fix this because they're starting to see the value immediately, but that has even been in production of what the master data management program, along with the enabling technology and the chromatic I can do. Jay, anything to add to those ideas. Oh, absolutely. I mean, part of the problem is that we ran into is again as soon as we saw things like the data access problem now you get a whole new series of problems to solve it's always snowballing collaboration on us is a little bit was is one of the challenges of all month grew by very much independent entrepreneurial spirits within all the different divisions and when we acquired companies we acquired them and didn't do very much with them except try to find more ways for us to work together. Certainly did not call it consolidate data or processes across the board. One of the things that I've been doing is working with people when they're ready. One of the important things for me is looking throughout my organization. When the organizations are asking the right questions. I can't do it big bang I was told by the CEO per step. Oh, I don't want this data governance stuff. Just use AI and ML to do it for us. Okay, but what I've been doing is working with like right now we've got it with the finance department the finance department's finally asking all the right questions and realizing the challenges with quality and all the pieces. So now I'm able to get a group that I can apply that governance to and we've been very successful very quickly by doing it that way. Excellent. I love that idea of you know meeting the people where they are really building that coalition of the willing, you know that are willing to get in and help to solve those problems be early adopters of some of the program methodology and techniques that you're describing and coming back to your comment about having to balance what I'll call the run and the build aspects of rolling out a program like this. As you start seeing the problems accumulate people come to the table with issues they want resolved you're also trying to build a broader capability. And in some sense trying to change the tires of the car while you're driving down the road. So, a great governance structure is important to kind of managing all of that and having some sense of how you're going to prioritize the issues that come up I'd be curious what your thoughts are on how you've approached that kind of that kind of overall governance of your programs. Who'd like to start. Well I can start a little bit. So, similar to Jay, you know we've been kind of doing what we call an informal coalition of people and they did kind of paraphrase some of your words. And what we've been taking to our senior leaders, our executives is that we need to make this a little bit more formalized. So then that way we can have, you know, decisions and authority against, you know, these, how do we remediate some of these issues and make them more preventative. Again, you know, as you keep going on these journeys of this data management program or this master data management program, you start to see these little things trickle in. But there's, you know, we want to make sure that things are standardized that their decision by a governing body with authority with decisions made. And then we can make those happen and working with our partners within technology and with the operations as well. Okay. Yeah, so from my side the what we've been doing is building out that governance to the point of with the willing, but it's expanding past that very very quickly as we look at a digital transformation. So if you take a record like that we're working right now on our customer record, well finance doesn't own the customer record they own part of it of course they don't know the whole thing. I've been using that to build out that larger government structure where I have steering committees within commercial engineering finance and operations. And in these cases we're working together at that larger team level. I'm trying to avoid a top level data group only because I have an executive leadership team that's very engaged and can act in that role for giving me high level priorities. But at that lower level I need people who are actually going to work together and for me that I can do it by using those individual steering committees working together to solve an individual piece for digital transformation. Yeah, that's interesting and you know it's it's it's sometimes the third rail of data governance is starting with those data elements and data attributes where they are shared and used across multiple groups can sometimes get mired down in disagreement about that. It's a slow progress but it sounds like you found a good way to really hone in on some of the some of the fields some of the pieces of data that that will make a difference to multiple parts of the business and you've got a good way to for them to work together so kudos to that. So just to kind of, you know, but Jay was saying, you know, from a top down perspective we just want to have kind of decisions made. But we do have a level of governance what we call working groups that span across those cross functional teams. Similar to what Jay was saying is that we take the customer. And some people kind of feel like they own that definition of the customer but what they only own is what their fit for purpose needs are are developing those kind of what we call client working data working groups, so that we bring all the cross functional owners of that record that client record to make sure that we have appropriate standardization. And so we can bring it up to a level of decision making. Fantastic. What sounds like you both done a great job getting the getting the people organized which is sometimes the hardest part of these programs. And having a way to identify those priorities and move things forward and find some real value so congratulations on that success. I think at this point in time we're going to flip to our next slide and talk a little bit about some of the challenges to governing master data. Ryan over to you. Yeah, real quick. I wanted to circle back on one of the things. Oh sure, please. Jay that you said when you talked about you know when they're ready or when they're willing. Things that I've seen organizations also be successful on that is helping to go ahead and almost have your own little, you know PR system that that's in place. They're keeping people educated, even though they might not be actively participating, because that's going to help accelerate them and they're being able to see the successes in which these governance organizations are bringing to the table, and make them much more willing and wanting to participate down the road. So I really like that. And that with our part of what we'll call our data literacy program goes into that aspect of it of publishing the stuff that we're doing having them see what's going on in the other parts and the successes. To that point it's been very successful for what I'm doing as well as the work we're doing on our data science and more innovative technologies. Yeah, absolutely. Yeah, thank you for that. But let's let's pivot real quick into the challenges on mastering or challenges to governing master data. We're going to go ahead and talk about three key topics within here. And really why it's hard to to govern and again we'll have Taryn and Jay discuss this but the first one is going to be around identifying unknown data sources really it starts at the source and data governance should be there to help you find what you don't want to cost your organization right do you want your data mastering team spending all of this time searching for data or improving its accuracy, and there's obviously an opportunity cost to the time that's being spent in this space. So, the first challenge, overall as part of the discussion is, you know, often the starting point for data governance how does identifying data sources and that in the data types create this different challenge for your organization. Taryn Jay, if one of you want to go ahead and kind of take that that first question and really you know where do teams, where were you seeing teams spent a lot of time hunting for data, and really less time being productive I know Jay you talked about, you know, access being one of those first things that that you were focused on. Absolutely, this is one of our core challenges or work was in the past we spent over 85% of all our data science and analytics initiatives, just finding the data and knowing how to use it. It's slowed down value attainment and in all honesty the business teams quickly got bored with it, and just went back to the old manual processes so everybody's team got or time got wasted. They had a huge risk with incorrect identification and a lot of rework being done within my own team because they would be doing something, give it up to the business the business was a yeah that's wrong. And that would be just about the level of the conversation that we had. So we found that this is an important aspect for what we need to do. I, for us, we got we have to do it we have to do it right to reduce the risk and get those fruits that we need in the AI and ML level. The way I've been doing a lot of this is by building it as part of our muscle. Every time that we start a new data project, we start with this is identifying the data sources typing, using a data catalog solution to try to get this information all together so we do it look and find it once, and then share it for everybody. And for us in our master data management program, identifying their data sources was our initial challenge. Like I described earlier we laid out what we call our lifecycle client services and associated the key technologies that supported it. So we identified our data sources. That was kind of a challenge at the beginning because we were going back to the system on record, and then working with the team members that represented that system of record but they didn't know how to, to provide that data. So we had to go turn to our analytics team to be able to get to our key data sets for identify our data sources so that we can profile it appropriately and kind of understanding with our state of the state of the data was, and it significantly improved our market once we got past that point. But I agree with Jay, you know from some past experience that with our current my current firm and stuff is that when you have data sets that are trusted and kind of that source of truth that helps everything to go a little bit easier and faster and quicker. You know this is opportunities that we see for improvement across our certain system of records to make sure that we have those key data sets coming from our system of records that are you know trusted so that we can continue to mature our data and also our analytics as well. Yeah, and this is this is something that we see with an organization right they just want to understand what data do we even have that's that's out there. And when you're able to go ahead and identify that data and scan it right and bring the lineage and it's just this big aha moment. I think that's what organizations are having and where they're able to see where data is coming from and going to the data quality, Taren that you mentioned around that so it definitely is a is a great starting point that people are coming in to identify that data. So Brian it's a challenge for us to and you know I know we're specifically speaking to like master data but we're leveraging some of the other capabilities within our software as a service of regarding data cataloging and metadata collection. So that way we can see beyond just the master data, where else is data residing. Where is it source from and start to deploy standardization of like authoritative data sources of this information as well. Absolutely. Let's, let's go on to the to the second challenge here. Jason. Yeah, happy to kick us off here. So the second challenge, one that I know we can all relate to has to do with improving low trust and data accuracy and you know talked before about what is the point of governance well one increase availability visibility but certainly right that has to be creating trust and information, we all still know that so much time is spent trying to to establish that trust at an individual level and at the end of the day. Do you really trust the number that you get out of a system or do you always go back and verify it so trying to shorten that cycle to where people can really believe in the information they're seeing some interesting numbers here only 27% of data owners completely trust their data. That's from an IDC survey, and I would argue that that 27% probably don't always know as much as they think they know about their data. The second point here trust in data degrades as it moves further away from the origin. Absolutely true. We all know the kind of stories about, I believe what's in my European system I believe what's in my CRM system I believe what's in my manufacturing system. But when it goes to that data warehouse or that data lake or it goes into the black box of, you know, corporate data platform. I start to lose confidence I don't know what happened to that data on its way to those places. Creating that visibility and creating that trust a really important piece of it. Ultimately, those source systems can operate as flexibly and is integrated away as modern data provisioning requires. And so we have to establish that trust in order to get the analytical value of what we're doing. And finally, a rigid manual documentation based approaches do not scale. There are thousands of data sources, you know, Informatica has customers with tens of thousands of data sources that are being scanned and assets that are being scanned. There's more data out there than ever humans can't keep up with it. So it's really important that there be a process in place and visibility to create that trust that doesn't require on poor data stewards kind of sitting there cranking out Excel spreadsheets day after day, or updating those based on changes in the data environment. Let's move to the next slide here just talk a little bit about this key question. How do we improve low trust in data accuracy. Joe would like to kick us off and maybe talk a little bit about how you've approached the concept of trust and and creating that that trust within your your business user population. Sure, about two years ago when I started the data access as I said was big problem as soon as I gave people access quality and trust became the issue. Part of the problem was forever we did things by consolidating things in Excel spreadsheet so people trusted the result, because they trusted the person who is doing it. Well, I'm trying to get rid of that person will not get rid of them I want them doing something else, not being manually transforming this data. And this became a big big issue for people to see we've done this in a number of ways first of all we do have some core problems that we have to solve and we're doing that with the work that we're doing in master data and data quality and getting people to understand what bad data really means that it's just not accuracy it's not just missing it's a whole bunch of criteria they have to work with. So we're using data quality we're do use the data litter the data lineage quite a bit to show people where that data is coming from and where it's all being transformed that works really really well for our analysts and our data scientists for our leaders and managers, not so much because they have no idea what's on the other end of that box. But I we do that by the stuff that we're talking about now, trying to get around the problems that they have and problems that their people are having with when they were transforming that data even manually, and targeting them on our data quality initiatives. Excellent. And I love that idea about really focusing on the different personas and populations that you have to appeal to their, it's not a one size fits all so I think that's a really powerful point. Taryn, can you talk a little bit about about your experience with trying to create trust in your data within your data governance program. Yes, so when we were beginning our master data management journey, we did you know the typical profiling of the data to assess the quality that we were about to bring into our MDM tool. And it was really interesting how team members who have been working within, you know, their system of records, right their systems, you know, thought, oh, you know we've been keeping on top of this we've really been improving the quality and then when they saw it as an outcome. Oh, you know we didn't, you know, because there was limitations at the system level in order to put controls into place to prevent them from going past the screen without filling in a field or making sure things are mandatory beyond the asterisk right like this is a mandatory field. And so, you know, people are starting to say now they're starting to use that language or we need to improve the data quality or not only we need to look at completeness we also need to look at accuracy of this information as well. People are starting to use that vocabulary and it is one of our, our next phases of our program and start to educate across the firm as well. So people, you know, not just among the core team that you know are within certain processes. It's, it's a language that's used across the entire firm as well. And I think that's that's where everyone's trying to get to for sure and listening to both of you talk about trust and data that occurs to me data is often used euphemistically to describe both the data itself but of course the reports that represent that data as well. What I can find is that there's just as much if not more confusion at the reporting layer, as there is at the data layer so what am I looking at, do I trust it do I understand what it's telling me in this report even if I believed the underlying data was correct. Is this representation of it accurate so governing reporting and analytical outputs is a key piece of this as well. Can you guys talk a little bit about your experience there. Absolutely you have it right on we came from an environment which used to allow everyone to do every reporting that they wanted however they wanted to, and we ran into the exact situation you'd expect. We had over 4000 reports that no one knew what was right what was wrong or how to do it. We tried to fix the problem by moving to an SSRS environment where upon they of course lifted and shifted everything that was in the old environment into an SSRS report and recreated the same thing in a new place. So that's one of the areas that we're focusing on quite a bit on how we're having that communication how we have to tie everything that we're doing from an analytics platform into things like the data catalog. So we have to give this information so that everyone sees and can understand where the pieces are, and to try to document all the important pieces I think the good point that was brought up before was, you can only do so much. There is so much data coming at us, and you have to really pay attention to the most important elements that are coming not not everything all at once. We've been finding that very very effective to get people to really really focus on what's the thing that really changes. The biggest challenge I have is getting people to stop thinking about it as magic. Somehow people think that data quality some is just something magically that happens. Oh it's an ERP system it's going to be good quality yeah no that ain't the case. It's a literacy conversation getting them to understand that the asset is important and understand why it's important that a currency has to be USD or us, but not both at the same time and what that implies to the rest of it is, is something to the to the forefront now for people to truly understand in in the analytical side and they understand how hard it is for them to get good analysis, if they haven't done the work they needed to in the earlier parts. Yeah. Okay, I like what you, I like what you said there Jay around, you know data quality is not magic, and this is something I was just talking about with the with the panel last week in New York, you know around how important it is to go ahead and fix the root cause which is typically the business user right they're the ones that are putting the data into the system, but it's fixing the root cause not necessarily diagnosing the symptom, which would be that magic that you're talking about on the back end. There's lots of things that we can do I mean if you got one of our challenges was that the ERP systems never had the lockdown that they should have required fields and rules that should have been applied to it in the first place. So there is stuff that has to happen and the good thing is, as you start analyzing this, as you start looking at the data and figuring out what the business rules and defining what good data is versus bad data and what has to happen. So you have to give your roadmap both technically as well as for those end users, as well as what do you do to fix it and keep it clean in the, as you go into the future. So Jay, I wanted to tell you when you're talking about the reports we used to call that the Wild Wild West. So, when it comes to just kind of an approach of what we've just done with the master data management project in just in data management practices is we always start off with the business term, the name, the name should give an indication of what it is right. So the definition should give the rest of the information so I think before Jason was asking about, have you ever had challenges for people, you know, think they're looking at one data, you know, element or data attribute and it's something completely different and this came up with some conversations with our leadership was that, oh, I asked a team for XYZ metric and I got it and then I looked at a report and it was different than what I got and they go well there's two definitions. Right. And so, yeah. So one of the key things are, you know, is, you know, especially since we're a global firm, right, and lots of people is to again educate them that there's something about a name there's something about how we name something right and making sure that's standard for that. So then that way then the name is long lasting and it's unambiguous it's meaningful, and that people look at it and they can tell what it is. Then plus looking at the definition to verify it so that's what we would start with with master data and along with what Jay commented on, what are the business rules to those data elements that we can then plug into the in our technology or also within our operations as well to improve the quality. Absolutely. And this conversation reminds me a little bit of in my past life. Before I came to informatic I was an informatic customer like Ryan and so lived in the trenches of the kind of stuff that you Jay and Karen are talking about with trying to resolve these kinds of endemic corporate problems that have woven in over many years and decades into behaviors right and how do you solve the problem of changing the use of a terminology around a piece of master data something as simple as customer or product and always creating that context of awareness when you're showing it on a report so that people know what, what flavor of that you're talking about as an example in my past life I let a governance program at a major publisher. And in that context, we will talk about customers, but they could be many different things sometimes they were partners sometimes they were individual purchasers. Sometimes they were actually partners that were providing content into our into our product lines. So understanding the context in which you're talking about a piece of master data is important and differentiating it. Do you guys have any examples like that where you've had to go a level down within some of those key master data concepts to really tease that out at a greater level of granularity and how have you enforced that in your environment. Absolutely people are like oh that's master data that's master data like know you're describing event data right and so we we definitely kind of start off when we do our socialization of the program and the initiative. We kind of give everybody a background to what master data is and what master data is not so that that way they can put in their mind and kind of promoting that literacy of that concept. You can make sure, hey, it's it's slow changing it doesn't change a lot. Now if it's something that's happening every minute of the day or daily. That's most likely an event data right that we can categorize appropriately in the right sort of like subject area or domain of data. Absolutely. Yeah, great. Sorry. Go ahead. Right. Yeah, let's go ahead and move on to to our next topic here which is really more around enabling the responsible data use. You know, Karen you mentioned something kind of earlier on this as well. Right and I think that really resonates with the first bullet up there around 33% of customers believe that personal data is being used responsibly right you talked really about you know you guys have a reputation that you need to be able to uphold as a as a part of your organization. So when we look at enabling responsible data use. It's again this is really kind of the last key challenge but it plays a huge part around, you know enforcing policies and standards. So do you describe your, your data consumers, or do they question whether, you know, the data is accurate and reliable and that they should have the appropriate access to that. So, I think the thing is, is that, you know, I think with this, the program I think they're seeing that there's, we're starting to kind of scale up and mature the reliability and the accuracy of the data. Our data consumers which are right now systems. I think the next phase is we're going to be adding to our analytics space, you know so that way they can trust that of course they're already pulling from their system of records so it should be you know pushing through that they they are definitely starting to see the value of you know improving the, the, the completeness in the accuracy of the data are our data privacy team sees that is, you know, from a privacy perspective, you know what it's classified as and we work hand in hand they see it very clearly and it starts going back to the last subject we were talking about is it starts at that business term so that they can see the business term they can see the definition they can understand the profile of information and they can say yeah that's sensitive to PII or it's general PII etc so we're able to kind of have a conversation more at the data level and not so much the system level and knowing that the data does tie to systems and you can have that that relationship between those two things, you know, it makes it easier to govern and control and after responsibility responsible with the data. From our side the we're sort of the flip where our data consumer is more in the analytics than it is on the system right now. I see us going down the flip journey of it I'm helping out the other guys they're doing the integration side of it, it all feed the same thing, but for us the there is always that question the accuracy and reliability. Mainly because in the past, it didn't matter to them. Their numbers were what their numbers were they believed it because it was being funneled through a person who would say yeah that's right or no that's not right. But as we get more and more automated there's more and more questions on the accuracy as I stated before, but also the reliability and getting that data in their hands at the same time every day, all the time becomes much more of a pieces, much more of a something that has to happen. So enabling it for responsible data use. Most of the time we have not had too much problem in this space, mainly because we don't have very much data that would could be misused. We don't have that much personal data because for manufacturers it's mostly the data that is company focused. But we have seen and have educated people and what you cannot can and cannot do. We had a marketing team who wanted to make a blast to everybody who ever bought or all ever saw our website on this one thing. And we said yeah you can find spam for us thank you very much. But it's been a rare thing for it. The data is perceived as potential liability for us actually because of liability. One of the challenges is is our amount of data that we have, and more to the point of the relevance of the data if you will. I was looking at one of our systems yesterday that has accounts payable data going back to 2001. And I started having the conversation with the people involved they're saying, really what what are you exactly going to do if somebody owes us money from 2001. And that the CEO is very much a proponent of shrinking our data to exactly what is necessary to get it done. And that helps with all of this it helps reduce your risk it makes a lot of things a lot simpler. But it's balancing that balancing act of how to have enough information to get the job done on a regular basis, and not having too much information that becomes a swamp. And have you guys seen with, you know, making sure that that people are using, you know, the right data, you're enabling that that responsibility of the data usage. Do they see that is creating a roadblock when it comes to, you know, overall kind of data sharing or data democratization goals. And it's very important to, you know, from a data sharing perspective that we make sure that we have, you know, well defined data sets to be shared, right, and they're coming from, you know, the golden source or the authoritative source for that information. That's one of our next journeys that we're going to be taking from our program perspective is, you know, start to make sure that, you know, what is being used in reporting and analytics which it is, you know, very good high quality data is reconciling to those golden sources or authoritative sources of information. And that way if they are doing any sort of analytics or reporting or using it for decision making, you know, that they can trust and have a confidence in that information, alongside of making sure that sharing between systems as well, you know, with our system being more of a hub, you know, where we're studying and governing the information, you know, to be sent to other systems that provide our services and products that, you know, that information as well is of high quality and accurate and complete. Jay, I don't know if you have anything else to add or I'm happy to move on to the next topic here. I think we go in the next. Perfect. One of the things that I really liked out of this this conversation and again what what I'm hearing from customers that that I'm talking to is how imperative these initiatives are tied to business outcomes. I was that is again I was at a CDO roundtable last week and you know someone mentioned that they've never seen a successful data initiative that was not tied to a business outcome and I think that's absolutely accurate. So I'm going to stop for these programs and go ahead and yield the adoption of which they're looking for. So, overall, right, with the right capabilities, you're able to go ahead and reduce that complexity you can solve multiple challenges, and you're able to reduce those goals for greater productivity and efficiency, and again those improved business outcomes, such as customer research planning, being able to lower the risks, and then also being able to meet those overall policy compliance goals within the organization. So what capabilities are missing today that are stopping you from achieving those goals. You know when you're really kind of thinking about that but you know what I what I'm really curious about is, you know when you're thinking about those capabilities that you're missing today, what are you really evolving to next in your data strategy, right you've you've both talked a lot about, you know what you've been able to achieve so far but when you look down at 2023 and beyond. What's what's next and what capabilities do you need in order to get there. I'm going to speak to that because we are laying out our roadmap as a team right now so the next, the near term next is data cataloging, we need to kind of in order to manage our data effectively we need to understand where our data is, and make sure that we have the high standards of the high quality, then our next thing that we're going to be doing is enabling data quality as well we currently have existing technologies that do a great job but that's not very scalable so we're going to be leveraging Informatica's data quality capabilities on that. So, next we're going to be looking into is reference data management. That's another tenant to improve data quality to making sure that everyone leverages enterprise versions of reference data I think somebody had mentioned earlier about like country and country codes. That's something that you know, when you're doing a master data management project you uncover really quick if people are using standardized country codes. Or if they have two versions of Great Britain and they have a great Britain and a United Kingdom in a single data set for a single data element so that's where we we see our next phase for this year and into next year. And reference data management is very critical a key for us to keep, you know, our high data quality to for our services that we offer. And, you know, along with the master data as well. For us a lot of the pieces that we have to work on is come down to the data sharing as well as. And this is the challenge of my side is to know the expectations of the business so that we don't try to eat the whole elephant all at the same time. And that's not a technical piece that I need. But it's one of the things that is interesting within my organization because as we look at the digital transformation overall. I just got a note from the CFO who asked me to say, stop all the other initiatives we want to spend this year cleaning our data and having the highest quality so we can all move to SAP next year. Right now looking at my note and saying, okay, how do I tell him, what is possible and what isn't possible. But in the normal scheme of things for us a lot of the work has to be done with how we do the data sharing across those organizations. And we use our, our, the catalog more effectively originally we decided to start with data catalog, mainly because I thought I could build this over time, as we have new analytical and new data projects and start getting that all populated. But now data quality is taking a forefront on something that I have to solve. But as I'm doing that more of some of the integration side of things is also becoming very, very important for us, everything's getting towards how can we get that AI and ML capability as fast as possible. The trick I need to do and work with it again is more of my side than anybody else's is training or getting the business to understand and to focus so that hey we can do this work in the AI and ML space we can get the data quality where you can be, but pick one area of it. And let's go do what's necessary from a cataloging perspective, a quality perspective, get the lineage in place so people can trust it, get data marketplace working so people can extract and get the data that they need to do and make those all work together, even on that smaller piece because we take all those smaller pieces added up and now we got something worth talking about. Yeah, and I think that's, that's great. How are, how are you seeing as, as you're starting to build out, you know, the data strategy, Terry talked about, you know, the roadmap. Are there new lines of businesses and what you're looking to align to as well in this coming year or in this year. Absolutely. So, from a master data management perspective we started with the B2B. And I kind of keep referring to what we call the life cycle client services, where we kind of focused on key pieces of that life cycle that would give us the biggest thing for our buck is the first, you know, roll out. So what we're also doing this year is expanding in our master data management space to go across that entire lifecycle at the starting point of leads and communities, we have technologies like Salesforce that do that to contracts to, you know, and then, you know, to all the way to billing which we've already been doing with this current deployment and initiative. Then we're also going down to the B2C level, you know, to help support initially, you know, those individuals that we provide services to. But I didn't want to bring it up because Jay seems like this is one of his hot topics. What's been asked of us next is, how do we kind of standardize people in their access management. Not that MDM is going to be the access management tool but to be able to kind of have that central point of access management across our systems as well. So that's, you know, one of our other things that are on is on the radar but the great thing is is that from a technology perspective this technology is fairly agile we just have to make sure that you know the capabilities as an offer that it does offer it's the right solution for the business needs as well. Yeah, and that makes sense and, you know, Jay earlier you mentioned you know kind of on the collaboration piece right when they're when they're ready and they're and they're willing. So, are you seeing more people now in 2023 at Valmont now wanting to come in to to participate. In fact, I have to speed, I have to beat them off with a stick at this point in time only because I can't do as much as they want us to do. But that's where the focus comes important and decide and back to the point that we made earlier on the business impact and doing things that fit within the model of what's going to be most impactful to the business. Yes, we're as people are seeing and seeing the successes we're having, even in little scopes that we've been working at more and more people are finally saying aha. They understand it, and even things like cataloging where they were at first why do I have to fill this out, I know what it is. When we go and do it, and then they realize, I'm never going to have to answer this question again, I'm only going to have to change this when it changes. It becomes a hot because a lot of the questions we're asking today, we asked three years ago. And now, now we got one place for everyone to see it across life cycle, and across from everywhere from the data scientists all the way up to those leaders and managers. Okay. Yeah, I like that you brought in, you know, the same questions are being asked today as they were three years ago and that made me think of the slide that, you know, Jason spoke about, you know, earlier and in this webinar. Right and those are questions that were commonly seen. Again, those are questions that we're still hearing, you know, three four years later from customers and I don't think those questions are necessarily going to change. I don't know if the, how do we go ahead and get those questions answered and what what capabilities do we have in order to help speed that up. So, I want to go ahead and wrap this up as we're coming to time and just kind of put a little bit of, you know, what is informatic as, you know, point of view on this and how do you go about being able to solve this and we believe that you need a set of capabilities that are going to go ahead and help to accelerate this data journey and this includes a solution that has AI at its core, right, being able to drive AI powered recommendations and automation to help to accelerate that data acquisition across your organization. It's going to, it needs to be a solution that's going to empower all of the users it's going to cater to all those different personas within the organization so we's technical user to most technical user to be able to support that. And finally a solution that's going to be able to scale with your business so it talked about you know what's what's next on the horizon for both Jane and Karen and you know she mentioned, you know data cataloging so with a platform that you can go ahead and have different levers to push and pull on as that program begins to mature and is able to go ahead and scale as a part of that. And we really see that, you know, without these enterprise platforms, they can provide these, you're really ending up to be a disparate, you know, bespoke solution that's going to require endless amount of integration and maintenance and is going to go ahead and slow down that delivery of data quality to your users within the organization. I also want to put a call to action here so I'm going to leave this up here for a second because we have a QR code that's on here that you're able to scan we have informatic a world that's coming up in May I can't believe we're already there but in May 8 through 11th informatic a world in Las Vegas. We're also going to have Taren and Jay joining us there so you'll be able to see Taren's panel on how to reckon or how to achieve rapid intangible results. And Jay is also going to be hosting a customer breakout and the data strategy and trends track so really exciting I'm sure those are going to go ahead and fill up quickly as registration has already begun. So really informatic we just came out with our CDO survey so again there's a qqr code on here, but I'll leave it up there for you to be able to download. But our CDO insights 23 survey this is going to cover some of the challenges that we talked about today, as well as how data strategy is evolving. With that we are at time. First and foremost, I want to thank Taren and I want to thank Jay for joining us today. It's been great really appreciated the interactive conversation and just the thought leadership that each one of you have brought into the conversation. Thank you for helping to co host this with really appreciated hearing things around your background and things that you're seeing through the advisory services part of informatic as you're working with with customers. And for all of you who attended today, really hope that you guys were able to pick up a few good tips where we're absolutely happy to continue this conversation in any shape or form and you can go ahead and reach us on social media, or catch our YouTube videos that are out there. I'm discussing this so with that again, thank you to each and every one of you have taken time out of your schedules to join us today. Thanks everyone have a great day. Thank you. Thank you. Thank you everyone for that great presentation and I think there are just a couple of q amp a questions if you have time Ryan and Jason. One from Hamid is do you coordinate with these your peer review catalog. I'm sorry would you be able to repeat that again. Do you coordinate with the as your preview purview catalog. So do we integrate with Azure purview. Yeah, no we do not. I think that is apologies I think that is the only question left in the q amp a. Thank you everyone for attending. That's all we have time for just to remind everyone, the recordings will be posted on diversity dot not along with the slides from this presentation within two business days, and Shannon will send out a follow up email to let you know if this and also with any other links or requested information. Thank you for to inform Attica for sponsoring today's webinar and thank you to everyone who attended. Hope you all have a great day. Until next time. Thank you.