 Welcome. My name is Shannon Kemp and I'm the Chief Digital Officer of Dataiversity. We'd like to thank you for joining the most recent webinar in the Dataiversity Monthly Series, Elevating Enterprise Data Literacy with Dr. Wendy Lynch. This series is held the first Thursday of every month and today Wendy will be joined by Mark Horseman to discuss Baby Steps, a successful kickoff of literacy. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. If you'd like to chat with us or with each other, we certainly encourage you to do so and just to note Zoom defaults the chat to send it to just the panelists, but you may askly change that to network with everyone. For questions, we'll be collecting them by the Q&A section and to find the chat and the Q&A panels, you can click those icons in the bottom middle of your screen to activate those features. And as always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and any additional information requested throughout the webinar. Now let me introduce to you our guest speaker. Mark is the Data Evangelist for Data Diversity. From his early days as an intern, Mark's trajectory led him to ascend the ranks culminating in his role as a manager at prominent organizations including Alberta Motor Association, Northern Alberta Institute of Technology. And through these diverse roles, he has etched a path as a stalwart in data quality, master data management, and data governance. Ever eager to evolve in the dynamic data landscape, Mark has consistently sought knowledge on the latest trends in information management. And let me introduce to you the speaker of our series, Dr. Wendy Lynch. Wendy is the founder of analytic translator.com and Lynch Consulting. For over 35 years, she has converted complex analytics into business value. At heart, she is a sense maker and a translator. A consultant to numerous Fortune 100 companies, her current work focuses on the application of big data solutions in human capital management. In 2022, she was awarded the Bill Whitmer Leadership Award for her sustained contributions to the science of corporate health. As a research scientist working in the business world, Wendy has learned to straddle commercial and academic goals, translating analytic results into market success. Through this experience, she has created her new books, her new book, Become an Analytic Translator, and an online course. And with that, I'll give the floor to Wendy and to Mark to get this presentation started. Hello and welcome. Thank you so much, Shannon. I am happy to be here and I'll speak for Mark. He is happy to be here too, at least so far, because we will be having a friendly end quotes, debate slash discussion about how to kick off data literacy and what the important elements are. For those of you who have been here before, welcome back. And for those who are here the first time, welcome. So this will be a series of questions that we will answer and then either agree or not about where we think this should go. So why don't I go ahead and kick off. Thanks again for being here with me, Mark. The very first question is, what is the very first step that you think companies need to make to promote literacy? So I've done this a few times now and coming in fresh to an organization, I always try to get an understanding of what folks' pain points are and to create an issue log. I always want to understand people's literacy level based on what they're struggling with. What about you, Wendy? I actually like that. I hadn't heard that response before. And I'm just going to go off track a little bit, a little bit radical. Because what I would like to see is that each job has a line of sight between what they do and the overall corporate objectives. So I mean, we should have this anyway. But what metric would indicate that the person is contributing to success? And if you want to take it even further, what I suggest is that we consider pay for performance. Now, all of you are going to say, well, what does compensation have to do with data? Well, you know, people pay attention to the things that they get evaluated on and that they are paid for. So if you tell somebody that they are going to be paid based on the metric X, like let's say percent of calls returned within a certain amount of time, they are going to care about the data. They're going to care about the meaning of the data. They're going to care about the quality of the data. And they will tell you that it's either accurate or it's not. So that's sort of where I am thinking about creating momentum. So I know that sometimes people really want you to start with as an assessment, let's say. How do you feel about assessments as a first step? Well, I largely despise assessments, actually. But assessments need to be meaningful and based on qualitative things. Like if we were to assess somebody's data literacy and give them a data literacy score of 7 out of 10, that doesn't really tell somebody a lot and it doesn't really feel like it means a lot. I would always rather gauge literacy on what people struggle with, whether that's real or perceived. Oh, I like that. Because I think you're right, there are a lot of things that aren't necessarily real that are what keep people from engaging as much with the data. And I have to say, I'm not going to debate you on this one, because I agree that assessments very often are either too granular, like the one that we reviewed earlier in the year that has 15 abilities and six levels of those abilities, which I'm not sure how helpful that is overall, or they're too judgmental. So Mark, you're only a three and it just doesn't feel good to have a score like that. So I've been wondering whether we ought to be assessing willingness to learn and openness to new methods or new metrics. I think that that might be helpful and also maybe make them more homegrown and relevant to their exact setting. But I agree with you, I think test anxiety is real. And whether it's real or perceived, there are issues. So question number two, what steps are you already seeing that companies are taking to contribute to literacy? So I've been thinking about this a lot lately, and I've had a few discussions in this vein. And what I currently see is like this literacy by data immersion almost, folks like want to see data to back up their claims, it forces verticals in an organization to provide data that backs up their decision making. The more people use data to predict the future, describe the past, the more folks will need to be more and more literate. It creates demand for literacy. The danger is when you have folks who engage, they believe they're literate, but sometimes they are not. Yikes. So I like that, but it doesn't sound very structured. Do we need to be more structured in how we structure this? I don't think it has to be structured. It always depends on the culture of the organization. So whenever you come to these webinars or people say, you've got to line up with culture, but it always does depend on the culture of the organization. You get things like data mesh becoming popular because federated models work in large organizations. And these federated models require some sort of literacy to function. Like I said earlier, you get this literacy by data immersion because you've got a federated data governance model that forces stewards to be more literate. Wait, let me interrupt though. For those of us who are not data architects, what is the mesh? Tell us what mesh is. Well, basically it's this federated model for data governance where in large organizations where you've got several enterprise systems running the same kind of data or are located across the globe or across several different locations, then you have this need to federate your governance out. And so if you've got not your person who's in charge of data governance, and maybe not necessarily your data stewards in certain departments or verticals within the organization, but very similar groups across different offices, then you get this federated model. And that federated model requires those people to become more and more literate to fulfill the role of a steward. Okay. I think I follow that. What do you see companies doing, Wendy? Um, so what I see most often is there are people who have been assigned this issue or who have taken it on themselves. And they're almost always people in the middle of the organization. And so they're trying to influence both upward and downward from their role. So they're trying to get executives, like please see this as important and be a champion for this. And then they're also trying to implement some sort of a process in the teams around them. But you know, either way, there is a certain amount of pushback and apathy that seems to exist. And unfortunately, it sometimes takes a crisis, like a result that was delivered that's completely wrong or a reason that they're not making decisions based on data. And so somebody announces that there's a problem. And then they hope to fix it by putting in literacy. So I'm not sure that this all has to be structured either. But I do think that each organization, that each part of an organization has to own some of their own part of this. So our next question is who really has to be literate as we're thinking about implementing? So I get the popular answer here. Everyone needs to be literate. But I guess the nature and level of literacy, I guess type of literacy is kind of a goofy word is different depending on the role. Right. So you really think everyone? Well, yeah, I do. I remember having a discussion with an executive recently. They would often ask for specific analysis or a new report to be delivered. And they would be surprised at how long it takes. But having a base understanding of the flow of data from transactional system into hands on in hand analysis would help them understand what they're asking for. The other challenge I faced at organization is when folks at all levels don't ask, this is a bad term, don't say this, literate questions. That is, they ask a question of the data or of the results of analysis and the data isn't fit for purpose. It's like, as a silly example, like pretend them Netflix. Here we see customers aren't renewing their subscription. Correlating factors are X and Y as shown here. And then somebody in the room says, oh, customers aren't renewing because of Zed. Well, Zed really? So if you don't think it's everyone, Wendy, who do you think it is? Well, I think it's, you know, like you said, the popular answer. I mean, it would be lovely, wouldn't it? If everybody became more like us, you know, they became more like the data nerds. But I have to say, I don't believe it. It's not going to happen. It's not even going to come close to that. So I think that we're asking sometimes the wrong question. And when you're actually pointing out that there are people who aren't going to, quote, unquote, be literate, that's sort of the problem that we have. We're a nation of people who are phobic about numbers and data and statistics. So that's a problem. And we're also a nation of people who don't like to be thought of as less than other people. So I think it's tough when data oriented people tell us that everybody who's not needs to be more like them. And so I will start already on only question number three, saying we have to think about a different group of people that facilitates a better level of communication for people who are and are not data oriented. So we'll continue on that stream, I'm sure, as we move ahead. And so if it's everyone, like you like it to be everyone, how literate do all those everyone's have to be? I like to say just that the basic definition of literacy here, just just enough to be able to read, write and argue with data. I need somebody to do that. And I hate that definition because does anyone know what it means? What does it mean? What does that actually mean if I say that someone is not a data person? What does that mean? Yeah, we just need folks who are are holding a report in hand discussing its meaning to be able to accurately reflect what that content means. So when you've got your decision makers at an organization sitting at a boardroom table, if they're not having that discussion in a useful way and are not able to read, comprehend, argue with that content, then how in the world can we expect good decisions to come out of that decision making body? Well, I like that. Of course, again, I love the idea, but what if the C-suite doesn't want to do it that way? Then you replace them. I'm kidding. I'm kidding. I think that's the secret answer. We all wish we could say the folks in the C-suite are there because of a deep connection to the business. These are smart folks who know the business. It's not about what they want to or not want to do, but how to connect literacy to their daily lives as a CEO, CFO, COO, CIO, whatever. What do you think about executive literacy, Wendy? Well, I'm going to trust the data, trust the data, which the best indication that I've seen is only about a third of the C-suite is highly data literate. We are talking about two-thirds of them that aren't able to really grasp high levels of analytic output. If we say that everyone has to be literate, we're actually saying that we want top level executives to take time to take a course to take time for remedial training. I mean, Mark, you can't really believe that that is possible. So what about instead having translators who are the right-hand person to leadership so that they are a trusted source of information when the people who don't want to admit it can't really follow? I mean, maybe that's a different direction we could go. So say more about, since we're talking about executives and leaders, say more about what leaders' responsibilities are in the data literacy efforts, even if they're not particularly literate. Right. And this is where the culture comes in. And I was just listening to Tom Redmond this morning talking a little bit about cultural change. You can't come in as a cultural change agent. You come in as being the change you want to see show the change in the nature of the culture that works and fit in with the culture. So as a leader, as a data leader, it's incumbent upon everyone to encourage folks to take the time to learn and promote iterative improvements. And, you know, we can't accidentally call people illiterate. What we really want to do is, you know, just do the things that we want to see people doing across the organization. So as leaders, whether that C-suite or senior executives, vice presidents, associate vice presidents, directors, senior directors, even managers, all the way up and down the chain is take the time to learn, show that you're taking the time to learn, and improve your overall knowledge. That goes for everything too, not just data literacy, but there's lots of things that folks can learn as it relates to data and interpreting graphs and visualizing content. And there's tons of learning out there that people can interact with. And executives should be displaying that change and encouraging that behavior in folks. So it's be the change you want to see is the one thing that I'd like people to take away from that. Yeah, I guess I agree with you that they have a role to play in terms of modeling. I will say that part of me feels like you've said an awfully low bar, kind of show a little bit of interest and don't insult anybody by calling them illiterate. So while we can both agree that promoting learning and being nice is probably a great role for everybody, but we should also, if there is some modeling to do, consider whether or not trusting each other, trusting a valuable person who is helping to interpret the data and embracing new roles and embracing the levels that people have, because if we spend our time just talking about how disruptive it is that people are illiterate rather than trusting people that can help bring everybody up a level, I think we're missing an opportunity there. Right. Yeah. So as we think about companies that are starting out now or who are in the midst of bringing on a new literacy effort, are there things that give you hope that the next generation are actually going to start out more literate when they join the work setting? Yeah, and I'm kind of keeping an eye on chat too, and I love all the chat about data fluency that we're seeing down here too, which is wonderful. What I see in people's private lives that gives me hope is more and more than ever before and more and more each day, people play with and consume data. Whether it's your fitness device like your Apple Watch, Fitbit, whatever, your smart fridge or your favorite online shopping site recommending products, people are reacting to, playing with, tracking and managing data as part of their non-work lives, their real lives. This gives people a stronger foundation around being able to read and understand data in all facets of life. Five years ago, I generated a graph of rings showing completeness of someone's tasks. It would have taken me effort to explain what closing a ring meant. Now it's part of everyday life for a lot of folks. Wendy, what gives you hope? I think similar things. I think that we're starting to have data be more a part of what we see on a day-to-day basis. I agree on the health trackers and mood trackers, but also we're much more likely to look at real estate on a site that tells you what the averages are and what the expected are. We look at websites on weather and we understand some of the statistics about weather because it's something that we all pay attention to every day. I agree we're being exposed to more. Now whether that translates or not, I'm not sure. But on the flip side, what worries you about the next generation and literacy? If we're talking a little bit about weather, what worries me sometimes is when the weather person is wrong and we get several feet of snow up here in Canada and maybe I don't have my winter tires on or my sandbags in my car yet. But besides that, more serious snow, just the nature of social media and the existence and exploitation of people who are identified as being persuadables. That whole situation is fascinating to follow, but also a little bit scary. If a person interacted with a news article and this news article and this news article and shared this meme and that meme and that meme, and if we target them, some group out there has figured out if we target them with article H, which is made up fake news, then there's a percentage chance that they're going to believe it. Wendy, what worries you? That's bad enough. I'm worried how little people are being encouraged to think for themselves. I'll have you comment too. A comment just went through on the chat while you were talking about how people getting inundated with personal data may not be moving us ahead. It may just tune them out, so there's that side of what we thought was hopeful. But it is worrisome to me that it's too easy now for an influencer who could be a very brilliant, wonderful, capable expert. We are now becoming accustomed to them telling us what we should think. Rather than being data or evidence-driven, we are opinion-driven. That worries me that even if the data are there, we're looking to somebody else to interpret it. That really goes to what we saw in chat there. Inundating individuals with data and influencers are awful for that. They just drone and drone and drone. But it almost reminds me of ye olden times when, and this is still true today, when buzzwords would just tune us out. I'm guilty of that just a few minutes ago when I was droning on about data mesh. It really is a jargony buzzword. The more of those things are repeated, the more people tune out. But if you're throwing data at people all the time, then does anybody really care anymore? Maybe in some cases where people are taking an active interest in that particular dataset. I know folks who are just absolutely enamored with their fitness watches and so tracking their calories, their heart rate, their move points, their stand points, those are all fascinating things that people interact with. But yeah, on the flip side, if my smart fridge is telling me I'm eating too much yogurt, then that's kind of, yeah, whatever. Maybe I don't care. How much help, end quotes, do we really want in terms of all of that information? Yeah, I think what we're both saying is that it's a double-edged sword. It's great to have more available, but how do we best use it and how do we interact with it in the best way? Our next question is, how do we help people? We're going back to your comment earlier, which was, there are people who don't ask things in a literate way and then people who are trying to provide them with information don't quite know what to provide or feel like they have to update it or correct it. So how do we help people to do that? Yeah, and our jobs as data professionals I think is changing a little bit. We're not data professionals because we're up there showing a visualization of data or just to help make data driven decisions. Our job is to teach those things as we present them. When I hear something like, I don't understand what this graph is telling me, that's a trigger that that visualization is not working for that person, that we need to visualize or present that content differently or teach the visualization and preferably something like this happens during development of a report where we can focus group and review and talk about these kinds of things as we develop them. So what does that process? I hear what you're saying that it should trigger a different response, but what does that look like? Take the time to tease out the meaning. We can't be the people that don't understand the business driver or we can be the people sorry that don't understand the business driver. That's okay. We can ask those questions. Once we've heard the requirements or the interpretation, we can restate these things in a literate way much like teaching kids how to pronounce new words. I've got four kids and one child that doesn't always speak super clearly has a little bit of a speech impediment. But when teaching kids to pronounce new words, you don't just correct them and be like, no, it's pronounced this way, not this way. You just carry on using the word correctly and let the learner bridge the gap personally. It's a much more positive method than walking into the C-suite with a graph of some flavor and saying you're wrong or not literate because you don't understand this. We always have to look at the positive way to reinforce that learning. Wendy, what do you think? Mostly I agree with you that the more that people understand the context and the more that they see examples that work for them, the better they will be. And if the priorities are things that matter to them personally, then I think they get better at understanding the data. And if we can tell a story with the information, that's even better so that they can... How do you tell that story? How does that happen? Yeah, so we have to realize that people don't have a vocabulary to ask questions that an analyst would answer. They don't have the vocabulary to say what type of analysis, what sorts of data, what kinds of covariates, what sorts of controls, all of those things. They don't have that. And so if we can transform what we're saying into a description of what really matters to them, so we can have a discovery process that is way more in depth and extensive. And I don't mean time consuming because it can be just a few minutes. That is way more effective than a ticket system where they submit one-way requests because we can't expect non-analytic people to be able to make that ticket request be articulated the right way. So we want to have an interaction with people in a way that they can speak their language and we can speak our language, but we understand it rather than asking them to become better. It becomes a collaboration at that point. You don't want to submit a ticket into the ether, especially if you don't have any kind of SLA process such that somebody actually gets a response within a certain time. If they send a ticket into the ether and nobody gets back to them, I think that's probably even worse than having that collaborative process that you've identified there, Wendy. Right. And I think, well, I know because I train people to ask questions in a way that gets to what's really important to the business and we can train a team member in every team to have that discovery process and be able to tease out what they're really trying to accomplish rather than trying to narrow it into a particular ticket or request. Right. So it's going to be the least receptive, do you think? You know, something I've heard a lot of in my career is, well, this is the way we've always done it. And I'm sure that that might be something that everybody has heard tons of on the call. That's the way I've always done it. Yes. Yes. Yeah. And some folks feel like any sort of change or training will be challenging or changing the way they do things. It can happen to any group. Usually you've got your long dedicated service employees that you can hear that from, but it can really happen at any point. This is where a change management structure comes in. We need to be positive about how data literacy is a force for good and demonstrate that. We'll earn champions and supporters for our efforts and things will begin to fall into place. And so a boat, I heard be a boat from our northern neighbors, a boat. My Canadian isms are coming through on the mic. Yeah. A boat. And so tell me more about what that process, that change management process, how that gets incorporated into literacy specifically. Yeah. So I do love talking about change management processes, processes, pro-pro. And there's lots of methods out there, like ADCAR, ACRITY, who is one of the folks that has spoken at our DGIQ conference for Dataversity has an anchor method as well. But it's really all about having that change management process in place where you can identify champions, identify who your submariners or detractors are, but actually go through an implementation like that in a positive way, where the champion converts the detractors from an effort so that everybody kind of comes up and do the things that celebrate your wins and reinforces learnings and change. So there's all these change management books and processes out there, and I recommend all of them. They're all fantastic things to read and consume as leaders. And what we're doing is leading people down a journey to become more data literate and to think in terms of data. So that's where I land on that, Wendy. Okay. All right. So I hear what you're saying, and I think that there are a lot of things that we can do to make that change less scary or disruptive, but sometimes it just really will be scary and disruptive. And that's why I was thinking, and when we were talking about assessments, do we need to be trying to help understand how to create more openness to these kinds of changes? And what does that look like? And is that part of that modeling that we said that leaders need to do, that we are going to embrace it because it's taking us somewhere better? And then I also, it makes me think that the people who are least receptive to training, I agree with you that we've always done it this way. Attitude is always a difficult one, or I already know all of that is a difficult one. But it doesn't always mean that old timers are old. It could just be young people who've been there for three years. So resistance won't necessarily be age related or tenure related, although it could be. But I think change is scary. Well, yeah. I always forget the exact title of this book, but it's the Who Moved My Cheese book. I think that might actually be the title is got the mouse and it was like my cheese used to always be here and now it's somewhere else who moved my cheese. And how distressing that is for the mouse. It's like he's got to hunt down his cheese now. And we make people feel like that sometimes if we're not careful and manage a change process in a way that makes sense. Yeah. And I think we also, in addition to the change management, I think we pay too little attention to group dynamics. And some of those group dynamics are, for instance, I'm on my team, which might be the business team or marketing team, you're on your team, which is the data and analytic team. We are considered separate, we're in separate divisions. And maybe we've had difficult interactions in the past. I mean, I'm sure you've seen where there's real animosity between the teams for real distressed. And so training can't fix that. We have to build a different way of interacting with each other that makes it a lot more comfortable and appreciates each other. So our last question before we open it up to other people's comments and questions is what events are moving us toward greater literacy that we can actually leverage? People are hit with data in their personal lives, like I said before, like never before. Current events and news reporting hits us more than it ever has before. When we see political poll in the news and that poll turns out to be correct or incorrect, that's an opportunity for us to think about why and teach about the data that goes into something like that. What is a random sample? What does that mean? What does it mean to be 49% plus minus 3% 19 times out of 20? There's a statistical background to that. And those are great data discussions that we can have. Yeah, I know sometimes though that just means if the polls are wrong over and over and over again, everyone's just like, well, we can't trust them. And maybe you're right, that's part of the, there is no random sample anymore and there is no sample for anybody to get anymore because we can't simply just call their landlines and hope that they'll answer. Exactly. In the days before they were answering machines, so you kind of had to pick it up. So, but it does make me think about some great examples. I read this case study where they were trying to get people interested in data. And it was a really cool approach where they took songs from people who are in the top 40. So I think the example was Beyoncé song and a Taylor Swift song. And they decided to create an indicator of narcissism by how many times they said, they said me and or I in their lyrics so that they could compare. So they just did something and this was for teenagers to get them interested in a very different way, which I thought was kind of cool. But I digress that was kind of a weird little example. But so what else is moving us toward greater literacy that we can leverage? If we are going to talk about Taylor Swift and if she can increase the excitement in the NFL, then if she can also increase excitement in data literacy, then I think that'd go a long way to helping us out. I know if all of her followers became data geeks, it would be a big advantage. Maybe you get her to write a song called Buy the Numbers or something. I don't know. So silliness aside, like internal to organizations, when forecasts are right or wrong, we can have those same discussions. When this activity happens, when we recommend a right or wrong product to a customer, these micro events enable us to have some data related discussion at the time the event occurs. Wendy, what do you think is moving us forward? Well, I'm going to push back on that one that we have to be very careful that we're having that discussion in a safe way. Yeah. I mean, so we can point fingers and that's what ends up happening a lot. Well, it's your fault because you didn't use the data. Well, it's your fault because the data weren't right or whatever. So we have to be careful about that. In terms of moving us forward, I'm going to open a little Pandora's box. I'm intrigued by how AI may, and I repeat may, bring data insights to more people and make them more accessible. Part of that is the democratization that we heard about of systems that can do queries using just natural language. The other thing is, is that if the AI system is interacting in a way that it can explain it, we have to realize that it will not get fatigued explaining the same thing over and over and over again, and perhaps even learning how best to present it to certain people without any judgment. Can you get me on my soapbox here, Wendy? Yes, but yeah, I guess my question is, where are you with AI? Is it going to help us hurt us or just confuse us a lot? I do like how you talk about getting tired of explaining things over and over and over again. I have a rule, a personal rule. It's like I explain something the first time. That's good. I explain something the second time. I'm a little bit irritated, but whatever I get it. I have to explain something the third time. Exactly, and they won't get fatigued. You're right. AI doesn't fatigue. AI doesn't fatigue. Well, we don't know yet, but I don't think it does. It actually does ever gain sentience, which it hasn't yet, so if there are any believers out there, it's not actually self-aware. But if it were self-aware, could you imagine an AI that was just tired of explaining things, which is like, no, I can't do this anymore, and then chat GPT closes down or something. Exactly. I think the nature of AI is fascinating right now, and it's kind of like if you go back to our questions of what gives you hope and what worries you, AI does both for me. I think it does so much to be able to bring context and meaning in a way that normal folks can't. The danger is that if you've got normal folks using these tools, and they're interacting with AI in an area that they're not a subject matter expert in, how can they accurately gauge the effectiveness or realness of an answer? It's the Dunning-Kruger effect, and it's the fastest way to the Dunning-Kruger effect. It scares me at the same time, but if you've got an AI that you've trained and added to the context or added to the learning model, things at your organization, then it could be a boon. But you've got to meet the AI halfway. It can only do so much. Next best word based on the 173 million things that it learned off the internet isn't necessarily going to yield the response that you're hoping for or teach the thing that you're hoping for. You got it. So for those of us who don't know, back me up on the Kruger, what are you talking about? Well, the Dunning-Kruger effect is where somebody lacks competence in a subject area, and so when they're reviewing something from that subject area, they don't necessarily know that they're making a mistake or that something is of high value or not. So if you're asking AI to do something for you in an area that's outside of your subject matter expertise, you'll get an answer that looks great. But is it good? That's not necessarily something that we can evaluate if we're not subject matter experts. I think this was called the MacArthur effect a while ago in newspapers. So if you knew something about a news story and you were reading it in the newspaper, you could find and spot errors or problems with the way a story was reported. But then you would go right to the very next story and believe everything that it said, because you weren't next to the story. So it is a very similar effect there, where your lack of expertise in an area disables you from being able to accurately judge the content of that area. If that makes any sense. Yeah, so you're applying that not to people, but even to the AI systems that have training in very siloed, different areas. That's where you get garbage out of your LLM, because it knows context, it knows how to string words together. It knows the millions of things that it's pulled off the internet. Yeah, it does a good job sometimes, but sometimes it does an awful job and it doesn't know that it's done an awful job. It doesn't have the ability to actually understand what it's spitting out. It's just spitting out next best word. Yeah, so give me the last word. Is AI going to help us on literacy, hurt us on literacy, or is that sort of a mixed back? Maybe. I think there's a lot of power there for organizations that implement an LLM with care and attention. It's the folks who believe that it's a magic bullet and just drop it in without that care and attention, without that literacy that we were talking about all day today. Then you're going to have problems if you just drop it and you're going to do well if you respect it. Got it. All right. Well, that is the end of our specific questions. Shannon, you can let us know what issues are percolating out there as they listen to us having our debate, let's say. Thank you both so much for this discussion. It's so good. Just to answer the most commonly asked questions here, just a reminder, I will send a follow-up email by end of day Monday for this webinar with links to the slides and links to the recording. This question came in early, but diving in here, how do you reconcile the extremes, data practitioners and non-data practitioners who have to work together and bring them up to perform better? Yeah. Well, I will give you my answer based on what I've been doing the last six months. We have to train analytic translators who have sufficient knowledge in each of those areas and train them to communicate in ways that enhance collaboration, enhance respect, and build a truly cohesive set of teams. Because we are facing real difficulties, and the more complex all of these systems get, the worse it gets. And we're seeing failure rates of 70% of all dashboards don't get used, 80% of all analytic projects end up either being redone or not used. The failure rate is huge. So we've got to bridge this gap. I've seen that in the trenches too, Wendy. Here's our Power BI repository of 500 reports or 1,000 reports. Here are the top five. And the top three of the top five get read all the time. And the other two in the top five are barely touched. And then there's still another 1,000 reports after that. It's kind of funny. I think personally, my biggest success as a data professional has been being able to speak and communicate in business terms. And so really that's that translator's piece that you keep talking about. It really is about bridging the gap and being someone who can speak to both sides of the data universe here. We have to bring business words to the business folks and data words to the data folks. And not judge that one side is better or worse. Exactly. They're just different. They are coming at it from different perspectives. And we get too much in our own little silos. And get irritated. And some of that is not taking the time, I think, to have a dialogue and making presumptions that you should know what I mean by that in both directions. Yeah, exactly. The C-suite got there because they're smart folks who care about the business and have a deep connection. And who are we to say that they don't understand something? That's completely silly if you stop and think about it. We all have thought that at one point, like, why can't you understand this graph? Come on. You're the chief operating officer. We've all had that conversation in our heads, but it's just fraught with that judginess that we have to get rid of. It's just different. Yeah, on top of that, you have different personalities in these groups, different orientations toward solutions, different ways that they want, that they perceive success. So these are big, big differences that we have to figure out how to appreciate between each other rather than get irritated by them. So what else have we got, Shannon? That's a great question. So what are the best programs for literacy? We have a best up yet? Is it? I want to hear your answer, Mark. That's a great question. We spent a bit of time today talking about cultural fit, and really you can't just grab a program like a template off the street. It's not like showing up to your favorite talk show, and everybody gets a gift of a literacy program under their chair or something. Congratulations. You get a literacy program. You get a literacy program. It's always going to be tied to the culture of the organization, and a literacy program can only ever get hold if it attaches itself to how an organization runs and what an organization cares about. So it takes some care from the person who's leading the charge for a data literacy program to really attach it to how an organization functions. So easier said than done, I apologize for that. There's a lot of work that goes into something like that. I've drank a lot of coffee and a lot of offices to know that fitting something to culture is a challenge, but really that's the way to approach it, I think. Wendy? Well, it's interesting because there are certainly a lot of groups out there when you Google it who say they have a one-size-fits-all education platform, and there are groups out there who seem to get a lot of traction, and I don't know whether they work because an organization is trying to check the box and say they did it, or whether there are some that are way superior than others. I've seen many, but I don't have one that I just say, oh my gosh, that was it. And I do think that the people that we've talked to over this past year since launching this series have the ones who say they're doing well are the ones who have made a homegrown approach to helping people understand the data that matter to that particular job or business. So again, it makes me apologize too. I wish there was a straightforward, this is the answer, but so far I have not seen it. We've got about five minutes left, so I'm going to try and slip in a couple more questions here. Okay. How can we manage the resistance for data literacy in high-level managers or leads? Yeah, I think that there is a real fear that is natural for accomplished professionals in any area. We as adults do not like to look silly. We are used to being competent, and the higher up we are, the more competent we are used to being. So we have to acknowledge that there is a resistance to taking on something that is a really high, high steep learning curve for them, especially if they already are a little bit numbers phobic. It can be a real problem. And I will go back to my broken record. If you hire a strong data translator, a strong analytic translator who knows how to communicate and really knows how to facilitate the right kinds of discussions, you can start to bring anybody into the fold a little bit at a time so that they can digest the amount of information that's available to them that is presented in a way that makes sense to them. But forcing them to try and get on board to a very structured abstract data literacy program, I don't think that's going to work. What do you think, Mark? Yeah, this is again where I like to talk a little bit about change management too. When you've got detractors out there, if we're lucky enough to work at medium to large organizations, chances are you're going to have some champions too. And having those champions demonstrate how they're benefiting from something like a data literacy program, it turns those doubters into believers slowly over time. It takes time. You're never going to flip that switch instantly. But building up that goodwill through culture and through showing success and having that champion or champions, if you're so lucky, is helpful. Agreed. One last one, or are we winding down, Shannon? We've got less than two minutes, so I need your elevator pitch on this last question. All that probably could be a whole webinar. I'm just going to put you guys to the test here. So how should we measure the success of data literacy? The number of people who are accessing, using, and understanding the data that are first and foremost most important in their organization. That is a fantastic answer. I kind of agree with Shannon. We could do a whole webinar on that. I'm just going to cop out and say ditto, Wendy. Okay. Well, then that gets us in the elevator. That's true. Bring us to the top floor. Very nice. Oh, I love it. Well, thank you again so much to both of you for this great discussion. I really appreciate it. And thanks to all of our attendees for being so engaged in everything we do. That is all the time we have here for this webinar. Again, just a reminder, I was going to follow up email by end of day Monday to all registrants with links to the slides and links to the recording of the discussion. And hope you all have a great day. Thank you. Thanks, y'all. Thank you, everybody. Bye-bye.