 Hello and welcome my name is Shannon Kemp and I'm the Chief Digital Officer of DataVersity. We'd like to thank you for joining the most recent webinar on the DataVersity Monthly Series Elevating Enterprise Data Literacy with Dr. Wendy Lynch. The series is held the first Thursday of every month and today Wendy will discuss data integrity, how literacy impacts data collection. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. If you'd like to chat with us or with each other, we certainly encourage you to do so and to note Zoom debuts the chat to send in just the panelists. We may absolutely switch that to network with each other. For questions, we'll be collecting them by the Q&A section and to find the chat in the Q&A panel to make quick those icons in the bottom of your screen to activate those features. And as always, we will send a follow-up email within two business days, containing links to the slides, the recording of the session and any additional information requested throughout the webinar. Now let me introduce to you the speaker for our series, Dr. Wendy Lynch. Wendy is the founder of analytic-translator.com and Lynch Consulting. For over 35 years, she has converted complex analytics into business value. At heart, she is a send-maker and a translator, a consultant to numerous Fortune 100 companies. Her current work focuses on the application of big data solutions in human capital management. In 2022, she was awarded the Bill Whitmer Leadership Award for her sustained contributions to the science of corporate health. As research scientist working in the business world, Dr. Wendy Lynch has learned to straddle commercial and academic goals, translating analytic results into market success. And through this experience, she has created her new book, Become an Analytic Translator and an Online Force. We hope you check that out. And with that, I'll give the floor to Wendy to get this presentation started. Wendy, hello and welcome. Thank you so much, and I'm so happy to be here. Thanks for that introduction. And to everyone who's joining us for the first time, welcome for anybody who's here from last time. Welcome back. We are going to cover quite a bit of territory today as we talk about data literacy and its relationship to data integrity. And because I'm celebrating the sixth month anniversary and starting this whole series, we will have some flashbacks to some of the things that we covered over the past six months. So I want to talk about the questions that we ask ourselves. And maybe it's the questions we should be asking ourselves about data literacy. And the first one is, do we ignore or fail to leverage the ways that employees already interact with data? And I'm going to go back in time because and I'd love to hear if there is anybody else on the phone or on the call that has used punch cards. Because when I started out in programming, we used punch cards. That is how we got data into a computer. And there were people whose jobs it was, like these ladies who are back in here, to punch information into those punch cards in order to enter data onto the computer. So you had to very intentionally put things into the computer. And every little bit was typed onto cards. And I see other people were in Fortran class and we thought we were very advanced when we moved to Pascal. And so these were things that we did. And because it was so rudimentary, much of the time we collected data on paper with handwritten responses. And then we would hire a data entry team, people who are very accurate typists. And if it really mattered, we had to double enter the same information to ensure accuracy so that when we got our information into the computer, it was right. And so my master's thesis was analyzed using punch cards. And because you didn't store it online, those punch cards were saved for years in my closet because you didn't want to lose data. So way back when it was very hard to get data into the computer. Today, I feel like we leave data behind like it's dust following pigpen. So we have data coming out of us in so many different ways. And our digital footprints are only getting bigger and brighter. The way that we leave data behind comes from so many places. The things that we buy, the things we review, anything we withdraw, a phone call, an email, a text, what we search for, what movies we watch, how we look at different pieces of information, how cameras look at us, how reports that we download, the PDFs we get, the prospects we put in Salesforce, what locations the phone tracks, how many steps on the Fitbit, how many scores we enter, what groups we belong to. All of these things are data creation. We are all, unless we're off the grid, data creators. It follows us behind. And so when we think about whether or not we've created data and what the data look like, can you trust the information that we're leaving behind? So if we think about data integrity, we can ask ourselves, how are we impacting data quality? Well, I have worked in many times in corporate health where we're interested in helping people be healthier. And so we'll submit surveys, ask every employee to fill out a survey because we want to know how to help people. Well, when we ask you, we ask you about the sweets that you eat, whether you vape, how much alcohol you drink and what your weight is. If you think about it, how many of you have fudged that just a little bit? How many of you have kind of taken a few glasses of wine off of the list or a few pounds off of the weight? We affect the quality of data because, for the most part, respondents don't report accurately because they want to look healthier than they are. We choose whether to participate. So we all get asked at the end of a service or when we get a product, can we please provide answers to a survey and rate things? Well, because we don't unless we are extremely excited about a product or really upset about a product, most of us do not answer the surveys. On top of that, in political surveys, nobody wants to answer anymore. And so the reason why polls are so terrible is that over the past 20 to 30 years, fewer and fewer people are answering the surveys. The reason the polls are so bad is because of us, because we choose not to give our input. We decide what we're going to reveal and how accurate it is. Your next co-worker is going to be hired in part based on what they put on their resume. And yet, Business Insider did a survey in three quarters of people. Oh, my goodness. It meant that they fudged how many years experience they had, their skills, their education. So we affect quality in the way that we decide what to put out into the data set regarding what we're doing. Now, I can have any of you who are golfers raise your hand. Just give a little shout out because you know that if you play golf pretty regularly in order to participate in any kind of a competition, you have to have a handicap. And the way you get a handicap is you enter your scores in an official website and the players enter their own scores on the honor system. So what happens is not everybody puts in all their scores. Some people, they're called sandbaggers, only put in their bad scores so that then when they're in the tournament, oops, they seem to win all the time because they're just playing so well compared to normal. On the other hand, you might have people who really, really, really want to play on the A team. And so they only put in their low scores. But then when they go into competition, they lose all the time because, oops, they are playing worse. So we choose not only how accurate our data are going to be, we forget to fill something out. We are reluctant to answer certain questions. We'd like to make it not look as bad as it is. And in this case, we might intentionally manipulate the data that we put into a system. So we are all data creators. We all influence how good or bad the information is. Plus, we're not only data creators, we're data evaluators all the time. We might not say this question specifically. How do you decide if you believe it? But we are making that decision all day, every day. We decide which product we're going to choose. And many of us look at the reviews. I know I do. And I looked at it just recently as I was preparing this talk. And on Amazon, it's illegal to leave a review if you haven't bought the product or and you can't leave one if somebody paid you. But that's not the case on Google. It turns out you can buy 55 star reviews for your product for $259 using this service. So we're deciding whether we believe the reviews. And they may or may not be real reviews. We're deciding whether our performance reviews are fair. For example, I worked at a company where the leadership didn't like how many fives there were on the one to five scale. So they didn't want these guys up here in the fours and fives to be up there as much. They wanted to have a bigger distribution. So they wanted everything to get pulled, oops, pulled to the left. And so they wanted it to look like most people are a three, a few of them are ones, very few of them are fives. And the employees were told, you know, your rating has gone down, but don't worry. It's not going to affect how we treat you or anything else. So they had to decide, is this going to have an effect on you? Just because they changed how they decided to use those data? We also decide based on how we understand the data. I was on a jury maybe 20 years ago and it was a terrible case. It was a rape case, but they had found a DNA match at the site of where the crime happened. And it matched the suspect and the scientists told us that that DNA could not match more than one person in 10 billion. That made us feel really good for most of us on the jury, except one person who didn't believe in DNA and said, how can they know? It seemed like Hocus Pocus to her. She didn't want to potentially convict someone if she didn't understand and believe that data, which is a different topic because it's more about scientific literacy than data literacy. But we make decisions all the time based on what we believe. So we can't ignore that people already use and interact with data, but we also need to ask ourselves if we're using overly technical language. I was looking at curricula as I was preparing for this and this is typical for a curriculum to look like this. We need to have people understand data quality and statistics and some visualization and specifically on data quality, they had topics like internal and external validity, inter-rater reliability, test retest reliability, all of these concepts that most of us in the scientific method and in data literacy understand what these mean. Data scientists understand this about data, but really do we have to have everybody understand all of these terms? Can't we say that data integrity is just, can we trust the data? Can't we just have people think about how they decide to believe something? Whether or not it could be wrong, do we think it might have a bias in a single direction? Could we influence that? Or who does influence that? And how might that affect the way that we use the information? Who might it harm most? We don't have to use inter-rater reliability as much as we can just ask really good questions and help people understand what's behind them. So besides that over technical language, does the way that we talk about data literacy create an adversarial or derogatory dynamic? I think many of you saw, if you joined us in March, that when we think about these things, this is kind of how it feels sometimes. The illiterates are coming to use our data and we mustn't let them in. We get protective. We want people to know how to use the information. We want to protect how consistent it is and we want to make sure that people aren't misusing that information. And so we develop this us versus them kind of a mentality. But I want to remind us who actually these illiterates are. Is it just a few people that we need to help? Well, no, 80% of employees have low literacy. Three quarters of business decision makers have low literacy. Almost 70% of the C-suite, meaning the CFO, the COO, the CEO have low data literacy. So we are talking about the majority of people. So if we're saying Katie bar the door, we are trying to take the majority away from the source of data. We are trying to take the majority away and force them to learn more and we don't want to let them in until they know what inner rate or reliability is. This doesn't move us toward how we want to talk to people to avoid this adversarial relationship. Oh, and there's a lot of finger pointing in articles about data literacy. They love to point out which industries are worse and which kinds of roles are worse. And they often point at human resources as the people who aren't performing. So if this is the majority of people. Do we really want to insist that it's their job to learn about our field? And the answer seems to be, yes, a lot of the time. When we see all of the journal articles, all of the management and leadership types of positioning papers, they tell us that 90% of business leaders believe that literacy is critical. They tell us that literacy is set to be the second language of business. They tell us that people should be able to do statistics, able to ideally interpret charts and to try and look at analytics. They really have high expectations. But the question is, how much do these folks understand? And in what industries do we want to force them to understand? Even doctors are having trouble with the amazing developments in AI where they can use AI to help diagnose and to help choose treatments. But because especially a doctor was trained 20 years ago, they don't understand that they don't understand digital twins and all of these wonderful capabilities. So now there is a call in those fields. And in that highlight, it really caught my attention. The next generation of clinicians all have to be data scientists. I mean, in some ways that's crazy. So every doctor, every CEO, every psychologist, every engineer, every HR director, every employee needs to become a data scientist. Are we asking every data scientist to become a doctor? Are we asking every data scientist to become an engineer? Are we asking the right people the right questions? And for those of you who are here in April, you know that I do have questions. I do have questions. Do we assume interest and aptitude? Do we assume that everybody feels this way? Oh, please can I have more math? Please, can I spend my days looking at computer output? I don't believe that that is the case. And when we really look at how people feel and their aptitude around math, I want to remind us a third of Americans don't know that a quarter of a pie is the same as 25%. More than half say they would rather smile and nod than admit they don't understand data or statistics. And one in five can't understand the simple math of a bank statement and rely on friends and family. So what are we assuming? And does it look a little more like this? Is there a certain amount of dread that we are exposing people to? Because we're assuming that they really, really want all of them to become better at data. You'll also recall if you were with me in March that I started to wonder whether data literacy should be separate from other things. And what we talked about was two other areas. One I called business literacy. And that area, what that means is that strategic alignment exists, which just says that everybody in the organization is pulling their oars in the same direction toward the key priorities of the business. So we have alignment of people and processes toward the main goals of the organization. And as you would imagine again in the business journals and the leadership journals, they're saying that you have to help your employees understand your main strategy. They have to understand why they do what they do and that the reason why is that when people are aligned on strategy, they grow revenue faster and they're more profitable. So of course, this is becoming a really big deal. But they are not immune from finger pointing any more than the folks who are touting data literacy. And they say, oh, only 13% of frontline managers can name their company's top three priorities. And according to a study by Price Waterhouse, 93% of employees could not articulate their company's strategy even from multiple choice. So here we are. We are thinking, wow, these are things that are really important. But again, there are a majority of people who are not strategically aligned. Next, I was looking at people literacy, the emotional social awareness that goes with being empathetic and understanding how to move culture. Once again, we see in the business journals, the leadership journals, the management journals are highlighting how important emotional intelligence is. And that emotional intelligence is more important to career success than how smart they are. That half of HR leaders say they'll be hiring managers based on their emotional intelligence. Even more interesting, companies that were trying to do a significant digital transformation to move things forward. The success of those efforts was dependent on the ability of the group to empathetically move people forward and help them understand why it was happening and how it affected them. It had to do with how emotionally in tune they were, how people literate they were. And once again, in the finger pointing category, people majoring in science and business are identified as significantly lower on the scale of empathy than people in social sciences such as HR. So on the one hand, we're pointing at HR for not being data literate. On the other hand, we're pointing at data folks as being not as literate emotionally. So we spend our time focusing on one and not the other. And so what we concluded in March is if we look at what we want to have high literacy for all employees, we want that for data literacy to be high across all employees. We want people literacy to be high across all employees. We want business literacy to be high across all employees. But that might be what we want. What we actually have is that less than 25% are data literate. Less than 25% are highly people literate and less than 25% are highly business literate. So if we are thinking about how we help people move from low to high literacy, should we be helping us all get better in many directions? Rather than having one group that's in charge of data literacy, one group that's in charge of strategic alignment, one group that's in charge of people literacy, can we not start to figure out how to align all of these things together? Because you might have one person that has amazing data skills, decent people skills, decent business skills. And another person who's really high at the people, literacy, but low in the others. What if we have different combinations with different people? What we talked about in that episode was that maybe we ought to be thinking about how to highlight one group within the context of the others. So rather than having data literacy be separate as a solitary solution that will fix our data driven problems, that we make everybody be highly literate and becoming experts. Instead, why don't we put it into an integrated context to admit that data literacy will depend on strategic and social components that realistically there are people with varying strengths and that we may not have everyone become expert at everything. But we highlight people's strengths and then fortify any major weaknesses. So we were thinking about this starting to be a holistic answer. We also should ask ourselves whether we focus enough on relevant topics for people. Like how do we make this relevant either to something that's important in their life or something that's important in their business? And so I'm going to recap what we talked about in June for about five minutes here. Where we were trying to describe governance in laypeople language. And what we talked about was that in modern society, we already use depend on compare, decide, celebrate data. Just like we talked about at the beginning, that we're all data creators, we're all data evaluators. And what we pointed out was that we take some of the data governance issues for granted that we can depend on things like time, temperature, dates, distances, speeds, dollars, weights and scores. Because standards are agreed upon in most cases worldwide, except for possibly the metric system and how we don't follow that. But we rely on consistency. We also make comparisons based on that. We know that it's colder or warmer, that my team won, that salaries will be different, that we've lost or gained money, that our health has improved, that we can choose a product based on things that we generally believe that it's collected and the standards are applied consistently so we can compare things. We also even make data driven decisions that are about the future. It's going to rain so I'm going to be outside Saturday, not Sunday. Traffic is backed up, so I'm going to go a different direction. I'm going to make decisions about when I do a purchase or when I'm going to travel. I may make decisions based on what we know astrologically, astronomically, whatever that word is, based on what's going to happen with the planets and stars. So we do those things because we in general believe the sources make good predictions. Which helps us believe in the things that we make decisions about. So my point was is that we ask people to become more literate because that will help them make better decisions that are valid and accurate and reliable. But my point was is that people make those decisions already, whether they're great decisions or not, they're making those decisions already. And the analogy that we used was about nutritional literacy. And what I said was even if you have low nutritional literacy, it doesn't mean that you don't eat. You still eat whether you know much about what you're eating. You just make different choices than somebody who's very highly nutritionally literate. And so what we talked about was that there is a parallel set of responsibilities. So the governance part is official entities who ensure safety. Then they determine the consistency of metrics and definitions and processes and transparency. And then the consumer has a responsibility should they choose to on how to use the information. On the ways that they understand information. Do they know the difference in calories in sodium in sugars and how those things affect them as a person. And how they might make their choices. So in governance, whether we are talking about food or data. We have this data collection and we'll get into this a little bit more today. We have that information about where it came from. We have information about how it's transferred. Is it transferred safely is privacy maintained. In the right kind of format, whether that be refrigerated fish or encrypted information as it's transferred. And then it's transformed into something that we can use. We can transform it into the product that eventually the consumer will use. And the cumulative information from top to bottom is how we make better decisions about the product that we get. So on the consumer side. Their role is for sure at the end where they're just going to eat, because that's when it arrives on their plate or in their lunchbox. And that comes from a format. That we have chosen, essentially. So whether that's, are we looking at graphs or reading a report when it comes to information, or is it coming in a salad is it coming in a burger. And lastly, we look at the preparation and manipulation. So if we cook, then we actually go up that chain. If we go out and we buy a product already completed, then we just simply consume. But that preparation may be whether we use logistic regression, whether we truncated the tail end of the data distribution, the things that we did, or it's whether we fried it or grilled it or boiled it. So when we think about food or data, more literacy helps us further up. So if we know more about how to prepare food, that gives us more choices on the different ways that we might get it and the way that we're going to consume it. So more literacy is probably up higher in that food chain. So governance covers all of the five top things all the way down to consumption, whereas consumers for the most part, unless they have, let's say hunted and gathered their own, they went out and shot a deer, then they will have all of these. And it was much more common years ago that we had much more of a role on top, but today we have more of a role at the bottom. So as we think about how literacy helps us make better choices on the consumption end, whether it's data or whether it's food, the more literacy we have, the better information we have. And so, if we move somebody from being simply a consumer, where I'm just going to eat whatever you give me, up to selecting and preparing based on what they know, then we're asking them to become more literate. We're asking them to become more informed. So I wanted to extend this particular analogy, because I thought it was an interesting way to talk about data collection and food collection. And I found some kind of surprising things. So if we say a person who has low nutritional literacy still eats, they just may choose differently. If a person goes for a person who has low data literacy, they still consume information, they just may use it and choose it differently. But let's look at the top of the food chain, when we are talking about the choice that somebody was making around governance. So the question you might ask is, can we trust the information that we are consuming? Why might we question them? And what I'm going to do is I'm going to ask questions about the data associated with those food sources, which is kind of fascinating. First of all, are the labels that we use about our food accurate? So can we look at the label and know what we're getting? Well, it was very interesting in 2018, they did some genetic studies about fish. And they found that 25% of fish served in restaurants is not what the menu said. Some of the worst examples were 55% of sea bass on the menu wasn't sea bass. 42% of snapper was not snapper. 47% of sushi was mislabeled, so whatever you were asking for wasn't what you got. 100% of dover sole was actually walleye. And 0% of Chilean sea bass comes from Chile. So this may affect what you decide to choose. Now that you know, if a consumer knows that this is inaccurate, as long as the fish tastes good, do they really care? Or do they feel like they're getting ripped off or that there is something wrong with it? Probably more relevant, almost all wild caught labeled salmon is farmed. It very, very often says it's wild caught, but it's not. So if you are aware from a climate perspective, from a humane sourcing perspective, this may matter a lot to you. And it may determine how you make your choices. But if you don't know, you won't know whether you are making a choice regarding that. Oh, and one interesting one is that they're instantly in the 1980s, everyone was eating orange ruffy and I had never heard of it before. And it turned out that its original name was slime fish. And so they changed the label and then the sales went up. So in terms of labeling, there are some other factors. So as we think about labels, let's also look at beef. This was another interesting thing. There are laws about country of origin labeling, but somehow lobbyists have made that beef or pork does not have to comply with that. So turns out 75% of beef consumed in the US that says product of the USA actually comes from Australia, New Zealand or Uruguay. But it's packaged here. And so it's allowed to be said to be product of the US. Additionally, 100% of South American beef is allowed to be labeled grass bed and organic, even if it's not through some treaty or trade agreement that they have. And lastly, if they do genetic tests, just like they do on fish, if you're buying a specialty meat, something unusual like bison 35% of the time it's wrong. 18% of local butcher meats have more than one species in it in the ground beef section and then even in grocery stores 6% have more than one species. So these are things that I just happened to find out as I was looking into how we treat food. In what cases do we actually know the sources in what cases. Do we not know the sources, but how can it be labeled and so we use data all the time. We think about data all the time. We just don't call it. So besides the naming I was interested in the numbers, how accurate are the numbers. For example, what about the weight of the product that you're buying. Does that not seem like a big deal. Texas found that 4% and grocery scales were inaccurate so they might be you ask for a half pound a ham and you may only get three eighths of a pound. So that didn't seem that was too bad. For meat sellers are allowed to add broth so that chicken broth that's added to your raw chicken means that you can buy a pound of chicken that's actually only three quarters of a pound of chicken because sellers have an incentive to list a weight as higher because they make more money. In terms of calories. What about those labels. The FDA only holds companies accountable for being within 20%. So, if it says 400 it may be 500. And it says 800 it may be 1000. Because they're allowed to budget. And at restaurants, most of the tests indicate that when they list calories they listed artificially low. That's not because restaurants have an incentive to make you feel better about eating their meals. So they put it lower. These are pieces of information based on data. Where we have to evaluate at all times. What it is that we're seeing how much we believe it. If we know it's biased. If we know that they've added broth to everything then we know that if something's bigger it's probably has more chicken in it. But it probably also has more broth in it. If we go to a restaurant we know that it probably has more calories than it says but if one says 2000 and one says 1000 then we can probably know that ordinarily the 2001 has more calories. How do we make these decisions and how do we help people make these decisions. To me it's not about definitions and understanding different kinds of technical terms it's about asking. Basic thought provoking questions, asking people to really think about how they decide. Who has an incentive to be truthful. And to exaggerate it who has an incentive to make it lower than it is. What are the most likely sources of inaccuracy are the data that you're using in your work. Are they accurate and if they're not. What do we think is causing that. How does information about these data sources shift our thinking. Make sense of it in a way that helps us make a decision. Would the information influence your choice. Does the fact that Texas has 4% of their scales that aren't right. Make you less likely to go measure something there, perhaps not. Does the fact that 75% of beef that says us not us is the fact that it says well caught when it's not does that affect your choice. And who might these inaccuracies impact the most. So as we think about data integrity we are thinking about trust. We're thinking about whether we trust the sources. And we want people to think about what they want to know. And what else they might want to know. So the last question that I will pose is one that all of you have probably heard me talk about in the past. And that is, are we mistaking data literacy for the ultimate goal, rather than a process by which we reach other goals. Do we really want everybody to be highly data literate. Do we want people to behave in ways that they use timely information. They notice problems and opportunities. They ask better questions. Make more informed decisions. And everybody has an opportunity to understand the data that they're using. But not forcing everyone to become highly skilled at statistics and analytics, like what seems often to be the perceived goal. So I'll conclude with my bias, which is that all of us have strengths. Some may be analytic experts, like so many of you that have joined me today. Some may be business experts, but they don't have a lot of literacy and even people or data. But some people are quite good at everything, but not necessarily the full expert. Those people can be translators can be the go between rather than asking everyone who's an analytic expert to become a business and people expert. And asking everyone who's a business expert to become an analyst and a person who has high emotional intelligence. Because if we make use of everybody's strengths, then collectively we can move farther in data literacy, we can have more empathy, and we can be more strategically aligned in ways that encourage people rather than focusing on their weaknesses. We exploit and leverage their strengths. So that is my take on data integrity and the ways that we should be perhaps thinking about it or at least the questions we ought to be asking ourselves as we move forward in this effort to help people use information. And I welcome any questions that you have as we have a few more minutes so Shannon, do you have a couple of questions for me. We do Wendy thank you so much for another great presentation and just to answer the most commonly asked questions. Just a reminder I will send a follow up email by end of day Monday for this webinar with links to the slides and links to the recording and anything else requested throughout here any of your questions for when you feel free to put them in the q amp a portion. So diving in here, you know, we've, we've heard this before Wendy, you know as we've started this and launched this series, quote unquote data literacy is such a terrible labels. Oh here we are calling the business folks who support. We seek illiterate some of the folks stop using this demeaning and insulting label. Yes. This will be a topic later in the year where we're going to try and explore other labels and it's interesting because I think so many of us really dislike I wasn't going to go further than that really dislike that label. And I am hopeful that as we become more holistic we start to talk about it in a different way. But if you think about even why and Shannon, you and I were in the discussion at the beginning. Why did we label it data literacy in the first place it's because it's a term that people now understand what you're talking about. So, yes, I am hopeful I totally agree with you I would rather not call it literacy and we'll be exploring some of the terms that are that are being used now in other places because there's a lot of folks who are worried about it. So, very good question. Indeed. So, how do we measure that we are making progress with data literacy. That's a really, really good question. You know, we did a session. I don't remember now which month it was maybe April on on assessments of literacy. And in many ways, those assessments are a baseline assessment. In some cases they can also measure how much more literate the person is. We had the one assessment that we dove into called data abilities which I think is 15 different skills on us one to six level scale. And so there are metrics, if you specifically want a attribute that is called data literacy. However, my bias, and again it's a bias would be that we find a way to see if people are more comfortable using information. Whether they use information whether it's because they went and got it themselves or if a translator help them understand it when they were making decisions. Could we assess whether or not people are grasping the information that is available to them and making use of it when they are making decisions. And that to me would be the real measure, not the actual information that they can regurgitate onto a test. Now, don't get me wrong, those assessments can be very helpful at the beginning to understand where you're starting. I'm not sure, as I said that data literacy is the goal data literacy is one step toward people being capable of using information whether that's because they become more literate or because they work with people they trust who are more literate. Makes a lot of sense. So, would you agree that the level of data literacy is based on one's role and responsibility in the organization. The required amount is I'm guessing what they mean. I think that where it isn't necessarily by the role that this may be what you mean. It should be tailored to the way that they should be using information if they're making decisions. So, if somebody is at a very base level, but needs to understand how they are performing compared to their peers, or whether they are doing the things that the bosses want them to accomplish. It's important to understand the metrics that are reflected in that. If on the other hand, they need to be making predictive kinds of decisions, then they either need to understand how predictions happen and how they are accurate or not, or they need to work with a translator who can help them in lay person language understand Perfect. And we've got about seven minutes left so time for a couple more questions. So how should we define the term quote unquote data in today's age. That's a really good question. I think you saw my list at the beginning, where anything that provides some record of an event for an amount or an opinion. And more and more, we're leaving behind those trails of information. So I don't think we can limit it to the things that we used to think of in data collection, something that is numeric that you could quantify. Because so much more now is in natural language so much more now is in some somatic somatic kinds of perceptions, negative versus positive enthusiastic versus not. So, I believe that our definitions need to get broader, but I can't say that off the tip of my tongue I can give you an exact definition. I love it. So when the data literacy of a negative impact on insights. I think it depends on how it's defined. Because the place where I could see it going the wrong way is sometimes the perfect gets in the way of the good. If we train people let's say that they have to have 100% confidence in the reliability of a certain metric, or that it needs to have certain properties that are so reliable so valid so consistent that most real world metrics can't achieve that. Then it could lead us to a situation where people are afraid to use any data, because they're afraid that they're going to get challenged on it, or that it's not good enough. So, we have to understand what the criterion are for making a decision and what the consequences are because in business, we don't always have to have perfect information in fact rarely do we have perfect information. And decision makers just want to be confident that they're making a better decision than they would have if they didn't have the information. Sometimes ordinal is fine. If, if we're pretty sure that the directionally we're right, then they will go there. So, while I believe that understanding more and asking good questions and really knowing the limitations of data are a good thing. We don't want to train people to the point where they are frozen, or they are reluctant to make use of it because now they've been trained that some data sources are too terrible, and they're afraid to use them. So, good question because I could see it going that direction in the wrong cases. Indeed. So, one more question here I think we've had about less than three minutes here is data literacy domain to specific domain specific. I'm showing my words here. Yes. I think rather than calling them domain calling it domain specific what I would prefer to say is that we need to really focus on relevance. So, what we learn should have a context that's related to what we are working on or what we are familiar with. And it needs to be relevant in such a way that we understand how it may affect some of the choices that we make or at least have us hypothetically thinking about what choices we might make. So, while I mean you might say that people who are making decisions about well being. And let's say they're making decisions about the potential for catastrophic errors of some kind, then you might have higher standards about what people need to know. If on the other hand, the consequences are not dire that somebody put pepperoni on a pizza instead of mushrooms. It may be that that isn't, and I'll probably get people say yeah but I'm allergic to mushrooms. If, if we think about the consequences of what the choices are that should probably be how we decide the context of what we are teaching people. Very, very, very nice. And Wendy congratulations on six months of webinars in the series. It's so exciting. I'm so grateful to have partnered with you on these webinars. Your presentations are so fantastic as well send you the chat because you can see from the chat everyone's been enjoying it so thank you so much. Thank you everybody for being so engaged in everything we do but that is all the time that we have for this webinar for today. And just a reminder again I'll send a final send a follow up email to everybody by end of day Monday with links to slides and links to the recording from today's presentation. Thanks everybody. Thank you Wendy. Thank you. See you next month. See you next month.