 Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Officer of Data Diversity. We would like to thank you for joining the most recent webinar in the Data Diversity Monthly Series, Elevating Enterprise Data Literacy with Dr. Wendy Lynch. This series is held the first Thursday of every month. And today, Wendy will be joined by Jane Cross, the founder and CEO of Data to the People and Heather Wilson, business analyst at Evergreen Health to discuss data literacy assessments. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. If you like chat with us or with each other, we certainly encourage you to do so. And just to note Zoom defaults the chat to send you just the panelists, but you may absolutely switch that to network with everyone. For questions, we will be collecting them by the Q&A section. And if to find the chat and the Q&A panels, you may click those icons in the bottom of your screen to activate those features. And as always, we will send a follow-up email within two business days containing links to the slides, the recording of this session, and any additional information requested throughout the webinar. Now, let me introduce our panelists for today. Jane Cross is the originator of data abilities and founder of Data to the People, drawing on her background in business analysts, supply chain re-engineering and process improvement. Jane brings a pioneering approach to data literacy. I'm a data literacy evangelist based in Australia. Thank you, Jane, so much for getting up so early for this. Jane works as a consultant author, teacher and advocate of data literacy all over the world. Jane's a member of the Data Literacy Projects Advisory Board, a group of respected visionaries brought together to further the future of data literacy globally. Heather has a passion for democratizing data and works to make data more accessible to the employees of Evergreen Health for better patient care. And let me introduce to you the speaker for the series. Heather has a passion for democratizing data and works to make data more accessible to the employees of Evergreen Health for better patient care. And let me introduce to you the speaker for the series, Dr. Wendy Lynch. Wendy is the founder of analytic translator.com and Lynch Consulting. For over 35 years, she has converted complex analytics into business value. At heart, she is a sense maker and translator, a consultant to numerous Fortune 100 companies. Her current work focuses on the application of big data solutions and human capital management. In 2022, she was awarded the Bill Whitmer Leadership Award for her sustained contributions to the science of corporate health. As a research scientist working in the business world, Dr. Wendy Lynch has learned to straddle commercial and academic goals, translating analytic results into market success. Through this experience, she has created her book Become an Analytic Translator and an online course. And with that, I will turn the floor over to Wendy to start the webinar. Wendy and panel and Jane and Heather. Hello, and welcome. Thank you, Shannon. I'm always impressed. I could never speak that quickly and so clearly. So thank you for that introduction. Welcome to everyone who is joining the first time and welcome back to those of you who have been here before. We have a real treat today. Both of our guests have amazing stories and wonderful information to share. And before I get to that, I am going to remind us where we've been. This is the fourth in a series, hard to believe. And I want to remind all of the listeners, the ground that we've covered that gets us to this content around assessment. So when we think about assessment, we have to start with the original idea about data literacy, the whole idea of data literacy. I think it reflects a concern on the part of organizations and educators that for the most part, a lot of people are unaware of data resources that are available, or they're not using the data that they know about, or they're misusing or misinterpreting what they do have and not using tools that might be available. So we'll hear a variety of different definitions today from our guests. In general, what we hear is that the definition of data literacy is the ability to read and write and convince with data. It also includes depending on who is talking about it, a whole variety of other skills that range from collection to interpretation to actually analyzing data. And that breadth of capabilities is reflected in a whole suite of advertisements about why data literacy is so critical. It is on the top of the list for many organizations because of results like this where one study showed that the companies that have the highest levels of literacy that they called mastery. They actually measured 70% higher revenue per person. So we see studies like this in journal after journal management journals, highlighting the importance of data literacy, which leads us to numbers like this that 90% of business leaders believe that literacy is going to be critical to their success. We can ask the question why assess data literacy. And so we'll review some of the things that we have covered. First of all, we want to inform stakeholders and create a baseline. Now we want to do this for a variety of reasons just because measurement is important, but also because there is misunderstanding about literacy levels. Currently, leaders overestimate literacy in one study that had a survey of leadership said that leaders thought that 70%, 75% of leaders thought that most or all of their workers were data literate. Yet when you do surveys of the individual workers, we find that one in five is really confident with their data skills. And one study even said that fewer than 10% could be rated as having high literacy. So are we making assumptions that we need to correct. Let's find out exactly who is literate rather than assuming that most people are. We also want to gauge interest and openness in this idea of literacy. I find that most of the proponents of literacy are people who love data. They love it. They think anybody could be data literate and they think, wow, everyone's going to be so excited to have a chance to learn about data. But what we covered a few months ago is that the statistics show a lot of people don't love data. And they find themselves dreading or having anxiety attacks even about this idea of having to learn about data. So we need to have empathy and understand what it is that people are open to. We also want to prioritize which skills that we want people to learn. Last month we talked about a whole variety of different types of skills from everything from being aware all the way up to interpreting and analyzing data. And so if we have this stair step of skills, where are people now and what skills do we want to prioritize. We also want to prioritize the right people. Who should we target. If we know what skills people need, but which people need which skills. We've explored the idea of whether it's realistic is everybody supposed to become literate is everyone supposed to become highly literate. Is it for every organization and for every employee. If we have people who are at the lowest level of literacy, what kind of information can they take advantage of. Do we need to get a good portion of people up to a moderate level of literacy and knowing how to do basic queries and manipulation. What portion of people have to become highly literate knowing how to do modeling doing sophisticated types of advanced analytics. These are questions that everybody seems to be grappling with as they think about what are we going to do and how are we going to become literate. We have to ask which skills we have to ask which people. And then we also need to ask. How well are we doing and can we track progress. So if we are setting a course for certain skills and certain people. Can we measure it over time and demonstrate change. Are these metrics things that change in an obvious way. So how are we going to do this. So in the past six weeks or so I've been looking at a variety of different assessments that are out there. And just like the definitions are all over the place. So are the assessments. I'll explain a few of the types that I've seen and then we'll hear from our experts on the call today about the way that they've taken it on. One of the most basic levels is just to assess how many people are using the tools that are available to them. So how many people are signing into dashboards. How many people are using Excel how many people are using various tools and in order to use that as a benchmark. Then we have outcomes like the rates so they may report the rate at which people use the different skills. Now, the next level I would say is self assessment and the self assessment includes things like how comfortable are you or familiar are you self assessments include what do you need to do your job better. The other type of self assessment is how good are you at certain things. You'll hear from Jane today about a variety of different skills that her measures self assess. Now the result for the self assessment seem to be usually in proportions of people. That's the result that you get who fall in each category. So each of these metrics divides people into certain categories of types of people they usually have fun names to them, rather than saying completely dorky and illiterate. They might say that the person is initially curious, or that they're interested. So, we divide people up into personas. The next level of assessment which are quite interesting are actual tests, where we are having a multiple choice set of items to see whether somebody can correctly identify certain concepts or certain ideas. Or those tests might measure whether you can do calculations, whether you can read a table, whether you can interpret a graph, and you are given multiple choice answers to see whether or not you can correctly use those. And I have to tell you Shannon that I wondered whether you would want to fire me because some of them I could not answer the questions because I couldn't figure out what they were really asking. So, I may be according to some of these assessments quite illiterate. And lastly, there are tests that have you demonstrate certain skills. So, which of these choices represents the right command or the right code in order to achieve something. So, what we get there is an actual score. Wendy is a C minus and she doesn't score well, or some sort of other score like that. So what it does is it gives us this broad variety of options, which is what brings us to this whole topic in the first place. Because what skills are we measuring? Who are we trying to target? How good do we need them to be and what are we actually measuring? Is it their use of something, their self-concept about how good they are? Is it their ability to answer questions? And I will finish up this preview by reviewing what dataversity found over the winter when they did focus groups about data literacy. And one of the questions they asked was, what about barriers? So we'll finish up with this as a review, which was, if we're thinking about data literacy and we're thinking about trying to improve it and assess it. The barriers that they brought up the most were, I don't know that my people have time. I don't know that we have money to invest. I'm not sure that I can get buy-in from people across the organization. I don't know who's going to own this approach. I don't know whether this fits into all of our other educational and training goals. And I don't know whether it should have a period of time or whether this goes on and on forever. So all of these questions lead us to where we are today, which is to hear from two people who have two very different experiences. Let's start with Heather. And Heather's experience is from the inside of an organization starting from scratch. And so why don't I turn it over here? So these are the two folks you will hear from. You heard their introductions. We'll also put the bios in the review. And just to let you know what my questions were. Both Heather and Jane is, what was their initial interest in data literacy? What were the goals that they set for the assessments that they use? How did they decide what they would assess? And what did you learn in the process? So Heather, why don't you take it from here? Thank you so much, Wendy. I'm really excited to be here and talk to you all. And just to let you know, we started this a while ago, so I'm going to kind of let you in on the buildup to the assessment. About three years, the analytics and reporting team at Evergreen was not the place you would go to get trainings at Evergreen Health, but it was very clear to us that something else was needed. I developed dashboards and had created a training environment in our BI tool and the utilization was almost nothing. We had to backpedal on self-service analytics. Reports would go out and people were clearly frustrated with basic Excel functions and we also wanted to increase dashboard utilization. And we're curious about how we could support the organization in basic reporting skills. I said we were a team of analysts, we weren't trainers, but we had to change our mindset around that. And this is when we started to look at our organization through the lens of data literacy. We knew if we were going to increase literacy, we needed to actively work towards impacting culture and skill, and we needed a way to measure our success. We were able to implement our first literacy initiative fairly quickly and started a data literacy resource group in April of 2021. The group has since been rebranded and now is called the analytics squad complete with fire emoji. And I just want to bring that up for our organization, the assessment was not the very first step in initiating a literacy program. It actually helped us start to get a closer view of what our organization wanted, what they needed, and what they were hungry for before we even launched the assessment. It also helped us find our data champions. We wanted to develop more formal trainings for the organizations and by this point we had learned a lot more about data literacy initiatives and knew that the data literacy assessment was a best practice for starting a formal data literacy training initiative. We wanted to do our own. We had support from our CIO but still needed more leadership support across the board and didn't really have a budget for an assessment. So in the summer of 2022, we started building the evergreen data engagement survey and we launched it in September 2022 as the first step towards building a formal set of data trainings that we would call our learning pathways. So, I was the primary person building the assessment with support from other teams who more traditionally sent out surveys to our organization. It, and I just want to bring up asking about the goals of the assessment is something you should ask yourself if you're building your own again and again it is so easy to get caught up in all the different assessments and all the different goals of assessments out there. And I think bringing ourselves the goal of the assessment was really key. I still remember a conversation where I was showing the director of analytics and our data governance leader examples from assessments all over the place, and our director of analytics stopped me and said what is our goal here. It really was an aha moment in the questions I needed to include and what I did it. The goal, really with this assessment was to create data literacy learning pathways successfully, and to make sure we had information to give leadership supporting how many people would benefit from these learnings. We also, we did want to measure data literacy maturity some but it was really secondary to determining the needs of our organization. We also knew some of the things that Wendy mentioned from all the assessments out there, we knew we didn't want the assessment to be scary to people. We wanted it to be accessible like everything else we would do in our data literacy initiative. We learned a lot from the data literacy project and I loved taking their data literacy assessment for myself. But I knew I didn't want questions about specifics and data like central tendency or outliers for people who had a wide range of data literacy and data experiences. We changed the name of our data literacy assessment from data literacy to data engagement, we wanted to know how people interacted with data and how important they thought it was. So, we came down to three main types of questions for assessments. What are people's experiences and data, which, what are what are I'm sorry what are people's roles in data, the, which would help us determine what learning pathways they would take, and also demonstrate the need for a data literacy training. What are people's experiences and data would, which would help us make measure data literacy and engagement and understand the systems people were working with to help us develop trainings around those. While we were developing the assessment we had also just launched the evergreen glossary and we're trying to normalize standard language and define terms across the organization. So, we started our assessment with defining terms for people to represent their experience with data. Here are the terms that we used, and they're pretty familiar terms, and also, we use definitions that are pretty close to what you would find if you looked up these words in the dictionary. So, this is what one of our experience questions looks like. This wasn't used to determine learning pathways. It was more to understand how people would self report about their experience with data. And I'm really interested to see what this will look like when we send out our second assessment. This question from this, this question from the click report the human impact of data literacy is one of my favorite assessment questions I found anywhere. It has great it has other great questions to and the link to that report will definitely be able to be with these resources when you receive them. We developed the data roles style question for our organization and as you can see, you know how employees read data and this was 75% read and interpret with 65% our numbers were actually even higher than this. Here's how we adapted that question for our did our data engagement survey. So, as you can see we not only use this question to get a picture of people's roles and data we also categorize the different ways people interact with data to fit into the learning pathways we wanted to develop. We assigned points for each answer that counted as a category, and as an effort to engage people in the survey. When we sent this out, anyone who answered with their name which was optional you could also take the survey anonymously. But anyone who did give us their information we would send them their results and their highest suggested learning pathway. This question was this question really unlocked how we built some of the best questions in our survey. We use this format to ask about dashboards data visualizations data entry data collection metrics Excel and data quality. Here are some other questions that we also included. We had a few extra questions about Excel because it's used so widely across our organization. I really loved just asking would you benefit from further Excel training was a yes no question. I think the percent of the people who responded to our survey answered yes and some of those people weren't even people who use Excel in their daily work for what we learned along the way. Here's some actual here's some information we got. Here's some information we got. Just a second. Sorry about that. Here's some information we got from the actual results of the survey. About 6% of the people who answered said that they read or read an interpret data. And just to give you a better picture of that about 10% of our workforce ended up answering the survey, which our quality team told me was a pretty good rate of success. They said they benefit from our Excel training 48% of the people who answered with their job title were mid level management or executives and we were pretty happy with that because we worked pretty hard to get engagement from those areas. 33% of the people were patient facing 18% in another admin role. And as you can see, for some of our roles questions we also figured out what people were saying about the tools they had access to 34% of people considered dashboards and integral part of their job. This question only populated if a person said that they did use dashboards so obviously we have some work to do on raising that 33% of the people are confident in processes around data quality in their department. And we definitely hope to raise that. And something I realized after the assessment is I wish I had put more questions about designing dashboards and data visualizations for more creative people who might have might find some interest in in data from a visual perspective. But to be fair, we also did not have very many people who do design those things answering our survey. So it was kind of more the big picture learning that happened along develop along the way of developing a data literacy program, and also the assessment. Unfortunately, I can't remember who said this, but, and I would love to give credit but there was a webinar that I had attended at one point where someone who had started a data literacy training said something that really stuck to me, which, whatever you create will be better than what you, whatever you create will be better than what you had before, which was nothing. You'll never develop the perfect assessment or the perfect literacy program. Analysts are used to making iterative projects, and it's okay to add on to it later. I'll probably throw out some questions from my assessment but I have a few solid ones I know I can include next year to measure. And I'll probably try out a couple new ones to see what happens with people's answers. Also, learning about people at your organization is the most important thing it's what an assessment is you're trying to understand other people at your organization to shift culture at your organization you have to connect to people and you have to create visibility for your data experts. It has been so great to get to know other people in the organization and watch at least some of them get excited about data. And we encourage people to ask questions. We've changed the way people see our analytics team. We want people to be curious. We want to encourage people to work with us and not judge them for the things that we don't, they don't know. We're, we try to be approachable and we ask people directly what they'd like to learn about and then we teach people about it. Don't be afraid to endorse your projects. And if someone does ask you for something to learn about something specific make sure you make that available for them. Being a data literacy teacher is a lot of grassroots marketing. And the last thing I just wanted to tell you is to be a data literacy learner to be a data literacy teacher which I know sounds a little bit cheesy when I even when I say it out loud. But there are definitely things that I learned along the way that I found myself wishing I knew sooner. And actually Wendy asked me about this before the presentation. And being the amazing analytic translator that she is it really had me look back on all the things I wished I had already known or if I wanted to do things differently. But really I think learning along the way learning about data literacy continually has actually made what we are developing better. I'm a data literacy learner and so is anyone who comes in and learn something from me. And I'm open to their feedback and I'm flexible to change things as I go and we've taught people so much in the past couple years and people are not afraid to ask us for help with things. So I really value being a learner alongside the other people who are learning this. And that's pretty much it I just put our learning pathways at the end in case anyone was curious about what all of the pathways were developing are. Thank you so much Heather. It's, I can't think of a better way to showcase how a data literacy program gets started than to have somebody walk through the starts and stops and the origins and the decisions that you made. So I really appreciate that you were willing to share this. And I'm guessing that there's a lot more people not just me to appreciate learning about all of this. So now we're going to change directions and I will hand it over to Jane to talk about her learnings in providing assessments to people across the globe. Thank you very much. I'm Jane Croft founder of data to the people. Our mission is to build and nurture lifelong data literacy across the globe. My interest in data literacy came from my consulting experience in business intelligence, where I was constantly witnessing a disconnect between those who could speak data and those who couldn't. And I wanted to do something about it. It actually took me some time to put a name to the issue I was saying, I called it attention or a disconnect but it was more than that. It was about being able to communicate. It was a language. This was in 2017 when the term data literacy was beginning to poke its head out was gaining traction. So we had a situation where this buzzword was starting to gain traction businesses were talking about the importance of building enterprise wide data literacy. But if you poked under the hood and dug a little deeper. I'm jumping well ahead. There was no agreed definition. There was no agreed test or standard. And in the absence of an agreed definition test or standard, I thought, Oh, I'll have a go at this. My quest had begun. I needed to understand what this term really meant. How it could be measured. Who was doing something about it. And who were the authorities in this space. It didn't take me long to discover this was in fact very fertile ground. There was a lot of talk, but not the same degree of substance. So I built data abilities during my quest to discover what data literacy really meant. I came across a study conducted in 2015 by a group of researchers at Delhousie University in Canada. They published a paper strategies and best practices with data literacy education knowledge synthesis report. And it was an incredible piece of work. It was a very detailed compilation and analysis of 15 years worth of work from both academic and private sectors, all with the intent of coming to a clear view of the building. Can you, we lost your sound there for a minute, Jean. I'm not sure if something, if there's a network issue, maybe try closing anything that's not zoom. Let me know if. Okay, so we're coming in and out. Let me just triple check where we're going with this. I've got pretty much everything shut other than what we were looking at. You're sounding good now, but fabulous. All I will come right back. So where were we? More than 90 resources across the globe and it resulted in this matrix. The matrix and the 75 pages that support it outlined the key functions of this broad term of data literacy. And it gave me the bones of the data literacy framework, which was that abilities. I spent a number of years in the public service in Australia. And one thing I grew to love in the public service was a good old competency framework. These frameworks break down every job across every function into discreet blocks of competency and behavior. They give precise examples of what is expected in each role and also give you a definitive example of what would be expected in the next role or higher role. It was all laid out on the table. If you wanted a promotion from Job A to Job B, you needed to demonstrate that you ticked all the boxes on the competency framework for your existing role. But more importantly, you needed to show how you were already on your way to ticking boxes for the next role, preferably having ticked a few along the way. Data abilities is the product of the Dalhousie University of Research combined with my love of a competency framework. It takes the 23 functions identified in the knowledge synthesis report, breaks these down into six levels of progression, so that individuals can see, I'm level here, level one here, and level two here, and so on. It allows individuals to see definitively what skill and behavior they need to demonstrate to the next level. Since its release in 2018, data abilities has had a few enhancements. Our original framework was structured around 15 competencies. And after early client engagements in 2019-2020, we completed a detailed review and expanded the framework to 18 competencies. We now also have the framework and associated tools available in both English and French. One thing that hasn't changed, though, is the six levels of progression. I won't go through each of these individually, but I wanted to show the complete set of competencies, how they sit within four domains of data foundations, reading, writing and comprehension, to give you a sense of the breadth of the framework. Data abilities is accompanied by a range of tools and methodologies, providing our clients with a clear pathway to developing data literacy across their business. The first phase, measure, is our opportunity to understand the current environment. The organizational assessment, which is conducted by individual self-assessments, provides us with the ability to look at the current state of data literacy at a competency level by functional, structural and geographical segments that best reflect the business. Our map phase is about understanding what good looks like for different job functions or job families across the organization. There are competency profiles for different roles that outline the desired future state, which we can then use to undertake a gap analysis. By comparing the current to future state, we know exactly which group of employees need assistance in which competencies and equally which group of employees are already very strong in particular competencies so we can celebrate and showcase their skills. This data-driven approach to developing data literacy gives us a very clear roadmap for areas for development, which we're then able to support with competency toolkits and resources contextualized to our clients' needs. Finally, we reassess using the exact same measurement tool as the first phase to measure improvements and refocus training efforts and resources. We continue to get great feedback about the validity of the tool and the insight it's able to offer regarding the state of data literacy within the organization. To date, we've had over 28,000 individuals go through the data abilities assessment. This is across both organizational clients and also our global data literacy benchmark study. The global data literacy benchmark was developed in response to the demand we've had for benchmarking or comparisons between client results and their peers or a related industry group. We've published two cycles of the benchmark and a third will be released by the end of this calendar year. As part of the assessment, we work with clients to help them create the right environment for the study within their organization. Not surprisingly, the importance of having strong, clear, consistent communications about who, what, when and why of the assessment is critical to getting initial engagement with employees. But beyond that, keeping those communications going, maintaining constant presence whilst the assessment is underway, has been key to boosting participation rates, which leads to a more robust analysis of the organization. What we've been very interested in is the very quick shift from, okay, let's do the assessment to, so what's next? And I think the speed of this progression has been a surprise to some of our clients too. This is why it's so important that the assessment is only part of a comprehensive, well planned data literacy program. It's so important to capitalize on the employee interest that comes from the initial assessment and follow up very quickly with the next phase. The next phase looks different across different organizations. Some might take a very targeted approach, working team by team on specific competencies. Other organizations offer a buffet, if you will, of tools and resources, allowing the employee to self-select and independently complete. Whatever that next phase is, it has to be ready to go whilst interest is still at its peak. At this point, I'm going to thank everyone for their time and end my presentation and hand back to Wendy for what I'm sure is going to be an interesting Q&A session. Yes. Thanks, Jane. Thanks, Heather. A couple of questions and then we'll get to the questions for the audience. Jane, does your assessment, and I think I've seen the answer to this, provide a, what is the scoring when I get an answer back about my people as an organization? Is there a label that we use to describe people similar to the way Heather had novices all the way up to wise users? Certainly. So we measure in six levels of progression, level one through to level six. What we found was level one, level two, three, four, five, six didn't resonate with anyone outside of our team who were doing the analysis and reporting. As part of the global data literacy benchmark, we introduced three cohorts, which were aggregations of the levels one through six. And they were the curious, which was levels one to two, the confident levels three to four, and the coaches, which was levels five to six. And we found that those three terms resonated much more strongly with users. And so we've adopted those terms when we're talking about particular cohorts. But we don't tend to focus on the scores. What we're really looking for is that gap assessment where we can say a particular cohort of employees has been identified as needing higher levels of competency in this area, and we're finding that lacking at the moment. But when we do discuss scores and the results of particular assessments, we sort of alternate between those individual levels of one, two, three, four, five, six, and the curious, the confident and the coaches. Great. Thank you. I wanted to clarify that so people got a sense of what they get back. So, if I could ask each of you in thinking about organizations that have not yet done assessments, and maybe not even figured out what their journey is going to look like in terms of building literacy within an organization. If you could have them ask themselves or ask their own organizations, a couple of questions to get them started. Or if it's not a question, a couple of considerations that you think are imperative as they're getting started, especially since Heather you went through this over the past three years. And, Jane, you do this with organizations all the time. What are, what are the two or three most important considerations or questions as they get started. And I'll start with you Heather. And if you're not off mute, I can't hear you. Oh yeah, I'm actually just thinking about it. I think that we really needed to ask what was working for people already and what wasn't. So, we worked closely with, we had a pretty successful data governance program at evergreen house that we also put together, and was really successful, even coming from a pretty non invasive place sort of similarly to how this was built. It was really helpful to work with our data governance team data quality had some great input to and just kind of asking other people for their input on the needs of the organization was really important and getting insight from people who had built other types of training so we also were able to access clinical systems who does a lot of our trainings in the systems that we use and talk to them about common challenges of the organization. So I think those were great, like really important people to ask questions of and, and great questions. So, so it's figure out what's already working in the organization and learn from the folks who are doing something similar or related about how they, how they go about this. Yeah, yes. Okay, that's great. Anything else that you would have people think about as they're getting started. Partially because it was such a common topic in any data literacy conversation that I was part of. I think how to get leadership engaged is really important to think about right from the beginning, even if you are kind of starting in a more grassroots sort of way. Yeah, and we did really, we really have tried to work on getting some visibility and, and making sure that they knew that we were working on what we were working on, especially some of our bigger projects along the way just making sure they knew what was coming and that was really helpful especially with the assessment itself. That's, yeah, I think leadership buy in is going to be critical for everybody when it comes to this. So, Jane, how would you answer that question of the first few things that they should consider or questions they should ask themselves. Thank you. One of the most important things that my talk about quite a lot is that it is not a one size fits all approach and I think we've seen this in Heather's presentation, and hopefully picked it up in some of my comments. I think it is entirely wrong to think that every person inside the organization needs to have the same set of competencies and be at the same level of competency as everyone else in the organization that's just not how we function and indeed if every human person is entirely capable in the same competencies and to the same degree, we would be a very, very boring population. So what it's very important to do is to understand that and to really appreciate that different roles within the organization are going to have different needs. And so to get to the crux of that, we're then able to build and deliver programs that are relevant to the employees. I think Heather mentioned my favorite words scary. And your data literacy, sorry, sorry, Wendy that was you in your introductory comments. Data literacy, if we throw people into a classroom for two days and tell them we're going to take them through A to Z of data literacy competencies and they're going to come out at a level two. It's a very scary and very irrelevant concept to the majority of people in the organization. What we need to do is we need to understand the functions that are happening within the organization. I saw a comment earlier about the data lifecycle or the data journey understanding how each person in the organization connects with or has interactions with the data journey or the data lifecycle, and then understanding what competencies are needed to support that function. It's very, very important to me and to my team that we get as close to the person as close to the individual as we possibly can. Otherwise, we're building irrelevant, boring, perhaps too intense programs that really are missing the mark of what we're what we're ideally striving for, which is to help every person do their job with the confidence and the level of comfort in the skills that are required to do that job. Great. So I think that that is very helpful given that you, you have created a framework that looks quite structured and these abilities that are quite detailed and the levels that are quite specific. But yet, your advocacy is to help people in an individualized way, not to teach every single person according to a one size fits all structure. Absolutely correct. Got it. Got it. So, Heather, how do you make it a little less daunting? I don't think people could get the same effect as when I saw the little emoji, the analytics and lit has a little fire emoji on it. So how do you help them in the analytics squad feel a little more connected and less intimidated. I'm not very asking about that because I think it's, it's not directly related to the assessment. Well, it did inform some of the things we did with the assessment, but it, it's been a really amazing way for people across the organization and totally different roles to connect it is people. At the same time, it is people who understand at least have an understanding that data is important, but I've had people say that it's like a safe space for talking about data problems. There's a lot of transparency in that group and it, it definitely wasn't always that way but I think there's some willingness on my end to be humble and not pretend I know everything for one thing. So, to kind of give people the opportunity to try things out that they might not be willing to do in other spaces. And we actually do monthly assessments monthly survey is not really assessments it's more of a feedback survey, just asking people about the that group that month what they got from it, and just a few other questions about generally what they've been doing with data in the month and people give feedback in that survey and we respond to it when people ask for specific group topics we do them. So, it's been a really great learning process and a way to understand interactions with data across the organization. Yeah, it sounds like it's been really important to evergreen to make this as non threatening as possible and you've kind of succeeded in getting more people to be open to it by taking that approach. I think it's, I definitely think it's still a work in progress but we, we have seen some success and definitely made, made some progress and, you know, it's, it is really, you do see people mature in their data literacy in that space. And it has helped us try out some things that we could then develop into a more formal data training. Great. Great. So I'm going to field a few of the questions that in the time that we have left. Jane one question is, do you see any differences in how data abilities come out, depending on whether it's a public or a private organization. And the answer is yes. But more, more importantly, I guess, we see great variances across industries. We have found that industries that are, that are performing regulatory or very compliance based functions such as public administration, such as financial services insurance, defense health, certainly. They tend to be more, I won't say higher scoring, but they tend to be more advanced in their thinking about the need for data literacy across the organization. But over time, that will shift and we'll start to see other industries really pick up the conversation and be moving into this space but at the moment, it is very much those regulatory based and compliance based organizations that are engaging in that conversation. And by just engaging in the conversation, as I'm sure you'd appreciate, by having that language circulating through the organization, they're already heads and shoulders above their peers. A lot of what we talk about is really about connecting the dots for the individual between, you know, these scary terms that fall within the data literacy sort of framework and their jobs. So if they've already started to break that barrier between using scary terms and understanding what it is they're actually doing on a day to day basis and the types of skills that they need to do those roles. Then they're already going to be having a higher level of competency to demonstrate or to declare as part of the assessments. Yeah, it must be really interesting to go into new industries, especially industries that aren't necessarily as data rich or data oriented to try and understand what that old trajectory is going to look like. So question, Heather, there was a request, and I don't know the answer. Are you able to share your survey that you used in order to do assessment. Our organization has decided not to share the whole survey completely but I'd be happy to talk through a little bit some of the examples that I shared, I just added my email to the chat if, if anyone wanted a couple more examples or, or wanted to talk about developing their own survey a little bit more. I'd be happy to do that. And maybe what we can do since we do a follow up that Shannon shares if you wanted to compile the ones that your companies open to sharing, we could just do it there too. Great. Okay, that sounds, that sounds awesome. So one, let me see if we have time. I guess we might have time for maybe one more question. How often are you thinking about retesting or how often do you recommend Jane that people retest I'll start with Heather. Just because there can be a little bit of fatigue about things like this and we do have a couple other organizational assessments, we are, our plan currently is to do it annually. Though, I do want to tell you that we are sort of in the process of thinking through some more focused assessments to different areas of business as well. That might be other points in the year. Okay. Great. What about you, Jane, what do you recommend or is this another category of one size doesn't fit all. I would say this is definitely a candidate for the one size does not fit all. The next thing is, let's say we've done an assessment in January, and for whatever reason, there's been very little follow up in terms of, you know, so what's next. The program hasn't been implemented or it's only been implemented with a small group. There's no point going back to the organization until there has been considered activity to go towards the development of specific competencies. So if you've done an assessment but you haven't done any of the work in terms of developing different skills and competencies to fill any identified gaps, then I would say that there is no point to reassessing. Ideally, I'd love to see some sort of assessment happening on a 12 monthly or sort of 18 month basis. But again, it is very much determined by the degree of effort or the degree of focus that's gone into the development program because if we're assessing for the sake of assessing, then we're wasting people's time and I agree with Heather about fatigue that exists within organizations for these types of activities. But if we are making a considered effort to build particular skills and competencies within particular groups, then we would want to follow up to see whether they're being successful and 12 months would be a great, great instance for that. Great. Well, I think we are at time Shannon. I don't know if you have wrap up comments but again my thanks to Heather and Jane for their wonderful contributions today. I know I learned a lot from each of you. So, Shannon, I'll let you finish us up. Thank you. Thank you so much and likewise thanks to Heather and Jane for joining us today and thanks to all our attendees for being so engaged in everything we do again just a reminder. I will send a follow up email by end of day Monday for this webinar with links to the slides recording and the additional information requested. Thanks everybody. I hope you all have a great day. Thank you. Thanks everyone. Thanks Monday thanks Heather. Thanks Jane. Thank you.