 Hello, my name is Dana Piederman. I'm the Digital and Distinctive Project Manager at UCLA Library, and this is... Hello, I am Sharon Schaefer. I am the Head of Library Search and Assessment, also from the Digital Library. And we're assessing the effectiveness of a university library strategic initiative to foster data-informed decision-making. We are in the process of instilling a culture of assessment that encourages data-informed decision-making. We've been using a three-pronged approach that relies on engagement and learning and change, spreading the message and creating buy-in. As a group, ACT, or Assessment for Change team, was supported by library leaders to use public menus such as all staff library meetings. We also had meetings with individual departments and decisions to reframe assessment as a positive term, addressing head-on fears of punitive associations with the term. We educated people to understand the value of assessment for the organization and as a means of growth and potential resource capture. Two-way communication and our education and workshops. As we educated our colleagues about assessment methodologies, we listened to them too, doing needs assessment along the way that made assessment our own. We developed a tool we call Data Lake created in Confluence, which is sort of a wiki. Data Lake acts as a planning platform and an abstracting and indexing repository or inventory for data tools for assessment and total locally created reports. Most importantly for the organization, Data Lake serves to centralize assessment and its importance to the organization. But we don't know how it was really going. How far down the road to assessment had we gotten in creating a culture of assessment. In order to find out, we needed a model that would allow us to assess our assessment. Based on the fact that data informed decision-making was our goal, the logical choice was to view this from a data sciences maturity model. So a maturity model, what is it? Well, simply put, a maturity model is an aid for an enterprise to understand their current and target states in a particular competency. Think of maturity model as a roadmap like Dana has said. For the library, we wanted to look at our competency using data informed decision-making. We had a need, a challenge, a solution, our need. What practices should we measure? For example, the practice of alignment. For alignment, follow organizational values and roles, goals. We had a challenge. Literature review revealed many types of maturity models, but not one was a good fit. Most maturity models found use levels of competency based on the consistent application of behaviors that range from least database to most data driven. But the models did not use examples that crossed over well into the library setting. In addition, the connection to data driven decision short changes what libraries value more, which is data informed decision-making. We had a solution. We created a maturity model for informed decision-making in the library. We'll look at the maturity model that we came up with in an upcoming slide. We used our maturity model to look at these multiple sources of data in order to create a larger picture of our place on the path to a culture of assessment. We created a survey to capture the thoughts of the organization as a whole. We interviewed library leaders to get an impression based on their experiences where we are in developing a culture of assessment. We looked over the products that have been created over the past two years, plans for assessments, and the reports that have been created showing the outcomes and recommendations of assessment. And we judge their quality using two simple rubrics. As we reviewed assessment plans and reports, we recorded our impressions about their quality both from the forms themselves and from direct observations and consultations as we worked with library colleagues. We sent out a survey to all library staff. The survey questions measured where we are on the maturity model for data informed decision-making. Let's go ahead, Dana, and switch to the maturity model that we created and used. Here is the maturity model that we created and used. We have three columns. Column one, the practice being measured. Remember the example practice of alignment that I mentioned. Column two is the definition. So if we scroll down to alignment, that's the practice that we're trying to, and then the column two is follows organizational values and goals. That's the description of alignment. The column three is the associated survey questions to measure the practice of alignment. And you can see a question there that was on the survey. The scale of one to five was used in many of these survey questions. One would be strongly agree where a five would be strongly disagree. Dana, if we could come back to the slide. This chart shows maturity model practices and where their associated survey responses skewed. Responses could skew left as in strongly agree, which means we're a bit further along on our maturity model roadmap. Responses could be symmetrical in the middle column, which is a sort of meh, being a score of three on a one to five. Think of it like a C. Responses could skew right. We strongly disagree, which means we're not as far along on the maturity model roadmap as we'd like to be. Our recommendation is to concentrate our efforts on those practices that had responses that skewed to the right. So let's dive a little bit deeper into this. So as we take a look at the questions, what we tend to define was that people liked their units better than they like to division the library as a whole. So they tended tended to trend more towards a positive feeling towards their unit. Just taking a look at the data here. You can see the types of questions that we got that were positive included UCLA library workers and my unit asked questions of data to guide their work towards fulfilling unit goals. A more negative response would be UCLA library allocates resources by making data informed decisions. So you can see by these blue and yellow columns, what tended to be positive and what tended to be negative as we did quick analysis of those things that were of value. In the all library staff survey. We also asked the open ended text question. Do you have any suggestions for prioritizing areas of assessment in the UCLA library over the next two years. The responses of the responses led to organization but category and subcategories, which were then crosswalk to priority areas of assessment for UCLA library to concentrate on note one trend in the responses. It was a desire for leadership staff are looking for a prioritization of assessment projects and the rationale supporting decisions. We reviewed six library leaders among different divisions and widely different perspectives and asked them to basic questions about how in the past two years data informed decision making had changed. One interesting quote had to do with transparency that I thought I should highlight here. After years spent in academia it's clear that transparency outside the library means obvious and simplistic transparency has value in context where library is not trusted. It's not clear to me that the more we try to be transparent, the more we are trusted. We did a review of assessments that were conducted or planned by library staff. We used a rubric with two parameters. Did they understand what they wanted to do in that assessment. Did they identify appropriate data for what they wanted to do. We also observed watch the staff filling out the assessment forms are findings, not too bad. But we did note a lot of assistance was needed to clearly state an assessment users also had trouble identifying appropriate data. We also took a look at reports and these are reports that show recommendations based on an assessment or outcomes and what we found is that people tended not to do as well on reports as they did on assessments. There was often a confusion of data with report and often a weak connection to any specific assessment goal. The findings from the survey. Here are the findings from the staff survey responses on our roadmap to foster data informed decision making. We need to concentrate on improving these characteristics here. We heard a great deal about making decisions but the library as a whole, make those decisions more transparent, especially supplying appropriate rationale for decisions. There was also the trend where unit level actions were more positively viewed as opposed to all library level actions. A recommendation would be to continue our centralization efforts. When we interviewed the library leaders we categorized all of our findings among our particular practices. And what we found was that primary concern to the library leaders was the idea of collaboration campus wide and use system wide. Primarily actually in the area of readiness and even as to resource allocation and transparency. It had a lot to do with data, being able to analyze data, being able to use data and be able to have data right on hand at the point of need. Our findings, we need to continue our work with library staff to learn how to specify an assessment clearly, gather and use appropriate data, communicate outcomes, or recommendations understand the definitions of data report and assessment and harmonization efforts surrounding data informed decision making. So we we welcome all your questions. Our names Dana Peterman Sharon Schaefer contact us here or in the web.