 Good morning, and welcome to this week's edition of Encompass Live. I am your host, Krista Porter, here at the Nebraska Library Commission. Encompass Live is the commission's weekly webinar series where we cover a variety of topics that may be of interest to libraries. The show is broadcast live every Wednesday morning at 10 a.m. Central Time. But if you're unable to join us on Wednesdays, that's okay. We do record the show every week, and it is then posted onto our website for you to watch later at your convenience. And I will show you at the end of today's show where you can access our recordings. Both the live show and the archives are free and open to anyone to watch. So please do share with your friends, family, neighbors, colleagues, anyone you think might be interested in any of the topics we have on the show. We do a mixture of things here on Encompass Live, book reviews, interviews, mini training sessions, demos of services and products. Really, our only criteria that is something for libraries, something libraries are doing, something libraries, we think libraries should be doing or could be doing, new services and products we want to share. Here at the Nebraska Library Commission, we are the state agency for all types of libraries in the state, for all libraries. So you will find things on our show for public libraries, academics, K-12, museums, corrections, so if it's got a library, it's anything library related, we will have something that will cover it. We do have some of our shows are presented by Nebraska Library Commission staff for things that we are doing here at the commission, but we do bring in guest speakers, and that is what we are doing this morning. On the line with us is Sarah Gouk. Is that how you pronounce that correctly? I should have asked. No, but that's okay. Sorry, Gouk. Sorry, Gouk. Sorry, Gouk. Okay, and she is just a little north, I'm here in Lincoln, Nebraska. She's a little northeast, she's up in Chicago at ACRL, Association of College and Research Libraries, and she's going to talk to us about project outcome for academic libraries. A couple of years ago, I believe it was in 2017, we did a session, a show about project outcome for public libraries, the original program, and it has now been, we now have a version of it specific for academics, so I'll just hand it over to you, Sarah, to go ahead and tell us all about it. Okay, thank you, Christa, and thanks for the invitation to present today. So my name is Sarah Gouk. I am Mellon, ACLS Public Fellow and Program Manager at ACRL, and today's webinar is Outcome Measurement Made Easy with Project Outcome for Academic Libraries. So if you haven't had a chance already, everything I'll be talking about today is on the Project Outcome Toolkit. So the website is acrl.projectoutcome.org. It should be quick and easy for you to sign up, but if you experience any problems in the registration process, just send us an email at acrl.projectoutcome.org and we can get that sorted out for you. So out of curiosity, I'd like to start by asking those of you who are attending live, whether you were familiar with Project Outcome before you decided to attend today's webinar. So you have a little raise hand icon in the go-to webinar screen. If you would just click on that and let us know. We have lots of hands going up. Anybody else? I can actually see how many. Yeah, about half of the people who are logged into this live right now have heard of it before. OK, great. Well, whether you have or not, hopefully you'll get something useful out of today's session to take away with you and to incorporate into your work. So today's webinar is really about sharing how Project Outcome can help you measure learning outcomes, introducing you to what is in the toolkit for you to use, and also sharing some advice about how to put the data that you collect to work for improving the work you do in your library. And I want to make sure I answer any questions throughout the session as well. So just to give you an overview, this is what we'll be covering. I will pause at a few points during the session to take questions. If you have any questions, please type in the chat or as Krista said, you're welcome to turn on your microphone and use that as well. So there'll be plenty of opportunities for you to ask questions throughout today's session. So this is what the home page looks like if you're visiting the site for the first time. Just to make sure we're all on the same page, Project Outcome for Academic Libraries is a free toolkit for academic and research libraries and LIS students. And it's designed to help academic libraries understand and share the impact of essential library services and programs. It provides simple surveys and an easy to use process for measuring and analyzing outcomes. It also provides libraries with resources and training support to apply results and confidently advocate for the library's future. And one thing I want to emphasize is that this toolkit is really for everyone, it's not only for people with assessment in their job titles or description. So hopefully whatever your role is within the library, you'll find something here that is useful to you. ACRL's Project Outcome for Academic Libraries just launched this past April. It's still pretty new and it's based on a model developed and tested by the Public Library Association, which has been in the field since 2015. So we're really building on their knowledge and experience. PLA currently has over 200,000 responses in the system. As you can see, we're just over about 1200 at this point. So we'll get there, but this is still a pretty new tool. So to start to set some context for what Project Outcome does, it's about measuring impact and traditionally libraries have collected a lot of data around usage. And also relied on things like intuition or the trust of the community that has been placed in libraries. But in this day and age, that's really not enough that there's constantly the need for more data and more evidence to show the impact of the library. The challenge, of course, is that this takes time. It takes expertise and resources. And often there can be a lack of understanding around what it really matters to measure and why. And so this is what these are some of the challenges that Project Outcome is meant to help you address. And so what we're talking about with the data that you collect in this toolkit is outcome measurement. So outcomes for the purposes of Project Outcome are defined as a specific benefit from a library, program or service that can be quantitative or qualitative. And outcomes are expressed as changes an individual perceives in themselves. It answers the question, what good did we do? Or in other words, how have the learners or users in the library been changed as a result of our interactions? And outcomes overall should be meaningful, achievable, observable and actionable. So that is what Project Outcome is intended to measure. So those other traditional sort of metrics that you may already be measuring, you can still keep doing those. We're trying to push the envelope a little bit and get at the learning and the benefits of library programs and services. So this is really one piece of the assessment puzzle, as it were that needs assessment, satisfaction surveys or other ways of looking at patron satisfaction outputs. Those are all still things that you can do, but outcomes are a really key part in terms of looking at the benefit and the learning that happens in the library. And this is where Project Outcome comes in. So the place that we usually start in workshops and presentations is here, it's with the desired end result. What do you want to be able to do? What sort of data do you need to serve a particular need within your library? And we've seen from PLA the evidence from public libraries as well as some early evidence from ACRL that this data can be put to put to use in all sorts of ways. It can help you get funding both internally and externally. It can help with programming decisions, making improvements or adding new programs. It can help develop partnerships again across the library or with external partners, maybe at your campus or with other libraries or institutions outside. And it can help with advocacy in general for the value and the thing, the impact of the library. So one thing we really want to challenge people to do is to think about this from the start, that the Project Outcome model isn't just about collecting data for the sake of data, but about taking action with that data and making improvements to library programs and services. OK, so what are you actually measuring? So again, we know that within libraries that assessment matters. We know that talking about learning matters, being able to talk about the value and the impact of the library. But what we're serving with Project Outcome is providing this consistent and convenient way to measure those things that rather than duplicating effort at every single institution, we're providing some standardized tools that you can then adapt to your specific needs. So within the Project Outcome Toolkit, we provide quick and simple surveys and easy to use survey management portal, ready-made and customizable data reports, interactive data dashboards that allow you to visualize that data, benchmarks so you can see how you're doing compared to your peers, resources and training materials, and a peer discussion board. And this combination of ready-to-go tools and easy-to-use processes for analyzing and measuring outcomes can really help library staff save time and energy in terms of planning data collection. And that leaves more time than for you to actually do something with that data to make decisions to make changes. And to reiterate again, all of these tools and resources are available on the site and they're all free for academic and research libraries and LIS students. You don't have to be an ACL or ALA member to use these tools. So the Public Library Project Outcome Model had seven surveys and when ACL decided to adapt this model for academic libraries, we also decided to keep that number consistent. So a task force was appointed to develop these new surveys and had to choose seven topics, which is hard given the range of things that libraries do to narrow it down to seven things that could cover really as many broad programs and services as possible. So these are the seven survey topic areas that we landed on in the end and they are meant to cut across the broadest possible range of library roles and services and programs and also users. So they're not only about administering surveys to undergraduate students, they can also be used for graduate students or faculty or even public events as well. So anyone that comes into your library, hopefully there is something here that could be relevant to the sort of interaction that you have with them. So these surveys were developed by a task force of librarians for, so developed by the field, they're developed for the field, they were field tested extensively last year by 54 different institutions across the country and we got feedback from all of those that went into these final surveys that are available now in the toolkit. So for each of these surveys, you're measuring four outcomes, knowledge, confidence, application or behavior change and awareness. So the surveys have questions that get at those four outcomes as well as that. There are two open-ended questions on each of the surveys that ask, what did patrons like most and what can the library do to improve? The language varies slightly from survey to survey. So that's kind of the standard model, the standard outcomes to give you an idea of what that looks like. This is the instruction survey. It's the most popular that we've seen so far. You can see the words in bold there where the survey questions align with those four outcomes and then the two open-ended questions as well. So in addition to the standardized questions that are set for each of the survey topics, you can also add up to three custom questions based on your particular needs at the time. It's also important to point out that when you're administering these surveys that you set the context so that when you send a survey to users or when you hand it out at the end of a session, for example, they know what it is you're asking them about. You're setting that context. You're saying, please fill out the survey about today's instruction session, for example. So even though the questions sometimes can seem a little bit general, users will be aware of what it is that you're asking them to evaluate and to report on. So are there any questions at this point before I get into more details about what is in the toolkit? Yeah, let's see. Yes, if anybody has any questions, go ahead and use your questions section of your GoToWebinar interface, or if you want to use your microphone, just say you have a microphone and you prefer to use that. Nobody's typed in anything yet, but I am keeping an eye on it. So whenever you do think of anything, just go right ahead and type something in there and I'll grab it for Sarah as appropriate. I just want to say I was glad to hear when you said that there's options for personalizing some of the questions on this survey that I think is very important for being able to, I think it's something sometimes that some people do have concerns about with these kind of pre-made surveys or outcome measurement things, is how can I make sure that it really does relate to what I'm doing in my library? Yes, yeah. Customization is very important. Okay, well, not seeing any questions at this point. We'll go full steam ahead and take another pause in a few minutes if anyone thinks of anything as we go along. Great. Okay, so in terms of the toolkit itself, again, this is the homepage you'll see when you go to log in. Everything I'm talking about is there. Again, to reiterate, if you work in a library or are an LIS student, you have free, full access to everything. That is really like 99% of our users, but if you fall into one of those other groups, there are a few caveats, but for the most part, everyone has access to all the resources and discussion and survey tools that we are talking about. So the survey tools, so those seven survey topics, there are two types of surveys available for each of them, the immediate surveys and follow-up surveys. Immediate surveys are designed to be used as the name suggests immediately after completion of a program or service. They can give you just an instant snapshot that you can use to make improvements or changes and use for reporting and advocacy. The follow-up surveys, we say four to eight weeks later, it's really up to you. They're designed to look at adoption rather than immediate learning. So the immediate survey, you're asking, what did you learn today? The follow-up survey, you're asking, remember that program that you did a month ago, have you used what you learned? They can be really useful for informing internal planning, measuring progress towards strategic goals and just providing further evidence for advocacy. We also provide outcome measurement guidelines, and these are in the resources in the toolkit, and these are intended to help you look beyond project outcomes. So once you get started doing outcome measurement, you might wanna do more things, use other methods, look more at long-term impact, and so there are resources there to help you do that. To give you a sense of what that looks like, this is an immediate survey. The four quantitative questions are all based on a Likert scale, and you see those open-ended questions as well. They don't take a whole lot of time to administer. All you have to do is plan to create the survey before your program and then either share a paper version or send out a URL afterwards. So pretty straightforward and easy to administer. The follow-up surveys, slightly different format. So they are five questions and yes, no format rather than a Likert scale. Those same two types of open-ended questions and again intended to be administered after the fact, after some time has elapsed. They do take a little bit more time in planning because you have to have a way to get in touch with your patrons later. So usually we recommend that you collect contact information during the program separately. So even if you're using both the immediate and follow-up surveys, the project outcome surveys are all anonymous and confidential. So we'd ask that you collect, have a contact sheet where you might collect email addresses, for example, and then plan to send out a link to the survey later on to administer a follow-up survey. The open-ended responses we've seen being particularly valuable to libraries. This is a snapshot of the sorts of comments that you'll see, particularly comments that highlight outcomes. So those same outcomes that are asked about in the quantitative questions, you will see popping up in the open-ended questions. Patrons have been sort of prompted to think about those things. And so those come up then again, even when they're writing just freehand. So in green on the left are responses to the what did you like most about this program or service question. These are actual responses from field testing of the instruction survey. And the most commonly used word there was resources that students like the resources that they are being introduced to in the instruction sessions. And in purple on the right are responses to the what else could the library do to help you succeed in your classes. And the most commonly used words here overall were more and nothing. And you see that reflected in the comments here that students are asking for more resources, more hours, just more of what the library is already doing. Or they're saying, it's great, you're already doing everything I need. I don't need anything else, which is nice to hear sometimes too. The outcome measurement guidelines, again, are designed to help you expand beyond the tools offered in project outcome once you've gotten started to develop your own outcome measures. These could be additional questions, custom questions that you're adding to the surveys, as well as your own completely new tools to implement other data collection methods. There's some advice, for example, on focus groups to measure outcome data over time and to develop strategies for working with partners on outcome measurement projects. These could be, again, partners within your institution or it might be across others. And as with all of the resources that we provide in the toolkit, if there are other things that you have found helpful and you think your peers would also find helpful, you could share that in the peer discussion board or send us an email. Or if there are things that aren't there that you would like advice on, send us that feedback as well. And we can always revise and adapt these. We want them to be useful to you. So you get to everything from this homepage. When you log in, this is what you'll see. This is the account homepage and it's the portal to all the aspects of the toolkit. I will say where the green arrow is pointing at the top that account link. You'll wanna check that when you log in, we did pre-populate the toolkit with lists of institution and libraries, but the library data isn't 100% so you might need to check whether the libraries that are listed at your institution are correct. If they're not, you can edit or add libraries at your institution yourself just from that account management page. You'll also be able to see any other users at your institution that are registered from that page. The resources are all here. Again, if there's something that you feel is missing or that you think we should add, definitely let us know. You do have to be registered to get to these resources, but they can be revised and updated constantly. Okay, so the more exciting parts of the toolkit, this is the survey management tool. This is where you'll go to create and access all of your surveys. You customize your surveys program information, including program name and date. That can be important because if you run the same type of program frequently and you use the same name for it, you can then collate all of that data by program name. You can also track program attendance and survey response rates. If you are administering paper surveys, you go here to enter the survey responses and you can get to raw data from the survey management tool as well. There's just this quick and easy wizard that helps you set up a survey. It walks you through all the steps and information that you need to add. You can't delete individual survey responses, but you can delete entire surveys. So if you just want to try out creating a survey, you can do that and you'd be able to delete it later on if you're just familiarizing yourself with the tool. So the data dashboards are the really fun part here. This is where once you've gone to the trouble of administering a survey and entering your data, you get to do cool things with the results. So there are a bunch of different dashboards that give you your data in different ways to review and to analyze. All of them are interactive and all of them, you can save either as a screenshot or you can print to a PDF to insert it into other documents or reports that you might be working on. So what you see here at the top left, the overview dashboard shows your aggregate average scores across all survey types and across all outcomes. So each survey topic and outcome shows comparisons then of your institution's average score versus others in your Carnegie class and the national average overalls. This is all other users in the system that are, if you're out of community college, then that your Carnegie class is other community colleges versus all project outcome users nationwide. The detail dashboard at the bottom is where you'll see data for each individual survey question and it's organized by survey. You can also filter your results. All of these dashboards have filters at the top so you can see the immediate or follow-up surveys. You can filter by outcome, by topic, by program name and date. Over on the right is the matrix dashboard. This one sometimes scares people a little bit. It looks very complicated but it is just showing the relationship between the survey topic and the outcomes. And again, since these are interactive, if you mouse over some of these, it will give you more detail about what it is that you are looking at. In addition, there's a mapping functionality that you can, if you have multiple library locations at your institution, for example, if you've entered addresses for those, they will show up on a map and there is a page that shows you just some basic information about your institution. So if you're not sure what Carnegie class you're in, for example, you could take a look there. Okay, as well as the data dashboards, there's a report builder. So there's two types of reports. The easiest, most straightforward is the summary reports, that's at the top. You can actually create a summary report directly from the survey management tool. It's just usually two pages, quick and easy to share, gives you a summary of the quantitative results from your survey. You can also add the open-ended responses to the bottom of the report. And it just gives you that snapshot if you need to get the big takeaway from a particular survey that you've administered. You can also go through the process of creating a custom report if there are particular things you want to focus on or filters that you want to apply. And you get to the report builder from within the data dashboard. And again, like the survey management tool, it's got this handy little wizard that walks you through the process. So overall benefits of the toolkit, the short and simple surveys mean higher response rates. One thing that we have seen is that the more questions there are the less likely people are to do the survey or to complete the survey in the first place. It can give you good snapshot data that can help you make immediate improvements and changes. The open-ended comments can be really, really valuable. Sometimes things come up that you wouldn't have anticipated or maybe weren't even asking about in the first place. The standardized outcome measures mean you don't have to write new questions, new surveys every single time. It also means that you get these aggregate national benchmarks, which can be useful in terms of seeing how you're doing compared to your peers. There are ready-made reports and data visualizations that do a lot of the hard work for you. You don't have to fight with Excel for hours on end. You can work at your own pace, pick and choose surveys based on programs, capacity, learning objectives, based on what you need, start small and then go from there. And as you get more familiar and develop what you need to a greater degree, you can customize a lot as well to add context, add custom questions and create reports that highlight the information that you want to highlight for your purposes. Okay, so that's a lot of information about the functionality of the toolkit. So are there any questions at this point about how it works or what you can do in the tool? Let's see, does anybody have any questions? Go ahead and type them into the question section of your go-to webinar interface. Mr. Binder, if you do have a microphone, just type. I have a microphone, please unmute me and I can unmute you on your side. If you do have any questions, I did see some people did join us after the show started. So if you weren't aware, you can go ahead and do that. Just type in the questions and I'll grab that for you. That was a lot, and actually I thought it was kind of cool that there were so many things in there. I think somebody just posted the same thing. No question, but this is so cool and helpful. Well, that's great. Always good to hear that too. Yeah, I'm looking at all the different things that you can evaluate and focus on. And I'm just like, yes, I need to do that. I want to look at that and I want to look at that one. It is meant to be pretty easy to use as well. So I would encourage you all just to sign in and to click around and see what's there. It's, I think, pretty intuitive, but... Yeah, and I think it's actually about being just something free for everyone to use. Yes. You don't have to be an ACL member or anything like that or ALA, just it's there out there for everybody. But I also did notice also that you mentioned a little while ago too that it doesn't only include libraries who are using it as part of the comparisons either. You're bringing in just any all information that can get and gather. No, so the benchmarks are against aggregate data that's in the system. Ah, okay. So you're talking about the ones I was getting confused about then, I guess. Yeah, so any, like, if we have 100 community colleges using Project Outcome for Academic Libraries, then one particular one is benchmarking against those 99 others. That are similar to them, but, yeah. Yes, yeah. And it will get, it will get more useful over time. Obviously, there's not a ton of data in the system just yet, but in the future, hopefully. Like, it just went live with its final version when? In April. This year, okay, yeah, so. Two months ago. We'll just get started, definitely. All right, doesn't look like anybody has any questions I typed in at the moment while you're chatting, so let's just go ahead, type in whatever you think of anything, and we'll grab your questions. Okay, we will go ahead. So I want to walk you through a few steps in the outcome measurement process that can help you implement this toolkit at your library. So one of the resources that we have available on the site is this roadmap, and this is specific to Project Outcome. It has kind of general things that you'll want to do to set the context, but also the things that you can do within the toolkit, for example, what resources you might want to look at, you know, when you need to customize the surveys or how you can customize the surveys and what you can do within the tool. So this can be useful if you just want to kind of walk through the steps of actually administering a survey and using the results within the toolkit. But one thing this is maybe a little bit misleading on or doesn't quite get to is the fact that outcome measurement is really an iterative process. It shouldn't just be this linear thing that you'll want to go through it, you know, maybe the first time you're just administering one survey, one type of program. You want to see how it goes. But then once you get to the end, once you've reviewed your results and done something with that data, to think about what you did before you're doing the next, the next survey or the next, before you set your next goal. And so to really think about it as a learning process for you within your library, as well as a data collection process, if that makes sense. So as you go through all of these steps, there are a lot of resources in the toolkit that can help you in each of these areas. Everything from choosing the right survey to data collection methods, informed consent. If you are worried about talking to patrons or to your institutional review board, we have some advice there on that. And then analyzing that data and putting it to use for advocacy as well. There are resources that can help you with all of those things. So I want to share especially a few case studies. Were there any more questions before I get into those? Actually, we did have some that came in. Okay, great. Let's see here. So first up again, so I just wanted to confirm the aggregate benchmark data is for similar institutions. It's both. It's other institutions within your Carnegie class and then all institutions nationwide. So you get two benchmarks. Look at both. Yes. Okay, so it looks very interesting, but we're very concerned about security and confidentiality at our institutions. At our institution. Without having any specific concerns, I could list at the moment, do you have any response about the security and confidentiality of the platform? Well, first of all, we do not collect any personal data. So it's all the survey data that you are collecting is anonymous and confidential. We specifically say, do not collect personally identifiable information within the system. We are also complying with all the new GDPR requirements around users of the system as well. There are more resources here, for example, in terms of looking at informed consent and protecting privacy that has more guidance. If that is something that you are specifically concerned about, I would suggest that you look at those. Great. That's very important, definitely. And then someone wants to know, is it possible to create a visualization object using data collected outside of the survey tools? No. They're bringing in other information they've collected, I guess. No, the dashboards visualize the data that is collected in the toolkit. Right. So you'd have to somehow get that information into there. No, if you're using different questions, you can't, it would corrupt the data there if you were to enter it into these existing surveys. So it's just a visualization for what is available in the tool. Right, right. Looks like that's everything for the moment. As you guys think of questions, go ahead and type them in. Okay, thanks. So we had, as I mentioned, 54 institutions participate in field testing the initial versions of the surveys. And so I want to share some examples of what they did, what they were evaluating and how they put that data to use for a couple of different types of surveys. So the space survey wasn't an interesting one. The task force thought it was really important to include space as a topic, but it was difficult to get at the learning outcomes of space. But what we saw from the field testing phase is one of the primary uses of the survey was to look at how students are using group study rooms for their learning and for study within the library. So just examples from three different types of libraries. So Iowa State University, one of the task force members, Greg Davis, participated and administered the space survey. Again, looking at group study rooms for they're quite a large library and they were in the middle of planning for renovations. So they were looking at how students were using the study rooms, the value of the study rooms. So these are the results, these were, this was before the visualizations were available in the toolkit. So these were visualized using Tableau. But you can see that overall the results are fairly positive. A lot of agree and strongly agree kind of in green there. And they use these results, the quantitative results and the comments to support the decision to include more group study rooms in their renovation process. They also found that using these surveys helped them build partnerships kind of across the library to engage more people in the assessment process. Often assessment can be kind of concentrated in one person sometimes even. And so this helps them kind of get more people, more library staff engaged. And they are as they, I think they're going through the renovation process this summer and those new study rooms and the facilities provided with the study rooms that was informed by their results from these surveys. Nevada State College, another task force member Tiffany Garrett participated and use the survey to look at space. Their space is kind of unusual. They do have a physical library but they do not have physical collections. But they also were looking at the students use of group study rooms. Again, fairly positive results overall you can see the average scores there for each of the quantitative questions. But they did a couple of things with this. One was to look at that last question, the awareness outcome and realize that that was the lowest score of the four. And so they said, hmm, I wonder why that is and realized that they don't have a whole lot of signage in their library. And so one of the things they've done having seen that result is to experiment with using the screens in the group study rooms when they're not in use to have kind of a loop slide show about other library resources and services that will play. So students that are using the group study room might see that information about the other things they can do, other library resources and services available to them. Another thing that they realized as a result of these surveys was that students were reserving the group study rooms, not for group study but for individual study. And the reason for that they said in their open-ended comments was because the group study rooms are quieter than the rest of the library. The library is positioned on the second floor of the campus building. On the first floor, I believe there's a student common area and a lot of noise was coming up the stairs into the library space. And so students were reserving those group study rooms to have a quieter place to study. So they took that evidence and applied for funding to get additional sound proofing for the group study rooms themselves. And they're also applied for funding to put in more of an entryway and barrier into the library to help block some of that sound. So it was really actionable evidence that they were able to take and to implement changes as a result. So to give you a sense, this is what that same data looks like in the summary report, visualized in the summary report. Another example from a community college, Central Piedmont Community College in North Carolina, they also use the space survey to look at group study rooms. As a result, it led to changes in their group study room policies. And it's also informing their process of designing a new library, looking at how students are using the study rooms, what technology or other facilities in the rooms they find most useful, for example, and helping to then design a new space that will meet those needs. Okay, one of the other surveys is the library's technology survey and field testers. We saw using this survey to assess a few different things, particularly equipment checkouts that were available from the library and also shared technology within the library itself. So Iowa State, again, this was one of the surveys that they implemented in the field testing phase, not quite as positive results for technology as for space, but nonetheless, these supported decision-making actions, including seeking and receiving additional funding for more laptops to check out and more software to be available on the public computers in the library. Central Piedmont Community College as well used this to look at how students are using the circulating laptops and the benefits they got from that. And overall, this example I quite like because there was one comment that came up that they didn't anticipate, but a student said that there were problems with how Adobe Acrobat was integrating with the browser Google Chrome. And it just wasn't working properly. And obviously the staff didn't realize that because you're not using the same laptops that the students are using to check out. And so this really alerted them to the problem. They were able to take it to their IT person and to get it fixed. And it was a really quick, easy fix, but without having that comment, without knowing that, without asking students how the technology is working for them, they wouldn't have known. So it's just a really small thing, but a good piece of actionable evidence that came out of these surveys. So a couple of these case studies are available in the resources in the toolkit and we'll be adding more as we start to see more evidence of how people are using the surveys and the benefits of that. So any more questions? Let's see. Yes, we have one here that just came in. Let's see. The data that is compiled from the surveys, do the reports stay in the dashboard for a long period of time to be used later at a budget or department meeting, for example? So basically how long, I guess, does it keep the data that's in there, that's been entered? The data stays there forever. Yeah. Great. You don't get deleted yourself for some reason. Yes, right. There's no expiration on any of that. Yeah. No, no. Even if you've closed the survey, the data is still there. Oh, the data is still there. You would have to actually delete the survey for the data to go away, but we don't do that for any reason. So the data is always there for whenever you need it. And then over time as well, that means you can compile all those results. And if you decide later, we wanna look at this in a different way now, that same data, six months from now you realize, wait, we should have focused on whatever, you can create a different visualization of it. Yeah, a date is one of the filters available in the dashboard. So you can filter by date, look at last semester, and then look at this semester, whatever you need. Cool. And we have another question that just came in. Once you create a data, how do the students and faculty access it? Is it a URL you give out or posts? How does that work? So all project outcome users or any library staff can sign up for the toolkit and they share basically an institutional account. So you'll see activity at your institution if there are multiple people signed up at a particular institution in the survey management tool, it will tell you who created a particular survey, for example. If you want to share with people outside the library, you can't unfortunately at this point just send a link to a visualization, but you can like take a screenshot, print to PDF, create a report. There are a few different options for ways you could share those visualizations unfortunately, but not a live link as currently it works. Okay, great. Any other questions right now? Let's continue. I'll keep an eye on the questions. Okay, thanks. We're pretty close to the end now anyways, but just one final piece that's important to emphasize. As I mentioned at the beginning, it's really important to think from the start about how you want to use these results. This is not about just collecting data for the sake of data, but getting actionable evidence. So there's a few things you can think about, program improvements. So since you get those results immediately, you can make immediate changes. For example, I've been running a series of workshops on the toolkit. So every time we look at the, we do a survey at the end and look at those and say, okay, what do we need to change? Are there any questions that we should address upfront the next time? So it gives you some evidence that you can put to use right away. Also, in the long term, maybe you're looking at how a particular program went this semester and then making changes the next semester. Strategic planning. So you can relate these outcome measurement goals to your strategic goals or your institutional initiatives. I think more long-term are focused on key areas of service, things that your library is really trying to impact at an institutional level and the programs and services that can help you achieve those goals. Communication and advocacy. So using the summary reports, the visualizations to help you share clear and concise messages about the library's impact on patrons. And you can also then tailor what you're sharing for different stakeholders. Maybe it's your library administration. Maybe it's your college or university administrators, faculty, students and so on. We have seen that the fact that we're encouraging a whole lot of people to be engaged in the outcome measurement process has helped with building partnerships that might be within the library. It might be across campus or even with the wider community with your peers in higher education or with your local communities. And of course, everyone wants more funding and so having these results that you can showcase can help you in funding bids to expand support for library programs and services or to apply for grants from external funders. Okay, so that is about the end of what I have. I wanna make sure though that I have addressed any other questions you have for how you're gonna implement the toolkit or how it can be used at your library. So if you have any final questions before we wrap up, please share those in the chat. If you think of something after today's session, again, the peer discussion board is available to all registered users. We keep an eye on that. So if there's a technical question that we as staff need to answer, we can answer it there and then others would be able to see that response as well. Or you can send us an email at acrl at project.com.org. I will just pause for a moment to let you type any questions you might have. Yeah, great. Yeah, anyone has any questions about project outcome for academic libraries? Get it into the questions section now. Someone does have a question, not necessarily related to it. She wants to know if you are the person to contact if another group wants to sponsor a webinar like this one. Yes. Yeah, my email address is on the A3L website or if you send an email to the project outcome email address, I can answer from there as well. I don't put my personal email usually on these presentations because for any technical questions, it's better to just email the project outcome address than one of my colleagues might be faster about getting to it than I will be. Right. But you could use that to get into there as well if you want to have her do something specifically for your university. Yes. And we are still, we're offering in-person workshops kind of all over the country over the next about 12 months. So there's a list on the events page on the website if you want to see if there's one near you or sponsor another one. Yeah. So any other questions people have before we do wrap up the presentation? There's, yeah, there you go. Good ways to keep track of what they are doing. Facebook and Twitter. I've got them following them on there. Anything else you just really want to ask us, sir, of why you have her here? Now's your chance. Yeah. Otherwise, of course, you can reach out to her at any time when you start getting into using this, the project outcome. I would want to know from anyone here, you can do the hand raising again. I know some of you said you are familiar with it. Has anybody who's on the show today actually used it yet? Have you gotten in there and started creating your information in there or any surveys? Can you use the hand raising feature in the GoToWebinar or type in any comments you have about it? I guess not. Hey, we might have a bunch of new users then. Great. Yeah. Summer is probably not the best time to be starting, but hopefully in the start of the fall semester we'll start seeing more. Sure. And then like I said, this shall be recorded and everyone will have access to it and all the other resources on the project, ACRL, Project Outcome website. Maybe that might be something to do. I'll just think about that as you mentioned, just getting started now. Come back in a year or so. Have you back on again to do an update on how things have been going? Sure. I'll be happy to. Do updates changes to the interface into the system and we'll see about that. So maybe. All right. It doesn't look like anybody's got any desperate questions they're typing in right now to ask of you. So I think we will wrap it up for today. Thank you everyone for attending. Thank you so much, Sarah, for this is great information. Like I said, we had done a previous session about the when it first came out for the public libraries and I'm glad now that academics have a version of this. I mean, I suppose I could have tried to use the public one, but I know from working in one that things can be very different from public to academic, of course. Yes. Well, thank you for having me. I believe Krista is going to send out a survey link, of course, given the topic of today's presentation. I think you had to see that coming. But thank you all for participating today and to keep in touch and let me know if you have any other questions. Awesome. Great. All right. I'm going to pull over present your control to my screen now to. There we go. All right. So. I'll wrap it up for today's show. This is the session page for today's show, on our Encompass Live website, which I'll go back to here. The archive, the recording will be available here. This is our main website for the show. These are upcoming sessions coming up, but here is where the archive will be. And I've already started working on the particular page for today's show. And this is where it will be. Once we go to webinar has to process this, then I upload it to YouTube, then there will be the link here to go to the recording of it. I've already added a link here that goes to the presentation. So if you do want to access and look at the slides that Sarah had today, they're up there via the Nebraska Library Commission slide share account. As soon as the recording is ready, I will email everyone who attended today and everyone who has registered this morning for the show, letting you know when it is ready for you to look at. It will also be posted to our various social media. We have a Twitter account and Facebook, all the usual places. Our mailing list we have here for Nebraska libraries as well. While I'm here on the archive page, I will just show you about our archives. You can see we've got a search feature here at the top. So if you are looking for any other topics you've had in the show, you can search, you can search the full archives or just most recent 12 months if you want to. This is 2019 is the 11th year of Encompass Live. So we have a lot of shows. The archives are really full here. But we are librarians. So we save everything for historical and archival purposes. So our old shows will always be on here. So you can see if you go back far here, you can see everything has a date. So when you are looking at our archives and watching an older show, pay attention to the date when it was originally broadcast. Some of the things here, some of the services and products might not exist anymore. Some links may have changed. The way a service works might have changed since the original broadcast. So just pay attention. Some of the shows will last forever. They'll always be of interest. But pay attention to the date just in case when it was originally shown. When you are viewing our archives there. Encompass Live does also have a Facebook page. And I've got links from our main page and from each of our session pages. I've got our site over here. So if you are a big user of Facebook, give us a like over there. We do post reminders. Here's your reminder log into today's show. Any of our previous shows we announced here, letting people know when last week's recording was available. Give us a link, a like over there. And you'll keep up with how we, what we're doing on Encompass Live. So I'll hope you join us for next week's show as well, which will be a very Nebraska-centric show, but might be of interest to other states as well. The Golden Sower Award. This is our Nebraska's Children's Choice Literary Award. The kids in Nebraska choose their own award winners for this. And we will have Sally Snyder, who is our coordinator of children's and Young Adult Library Service here at the commission. And Kathy Schultz, who is the committee chair for the Golden Sower Award, will be with us next week to talk about it. The awards have been announced for this year already, a few months ago. But this will let you know more about that one and what's coming up for the next go-round of the Golden Sower Award. So please do join us for our next week's show. And at any of the other shows we have here on the schedule, I've got July and August up here. I've got dates for September. I'm getting confirmed. You'll see more things being added to the page as I get them all finalized. So that will wrap it up for today's show. Thank you, everyone, for being here this morning with us on Encompass Live, and hopefully we'll see you in the future show. Bye-bye. Thanks. Bye.