 Welcome, and thank you for joining us for today's TechSoup for Libraries webinar, Measuring Program Outcomes, a Toolkit for Small Libraries. My name is Crystal, and I'll be your host. Today we're joined by two guests who will talk about how small libraries can leverage easy-to-use resources from the Public Library Association to measure outcomes and use outcome data for advocacy and to take action. But before we begin, I have just a few announcements to share. Today we'll be using the ReadyTalk platform for our meeting, and please use the chat in the lower left corner to send questions and comments to the presenters. We will be tracking your questions throughout the webinar, and we'll answer them at the designated Q&A section at the end of each presentation. All of your chat comments will only come to the presenters, but if you have comments or ideas to share, we'll forward them back out to the entire group. You don't need to raise your hand to ask a question, simply type it into the chat box. Should you get disconnected during the webinar, you can reconnect using the same link in your confirmation email. You should be hearing the conference audio through your computer speakers, but if your audio connection is unclear, you can dial in using the phone number in your confirmation email or that we've shared in the chat. If you're having technical issues, please send us a chat message and we'll try to assist you. This webinar is being recorded and will be archived on the TechSoup website. If you're called away from the webinar or if you have connection issues, you can watch a full recording of this webinar later. You'll receive an archive email within 24 hours that will include a link to the recording, the PowerPoint slides, and any additional links or resources shared during the session. If you're tweeting this webinar, please use the hashtag TS4LIDS. We have someone from TechSoup live tweeting this event, so please join the conversation there. TechSoup Global is dedicated to serving the world's nonprofit organizations and libraries. TechSoup was founded in 1987 with a global network of partners. We connect libraries and nonprofits to technology, resources, and support so that you can operate at your full potential and more effectively deliver programs and services to better achieve your mission. TechSoup has helped to distribute over 14 million software and hardware donations to date through our product donation program. We offer a wide range of software, hardware, and services, including software like Microsoft Office and refurbished computers. For more information about TechSoup product donations or services, please visit TechSoup.org. Again, thanks for joining us for today's TechSoup for Libraries webinar, Measuring Program Outcomes, a toolkit for small libraries. We have two guests joining us today. Samantha Lopez joins from the Public Library Association in Chicago where she is a project coordinator for Project Outcome, which we'll learn more about in just a few minutes. Robin Treslow joins us from Prince Frederick, Maryland, where she coordinates adult programming and manages public relations at the Calvert Library. My name is Crystal Schimpf, and I'll be your host for today's webinar. Assisting us with chat and Twitter, we have Ginny Mies and Susan Hope Bard from the TechSoup team. We also have Emily Plamon joining us in chat from PLA and from Project Outcome to respond to questions as they arise. We'll be on Twitter using the atTechSoup for Libs handle, so you can also reach us there. Now we will have time for questions throughout the webinar today, so please send us those questions as they arise. And we'll address as many as we are able to. Some of those responses may come in chat from Emily, and some of those will come during our designated Q&A sections on the record during the webinar. If we aren't able to answer your question during the webinar, we will follow up later on with an email to give you a response. Again, this webinar is being recorded in all of the slides, resources, and materials will be included in the archive, which you will receive within about 48 hours after the event. We'll start off today by hearing from Samantha who will share information from Project Outcome, a free set of tools and resources for public libraries. She'll start by talking about why measuring outcomes and not just outputs is important for libraries. Then she'll share the Project Outcome toolkit and resources, as well as ideas for how you can use outcome measurement data in your library. Then, Robin will share her experience measuring outcomes at the Calvert Library across four programs, and how they overcame challenges along the way. We're focusing on smaller libraries today, but if you've joined us from a large library, you should still be able to apply these concepts and utilize the toolkit. The toolkit is only available to public libraries in the U.S. and Canada currently, but if you've joined us from a nonprofit, you're welcome to stay on the line and adapt any of these ideas for your own setting. We'll have time for questions after each speaker, and as I said, Emily will also be responding to questions in chat throughout, so please send in your questions as they arise. Now before we get into our content, just out of curiosity, we'd like to know if this webinar is your first time learning about Project Outcome. So if this is your first experience, you can just select Yes, and then click Submit. If it's not, then select No, and click Submit, and you'll see the responses as they come in. And I'll give you a few moments just to put in your response and send it to us. At this point I can see as the responses are coming in that we're seeing a strong majority that this is your first time learning about Project Outcome. And if that's the case, then we're happy to be able to introduce you to this resource today. Now if you're already familiar with it, then we hope that today's webinar will give you some additional ideas for how to implement outcome measurement in your library. I can see that the responses have slowed down a bit here, so I'm going to go ahead and close the poll so you can see everyone's responses. And we can see that just over 70% say that this is their first time learning about Project Outcome. So you are in good company today if that's the case for you. And welcome and thanks for joining us. All right, so now I'm going to hand things over to Samantha so she can tell us more about Project Outcome and the importance of outcome measurement. Sam? Thanks Crystal. Hello everyone. As Crystal mentioned, I am the Project Coordinator for Project Outcomes. And I'm glad to be here today to be talking about why libraries should measure outcomes in the Project Outcome Toolkit. So first, let's address the question, why measure outcomes? We know times have changed and as a result, so have libraries. There's been an increased need for services and programs. The library's role has transformed into a community hub addressing a variety of community needs. But traditional measures only capture outputs. They don't capture the library's newly expanded community role. We know libraries offer a variety of important services from digital to early literacy programs and so much more. We also know how valuable patrons find these services. But numbers like these are no longer enough. How do we capture the library's impact on the community? In other words, what are these numbers actually need? For the purpose of clarification, I want to take a minute to explain how Project Outcome defines the word outcome. An outcome is a specific benefit that results from a library program or service designed to help patrons. Outcomes are not outputs. In this table, you see some examples of outcomes versus outputs. Some outcomes include a job seeker is more confident in their job search after meeting one-on-one with library staff. An elderly patron is able to create an email account after attending a class at the library. And after attending a library story time, the caregiver feels they are better able to support their child's literacy, needs, and growth. Now that we're more clear on what we mean by outcomes, we'd like to know how many of you are currently measuring outcomes in your library. Please select one of the responses, then click Submit. Once you submit your response, you will see a summary of all the responses. If you have any ideas you'd like to share, please put them in the chat. We'll try to share out as many of your chat responses as we can. I'll give a few seconds here to get everyone's responses in. Now let's see our results. Okay, so a vast majority of you say you're not measuring outcomes. About 30% are using other methods. Some of you are actually using Project Outcome, which is great. Whether you are not measuring outcomes or already measuring outcomes and looking for new ways to do so, Project Outcome can help. Project Outcome is an online toolkit that is free for all U.S. and Canadian public libraries to measure outcomes of their programs and services. Project Outcome is managed by the Public Library Association and one of many ways for libraries to measure outcomes. But the benefit of using Project Outcome to do so is that it's free, easy to use, and takes the work out of creating your own questions or needing to be a data analysis expert. It does the work for you. The surveys themselves were developed by a specialized task force comprised of a diverse group of public and state library leaders, consultants, data researchers, and analysts. The task force pilot tested the surveys in 2014 and they were finalized and launched with the Project Outcome website and tools at ALA Annual in June 2015. You do not need to be a PLA member to participate in Project Outcome. And while it was designed for public library staff, state library, and non-library users can also register for free to learn more about outcome measurement. The Project Outcome toolkit includes access to the field tested surveys measuring outcomes in seven library service areas. The Survey Portal tool, which helps library schedule surveys, track and store data, and access ready-made reports. The Visually Interactive Data Dashboard tool, which helps libraries analyze and understand their results. And all of the resources and training we have created to help libraries throughout the outcome measurement process. We have heard from our small library users especially that the combination of the ready-to-go surveys and easy-to-use tools really help library staff save time and energy in planning their data collection. Leaving more time for decision-making and advocacy once the results are in. These icons represent the seven survey topics available within Project Outcome. The task force intentionally designed a set of questions that were short that captured the most essential patron outcomes the library would want to know. We have heard from our users how helpful it is to have an off-the-shelf set of questions that they don't have to create on their own. And that patrons find the short, simple surveys easy to complete and are willing to take the time to do so. You do not have to measure across all seven topics. You also do not need a certain amount of responses to start measuring your impact. We suggest starting small with one survey, seeing the results, and planning for larger programs from there. The surveys are six questions long and they are designed to be administered directly after a program or service has been delivered. The first four questions measure a particular type of outcome, knowledge, confidence, anticipated change in behavior, and increased awareness of library resources. As you can see in this picture, the surveys have a five-point Likert scale for the first four questions, allowing patrons to select from strongly disagree with the option of not applicable. The last two questions are open-ended and designed to allow for additional feedback from participants not captured in the first four questions. We have heard from our current users that the open-ended comments are key to making programmatic changes. This is where programming staff learn what their patrons like most, what needs improvement, and what additional programs they'd like the library to provide. The questions cannot be changed or edited, but libraries can add a question or two if they like. This was intentionally done by the Task Force because the questions have a research component that allows for the questions to be valid and provide consistent aggregation of data. The surveys come in both paper and digital format and are currently available in English and Spanish. I'll also add that the Task Force is currently working on the next set of outcome measures, which will give libraries even more ways to capture patron outcomes. These new surveys will be launched this June at ALA annual in Orlando and will be available for all users following the launch. So this is the Project Outcome website. This is where you will access all of your surveys, tools, and resources to help you throughout the outcome measurement process. Again, registration and access is all free. Now I will walk you through the Project Outcomes tools, which are available on the Surveys and Resources page of the website. The Survey Portal is where you go to schedule and access all of your surveys, customize your surveys' program information, including program name and date, enter all of your survey responses, track program attendance and survey response rates, and review your auto-generated reports and data. Again, we recommend starting small with a program you feel confident about getting patron feedback from. You do not need a high number of responses, but it's always best to aim for a high response rate no matter the size of the program. This screen shows what you will see inside the Survey Portal, where all of your survey scheduling and data collection takes place. Here is where you will go to schedule your surveys, see all of your currently running surveys, get your digital survey links, enter paper responses, and access all of your reports and data sets. On a recent presentation with Iowa Libraries, our co-presenter was a small library who reported finding the survey scheduling process very easy and especially liked being able to copy and paste the digital surveys on iPads for patrons to complete as they left the program. No extra work was needed by her or her program staff. This screen gives you an example of what scheduling a survey will look like. You are walked through the process step by step from choosing which survey you want to run, to naming your survey, and selecting your fielding dates. Project Outcome provides users with auto-generated reports they can use in presentations to their staff, board, or city council. The summary report includes community talking points and statistics about the survey topics, aggregate summary results of your outcome scores, and program detail information like program name, attendance count, and session dates. Our Iowa Library presenter also found the auto-generated reports extremely helpful, especially that it eliminated any additional data analysis work on her end. In your library data set report, you can see all of your survey responses including the open-ended comments. All responses are also tied to branch location and program name and date. You can really compare and contrast results within a single survey. You'll return to the Project Outcome website to access the data dashboard tool, which is where you will view, interact with, and analyze your survey results. Once survey responses are entered into the survey portal, your results will automatically populate in the data dashboard. The four data dashboards help libraries learn more about their results. They provide high-level summary scores and quick comparisons of national and statewide averages. They help you visually explore your data and see relationships and correlations between survey topic and outcome sites, which can help drive decision-making and identify gaps in service or impacts. Your service outputs are also captured in your dashboard, since outputs and outcomes work together to help libraries better tell their impact story. You see a lot of data points here, but if your library only wants to run a few surveys or only survey two of the seven topics, that's fine. You will only see the data relevant to your library's collection needs. Here's a closer view of the Detail Dashboard page filtered down by Survey Topics. Here we can see the library, state, and national averages in the center of the road, with the scores going left to right from strongly disagree to strongly agree. These results are color-coded with weaker, lighter-colored scores to the left and stronger, darker-colored scores to the right. Here's where you also have an additional view of your open-ended comments. The data dashboards are designed to help you clearly and quickly see your impact without needing to shift through rows of Excel data. If you prefer to work in Excel, then the Survey Portal reports provide that for you, too. You've tried to cover every way a library would want to interact with and learn more about their data. I know that was a very quick overview of the tools, but you can learn more about how to use the tools on the Project Outcome website, which hosts a variety of helpful resources. You can find all of the Project Outcome resources under the Websites, Surveys, and Resources page. You can find resources by clicking hashtag categories on the left-hand side of the screen, which will point you to resources specific to that topic. There's also a search box to help you filter through the resources. Here are more examples of the types of resources available to help you throughout the outcome measurement process. From getting started to how to talk to patrons about surveys to influencing your programming. Project Outcome also hosts three monthly webinars that expand on these resource topics. All of the archived webinar recordings can also be found on the Website. Since launching last June, Project Outcome has seen a lot of participation and a variety of programs and services have been measured. Over 11,000 patron surveys are currently aggregated within the Project Outcome system. The Education and Lifelong Learning Survey is our most common survey schedule, which libraries use to measure popular programs like teen-steam programs, and adult skills such as knitting, book clubs, and ESL classes. The more libraries participate, the more outcome data will be aggregated, and the fields will move toward outcome measurement as common practice. Listed here are some examples that we have heard from libraries already implementing the surveys and the programmatic decisions they have been able to make as a result. One library used the surveys to measure the effectiveness of their Science Kit service for one month. The library put the Education and Lifelong Learning Paper Survey. In each Science Kit, they loaned out. They received about 30 responses, which was a 17% return rate. What they learned is that most parents and kids were unaware of the additional Science Programming the library provided. One response actually said, the library should provide kids programs, so kids will want to come to the library more often. This unawareness shocked the library and resulted in the library inserting their program brochure so all Science Kit users could learn more about the library's hundreds of kids programs and when they were offered. 90% of the brochures did not come back to the kit. This was a small, quick, and inexpensive change for the library to make. Most libraries are getting questions about why the library still matters. Many Project Outcome Libraries have been able to use the positive results to reinforce their value in the community when presenting to their board. They have been able to use the summary reports and data dashboard views to give a clear and concise message about their patron impact. One Project Outcome Library said they were able to change the conversation with their board and internally with staff from numbers being down to what are we accomplishing and what is happening in the lives of our patrons. This was all thanks to conducting a few Project Outcome surveys and finding out what their patrons really needed. Now I will take any questions you may have. Crystal? All right, Samantha, thank you for sharing such a comprehensive overview. I know there is so much with the Project Outcome Toolkit and resources and what's being done there. But I am glad you were able to share such a broad overview of it with everybody. We have been getting some questions in the chat and I know we have been sending off some individual responses but I would like to address some of them now just to clarify for everybody. Now I know you said earlier how we are able to access the Toolkit and the resources but just in case anybody missed that can you tell us again how can people gain access to all of these resources you have just shared? So to get access to the tools and resources you simply register on the Project Outcome website. So you will go, you will hit sign up, enter your email, get a confirmation, click the confirmation email and then answer a few questions about where you are from and what library you work for and that will direct you and link you to your library's account. We allow for multiple registrations for libraries. So you can have multiple people in your library all registering on their own all links to the same account. So within the survey portal you see all of everyone's data from your library and the same with the data dashboard. So everything lives behind the registration wall of the Project Outcome website but signing up is quick and easy and again it's free. Great. And just that last sentence to reiterate how much does this service cost this set of resources? Free, free, free. Great. And who is able to? Excellent. That's what I was just going to ask. So this is currently available to public libraries in the United States and also in Canada. Do I have that right? Correct. Great. So those are the important things we want to make sure everybody knows before they walk away today so thanks for clarifying that. We also had a question come in earlier just about the languages that the surveys are available and are they available only in English or is Spanish available as well? We currently have the surveys available in English and Spanish and we know some libraries are requesting other languages. We haven't really grasped what the strongest need is for certain translations so we're still gathering feedback from our users so we will be adding other translations in the future. All right. Now one person has asked a question about the staff time involved and of course we're looking at small libraries today and staff time is always a concern there. So do you have any indicator on how much staff time is involved in entering data from what you've been hearing from the small libraries using it? Is this something that a small library would be able to approach on a smaller scale? Yes. Project Outcome was specifically designed for libraries of all sizes, small, medium, large. So we know that small libraries have limited time, limited staff time and capacity. So the great thing about the survey portal is it kind of forces you to plan and think ahead. So if you go in, schedule your survey, decide in advance which programs you're going to measure, print out your survey or get your digital link. You'll have it ready to go. If you use the digital surveys, there's no data entry on your end. It automatically populates in your reports. It automatically populates in your dashboards. If you are collecting paper surveys, we give you an extra week after your survey fielding date is closed to enter all of the data manually. And to enter the data, it's super easy. The screen looks like the digital survey. So the staff would have the paper surveys go in to enter the paper surveys, and it looks like they're just completing the online survey themselves. So there's no Excel work. There's nothing. There's no additional confusion. It just looks like you're completing the survey online. You click submit, and that will also populate all of your reports and all of your dashboards. Excellent. So the goal being an easy to use tool for libraries of all sizes, including smaller libraries with less staff available to work on it. Sounds great. Now you mentioned that the surveys could be entered from paper, and someone has asked a question, are the surveys printable? Clearly you've just answered that in your last response, but could you tell us a little bit about both the ways that people can access the surveys for printing them out and also for access on a computer or on a mobile device? So once you go into the survey portal and schedule your survey, you'll choose which survey topic you're going to measure. You're going to answer a few more questions about your dates and which programs you intend to measure. And then you'll be able to click a checkbox and link that will give you a PDF download of the paper survey. So the surveys do not live on the Project Outcome site, and that's done on purpose so that libraries know ahead of time that they can't just print out the survey, hand it out, go back in, and enter data for that day. They have to plan at least the day in advance through the survey portal. When you schedule a survey, you automatically get access to a digital link. And they're really easy to use. We heard from small libraries that they were able to just put the URL in on an iPad and have patrons complete them while leaving the program or the library. And that all lives within your survey portal dashboard, the screen that I showed earlier. And you just copy and paste a URL. So whether you're putting it on a desktop computer after a digital learning course, or if you're putting it on an iPad or an iPhone, however you want to use the digital survey, it's super easy. You just copy and paste. Excellent. Well I think we have time for one more question, and then we're going to hear from Robin who's going to actually tell us what it's been like in a smaller sized library to be using project outcome surveys. So we'll hear about Robin's experience. But I also just wanted to share a comment from Peggy saying that they especially want to apply this, or interested in applying this to writing grant applications and final reports for grants. And so when you think about using the outcome results, I think that's another great example of how this outcome measurement data so Peggy thanks for sharing that. And then the last question I'm going to ask right now, and then we will have time for more questions and answers at the end of the session so please continue to send them in, is a question about survey fatigue, and this comes from Creed. Of course we know that our patrons in our libraries are bombarded by many surveys in all different directions, and that can cause what is sometimes referred to as survey fatigue. So Creed's question is how does one ask the relevant questions without overwhelming? And Samantha I thought you might be able to tell us a little bit about the questions in the project outcome surveys and how those were designed to perhaps be more manageable for library patrons. So could you respond to that? Yes. So we all know about survey fatigue. We'll probably be surveyed after this webinar. And the task force knew that too. So going in they knew they wanted the surveys to be short and simple and easy for patrons to complete. So that's why they limited themselves to four major patron outcomes. So knowledge, confidence, application, and awareness. We know that libraries want to collect everything as long as you're surveying someone. You want to ask them 100 questions about how they learned about the program, where they're coming from, their demographic information. But that really limits your response rate. And what we're trying to get here is a high response rate. So you can add an additional question but you can't collect it within our tools. So if you're going to add an additional question it needs to be outside the project outcome toolkit because we specifically designed our surveys to be aggregated so that you could talk about data over a long period of time. All right. So that gives us a little bit of a window into project outcome tools and resources. And Samantha, thank you for sharing that presentation with us and also for answering some questions. We will have more time for questions at the end. But right now we want to move on and hear from Robin. So Samantha, thank you again. Thank you. And we'll have you come back on at the end for some more questions if we have time. Now I did want to share an opportunity that project outcome is gearing up for. If you are attending the ALA annual conference in Orlando in late June there will be a free project outcome enrollment workshop that you will be able to – in that workshop you would be able to learn more about how to use project outcome in your library and learn at a deeper level. Now this workshop is free but you do have to apply for it and space is limited. So I know we're going to share that link for you in the chat. In fact we already have. The deadline to apply is May 22nd and space is limited. So if that is something you're interested in doing you'll want to check out that link and we'll include that in the archive as well. But a great additional opportunity for free training on this project. Now let's hear from Robin who is going to tell us about how they did things at her library in using some of the project outcome surveys to measure outcomes. Robin. Thanks for including me. So I'm just giving a little bit of my bio here so those of you who work in small libraries can see many hats. Adult programming, marketing, publicity, IT, great partnerships, basically whatever my director asked me to do. Jack of all trades which I suspect many of you experience as well. And then I've always heard you should always include a baby or a puppy in your slideshow so there's my baby. And to know a little bit about Calvert Library because I guess the definition of small library can vary. This is us where for locations plus mobile services department we serve a population of 90,000. Our circulation is a bit over a million per year for a whole system. We own about 350,000 items and in a typical day we have at our main library we have about six to seven staff on the floor at once and about two to three staff on the floor at each branch. So that's sort of what we look like and what I think of as small. And then our county bio. We are an agricultural community. We used to grow tobacco, a long time history, and that was followed by rapid growth where a lot of folks moved in. We're within commuting distance of Baltimore, D.C., Annapolis. Some folks go south to Northern Virginia, et cetera. So we have a really a mix of farmers and city folk in our community. It's about 62% of our population actually commute out of the county for work. And I've heard that what you do for fun defines you. So that kind of tells you what our community is like. We love farmers markets. So just an overview of what I'm going to talk about today and that is that there are four different surveys that we've done through project outcomes. And I'll talk a little bit about our methods of implementation for each of those surveys, what we learned about those methods, and then what we learned overall from our survey results. And within that conversation you'll hear about obstacles we had to overcome and so forth. But overall it's doable. So the first survey we did came about because we had signed it for project outcomes. Several of us attended the pre-conference last June. We were super excited to be involved. And then summer happened and summer reading. And we kind of forgot about project outcome in the rush of summer. And then when the dust settled and the kids were headed back to school we thought we should do a survey. So of course we had our summer readers register for summer reading and we had about 2,000 emails from that, email addresses. And so we emailed a link from project outcome to a survey. And we didn't get very many responses. We also sent a reminder email saying please respond. We still ended up with only a total of 61 responding which is only a 3% return. And you heard Samantha say the goal is to get a good return rate. So we kind of learned that that was not the best way to go about administering the project outcomes survey. And what we're going to do differently this year is when people register for summer reading we'll tell them that we're going to do a survey thinking that they'll form this idea of oh, there's going to be a survey and I'll be prepared to do it. And we'll probably send it earlier. Some people complete summer reading within four weeks. And if we're sending it out at the end of August they've forgotten about it by then. So we're going to send it earlier and maybe do some weekly reminders. So that was sort of the implementation piece that we learned about. But 61 results wasn't none. We got really a lot of good information from the 61 people. There was repeated feedback about our software that we were using for summer reading, that it was clunky, that reading log wasn't really fun, not motivating. And we heard that enough that we thought well gee, we need to improve that. So this year we're trying a new software that really gamifies the program. It gives badges and basically we think it's going to be a lot more fun to use. And of course we'll do a survey and see if our customers agree. But it was important for us to hear that from our customers and project outcome was a great way to do that. So the second program that we did a survey for, we learned from our last lesson. This is a self-defense workshop for grown-ups and we had people register again for that. So we had email addresses. We emailed those 15 people and we got seven responses. We emailed them the day after the event. Seven responses was almost 50%. So we thought that was an improvement. The third event that we wanted to survey was a clean up your record. And that was how to get your state arrest record expunged. And we didn't think collecting emails or registration was going to go over as well with this crowd. So we thought we better do a paper survey. And we did consider, well gee, if we have a QR code they could go to a link on their smartphone or whatever. But we kind of were afraid to make too many assumptions about this audience and thought the paper survey was safer. So that's what we did. We did paper surveys. We had 46 attendees. 32 paper surveys were returned. You can see our results there. I know someone asked about how much staff time does it take to enter. We're fortunate that we have a lot of volunteers at the library. We have an older community, an aging community. So it's easy enough to ask folks to enter it for us. Honestly, I don't think it took her, I swear it only took her half an hour to enter it. But it probably wouldn't have gotten done within our week if we hadn't asked for a volunteer and she did it. And it was a great outcome. And I should say I don't pull up any of the reports we've gotten for this before, but the reports are really awesome. So what we learned specifically about this event was that there was obviously a need for this workshop. The turnout was great. The people who responded on the paper surveys were so positive. They loved the presenter who happened to be our prosecutor for the county. They said she gave great information. They understood what to do afterwards. It was highly informative. So it was wonderful for us to be able to tell this professional volunteer. They loved it. It was wonderful. Thank you so much. And to have that specific feedback for her. And then I loved the response that someone gave that was there as hope, which really made it feel very worthwhile. And then we also heard that we should do something similar for folks who have federal arrests. So we are going to do that. So it was, like I say, a good result. We learned a lot. Our fourth program came when we were invited to help with this webinar today. And I thought, oh gee, we haven't done a survey in a little while. Maybe I better check it out to see if there's anything that has changed. So we had a program on April 12th. It's called Connect with Your Children Using Minecraft for Parents. And so for that survey, I knew that Samantha said you have to plan ahead. But about 30 minutes before the program, I went on to a project outcome and said, gee, I want to set up a survey. Discovered that you can't actually schedule one for the same day. But I got around it. We overcame this opposite goal by saying, okay, well, the program is really tomorrow. And now I can print a paper survey right now. So I had surveys available to hand to the participants just my dates off by one day in the system. I would encourage you all to plan ahead, better than I do. But if you're a small library that, you know, wear a lot of hats, that sort of thing, you might encounter this too. So there are ways around it. So, and then of course I entered the results the next day. And, you know, we got those results. So overall, what we've learned, you know, I think, I feel like we've scored pretty well. And I look forward to, you know, comparing us just sort of to national averages. But areas that we felt we could improve on were the impact on confidence. That people didn't say that, yes, we had a huge change in our confidence levels. So why is that? You know, was it because people already felt confident in this area? Or maybe they needed more hand-holding? So we really want to follow up on that and figure out, you know, what can we do to increase confidence? And then the other piece of, you know, marketing person, this impact on awareness, we seem to not have helped people be aware of other things that the library were offering. And I thought, gee, we should do better at cross-marketing because people that come to a library program are probably going to come to another one if we tell them about it. So that was good information to have. You know, our awareness could be improved. How we've used the data, I've mentioned some of those. We have a real formal plan about how we're going to use it. We're really excited to see what it looks like nationally when we compare our library to other libraries and impact. But overall, we're just excited to be part of a national data collection and just to tell you it's really not that hard even for a small library. Well, Robin, thank you for sharing some of your experiences. And I think it's great to see how you've put this into action in a few different ways. And the question I want to start out with is, do you have any advice for small libraries that are looking at this? Where should they get started? If you were them starting again now, what would you recommend starting with? Well, there are some great videos on the site that you can watch that give you some overviews of how to do this. And we've watched some of them. I admit I have not explored all the resources on the page yet, but it's rich in resources. So I would say find something to watch and get a good feel for it and go for it. Great. And actually to that point I know we've seen a few responses from people who have just in the webinar time today gone on to try and register and a few issues came up. So I just wanted to address those in case you've had them as well. And for those of you who are very eager to get started, I'm glad to hear it. First of all, you do need to get your account approved and that can take up to 24 hours and you'll get a confirmation email. So if you first go in and it's waiting to be approved, know that to watch in your email for that specifically. And the other issue we heard about is somebody who's gone in to try to start one of the resources that had kind of an interactive video content. And if you have a lower bandwidth internet connection, it may be because you've got the webinar going at the same time. There wasn't enough bandwidth there. So if you've experienced that or if you experience it, know that some of the resources are videos and so they take a bit more and maybe to try them at a different time. We certainly recommend waiting until the webinar is over just so that you get a chance to really look at those. And like Robin said, just dive in and give them a try. Robin, I did have another question for you. I wanted to go back to that question of survey fatigue and just see if you have experienced any of that in your library, do patrons seem willing to fill out your surveys or are they expressing any fatigue? And if so, how do you work around that? Okay, so interestingly the paper surveys have been, we've had the best return for those, but you've seen the surveys and they are really quite short. It does not take long at all to answer them. So far really no one has objected. If you look at our return on the summer reading, that wasn't great. But I do think that's because we asked so late if we really needed to have done it earlier in the whole process. So I don't know. I think that it's a pretty easy survey, so I would say no. All right. Well, Samantha, I'm going to have you come back on because we've had a few questions about the administration of surveys themselves and where you might use them. And we had one person ask a question about using the survey tool to measure collection development was the question that they asked specifically. But I think the bigger question here is what types of programs and services can the surveys be used for? Is there a way you could give us kind of a brief answer for that? So what I will say about collection services, the project outcome surveys are specifically designed to select patron outcomes after attending a program for service. And by service I think the science kit example is a good idea of how to measure patron outcomes from a service. So they're not designed for collection. The task force had to really look at what were the easiest outcomes that they could collect, which services should they cover, and they had to narrow it down. And they chose their seven core service areas. But we recommend taking the project outcome questions. And if you want to write your own surveys that measure things like collection, then by all means do so. We're all about promoting alpha measurement in general. It does not have to be restricted to our seven surveys topics or using our surveys. But the surveys are designed for really those programs right after attending a program, the immediate perceived benefits that a patron reports. All right. Another question, Samantha, about the surveys in general is that the surveys you've talked about come at the end of a program, seems, but Kathy says many grants like a pre-survey and a post-survey, and does Project outcome have any type of pre-survey or other survey types besides the ones you've told us about today? So Project outcome does not have a pre-survey. We do recommend libraries to measure the same programs multiple times to try to see if they can compare and contrast differences, but that's not a pre- and post-survey. So we are currently working with the task force to develop the next set of measures, which are not pre-surveys, but follow-up surveys. And those will be launched at the ALA annual pre-conference that we talked about earlier. And those aim to capture patron reported changes after a period of time has passed. So those surveys will aim to answer questions like, okay, you attended a job, a job search program, did you search for a job in the way that you were taught, did you receive an interview or a job offer since then? Those types of questions. If you want pre-surveys, Project outcome is working with the task force and an outside consultant to develop guidelines on how to write your own outcome measures, which will include tips for pre- and post-surveys and skills testing. So those are coming later in 2016. I don't have a definite date for you. The follow-up surveys will definitely be launched at the end of June. But the guideline measures for writing your own and for pre- and post-testing, I'll say fall 2016. All right, great. Another question we got, and I think that part of this may have been answered in the fact that this survey is about programs and not just day-to-day attendance. But Terry says, how does a library like ours that only sees about three people a day, and most of them are kids that are here to play on the computer? So how can a library with a very small population or perhaps small program attendance utilize project outcome surveys? And I think that there's also an underlying question here, which is, is there a number that's too small? Is there a minimum number of surveys to utilize this program in these resources? There's not a minimum number of surveys or a minimum number of attendees for program that you should measure if you are a small library and have small programs and few attendees, then that's fine. That doesn't mean you can't measure the impact you had on those three attendees of your program. Plus, you might have the highest response rate. So we recommend measuring programs not based on the number or popularity, but really what's important for your library's goals to align with your community needs. So if it's a program that you really want to emphasize, you're having an impact on the community and it's a small program, we recommend you survey that program, no matter how many attendees there are. Great. Samantha, I have one more question for you, and then Robin, I'll have one more question for you. We've had so many great questions come in. So some of these we will follow up with later via email. I know Emily's also been answering some directly for you in the chat, but we've got time for just these two more questions before we do a few more announcements and wrap up for the day. Samantha, the question I have for you is really around the idea that when people submit a paper survey and they happen to know the person who's presenting, maybe even if it's an online survey, but you know the person presenting, that people may not always be as honest and that the outcome rating, in this case, they say that the ratings increase substantially when the survey taker thinks that the presenter can trace the source of the survey and so we might not get as accurate of a result. So does Project Outcome have any recommendations for encouraging more honest and accurate responses from survey takers? We've heard this from several libraries. We all know that people who are already attending library programs typically will respond that they love the library, they love their librarians. And we do recommend maybe having a volunteer administer the surveys, but if you're a small library, you don't have a volunteer, it's just you. We recommend starting off the survey administration process by saying this is not a reflection of me, this is to help the library. All feedback is welcome. Please be open and honest. Your surveys are also anonymous, so I would emphasize the anonymity of the surveys as much as possible. The responses are not tied to any individuals, so they can be as open and honest as possible. We also recommend maybe you leave the room or there's a drop box. Whatever makes the patron feel more comfortable filling out the survey and being honest about it. All right, Samantha, thank you for that. Robin, we have time for one more question for you. And this is maybe from the programming side of it and thinking about how do you use the results to adjust programming when you're dealing with perhaps other people on your team? Do you have any recommendations on that, having looked at the results so far and considered some changes for the future? How do you coordinate that with others working in your library? Yeah, for us it's really about ownership. We ask that is the people that are doing the program take charge of the survey, whether it's that they have actually, we usually will set it up for them or I will usually set it up for them. But I let them know what the different options are and which one seems to be the best fit and that we're going to do it. And then they administer it. They're the ones that look at the data first. All that is sort of, I think it's that ownership piece. Once they have that in hand then they sort of want to do the tweaking themselves. I'm not sure if that answers the question, but that's how I heard it. And that was a great, just from your perspective I think that's what I wanted to provide there was what your experience had been in working with programming staff in your library. So thank you very much. Thanks again, Samantha and Robin for sharing what you know. I'll invite everybody to stay on the line for just a few more minutes as we wrap up here. I know we had a lot of information today and just a reminder that that will come out in the archive email. All of the resources and links that we talked about today will be shared in the archive email and you should receive that by the end of the week. We also have some other webinars coming up that you might be interested in next week on May 2. We have a webinar with some updates on what's new at TechSoup. So whether you are new to TechSoup or have been using TechSoup for years it's a great way to learn the latest product donations and offerings. And then on May 18th we have another library-specific webinar on how to teach older adults digital literacy skills. And this is focused on public libraries and features both a free resource and some good practices for library programming. And you can register for these webinars and view archives of past events at TechSoup.org. We'll have a survey in just a minute. We will ask how you felt about today's webinar and give you a chance to provide us some feedback. So please stay on the line if you can. But before we sign up I just wanted to ask, do you have a story to share? TechSoup for Libraries is always looking for great examples of how libraries are utilizing technology to feature in our webinars and in our blog posts. So TechSoup for Libraries addresses the specific technology needs public libraries and we have a monthly newsletter also where we do that. So if you've got a story to share you can do that at TechSoupforLibraries.org. You can also sign up for our newsletter there to read about stories from other libraries and hear about upcoming webinars. So visit us online there. Please do stay on the line for just a moment longer to take that brief survey about today's webinar. Thanks again to our guests for sharing their expertise today. And I think it was really great to hear about project outcome and the resources and toolkits that are available there. Thanks to ReadyTalk for being our webinar sponsor. And thank you to all of you for joining us today. Have a great afternoon.