 Hello everyone and welcome. I'm Brenda Haug and I'm the facilitator for today's session which will feature Sam Becker talking about the Impact Survey. Thanks everyone for being here. Before we dive into the content, I'll talk about ReadyTalk, the technology we're using for today's session. You should be hearing audio right now through your computer speakers or through your headphones. And if that's not working for some reason, if it's choppy or for whatever reason, you also can use the phone for audio. And we'll put the phone number into chat for you. So again, I'm just going to double check and make sure that everyone can hear us now and let us know in chat. And we'll help troubleshoot if you're having any audio issues or other technical issues. We'll use chat throughout the session. Feel free to use it to – people are sharing right now where they're at and what the weather is like. You can also ask questions, share your experiences. If you have web resources that are relevant, feel free to share those there. So again, use chat throughout the session. The number one question we always get asked at this point is, will this be recorded? And the answer is yes it will. Later today you'll get a follow-up email message. And in that it will have a link to the session recording. You'll also get a copy of these PowerPoint slides that we're using. And then any websites that are discussed or shared during the session. I know Sam has a number of websites she's going to mention and will include those. And any that are shared in the chat too will share those. Today's webinar is being brought to you by several groups. I work with TechSoup for Libraries which is part of TechSoup, an organization that helps nonprofits and libraries use technology to serve their communities. And TechSoup is one of the organizations that is part of a coalition called the EDGE Initiative. And during her presentation, Sam is going to be talking about the connection between the EDGE Initiative and the IMPACT Survey. So we'll go ahead and start with Sam and hear about the IMPACT Survey from her. So welcome Sam. Thank you Brenda. I'm really, really happy to be here and so glad to see so many new names on the participant list. I'm really excited to be able to talk to you today about IMPACT Survey. But before we get started, I wanted to do a poll and just ask you how long it's been since you surveyed your community about their technology. So there is a poll up on the screen right now. And if you could answer that, that will help me understand more about where you are coming from. So the IMPACT Survey was developed about 2009 when we were doing the Opportunity for All Study. And what we wanted to do was provide a way for libraries to participate in our research about public access technology and also to supplement a national phone survey that we knew would miss a lot of important users of library technology. And so as we were figuring out how to get libraries to participate, we came up with this way of having them post links to our survey through their library websites. And in return for their help, we were able to give back to the library community reports about the use in their particular library. So over the past several years we've been working to develop the IMPACT Survey into a tool that libraries can use on their own to gather information about how patrons use their technology services, particularly about what kinds of outcomes they are experiencing as a result of the resources and services that you provide. We wanted to create a way for libraries to get information that they can use to figure out what kinds of resources and services they need to provide to patrons to support them in the kinds of tasks that they are more likely to do in your particular community. And then we wanted to make it really easy for libraries to present the findings from the survey to key stakeholders so that you can have information about your users to take to the Rotary Club or to the City Manager. Sam, can we share those poll results with everyone? Sure. Okay. So there you can get a sense of where the participants are at, how long it's been since people have done a survey like this. Well, I'm glad to see that many, many of you look like about 60% of you have actually done some sort of community technology survey, which is super. We really hope that the burden that is created by trying to do surveys will be lessened by IMPACT Survey, and then you'll be able to do it more often, maybe once a year or every other year. So for IMPACT Survey, there's a few really big benefits that we are trying to create for libraries. We're trying to save you the time and energy that it takes to develop survey questions on your own. So a lot of you are coming up with different kinds of survey instruments for different purposes, and it's hard to figure out what to ask. And it's also hard to know whether or not the survey questions that you're asking are actually valid. In other words, do the respondents understand what the question means and are asking it in the way you intend it? So we developed the survey instrument itself. We've tested it. It works really well. People understand it. And so it saves you that process of having to develop those survey questions. We also wanted to save you the time and expense of programming your own web survey. So a lot of you use SurveyMonkey for different kinds of patron surveys. We've set it up so that you don't have to do any of it. As soon as you sign up to use it, you're able to access it, and there's no programming that you have to do. And then finally, we realized that the biggest barrier to surveying really is having the staff to take the survey results and do something with them. So we wanted to create a way for libraries to get the results in a format that they can use and not have to spend staff time creating pie charts and presentations and that kind of thing. So as I mentioned, the survey asks questions about activities in core outcome areas. These are outcome areas that our research showed were important both for patrons being able to complete important tasks that lead to better self-sufficiency as well as what policymakers want to hear about in terms of how patrons are using library technology. We also have questions in the survey about library use, regular library use and visits, and also questions about how the patron uses your library website. So you'll be getting some information about that as well in your report. So I just wanted to walk through really quickly how you get set up just to show you it's very easy and give you some information about how it works and the best way to optimize your response rates. And we have some tips on the website about that also. So the basic steps are that you create an account, you install this link on your website, you allow the survey to run in your community for two or four weeks, and then the very next day you're able to go back to your account and get the results in these customized reports. So in the registration, if you go to our website you'll see a button that says Register Today. In the registration process you'll be asked to supply an email address for your library system. It makes it a lot easier for us if you use your official library email. That way we can confirm you more quickly. When you're setting up you'll see a drop-down menu with states. If you choose your state then the library systems will appear in the next menu below. And if you don't find your library on the list, please contact us. We may be able to add you. We've also had a lot of libraries that want to share the survey. They may be some sort of a cooperative where there are library systems that officially according to the Institute of Museum and Library Services are separate systems but they operate completely together. So we have some ways of dealing with that but it has to be kind of on a one-on-one basis. Sam, is it just public libraries that can use the survey? What about schools or academic? Right now it's just U.S. public libraries. We just added this morning the ability to make it available for military libraries as well. We are in the process of figuring out how to add libraries from outside of the U.S. And also we're looking to get some information from the schools and particularly the community college libraries that we've talked to about how the survey would work for them. So some of the questions in the survey may not be applicable to them and we would need to do some retooling around the survey and develop the technology to deliver that to them. So that's our plan in the next probably year longer term to make it available to school libraries. But in the shorter term we will be adding the capability of libraries outside the U.S. Okay, thank you. So we also have the ability, if you refer to your library by name other than what is official with the IMLS, you can change that here. Also if your library is one that frequently uses an acronym for your library, maybe sometimes even more frequently than the full name, you can add that here as well. And those are the names that will appear in the reports that you get. We're really, really excited to announce a new feature on the website which is the ability for you to designate a website for your patrons to be redirected to when they're done with the survey. So right now it just goes to a generic, or previously it just went to a generic thank you page. Now you can use your library web page and redirect them there. Or for libraries that want to offer some sort of an incentive for participation you can redirect them to a site where they can leave their email address for a drawing or something like that. And I'll talk about that more when I address promoting the survey to get good results. You want to take a couple more questions? We're getting a bunch. Okay, one question is, the survey is available for free for the first year. Any indication of what the pricing model will look like after that? Yeah, so it's free until next October. And after that we're going to mostly try to get state libraries to purchase a license for their entire state. But for individual libraries we expect first of all it will be on a sliding scale depending on the library's budget size and population. And we anticipate that the cost will be somewhere in the neighborhood of between $50 and $500 depending on the library's side. So we think that that's a really good price point. We're able to keep our cost low by automating a lot of the report generation. And as more libraries use the system we'll be able to continue to lower the price. Okay, thanks. Okay, so then we ask you about your library, some basic information about your library data. We've pulled in data from the IMLS that you provided probably a couple of years ago. So we give you an opportunity to update those values. It's mostly about your population and your funding. These data will appear in some of the reports to contextualize the findings. So it's good if you can provide the most updated value for those. We also give you the opportunity to update your locations. So if you're a library with multiple locations you're provided with a list of those locations which you can change, close, or add locations as well. And those locations end up being used to populate a branch selection map that will appear for your patrons when they click on the survey link from your library website. So this allows us to be able to provide to you a data set that shows which survey responses came from which library and also which comments came from which library branch. We also have an intake questionnaire. This is a little bit of information for the reports and a little bit of information for us. So we're constantly making improvements to the system as we get feedback from libraries. We want to understand what libraries are doing and what they need. And we also want to look at the relationship between the things that you offer and the kinds of outcomes that your patrons are experiencing. And that helps us contextualize the impact survey on a national level and do advocacy there as well. So we appreciate you filling out this format. Everything on the form you should be able to answer off the top of your head so hopefully it won't take more than a few minutes. When you're all done with that you can begin to get ready for the survey by selecting dates that you want to field the survey in your library. We recommend you run the survey for around two to three weeks for kind of an average size community around $150,000 or better. Smaller libraries and small communities need to run the survey longer in order to get a good response rate. So we recommend for the small libraries that they start out at four weeks and if they're still not satisfied with their results we can extend up to six weeks for them. What do you consider a good response rate? Is it a percentage of community size or how would you determine what a good response rate is? No, the survey itself is not representative. It's always going to be a convenient sample. So we really want to get you enough responses so that when we're calculating the percentages that they're meaningful. And the size of the response rate is really dependent on how many people in the community and how many people in the community use the library. So we like to see for little libraries response rates in the 40 to 50 range, and I'm talking about very small libraries. For larger libraries we usually see response rates at 2,000 surveys or better. So we don't like to see surveys that have fewer than 10 responses, and we encourage libraries to do some additional work to promote the survey when that happens. Okay, good, that helps. All right, so on the fielding days when you select your fielding days you're also given the option to use a paper survey version of the public access survey. So in some libraries and in some communities your patrons may feel more comfortable filling out a paper survey. They may prefer to do it rather than use their computer time. We don't get a terribly good response rate on the paper surveys. The survey itself looks a little intimidating on paper. When it's online there's a lot of skip logic that shortens the time the patron is using the survey. So if you want to use that you can. Once you click the paper survey button we provide you with a PDF of the survey that you can print out and leave at your library. When the survey is closed, when your fielding date has passed you'll have 7 days to enter those surveys into our system. I hear an echo. Anyway, we have a special portal for you to enter in those paper surveys so go much quicker than if you went through the patron side of the survey entry. So just something that's a feature for some libraries may want to use others, may want to skip over this. Okay, so a couple of questions coming in about the things we just covered. So one was back to what's a good response rate. And you said for a little library 40-50 range, so getting 40-50 surveys filled out for a small library that's what you consider a good response rate? For a small library they're doing pretty well. And I'm talking about under $10,000. The thing to keep in mind is that this is real data. These are real people who are doing things on your library computer. So even though it's not representative it's still good information on what folks are doing. Okay, and then for the larger library the 2,000 number that would be 2,000 surveys filled out would be a good number for a larger community. Yes. So somewhere in between, we like to see for kind of average size libraries 2 or 300, and that's usually pretty attainable. Gretchen may be able to talk to that. Gretchen Prueh may be able to talk to that when she trains us. Another question was about the paper survey. So if a library chose the paper survey or do you recommend that they all choose to have that as an option so people can either fill it out online or also have some paper surveys available? Yeah, so it's totally up to you. We're pretty agnostic about this. It's a way of reaching out to certain types of people. In previous versions we found that older folks and very, very low income people are slightly more likely to use the paper survey, but it's not huge. So it's really up to you. And I would say that the deciding factor is whether or not you have the personnel or volunteers who can enter those paper survey responses into our system. All right, moving on. So when you're ready to start the survey, when your fielding dates come up, you'll go and get your links. And these are custom links that are embedded with a URL that takes the patron to the survey. And that link has code in it that records for us which library it came from. So it's really important that you use the links that we provide, the URL that we provide. And we have several different options for linking to the survey. We recommend that you use more than one of these approaches. So our favorite approach is this light box. We provide you with just a tiny little bit of code, just this little snippet of code. And that goes into the head of your library's website and creates that light box. If the patron doesn't follow that link or closes that light box, then it won't appear again for that patron when they go back to the library website. So we know that patrons go back and forth from the library's home page. We didn't want that to pop up. So it's important to also have a button or a banner on your website so that if they close out that button, they can still get to the survey. So for buttons and banners, we have a variety of different sizes and orientations that one of those will hopefully sit into a spot on your website. If you have some sort of a news banner or something, you can just insert those here. You can also, the code is here. We host the button so you don't have to download anything. You just need to insert this little snippet of code where you want the button to appear on your library website. And this can be used with your content management system. So if you're using Drupal or WordPress or a site like that, you can use this also. So you can put this on Facebook too. You could take some of that and put a message on Facebook about the survey on your library's Facebook page. Yes, yes you can. That's definitely a great approach. If you want, you can also make your own button for the survey. So we provide you with the actual kind of core link there on the button page. And so if you have a particular style or you don't like the buttons that we're providing, you can certainly make an image of your own. And all you'll need to do is add to that the direction to your folder where that image is. And if you have questions about that, we can certainly help you and walk you through that. Great, thank you. Once you've done that, or about three weeks before you launch the survey, you should really start planning for how you're going to make the survey available in your library and how you're going to promote it with your patrons and the rest of the community. So we have a timeline there with some hints and tips about how to increase your response rate. We also have some pre-formatted documents that you can use. So we have like a table tent you can put up next to the computers. We have signs that are populated with your URL that you can hang around the computer. We have sample emails that you can send to your library list. I know that this is a concern of a lot of libraries, and some libraries that previously used Impact Survey weren't satisfied with their response rate. So it's really kind of your part of it to really promote it. One of the things that you can do is offer a prize or an incentive to patrons who complete the survey. And again, by using that redirect link that I talked about earlier, you can send them to a Google form or something like that to leave their email address for a drawing. And that is a pretty good incentive that some libraries have used. Other libraries have been really successful by extending computer session times. During the survey period some of your session management software may allow you to extend it only for people that click through the survey. Not all of them are capable of doing that. But that does tend to increase the number of completed surveys. Also if you have a very, very busy computer lab you may want to set up just a terminal that does nothing but go to the survey so that people if they are waiting for a computer to free up or they are done with their session can go and fill out the survey using that terminal. Definitely need to place a lot of fine edge around the libraries, especially where patrons are using the Wi-Fi so that they know the survey is going on. If you have some sort of a login screen for your Wi-Fi, you might want to put a link to the survey there so that patrons can see it in case they bypass the library's website. We also recommend that you send an email to your patron list that includes the link. And a lot of libraries have been very successful getting notices in the local paper or making radio announcements. And that just drives people to the website where they can click on the link and take the survey. We have found in our research that library patrons are kind of unusually willing to take surveys. And generally speaking on average over the past two times we piloted this project we've had a 25 to 30% completion rate. So patrons who begin the survey about 25 to 30% actually complete it which is actually quite high for a web survey. Should we pause here for questions? Let's take a couple of questions. So something we've been asked is some people would like to just see a sample survey. Is there a website or something they can go to where they can see a live survey or see what a sample survey without actually signing up and going through and putting in their own information? That's a really good idea actually. No we don't have that. We do have test accounts. So if you have a library organization that just wants to poke around and see how it works before they decide to use it they can do that. You can also view the survey questions but that's a great idea to have actually like a test library site where people can see how it works. So I'll bring that up to my crew and see who can do that. How long does it take people to take the survey at just a rough average? The median time is about 6 minutes. Some people if they are using the computers for a lot of things, if their whole life is there at the library on the public access it will take them probably about 10 minutes. But for most people they fall into the 4-6 minute range. Okay. And is it worded in a way where it's primarily geared towards adults or would it be something that children could take also? Right now it's designed for people over the age of 14. So we don't allow people under 14 to take the survey but it is certainly appropriate for teens and young adults. Particularly we ask questions about how patrons use the library's computers for educational purposes so it's good to get those teens and young adults answering that. Okay. And this one I'll ask you but then also maybe others who are on the call who have done surveys like this can share too. But have most libraries has it been kind of a passive approach to getting responses? Or have you had libraries where staff actually approaches people and directly asks people to fill out the survey? We've had libraries do both. So most of them kind of have a more passive approach to it although the light box is kind of a more in-your-face advertisement for it. But in very small libraries where you might only get 4 or 5 people a day coming in we've definitely had librarians personally ask people to take the survey. Okay. Great. Okay, good. Thank you. So during the survey you can go to your My Impact Dashboard and you can keep track of how many surveys have been submitted so it will show up there. It gets updated. If you're not satisfied with your response rate as you're getting close to the end of your fielding period, if you haven't reached that 4-week maximum you can extend it by returning to the fielding dates tab. If you're at 4 weeks and you still want to extend it you'll just need to shoot us an email and we'll unlock that for you. But we want to check in with folks that are going that long to make sure that we're engaging in activities to maximize the response rate. So once that's all done and your fielding date has closed and you've confirmed that you entered all the paper surveys that you collected, the very next day you can come back to the site and download reports that contain the results of your survey. So we have several different reports available, a comprehensive report that shows all of the survey responses and it's nice. It's graphic. It's got tables. It's also got text that talks about what the survey results are. It's a pretty long report so maybe something that is more useful internally. But we also have other reports that are designed for you to share with external stakeholders like your city manager or your Chamber of Commerce. We also have now the ability for you to actually download the data set from your survey responses. So if there are things that you want to look at that aren't included in any of the reports that we've created, you can do that on your own using Excel with pivot tables or something like that. This comes in handy if you have particular questions, for example, about how certain types of people, demographic categories of people use the library computers versus others, that kind of information. We do have some of those cross tabulations but there are certainly more demographics that could be done against that. So that's a nice additional feature for you. So that was actually one of the questions we were asked. So information, that's not a question people respond to, but information that tells you, for example, was this survey completed on a library computer or from outside a library IP addresses or browser types, that kind of thing. Is that information available to the library? You will get information in the data set about how the patron accessed the survey, so at home in the library or in the paper survey. We won't be passing on any information about IP addresses or browser information. The survey itself does not collect that kind of information so we don't actually have it to give back. You could certainly, if you're running the survey, you can certainly install a Google analytic on your website to see where traffic is coming that subsequently clicks on the survey button. Okay, and then one more question, people wondering about mobile devices, smartphones, tablets, and how well the survey displays on those devices. The survey will display on a tablet or smartphone. It's not designed especially for that but it's a pretty simple interface so it will show up just fine on those devices. And of course, if your website is optimized for that, then the button will also show up there. I also want to point out that the survey itself also meets accessibility requirements so people who are using assistive devices can also take the survey. Great, thank you. All right, well I'm really, really pleased to have here with us Gretchen Pruitt. She's the library director at the New Braunfels, Texas Public Library. And New Braunfels used the Impact Survey in 2011 and they just recently finished up their survey. And they also happen to be an Edge Pilot Library so they've used the results in their Impact Survey for some of the activities related to Edge. So Gretchen, hi. Hi, Samantha. Hi everybody. Thanks for being here. I was hoping you could just kind of share your experience about how you used Impact Survey and the benefits that you found from using it. Absolutely. First of all, I can very much attest to the fact that you finished it and 24 hours later. Your reports are up because we just finished our second round on, well we said the 1st of December but realistically we finished it yesterday. And so I just pulled down the results today. So I have only briefly glimpsed at them and I'm going to want to compare them to the prior year but really truly 24 hours later we have our reports. So that was pretty wonderful. The Impact Survey in 2011 that we did coincided with our invitation to be an Edge Pilot Library. And they really came through about the same time. But of the two, the Impact Survey was initially the one that I was more excited about because like a lot of library directors out there you get a lot of people coming in and saying, what are people doing in the library? We don't know. We don't have software on our public networks that tell us what websites people are going to. And RIT department did not, I will use past tense there, RIT department did not consider our public access computer technology to be a high priority for them. So they didn't put a lot of resources into it. They just locked it down as much as they felt they needed to do. And whenever we approached them about patron needs, the response was always, why do they need to do that? Why do you need more bandwidth? They shouldn't be going to YouTube. They shouldn't be doing these things. A very negative attitude from our IT department about the importance of the public technology access. So we were really wanting some kind of credible capture device or credible evidence that backed up what we saw. Yes, you walk through the public computer aisles and some people are on Facebook and some people are playing games on Facebook. Some people are looking at websites for fun, but a lot of people do a lot of different things every day on your Internet and trying to capture that was very difficult other than anecdotally. And we all have a few success stories, the lady who came in and found a job. One of our prize success stories was the lady who used our public computers to get her degree online and then got a job with one of the local businesses. So we always trotted her out as our success story. But other than her, we didn't have a lot of other good data. Once we rolled the impact survey out back in 2011, there was a lot of reluctance on people's parts to answer some of the questions. There was some confusion about what we were asking for, and so we had to clear that up that we were not interested in what you do on your computer at home. We were really interested in what you are doing on the computers when you are in the library, and those are not always the same. We knew that we had some databases that could only be used in the library. So one of the questions was if they came in to use Ancestry.com for example, are they staying to do other things while they are there, or are they just coming in using Ancestry and then going back home? So the impact survey gave us a lot of that kind of data. A shocking figure, which I use all the time, was that about 70% of the people in the library using our computer had access to computers in other places, including their home. And so that's been a key statistic for me to ask or to put out because as people get more and more devices, the question was, well, isn't the library going to become irrelevant? Or again, why should I increase your bandwidth? They are not going to be using the library. So being able to say that we are not just serving people who don't have access to computers at all, but we are also seeing a lot of people coming in who want the value added that the library brings. Then as the EDGE program developed, which we were really happy to be a part of the pilot, but it was a little bit nervous because we weren't sure what all was going to need to be captured, it turned out that the EDGE capture mechanism absolutely dovetailed with the questions on the Impact Survey. Had we not done the Impact Survey, it would have been much more difficult to really have the knowledge about what our patrons were doing on the public technology in order to really assess where our strengths were and where we needed to put more resources. And that's really what both of these tools are about, is what are you already doing well? And then what do you need to do better? And the comparability because of most governing bodies of libraries, especially public libraries, don't want to put more into the library than they have to because we all have competing interest in the cities or the counties or whatever body we are a part of. So they want to do enough but they don't want to go overboard and put more resources in if it's not necessary. So the comparability to other libraries both of our size and within our state and then nationally was really critical when I talked to our shareholders. We just finished our survey. We got 308 responses which I know it doesn't sound like a lot, but since we were looking at the subset of people who really use the computers in the library, that's actually a pretty decent response and we're pretty happy with it. Again, we just got a response. We closed the survey yesterday so we haven't had time to really analyze what all the information is. But our initial survey we only had 40 people take the survey so we already have hundreds more than we had before. We got people to take the survey passively so we did not actively go up to folks and ask them to take the survey. We had paper copies available but nobody used them because we really were focusing on people using the technology in the building and so most of our patrons were comfortable filling out the survey on the public computers or on their own devices if they were using our wireless network. We used the pop-up box and we did use the after capture to get people's emails so that we were giving out a $50 Target gift card for one lucky person. But interestingly, less than half the people who took the survey actually gave us their email address to be entered into the drawing. We also tweeted it. We put emails out. We put it on our Facebook page. We had basically links from every place. We used the table tent signs. We used signs in the computer carols but we didn't actively go up to people and ask them to take the survey. So we feel like the response was pretty good. We did use the press release that was prepared and that went out into our local newspaper as well as our local radio station. And Nebraska is a city of about 60,000 so that helps to put it in perspective. And we have about 35,000 active card holders but our public computers we have 29 public computers. So the percentage of card holders are actually using the computers. It's fairly small. We did again have some confusion about what are you trying to capture. So we had to correct the perception that we didn't really want to know what every resident of our city was doing on their computers at home. We really just wanted to know what they were doing on the computers or the network from the library. And once we cleared that up, I think that was the main question that we really had. We are looking to repeat our EDGE benchmark assessment in 2014 and so we are going to be very happy to have the impact survey to be able to answer all those questions but also between now and when we do the EDGE assessment we've got information about public computer usage because we continually amaze and surprise our city stakeholders about what people are doing on the library computers. Since our participation in 2011 we've gotten budget support for a dedicated IT person in the library. We doubled our bandwidth and we replaced our public computers. But most importantly our IT department now takes us seriously. And I think it's been months since I actually had somebody from the IT department say, why do they need to do that again? So I would say that if for no other reason, not having to answer, why do they need to do that again because of the results in the impact survey was probably the best result of all of them. Yeah, that's great. Gretchen, we had a question. How do you define active card holders? Having used their card within the last three years. Okay, thank you. Gretchen, I really appreciate you sharing those stories. And I think a couple of things that happened in New Bronsfeld are really reflective of what we're seeing across the country. The first thing that you pointed out was that 70% of your folks said that they have access somewhere else. And that is the same statistic we had for our national survey in 2009. And there's a lot of really important reasons why people are still using public access when they have access elsewhere, but especially because they're getting help from librarians. So it's a really important thing to talk about when you're advocating for your libraries because as Gretchen said, there is a perception that someday, magically, when everybody has a computer at home, they won't need public access. And our research has shown that that's just not really the case as far as we can predict. In fact, just looking at our results, like I said, I just got them, 61% of the people using the computers in the library reported having one-on-one technology help from a library staff member. And so that's pretty important information for me as I lobby to make sure that I keep my staffing levels up, that computers are not taking over. And the staff is an increasingly important component. And then as a director, the other thing that I love is that out of how helpful was library technology, staff help, the choices were not too helpful, somewhat helpful, and very helpful. And we got a 94.9, very helpful, and 3.4% somewhat helpful, and only 0.8% not too helpful. That one grumpy person. That one, or they didn't know. I mean sometimes people come in with some pretty esoteric questions about software where the percentage of the population who wouldn't even know what it was is pretty small. So we can't be everything to everybody. But when I look at that user satisfaction statistic that says that, okay, I've got 3.4% to go, but the work that we've put into raising staff awareness of the importance, and that's something I really didn't touch on, but we have a facility where we have paraprofessionals on the desks as well as professionals, and everybody's expected to answer a basic level of questions. And convincing the staff that this is a critical skill that they need to learn, that they need to put time into to obtain, and that what the public is doing also is not again frivolous, or just because they play a game on Facebook doesn't mean that's all they did on the computers that day. So sometimes convincing your own staff of the value of what the service they provide is the first step, and sometimes the more difficult step. Yeah, we've heard that quite a bit too that the results of the survey really do have an impact on library staff, and also on IT departments who as you experience often don't really understand how important public access is or what kinds of things people are doing on the computer. So it's great to hear that confirmed as well. Sam, could I ask a couple of questions before we move on to the edge and just clarification questions about impact survey? Sure. Okay, we're getting some time frame questions. So the survey is open now, correct? Yes. And it just opened a month ago, is that right? Right, yes. And we are in beta mode which just means forgive us if something goes wrong, we'll fix it as quickly as possible. And so that was a new version of the survey, so it has just been updated, is that right? That's right, yeah. We did do some revisions to the survey instrument to reflect some of the suggestions that libraries made to us. Okay, and then what's the time frame for, so if someone's ready to start now, they can do that. But if someone isn't ready until say next April, is that too late to start? No, no, you can use the survey at any time. It's totally up to you. A lot of times you have other surveys that you field during the year or you have certain times of the year that you would prefer. So you're able to set those survey fielding periods any time that you want and run it at your convenience. Okay, and one last question for you Sam, can you add custom questions to the survey? No, we don't have that ability. One of the upsides of impact surveys is that we have a very, very large data set, or we will have a very large data set with all the same data from many different libraries. So that helps us a lot to be able to present national findings. But also we've gone through a lot of trouble to develop the questions and make sure that they're valid. So for right now we don't have that capability. We will be adding in the future however some additional types of surveys. So this one is focused on public access. We intend to develop one that focuses on readership and devices that follows along with the recent Pew studies. And as libraries make suggestions for other kinds of surveys they'd like to see, we'll add those too, but always being careful that we validate the survey questions. Okay, and then this is sort of for Gretchen and both of you I think, just to clarify. So Gretchen said she focused on in-library use, that subset, people who were using computers in the library. So that's just a choice that was made for New Brownfels, but overall that's not the focus of the survey, correct? The focus of the outcomes portion of the survey is on people who are using public access technology, either the computers or the Wi-Fi. So that's kind of the biggest portion of it. But before they get to that point there's a number of questions that also pertain to use of electronic resources like eBooks and audiobooks that they download and databases that they access through the website. So it's geared towards both. It makes sense for people who are coming into it from home through the library website. And if they say they've never used a public access computer or the library's wireless they don't have to answer any of those additional outcome questions. Okay, if you're a library with multiple locations, can you collate report responses by location? So in all of the reports the results are in the aggregate. So those pre-formatted reports show your aggregate responses. The report that shows the comments, so we have two open-ended questions in the survey one that allows people to say other ways that they use public access and the other that provides them with the opportunity to make suggestions. The report that contains those comments is broken down by Branch. If you're interested in how the results are by Branch you can download that CSV file and do that analysis yourself. So each individual survey response is marked with the branch that it came from if they selected a branch. So you'll be able to do that. Great, thank you. We've got about five minutes left and I know you wanted to talk about Ed's too. Yeah, I'm just going to kind of skip forward and just briefly touch on how these two work together. Gretchen talked about it somehow. She used it. But EDGE and Impact Survey and my research group have been very much involved in the EDGE initiative since it's very beginning. So we've really worked to make these two initiatives work together. So the Impact Survey provides some resources that you can use in your advocacy efforts and needs assessment for EDGE. And it also actually gives you the criteria to answer yes to some of the assessment questions. So if you're familiar with EDGE or you're going into taking EDGE in January there will be an online assessment that will ask questions about the library. And there are several questions that if you have used the Impact Survey you can answer yes. So there's questions about whether you surveyed your patrons and so you can answer yes to those. Collectively that is about 50 points on the assessment but it also provides you with materials that you can use to support other activities that are recommended by the EDGE benchmarks. So with just a couple of minutes left I wanted to give a chance to answer any outstanding questions and get any feedback that you have. Good. So we have a few and then again please feel free to keep sharing if you have more questions. One was you mentioned earlier that there are test accounts so that people could look at just what a survey looks like. How would they get access to those test accounts? Well if they just want to look at the survey instrument itself, know what the questions are on the survey. That's available with a link on our website on impactsurvey.org so you can look at the survey questions in paper form. Test accounts we're giving mostly to library consortium or co-ops or kind of umbrella organizations that work with many libraries that might want to coordinate their work. There's nothing in those test accounts that is different than an account that a library would have if they signed up. Okay. And Gretchen this one is for you. You said that because of the data you were able to show to the City IT that you were able to get your bandwidth increased. What was your increase from what to what? Someone would like to know. Our bandwidth went from 10 megabytes to 20 megabytes. And we're getting ready to make the argument to double it again. But 10 megabytes, one of the arguments was the whole city was only running on 10 megabytes. So why would the public need so much bandwidth? And we just were able to show what the public was doing and all the different needs they had and that helped them to then go ahead and agree to make the public bandwidth bigger than the cities. The city is now 20 megabytes also. But that was part of the problem with them. Good. Thank you. And Sam this one is for you. This is someone who's been using me. I'm going to put this in chat again so if you want to glance down there. Someone who had taken or had done the previous impact survey and here's a question about the data and what they're extrapolating from the results. I'll let you take a look at that. I just put it in chat. So if 34% say they use computers for job finding, is that a legitimate deduction? So I think there's a couple of ways that you can approach deducing information about your community as a whole from the results of the survey. One of those ways is to look at the demographics of the respondents in certain categories and then look at the population of those folks overall in your community. So if for example you're seeing that on the survey that 40% of the people who are using public access for health purposes are over the age of 65, and then you see in your particular community that your population over the age of 65 is 20%, then you can use that to kind of extrapolate up to your population. This is just kind of back of the envelope quick figuring. This is obviously not scientific. But one of my favorite evaluators, Harry Hatchery from the Urban Institute says it's better to be roughly right than precisely ignorant. And so you can use this information to think about, oh gee, if we have this large population of older folks and older folks are using the computers for health, maybe we should have a program for older folks in health use of computers or something like that. And a question about the survey results, and are they available to organizations other than libraries? And I'm not sure if that means in a particular community or the overall results? So the results of the survey are available to the library that fielded it and is also available to your state library. So we have a portal for state libraries where they can go in and retrieve aggregate reports of all the libraries that have run the survey in their state and also see what the results are for individual libraries. And we've given that capability to them because they're oftentimes are usually such a critical advocate for public access in the state that it's important for them to have that information also. And how that is aggregated on a national level? Is that sort of data going to be made available? We have someone who advocates for libraries to get more broadband and is wondering if the results of the survey would be available to her. So we will be releasing reports about the results at the end of the year. So once we've collected a year's worth of data through libraries that are using Impact Survey, we will issue a report. We're going through a process of trying to figure out how to make the data set itself available for other people to use. We're designing an interface right now where people can play with that data on their own. But we're not sure right now about how to release the data in a way that protects the library from release of information that they may not want, and also respects the patron's privacy as well. Okay, and then one last question before we sign off is, how is the survey kept current? So we are constantly looking at the survey to make sure that it has relevant questions. So every year we're going through looking at the survey, removing questions that we found had poor response rates overall, adding questions on subjects that libraries have expressed interest in having. So we're always happy to hear from libraries and suggestions, and we keep track of those. And then when we do our revision cycle, we'll look through that. Okay, good. Well, we have reached the end of our session, and I know people have things to do. I want to thank you so much for doing this. This has been really informative. We're getting good feedback in the chat. So thank you both to Sam and to Gretchen for taking the time to talk about these things with us today. Just as a reminder, later today we're going to send up a follow-up email and we'll have a link to different sites that we've talked about. We'll have the PowerPoint slides, and we'll have this recording too. And we encourage you to share this with other people if it would be useful to them. So again, thank you, Sam, and thank you, Gretchen. Thank you. Thanks to TechSoup for doing this. This is great. Yes, thank you to both of you. And I encourage libraries to sign up now. Great. So as we close you'll get an evaluation form, so feel free to weigh in on that and let us know what was useful to you and what suggestions you have for future sessions. Thanks everyone, and have a good rest of the day.