 Good morning and welcome to this week's edition of Encompass Live. I am Krista Burns, your host here at the Nebraska Library Commission. Encompass Live is the commission's weekly online event where we do presentations on various NLC activities or anything that we think may be of interest to our Nebraska librarians. We do these sessions every Wednesday morning at 10 a.m. central time. They are free and they are recorded. So if you are unable to attend a live session or if you just want to listen to one that you've attended before again, you can always go to our archives and view anything you want to there. We do have guest speakers sometimes and sometimes we have our own commission staff doing presentations as we have today. So today we have Catherine Brockmeyer who is right here at the Library Commission and we have our third in our three-part series about doing surveys. So this is the last one. If you did not attend the first two ones, the recordings are up there. You will have that link sent to you after this one and then you can have access to all the recordings and go through the entire series as you want to. So I will hand over to Catherine now to go ahead and take it away. All right. Good morning. I'm Catherine Brockmeyer here at the Nebraska Library Commission. I want to welcome you to the third installment of the three-part series on conducting surveys. This is analyzing data and reporting methods. Some people say data. Some people say data. You know, tomato, tomato. Whatever you'd like to say, I'm not too picky. All right. It's not that fancy. There we go. Okay. I'm the research analyst and special projects associate here at the Nebraska Library Commission. I wear a couple of different hats. One of them is a grant writer. Another is as evaluator. I have background in survey research and methodology and so I'm going to share with you today some things that I've picked up, some tools of the trade and some things that I did find on the internet to help supplement me with some information. There's my contact information. Please at any time, if you have any questions about any aspect of today's presentation, I am more than happy to follow up with you, provide more details, talk with you over the phone, walk you through something. That's what I'm here for. Please use me as a resource. I'm a librarian. Ask a reference question. All right. Here we go. Who's joining us today? Perhaps you're a library director or employee that your director has said, okay, go and get this done. You might be a trustee or you might be a friends or foundation board member. We also might have some people from non-profit organizations that picked up on this webinar today and we're so glad you're here. What are your credentials today? Just that you're willing to try to navigate through the survey research process, especially today with data collection and data and reporting, data analysis and reporting. Thank you. Where are you from? Here are a few states that may be represented today. I'm sorry if I missed one. I tried to go through the list and put up these little clip art. They're so much fun. I like them a lot. Here are a few more. We've got a nice wide variety of people from across the United States. I have a Midwest accent, so I'm hoping that you're going to be okay with that. We'll go as smoothly as possible. Word of warning. I talk very fast and I do have a lot of information to cover today. I will try to keep it down to an hour. We may go over a little bit. I won't be offended if you need to leave early, but I do talk pretty quickly and that's one reason why we record these sessions so that you can go back and view them again, pick up what you need to. I did send out three documents, one of which was some outline for notes so that you could try to keep up. I don't typically send out the presentation, the PowerPoint in advance, because I want you to listen to what I have to say and absorb some of it, but I want you to sit back to and take it in as much as you can without trying to follow too closely to what's printed out in front of you. After the session is done, this PowerPoint presentation and any of the other documents that Kevin shows will also be posted along with the recording, so you will be able to download them and have them for your own reference afterwards. Thank you, Christa. What are we going to cover today? We're going to cover questionnaire coding, which I'll explain what that is, data tracking and data entry. We're going to talk about the basics of analysis, data analysis, and reporting methods, including employing tables and graphs. And please don't be overwhelmed by this. Yes, some of this information that I'm going to present to you comes from a background in survey research, but a lot of this is also common sense, and so this is just a jumping off point for you and this may be all you need to get your work done to do the job that you've been set to do, so please don't be too overwhelmed by what I'm going to explain today. Some of this you may not actually even need to employ, but I want to give you as an extensive overview as possible. So the first thing we're going to talk about today, well, let me explain a little bit of what we already talked about in the previous two sessions. The first session we talked about how to set up a survey and the reasons for setting up a survey, and we also looked at a question bank, different kinds of response categories to different types of questions. The second session we talked about data collection. And so now this last part is the third leg of our journey. One of the first things that you're going to do actually when you create your survey is you're going to create a code book. And I'm going to show you a code book and you can do this a couple of different ways. You can do this from your email surveys or you can do this from your print surveys and you can also do this from web surveys. I'll show you the email survey and print survey first and we're going to walk through it. Here it says a little bit about one to five or five to one. We're going to talk about which way, which direction you're going to code one or two or one versus zero for yes, no, and using nine for non-response and addressing sensitive questions. So I need to pull up, I'm pulling up code book example. And I did send this out to you, but I scanned it for us to look at. Okay, what you do is if you have a print or email survey, you print it out and you create codes for the data entry people to refer to so that we're all talking the same language. The first thing that you do on your code book is you show your data entry people that they need to write a response number. You start with one, the date that they're entering their data and have them initial so that you can go back to that person if you're having more than one person enter your data. Over here on the left you will see that we have, you're going to name it Q1 and I'll show you how you move all of this over into Excel eventually. But for example, your first question, you would label it Q1 or you may name it Q1.overall or something, you may get more detail. You see here that I had numbered very satisfied to very satisfied from one to five because I was thinking, oh, the first entry should be a one and the last entry should go in sequence. However, the more I thought about it when I entered data, what came to my fingers was very satisfied tends to be the highest number. When you talk about things on a scale from one to ten, you go from worst to best. And so I did switch this over and I would recommend that you would do your most positive response to be your five or your ten and your least positive response to be your lowest number. Also, you'll see here that even though it's not a category, if they don't answer this question, you might assign the number nine or 99 for no response. Number two, here this is that they can check any of the following. And on the left, you'll see that I numbered them, Q2.comp. You could also do Q2.computer, Q2.a. And then the next one would be Q2.b. And in that case, when you name them, it helps the data entry people keep track, especially you can see here I have 10 aspects of question number two. And so unless you've got somebody who's really comfortable with data entry and can just whip through those and trust their fingers, you might actually name each of your columns with the concept of the question that you're asking. So computer, browse, mag, program, or attend, copy, check myself, check child, return, other. So if they check it, you assign a one. If they don't check it, you can either assign a two or a zero. I prefer zero. That's just the way I was trained. Down here, under other, you would create a column that says other and they check one or zero. Your next column is going to be for text. And if they do respond, you type in the text. Then your next column is going to be back to Q2.recall, or don't recall, or Q2.g, or whatever we're at. Number three, your columns are going to say something like Q3.a through e or fiction, nonfiction, DVD, kid, team. Something to that effect. If they responded very satisfied, you assign a five. All the way over to very dissatisfied, you assign a one. If they didn't respond to the question, use a nine or 99. And you work your way, so each of your columns is going to be across the top are going to be these five. And then your responses are going to be these five numbers. Your next column is going to be something like Q3.comment. This is where they have a chance to write as much as they like. It's part of question number three. They're still responding to these aspects of your question. So keep your question. I wouldn't move on to, I wouldn't number this number four because you're asking them to comment on the current question. And at this point in time, you would enter the text of what has been written. If nothing's written, you just skip. You don't need to put in nine or 99. Number four, and this is four categories. In a way, it's two categories. It's you're mostly interested in yes and no. But you want to give people an out because it wouldn't be truthful for them to say yes or no. You could say no, they're not a fan of the library on Facebook. But what you really want to know is the people who are on Facebook, how many of them are a fan. So actually, this question isn't as well written, and the categories aren't as well written. This probably should say yes, I'm a fan. I'm on Facebook, and I'm a fan. No, I'm on Facebook, but I'm not a fan. I don't know if I'm on Facebook, but I don't know if I'm a fan. And the last one would be, I don't have a Facebook account. If they don't answer the question, again, this is nine, no response. Now, let's talk about non-response for just a little bit. If somebody skips a question, and they skip a huge section of your survey, but they answer some of the other questions, and you see this as a trend or some sort of pattern, it could be that you have asked some sensitive questions. And so then this would be considered non-random, non-response. Random non-response is if they just miss a question, and they forget to answer it, or they answer some, and they think they'll go back and answer the others, and they forget and don't go back. Non-random non-response is if you're asking something like a sensitive question, and they are not comfortable answering the question, and so they skip it. One way to get around that is to require, especially on online surveys, sometimes they require that you provide an answer. However, that will cause some people to accept this survey entirely, because they really don't want to answer that question. So you run that risk. And typically, the questions that are sensitive have to do with policy and money. So if you're going to ask about a tax levy, something to that effect, you might have some people who aren't going to want to answer it. At the end, if you're asking demographics, if you're asking age, if you're asking income, income is one of the most common questions that people refuse to answer or do not answer. All right, so we've got our code book. Now, we've collected our surveys. They've come in. And right away, I would recommend that you go ahead and start entering a few of these, so that start entering them a few as soon as possible. But before we do that, I skipped a slide here. We're going to talk about tracking the surveys. For each one that comes in, the first one that comes in, you're going to number number one. And you're going to do this sequentially. You want to give each of them a unique identifier, ID number. You don't want to identify them by the name of the person who has responded. You probably have mentioned that some of their responses are going to be confidential. Their names are not something that you're going to enter into your database unless they provide it to you. Also, you might want to start when you're done, and you've received all of the surveys that you want to collect. You can calculate your response rate. So the response rate is the number if you have knowledge of how many surveys were sent out. This is the number of surveys you have collected divided by the number of surveys you sent out. That's easy to do when you email them out to people, unless they forward them to someone, and you get multiples. If you're doing face-to-face, if it's a convenient sample, in other words, you try to grab as many people that come by the front desk as possible, you may not be able to calculate your response rate, because you don't know how many total people came by your desk. If you're doing this on the web, one way that you might possibly calculate your response rate is to count the number of visitors to your website and have your survey be available for a month. And then if you have your analytics and you're able to count the number of unique visitors to your website, to your front page where your survey is located, you can tentatively, with a grain of salt, calculate your response rate. Also, if you have a 50% response rate, that's great. But if you only sent out the survey to 20 people, that means you only got 10 back. That's not so great. If you sent out the survey to 100 people and you got 50 back, that's the same response rate. And it's better because the number of surveys that you have to work with is 50. So if you're sending out a large survey and you get an 80% response rate, congratulations. If you send out a really large survey and you get a 35% response rate, that's still pretty good. If you send out 2,000 people and you receive even 200 or 300 back, that's a really great response rate because you have a lot of data points to work with. So these are coming in. Your surveys are coming in. And you've got your data entry people all set up and you've bribed them with M&Ms. Data entry is the process of getting your survey's responses into the computer. And you can either get your data from a coded survey and enter them into a spreadsheet. In other words, somebody has taken the survey and they've done it on paper and pencil. Or you have taken the survey because you've interviewed them and then you wrote it down on paper and pencil and it has later been entered. Another way to do it is there is room for error there because one person writes it down and another person enters it. So there's a step in there where there is some room for error. Another way to do it, which is more efficient, is that the respondent or the interviewer enters directly into the computer. And so if you're doing a telephone interview, the person who's doing the telephone interview is actually doing the data entry because they're putting in the responses for the respondent. If a respondent is taking an online survey, they also are entering the data directly because there's no transfer from when they enter to somebody who enters the data afterward. Let's look at a completed survey real quickly. So up here, this should be in red. Somebody brings this in. They assign a number one. They put the date that they've entered it and they put their initials. On the left, somebody wouldn't actually go down and write these in, but this is just to show you what they would enter. So number one would be a five. Here we have zero, one, zero, zero, zero, one, zero, one, volunteer, and then a zero. And that might be a burden on your data entry people. And so take this into account when you're designing your survey. If they have glitches and hiccups, it can be painful for the data entry people. Again, if you so choose to do something like that, M&Ms are always very, very nice thing to give to them. Down here, q3.fiction, here you go, is a five. Five, four, three, three. Five, five, four, three, three. And then they did comment, I don't have kids or teens at home anymore. This helps explain why they were neither satisfied nor dissatisfied about the children's books and teen books variety. Last question, q number four. Are you a fan of Happy Place Library on Facebook? Yes. And so you would put in one. So please keep your data entry personnel in mind when you are designing your survey so that there isn't such a burden for them. I've done that before. I've done something like number two, but I had 30 things. It was a whole compendium of things that people were asked if they had done when they were training their interns and all the different dimensions of training. And I had to thank my data entry people profusely. So if you think very carefully what you're going to do with this information, if you're just doing it because you want to find out, that's something you may want to skip. Next step. And I need to move a little more quickly if we're going to get through everything here. Data cleaning and edit checks, there are three phases. Data entry errors, incomplete answers, illogical answers, and answers out of the possible range. Or respondents selecting more answers than are allowable. Those are different kinds of things you need to look for when you're doing your data cleaning and edit checks. And print surveys are prone to typos. You might also look for that. So the screening phase, you may have some missing data inconsistencies in the data. Strange patterns in your distributions and extreme values. To check that, and I would do this early and possibly often, you might run your frequencies to look for odd and data points that stick out. Or you could check random questionnaires against the data entry. In other words, if you have 50 that come in, pick a couple out and check every single answer going across. You don't want to have to check every single one. You don't want to check somebody's work. You need to trust that they know what they're doing and that they're able to enter the data correctly. Diagnostic phase. This is where you try to identify the cause of the strange data points. Incorrect data by data entry personnel, or it could be respondent error. Or the question may have been misunderstood, or the data may be real. So for example, you ask people how far they drive to get to your library. And somebody either types in or writes down 100. And you're thinking 100 miles to get to my library? Wouldn't it be 10? Did they just mistype it and add an extra zero? Or is it possible that they commute to your city and they use your library? So that's the diagnostic phase. Treatment phase, you look for impossible values. If you ask people how many hours a day they do something and they write 25, well, they're saying one more hour a day than is possible. And so you need to find the correct value by returning to the original questionnaire, follow up with the respondent, or delete the value if it's anonymous. Make a note in Excel or on your questionnaire. There are also the possible but likely extreme values, such as 100 miles. These remain unchanged. Or if they are really unlikely, you can impute the group mean for that variable. Or you can flip a coin. So if it was yes or no, if you're working with 200 or 300 surveys, you can flip a coin. You may only delete a respondent as long as it's noted in the report that some values were excluded from the analysis. Also, one thing that you can do in SurveyMonkey if you have the subscription is that some answers need to be in a certain format or they have to be within. I think you can do that they have to be within a certain range. OK, we're on to data transformation. So sometimes you need to transform your date of birth to age, or you need to account for some missing values on questions. So if there's a non-response or missing value, you do the 99 or 9. Another thing that you need to do, perhaps, is recode some of your scales. So for example, you asked a question that was from very dissatisfied to satisfied. Negative to positive, 1 to 5. Now, here's an example of a question where this is, especially if you're getting into high-level analyses. But you ask a question that for them to agree or disagree that the library provides valuable services, I would recommend using the library to my friends and family. Those are both positive questions. But then the third one, there's room for improvement in services. That is kind of a negative connotation. And if they agree, that's a 5, but it's in a negative way, this is high-level. I just wanted to mention it. So now you need to plan for data analysis. And perhaps I should have put this first. And back, I did mention this. I think some when we were talking about survey construction. But the plan for data analysis actually begins with your survey objectives. So for example, you may want to determine your patient's level of satisfaction with library services. This then continues with the appropriate measurement or the suitable questions on your survey. So for example, overall, how satisfied or dissatisfied are you with the services the library provides to the community? And that would be measured on an ordinal scale of categories of very satisfied all the way down to very dissatisfied, so five categories. And it ends with a detailed specification of the statistical tests to be performed. With this one, you would be able to provide descriptive statistics of the variable. And we'll talk about those in just a moment. OK, descriptive statistics. This is even high-level analyses. They run frequency distribution summary statistics cross tabs, and they look at filters to start with before they ever get most analysis starts with that before even getting into high-level analysis, which we're not going to talk about today. But descriptive statistics do exactly what they say. They describe the basic features of the data in a study or survey. And this is where you look at each questionnaire item individually. First, you can run frequency distributions for each question, so this is most common. You look at the numbers or percentages of respondents who select each response option. And this can be also displayed in tables or graphs. I'm really hoping that we're going to be able to look some of this in Excel. I think we're going to have time to do that. The second thing that you can do is compute summary statistics for each question. And a lot of us are familiar with this, either from hearing it in the news or reading reports. These are designed to provide concise descriptions of the distribution of answers to your survey questions. So those are measures of central tendency and range. Measures of central tendency, let's say you have these answers, 8 through 20, they are 10 responses. These are interval answers. They're not response category assignments for this example. So you take all 10, you add them up. You divide by 10, that gives you the mean, 12.8. Mode is your most common response. So 10 was the most common response with three responses. Your median is your center most data point. You need to put these in order from smallest to largest. If you have 11, your center number would be the middle number. This is 10 data points. And so you need to find the average between 10 and 13 because you don't have a center number. And that's your median. Your range is 8 to 20. You can look at mean of something like a response category of very dissatisfied to satisfied. People are so used to doing that kind of thing on a scale of one to five with the actual response categories. They see that three is neutral. They see one is most dissatisfied and five is most satisfied. And basically in their head, in a scale of 100%, they're approximately 20% away from each other. So they are of equal distance. So in a way, it is a scale of one to five. And you could possibly calculate the mean, even though you're not calculating the mean between somewhat satisfied and very satisfied. But you're calculating the mean between four and five. Let's say you come up with 4.7. You can see that the tendency is very close to being, for the whole group, is to be very satisfied. Be somewhat careful how you use that. What kind of inference you would make from an average based off of response categories. Going back, that was summary statistics. We talked about range. Cross tabulation. OK, let me see if I can explain this. Think of a two by two grid. You have a yes, no question about whether or not they have a Facebook fan on Facebook. And you're interested to know men and women. How many men are a fan are not. And how many women are a fan are not. So along your left, you might do gender. And along the top, you might do yes, no. This is where you would enter people's response to yes, no, based on whether they are female or male. One of the best ways to do that is in Excel. And they're called pivot tables. Another way to do this is if you have the subscription to SurveyMonkey, they will do the cross tabulations for you. Filters, similar to the cross tabulation, allows you to look at responses by isolating a category of interest. So you're interested in computer users, people who said that yes, they're most recent trip to the library. They use the computer. And you're wondering how satisfied or dissatisfied they are with the services you provide. So you have filtered for computer users, and then you look at your measures of central tendency or your frequency distributions in terms of satisfaction. Do you have any questions so far, Krista? How are we doing? OK, all right. We are halfway through, and we're going to talk about reporting for about 15 minutes or so. And then we're going to go into Excel and take a really good look at some of the things that we talked about at the beginning here. OK, we are on to reporting. So take a breath, kind of change lanes. We're changing lanes a little bit. We've done our data entry, wahoo. And now we're ready to, and we've done the analytics, wahoo. And now, I'm sorry, a wahoo is a town in Nebraska. Anyway, wahoo is what most people say. So we're going to move on now to reporting. And I'm sure that there are people out there that know more than I do about reporting. This is just from my general experience. Please take everything that I say and use it in your own experience, and do your own investigation, and make the reports best suited to your needs. But I just need to throw out as much information here as possible to cover a wide variety of needs. All right, reporting. First, let's talk about your audience. Who is this report going to be for? And what kind of a report is it going to be? You might think about that, too. Audiences, is this the general public? Then you want your results to be a main point summarized with clarity and insight, and not laden with numbers that tell them very little. This may be, your audience may be a participant attending a presentation at a conference. In this instance, some jargon may be acceptable, because they are in the know with regard to your subject matter. If you are reporting to a funding agency, perhaps you received a grant. You'll need a summary of the data analysis, the statistics you used, your interpretation of the results. You're expected to demonstrate how your objectives and other goals you described in your initial grant proposal were or were not achieved. And you use your results to support or not support your goals and objectives. And definitely look at your grant guidelines to find out what they're asking for in terms of your report. You might be responding, reporting this information to your boss or to a board of trustees or a committee. In this instance, you want to demonstrate a sophisticated and scientifically sound design, because any related policy decisions that they make based off of this could have a profound impact on people's lives. So you want to be as accurate and honest as possible. You'll need to provide at least an executive summary that clearly summarizes the key findings of your study and any relevant methodological points. Being able to condense a research project into one or two pages of information, this is a skill well worth working on when you're dealing with your employers or other agency boards or organizations. And then keep in mind, are you going to be asked to make a recommendation or are you going to leave that up to someone else, someone who's reading the report? And the last one here for your audience is the press and the media. Sometimes you'll see Gallup. You'll hear about Gallup in the news. You'll hear about a CNN poll in the news. These sorts of things, if you're going to report some of the findings from your survey to your local newspaper or television station, be very clear with them about the limitations of your study that you didn't prove anything, that you took a snapshot. That's what you might say, that you took a snapshot of what's going on at the library or what people are thinking. And actually have them replay that back to you to make sure that they understood what you were saying. Because once it goes to press, it's considered truth, not always, but you might be having to put out fires, especially if you're talking about things that would impact funding for a new library or improvements, that sort of thing. Format, we're going to talk about the format of a formal report. And you might keep some of this in mind if you're doing something like an annual report. But you usually have a title page, a table of contents, your introduction and your objectives of your survey or study or project, an executive summary or highlights, your conclusions and recommendations. And then you would give probably complete findings of the study and employ supporting charts and graphs, and you may actually have an appendix. Let's take a look at a couple of reports. I'm going to cover everything I just said in this outline example, except for the title page and table of contents. So the introduction or statement of purpose, why did you do the survey? You can read here that the Board of Trustees is in the process of forming a strategic plan in order to get an idea of general impressions. The board conducted a survey. What were your outcomes or objectives? Which dimensions of knowledge, attitudes, or behavior did your survey cover? So specifically, they chose to investigate attitudes toward library services and library use behavior. What were your methods? So what was done and with whom? So what was conducted a survey when on that date? How the questionnaires were mailed out to whom? And again, how? And then how did you get the surveys back? And how many were collected? What was your response rate? Your results? What did you find? So I just made this up. But the survey was conducted. Questionnaires were mailed to this number of people. Uh-oh, uh-oh, I didn't do this. I copied and pasted and didn't finish my document. OK, I did not finish this. So your results, I'm going to go to another document. We're going to talk about that. And then you have your conclusions and recommendations. Catherine didn't do her homework. She did not do her homework. Close this document, pretend it didn't happen. All right, let's look at reporting examples. I mean, we have all written reports probably in school at some point in time. And so we're all familiar. Tell them what you're going to say, say it, tell them what you said, that sort of thing. At some point in time, depending on to what grade of the detail you need to give this, I have done this as a supplemental as an appendix to a federal report where you actually show the questions that you asked. You show the number of responses. And then you go through by category. So as you can see here, I had 50 responses to this question. 28 were very satisfied, and you give the percentage. Now, let's say that you sent out this survey to 50 people, and only 48 responded to this question. You have two options. You could put that you had 48 responses and go with a percentage of 48. Your other option is to say you have 50 respondents, and down below here you would have, including your number and percentage, no response. For example, you had two that didn't respond and what percentage that would have been. Especially if this is a sensitive question, you may want to go this route. Again here, number two, if you recall, what was the purpose of your most recent visit to the library? This is where people checked or didn't check. So 16 checked yes, 24, seven, six, two, 16, one, 24. Six people checked other, and this is the sort of things that they talked about, and then three didn't recall. This shows a bar graph. When asked the purpose for the most recent visit to the library, respondents most commonly reported browsing items and returning items. Checking out books and using the computer lab were also common reasons. So you're taking these numbers and you're putting them into sentences. Something that people can grab onto. Members may not mean a whole lot to them. They may not think that way. They may not think in terms of data. They would rather hear most common, also second in line. Secondly, most important to your patrons, that sort of thing. And then in general, computer lab users are pleased with the services the library provides to the community. This is where I did a filter. We looked at computer lab users, and then so of those who reported using the computer lab during their most recent visit, this was from your filter. 93.8% were either very satisfied or somewhat satisfied with the services the library provides to the community. Interesting, now what did I do? I also collapsed very satisfied and somewhat satisfied. I didn't say that 70% were satisfied and 23% were somewhat satisfied. You can collapse some of your data, especially when you're looking at satisfaction in general versus dissatisfaction, especially if you have a smaller group of people who did your survey. If you only have 50 who did your survey, you may not wanna break it down. Seven and 20 and seven, for example, you may want to collapse so that you have 27 in terms of satisfaction. Down below, here's the matrix. Here's where you say total responses were only 49 and you use that as your 100%. Otherwise, your total responses, you might move this over, no response over into a column to the left and your total responses would be 50 and then 39 would be divided by 50 for your percentage, which is not here. This is 39 divided by 49, which is 80%. And then we asked them to comment and down here is where they commented. Now, if they use any identifying information or a name or anything like that, I would blank out that name and put in asterisks instead. You've basically said that you're ensuring their confidentiality and also you don't want to create waves in terms of you're trying to get across the concept, not who said it, not who it's about. You're trying to get across the information and especially if finger point is going on or anything like that, you need to if possible remove your identifying information. Any questions so far? No, the only question was someone asking if we can get a copy of this document and I responded in the questions but also I'll say here that all of this, this example document, everything that is part of the session that Katherine is showing will be posted along with the recording afterwards so you'll be able to access and download it for your use. And I will try to fix the previous documents. Yeah, but of course I'll make sure to fix it. And then here are you a fan of Happy Place Library on Facebook? Yes, no, don't know, I don't have a Facebook account. So while 24% of respondents have a Facebook account and like the Happy Base Library page, a full 40% have a Facebook account but do not report being a fan. The other thing that you can do is get rid of the don't knows and don't have a Facebook account, then your total is 32, what percentage of fans, so what percentage of Facebook users are fans and be 12 divided by 32, which is higher than 24% obviously. And so you might make a conclusion or inference, there's room to expand Happy Place Library's presence on Facebook because you have 20 people that aren't fans yet and how do you reach them? So there's where you might make a recommendation or an inference but keep those pretty general and we'll talk about why. Writing tips, try to stay in the same, these are English compositional suggestions that I have for you. Stay in the same tense, use active rather than passive verbs, do not write in the first person except for you might say we conducted a survey but after that pretty much stay in an active voice but third person. Try to remain unemotional and neutral and keep it interesting. Nobody really wants to read dry information and read a lot of numbers so create a narrative if you can, tell a story, that's the biggest thing. And the other writing tip that I just mentioned was remove identifying information, especially if you have minors who are answering your survey, you may just, I've done statewide things and so I used their first name and their age and that was it. Or you can change their name, this is when you're getting into high level studies, ivory tower sorts of things. Oral presentations, this might be another way that you're going to do your report as opposed to a written report or an annual report. And this is where obviously PowerPoint comes in, you can jazz it up, you can use more graphs because they don't take up all the space because they're just on a slide and you can move on. So oral presentations make it a lot of fun to report but again, keep it interesting and try to tell a story. Your visual aids, I'll try to cover that in just a second but you're looking at charts and graphs. Considerations, interpretation and conclusions. The survey you've conducted is only one way of gathering information about the world around you so there are other ways to do this, to gather your information for perhaps for a strategic plan and you need to keep in mind that what you may have discovered is not necessarily the truth. This is what people have reported but it may not be what's actually going on. Perhaps the people who responded are the ones who are the least angry or the most happy with your services or the most angry or the most happy with your services. So you need to look at your sample and who responded and keep some of that in mind but do your best with your methodology and you should be able to hopefully portray an accurate picture of what's going on in your community or with your program. What are some of the limitations? This isn't a hypothesis-driven study. This is a survey and it's not an ivory tower sort of survey, it's not an academic survey. So you're not gonna have a hypothesis necessarily that you need to support or not support. Perhaps you have some things in mind that you wanna investigate but typically you're not trying to prove or disprove something, you're more looking for trends and so I would suggest relationships but not things that are proven. Instead say that the findings suggest that or it seems that in some cases or for the sample of people that were surveyed it can be concluded because then you're talking about the responses from the people who actually did respond. Generalizability, look at response rate and look at your sample and to what other groups and population can you conclude that you found in your sample also applies. So for most of the studies we do we can safely generalize only about those we actually surveyed or at best those similar to the people we study. So if you're surveying library users you might not wanna generalize to the population on the whole because you're talking about people who come to the library and use your services not necessarily to your non-users. So if you're wanting to know, get the entire picture you're going to want to sample your library users and your library non-users. Also stay neutral. As Mark Twain said there are lies, damn lies and statistics. And a lot of people say that people use statistics to explain what they wanna hear what they think should have happened and so they go fishing. And that's somewhat unethical. So by ignoring some of the data or by leaving it out you might be able to legitimately eliminate some of your extreme answers but you need to say that you did that. Also if you have somebody who's completely irate there may not be any need to report that one data point because it stands out to your board. And be careful about leading people to a conclusion by saying this only or this is very important. So try not to be biased. We do have a comment about that. Just a comment from Stephanie saying I ask the infamous massaging of the data. Yes, exactly. Don't do that. Right. Okay, we have time. We're going to excel. I'm just five minutes behind. No, not a problem. Let's look. I only entered 25 data points and then I copied them. So it looks like I did 50 but I didn't. Go back to your completed survey, the one that has a in green and red. This is where you would enter for the first question and you can see at the top it's labeled Q1. Before I do that. Let's talk about the code book. Another way you can do a code book is you can do it in a tab in Excel. So you can right here put the name of the question and then the response categories and what each number. And it's kind of nice for people that are doing data entry in Excel that they can just refer to the tab. There was something else I wanted to talk about. Oh, okay. Before we actually do a data entry. Here's where you bold your first column in your first row. You bold your first column so it helps you see more clearly by respondent and you bold your first row if you're gonna run any frequencies or anything. Also I froze the pains and I shouldn't have. That was a big no-no because then I went and did some sorting and I tried to go back and it only moved some of the data but it didn't move all of it. It didn't move the respondent numbers along the left. That was not a good idea. It's nice to be able to scroll and still see that you've got respondent number one. Big no-no. I've had to re-enter a whole. I had to re-enter 100 surveys one time because I did that. Yes, because I was manipulating the data within my original tab and I had to re-enter it all. So I recommend against it. Okay, data entry. You might wanna put your columns wider so that they can see the whole label. And here's where we have number one. They responded very satisfied. So five, zero, one, zero, zero and then you see the rest that I entered. And here's where your hiccup, you have to type and then you go back to data entry again with numerical data entry. So if you can enter a few yourself practice with fake data, see what's the burden going to be on your data entry people, I recommend that. Okay, data analysis. There are three ways that I have found to be able to do this in Excel. One is where you, for example, you copy your column of interest. This is where you don't manipulate your data within your original. You see how I typed original? I don't touch this data once I've entered it. I don't move it around, I keep it in numerical order. I only copy and move it over to something else. So this is what I've done. Subtotals, if you're gonna use that, you copy the column, you sort A to Z, smallest to largest, so you have your threes together, your fours together, your fives. We didn't happen to have any ones or twos. Everybody's either neutral or satisfied at some point. And then you use your function. You go to subtotal. And I have to say it's different in 2007 than it is in 97 to 2003. So you may have to go into your help and find out how to run subtotals. This is where you don't use the sub-function because you're gonna be some function because you're gonna be adding three plus three plus three plus four, no. You wanna count the number of threes, count the number of fours and count the number of fives. So change this to count, add your subtotal to, it doesn't matter, Q1. And that didn't work. I'm sorry, apologize. At each change in response number, that's not the right one. We want at each change in Q1, there we go. See, I sometimes don't know what I'm doing. It's okay, there we go. Here you can collapse these or expand them. But you can see here that you had four people who responded to number three, which you have to remember, that is neither satisfied nor dissatisfied, 18. Somewhat satisfied, 28 responded very satisfied. What can you do with this data? You can use your control key, select and copy. You can move them over here and paste. Add up at the bottom, sum, get your percentages, four divided by 50, V1 divided by V4, et cetera, et cetera. Now three count doesn't say anything if you're gonna create a chart or a graph. So you may want to redo it and do neither satisfied, somewhat satisfied, very satisfied. If you're going to do a chart and graph, three, four, and five, and I'm gonna tell a whole lot of people a whole lot. Another way to do it is to run descriptives, I'm not sure if it's in 97 through 2003, but it's in the 2007. And that's where you have your column. You go up to data analysis, and then you run descriptive statistics. I don't see it on here. I don't see the function. I tried to do a dry run so that I would remember everything that I did. And I obviously don't remember everything that I did, but there is a way to run, to choose descriptive statistics. And underneath, choose data analysis and then run descriptive statistics. And this is what you're gonna get for your answer. Here's where you did get the mean, 4.48. So it's halfway between somewhat satisfied and very satisfied. Your median, your center response was a five. Your most common response was a five. Your range was two because you went between three and five. Your minimum was three, your maximum was five. The sum of your numbers was 224, which doesn't tell you a lot in this instance. And then your count was 50 responses. Another way to do this is with pivot tables, which are pretty elegant and take a while to figure out. But once you get the hang of it, I'd try a tutorial or something if you can. You select your column. Then that's your column label, your row label. Actually, you just do your row label if you don't. Let me explain this. What I did was I did the filter. So these are the people, your 34 people here. Here are your 50 people, 412, 18, 6, and 10. Here are the people who didn't use the computer lab by satisfaction. In the row, in the column of one, these are the people who did use the computer by satisfaction. And here's your grand total. So we broke it out so that you could dig in and find that there were six people who used the computer lab that were somewhat satisfied, and 10 people who used the computer lab were very satisfied. So that's where you can filter. Again, I'm not going to go into great detail in this. You can contact me if you'd like me to walk it through with you one time or two times or five times. You can try a tutorial. Those are also very helpful. Pivot tables can help so much, but they're somewhat unwieldy to start off with. Let's talk about charts and graphs for just a second. I do have a question. Okay. Not me. I mean, somebody has a question. Okay. That if you're using SurveyMonkey, do you still suggest doing the coding? Oh. Because SurveyMonkey has a lot of its own reporting. If you have the subscription, you can download it into Excel. And you can choose whether they code it for you or they give you the response categories. Okay. So for example, your response categories went from very satisfied to very dissatisfied. They are going to number that sequentially from one to five for you. And so double check your data against your first respondent. Match up your respondent from the survey with a respondent in the results. And make sure that you've got it right, that it's either one to five or five to one. Typically what they do is they go from first response category to last response category and number it from one to whatever. The other option is you can choose the written response categories, but then when you get those, in every single cell it's gonna say very satisfied, very satisfied, somewhat satisfied, somewhat satisfied. And that's not exactly what you want. You wanna work with numbers. So thank you for asking that question. However- So that if you do use SurveyMonkey, you don't have to do this yourself. It will do it for you. Well, if you do the free version of SurveyMonkey, they will run the frequencies for you. But you cannot download your data. Ah, okay. You can go through a respondent by respondent and write down or if you have two screens or you wanna split your screens and enter the data into Excel yourself based off the responses. You can. So it depends on what kind of, what you have with SurveyMonkey and what you can and can't get out of it. Right. Good to know. Yes. So, and I signed up for the free version and went through that and I was disappointed to see that you cannot download your data. But they do a good job of giving you those frequencies and percentages. What's nice in the subscription is you can do a cross tab. In other words, you can filter out for, for example, just like this one, people who do or do not use your computers and their level of satisfaction. But that is, again, the subscription version. All right. We are right at 11 o'clock. I'm just, if you need to go, that's fine. Otherwise, I just wanna show you a few things about graphs and charts. Here's a bar chart. This was by usage, computer usage. And so I had the data here. They asked for your ranges. You need to make sure that your first row has a label so that they show up. So here I summarized right here. And this is what the bar chart ended up looking like. It's possible, you know, you need to dig a little bit. It's kind of, yeah. This is a pretty accurate portrait. Let me see. Do I have a pie chart here? There's a pie chart. Sometimes it's harder for people to visualize. For example, you can see that these are all fairly similar. And so you really don't get an accurate picture that 16 is different from 20. It's different from 24%. And sometimes the slivers are so small that they're nominal. But this is a pie chart that took, whether they're a fan of Happy Place Public Library or not. If you just did yes and no, your pie chart would look kind of silly. Not necessarily. I mean, you get a good picture of what chunk of people are and are not fans. Take a look at them. It might actually just be better just to do it in the narrative that 12 or whatever percentage that is of 32 were fans. Some people, this visual type stuff, just when you're presenting to someone in authority who you need to convince of something, a lot of this graphical visual stuff kind of catches their eye and then sucks them into other stuff that you're trying to explain to them as well. Absolutely, right. It does summarize very quickly what you're going to, what you do explain in the narrative. However, I would do both. I would explain the narrative and then also include it. The here again is the same data. And this shows in a bar chart, the same information versus the pie chart. And this shows the difference a little bit more between 20 and 12 as opposed to the pie chart here. So 20 and 12 is no and yes. That shows up a little bit more, but the differences between these three show up a little bit more here between 8, 10 and 12. So visual graphics are good, but pick the one that looks the most impressive. Right, or is the most, show it to a couple of people and ask what they think or how they interpret it. That's also a very good thing to do. All right, that was it. I just need to plug a few things. I want to talk with you just a second. There are some resources. These will be up on delicious. And you can refer to these. These are some that I used for this presentation. Don't worry about trying to scribble down all those URLs. We will link to them with the recording. Absolutely. And what are your next steps? Well, this is where you take your idea and you run with it. This is the third part in the three-part series. If you'd like to go back to the first two and either view them or review them, put this all together. I'm hoping that this has given our attendees a good starting point. This is by no means comprehensive because everybody's situation is different and everybody's needs are different. But please take what you can, use it to, use it for your situation. If there are any more questions, I'd be happy to field them if you're needing CE credit. And then also, evaluation, I'll plug that in just a second. We do have some sessions that have been archived here. Those will also be links on delicious or we may actually show them on the page. Actually, when we put up the recording for session three, it actually shows links to the recordings for sessions one and two of this series. But we could link to the other ones. They're all related, yeah, absolutely, yeah. So all of these will be on the pages for all sessions, one, two, and three. We have three different pages for each of them, their own recordings, but then they link out to all the other ones, because they're all related and it's a three-part series. In the upcoming sessions, we already plugged that at the beginning. And of course, I conduct surveys and I do evaluations. So I'd love to hear back from you. Please take, I'm going to send out, well, it's actually on the notes that I already sent out at the bottom. The survey's already up and ready to go. And this helps me with my future presentations. It also helps me if you would like to leave some comments and ask me to get back to you, I'd be happy to do so. So please take advantage of this and let me know how I did, let me know how we did as a commission with our Encompass Live. And I certainly appreciate your time today. Thank you very much. So this will look like we have any more questions that were tossed in at the end while you're wrapping up. That's cool. But as Catherine said, contact her if you do want to need any help with any of this. Thank you very much for attending. Oh, Laura says, thank you, Laura. Oh, thanks Laura, good to see you. And we hope you'll join us next week, as Catherine showed in the previous slide too. We have communication getting the word out. Does your audience hear what you mean? We're Mary Jo Ryan, who is the Communication Coordinator here at the Nebraska Library Commission and will be presenting next Wednesday at 10 a.m. Central time. So thank you very much. And we will see you next time. Bye-bye. Bye.