 So this is my little morning that we probably want to stop talking in our entertainment. Let's talk as a bigger group. I heard some very interesting things as I was going around the room. And I had the luxury of eavesdropping on all these wonderful conversations. So you don't necessarily have to report out in a sort of what we talked about this, this, and this. But let's just talk about some of the interesting things that came up. Anybody want to volunteer? We talked about hours. Hours. Assessing the hours. That's something to bite off, right? How are you going to do that? That's okay. You don't have to have an answer right now. What I hope is that you go home with a thing like, okay, hours. I can start with hours. What about other people? We talked about accreditation as an incredible opportunity to do a lot of formal assessment. So the accreditation pushes that are happening at the larger institution level can be the impetus. Well, they happen at this level too. No, what I mean is the law, as in even if you're a small library, you're embedded in a bigger institution, that one has to be accredited. Yeah, so that can be an impetus for something, right? Yeah. Is that what you were thinking? Well, you have to have self-study. I mean, that's part of the process. So it's a little bit of a male case in a structured way. So in a brief way, as Mary was saying. Yeah. Like, you know, they give you five minutes. What else have we got talking about here? Website evaluation assessment. Website. Scott, how are we assessing our website? Web analytics, usability tests, user surveys, user focus groups, and our own observations of the website through front-line-off experiences coming from our liaisons, our reference team, and our access services group. I am so impressed that you had that answer. How did you just get the workshop on? Plus, it's what I did. I'm sorry. How did you? Did you just say it's easy, though, to do that? Well, we've added it in sort of like a modular program. So we installed web analytics. Then we installed a few more web analytics tools. Then we installed sort of a user survey program. And then we synthesized it all together. So it builds on itself. Right. So you build on it. So I think that's something to take away from that, is that you can think about this model in a modular way. I think most of us procrastinate on doing things when we think about the entire thing as one huge thing to do. I know that's the story of my life. That's why I used to write my papers at midnight before they were due. I'm the worst case scenario here. Whereas if I had learned earlier in my life that I could do things in increments, I hope you have your health improved today. What else were you talking about at your tables? Understanding what analyzing is. What assessment is. The word assessment is not something we use all the time. And what did we decide over here? It's understanding outcomes. So you probably understand what you're doing. And then understanding what happened after you did it. And so that, I think, sometimes that's where things fall apart. You just do it. You don't assess how it affected your patrons, the students, the faculty who were a part of it. That other person in the room, how did that affect them? And it's different from evaluating your performance as a teacher, it always sort of gets people on edge anyway. And I think about assessment more focused on, did this actually work for the students? Did they retain them? Did anything stick? Did they actually know what's in the catalog? If I spent half an hour talking about what was in the catalog and they left, not actually knowing that. And that does happen. Even though they might be nodding and saying, so how would you find out if it's stuck? If they actually get them to demonstrate that they know how to take those clicks by themselves before they leave. Because it's really easy when we do it. But it's not so easy when they have to go and do it when they've gone home and maybe they've had to make differences. So I get them to kind of demonstrate it. Or if I don't have enough time and I have a lot of students, then I get them to actually write it down. You get them to write it down. And make sure that they've written it down. So that it has a better chance of sticking with them. Yeah, at least even the first step. Jenny, did you want to say something? I didn't mention this here. Has anybody done anything of surveying former students, students that have gone on to four-year colleges or students that have gone to careers and what they think of the library did for them? We have an exit interview. We don't have the library doesn't do it. It's a larger university thing. So it's just by chance if they haven't mentioned the library played a role. But I think they're sent to seniors. So they're based on programs. Exit interviews like student athletics, those and all of those. One of our goals I think is to help kids feel comfortable going into a library so they can go into any library and not feel intimidated or not be worried about asking questions or knowing what they can get. And I'd like to see, I'd like to know if that could do that. So how do you think you could do a survey? Our college can't even keep track of them once they graduate or work right on their reservations. So I don't know how they're going to go on. This is what we call one of the obstacles. They're trying to do better at that. But I mean it's seriously, I'm sure I'm at, she has the same issues. People graduate from here and they go away and you don't even have an address to contact them. Do you have an alumni association? No. We're developing one. Just making some assumptions based on a bigger institution. The majority of the people that are graduating from college are alumni. I did a survey a couple of weeks ago. I had altering motives. But just like how fast are you at the library, the service, your employees helped you, and things like that. I got pretty much expected. But then I said I'd draw one and give it to not a good card so that didn't get people to sit in it although they did it really fast. So you gave an incentive and was it a hand filled out survey? Right. There are a lot of them. At one table we were talking about sort of survey sickness where people get tired of surveys, whether they're in print or print. And I'm not saying they're a bad thing to do. We certainly use them. I think we just had a survey sent out to us about what kind of hand dryer we went in the bathroom. I have to say I laughed for about five minutes in my office. It came from our great associate dean and bless his heart. People want to know what you think and what you want. But about everything, you know. So we have to be kind of careful how we use surveys too. You know, another thing that I can say. We had a suggestion box that all we ever got was you guys are doing a great job keeping it up. And we haven't had any complaints from students which makes me think that they're not expecting enough of us. And so I'm just getting low expectations. Yeah. I don't know. Our students know that, I mean, not just our students, our patron base knows that everything on the res is grant driven. So they come in and expect to have to sign in somewhere. So they're not going to tell us we're doing a bad job. That's exactly what I'm going to say. But we didn't have a confidential box for the slot that you could like a voting box. Is it helpful? All we got was you guys are doing a great job. How do you get somebody to tell you you're doing a bad job? How do you get somebody to be critical? That's really interesting. Anonymously. Yeah. Even anonymously. Yeah. Sometimes. I do. I forgot this spring. I do a paper. I've done the same paper survey for about 10 years. It needs to be revamped. But there are a couple of people who are mainly honest as far as I'm concerned. And when I see those comments, it's like, you do that? Because maybe they're just being mean. Maybe they didn't really have anything happen and they're just wanting to be contrary. But it hurts when somebody says something. It's better when they say you're doing a good job. But you don't know what to improve if nobody's telling you you're doing anything wrong. That's right. Abby, you were talking about statistics. I have every statistic anybody could ever ask for, and I don't know what to do with it. So I have Excel spreadsheets like you would believe. I was quoting for Gary on this earlier this year. Don't you use it when you have to do an I&Peds report? Yeah. I have the numbers for that. But I have lots more numbers. I mean, I have hourly counts and gate counts and search. And I mean just everything. And they're useful numbers to me. But how do I get the rest of the world to see that we need more funding? This shows that we are trying to do our job. This shows that students are coming from libraries. What graph or charger, pie charger, or whatever it would say to somebody else in the outside world? They either need help or they're doing a great job. Well, what happens when you all together start comparing your statistics and comparing your hours and your students and count and comparing numbers? I mean all of us? It would be nice because the Montana State Library does that for all public libraries. So when you go to the fall workshops, they hand everyone this cute little trifold flyer and tribal colleges don't get that. So you would kind of look, oh, what is your library doing? Well, there is actually no reason why we couldn't use the form that the state library uses in answer all those same questions like for the seven of us in Montana, we could do it or we could do it for all 30 however many of us there are all of the United States. I'm sure that getting the cold at the survey, that form that they use is probably not a big deal. We're so really different. I have 400 students. Emma has 120. I have 20 faculty. Emma has five. See, so it works, but it doesn't work and the collection sizes are so different. But that's why you do prep ratios and percentages. So you could still make that work, I think. But cooling that, one of the things though that I'm reading in the assessment literature is that we are really good at counting things in libraries and we've proven that. Emma certainly has proven that. I've been doing her daily course. Yeah, so, you know, but we're all kind of going, you know, nobody's really interested in these stats beyond a certain level. What do we need to do in addition to or instead of maybe even some of that counting? Yeah. What would actually be more interesting to our administrators? The outcomes possibly. The personal comments. The comments from patrons, both positive and negative. And also, how did you impact the lives of your students either through those comments or can you show that grades went up sometimes because they're classic. You know, so, do you guys do that? Are you guys able to get to that level to either work with faculty to say yes, the library is the one that helped raise these grades or not? You know, we have in the past surveyed our faculty that we've talked for and with precisely that kind of question, do you feel that the library instruction helped your students attain better and write better papers and that sort of thing? And we haven't gone to the point of and if you say yes, prove it to us. But really, the places that are experimenting with this are doing interesting things like counting citations, doing citation analysis. So maybe you've got some teachers who you can work with closely, not every teacher because you can die but maybe you've got a couple who will let you examine the bibliographies on the papers so you can see how many of them are actually using peer-reviewed journals after you taught them what that means and how to find those. And then you tell a story out of that. You use those stats to tell a more sort of an outcomes-focused story. So that's one idea. I mean, I think there are lots of ways. We just have to almost like have somebody sort of hit us on the head of the mallet so that we quit counting things. Or we count things and we put that aside and we think in a slightly different way. One of the things that I just asked people, what does it mean that your library is successful? What does success look like? How do you know you are being a successful library? And I hope you all heard that. I think that is a really important question and it's going to come up in the next few slides here. I think it's so important. It's like you read my mind. How do you recognize success? How are you going to talk about success? How are you going to identify? How are you going to describe it to somebody else? If it's just we count success as 20 people a day walking into the library. Well, that's okay. But it's really, it's not a very exciting success story, is it? Because it doesn't take the next step. It doesn't say and these people were able to do X, Y, and Z. Or, you know, whatever we want to do with that. Anything else you want to bring out from the discussion? All right. Well, just from our group discussion here, I would love to see some kind of annual report of all the library. Tribal college libraries, some kind of stats, something, because I come from the public library background where we had to turn those in annually and we got to see how we compare to all the other libraries in the state. I don't know how I compare to all the other libraries. Why don't you let me see if I can talk to the state library in Montana, since I kind of have a connection there. And if I can get their form, their Excel sheets, whatever they bought, and see if we can't get something to go on starting for fall semester maybe. Would that work for everybody? Yeah. It sounds like something that could probably get an egg actually behind as far as getting all of the libraries. And I was thinking about it before you said I'm also wondering, and so you have it. Yeah. And what are you going to do with it? You can get a lot of statistics really. But it's got to come. You're going to have to say MSU says, you know, Dean or whoever, University of Montana, there's this much for capital on the library. Yeah. You can use it because that type of a thing, that would be a thing. Sure. But it's still a lot of counting, too. And I'm not saying, you know, let's not count any of the stuff, and let's not compile these stats, because it would give you a feeling for how you compared with others who are doing a similar job. Or maybe you'll pull in things like graduation rates, retention rates, library support, you know, library instruction being done and some breakfast transactions. It sounds to me like you need a heck to start doing this kind of thing. It does this on a level, but not touching the library. I mean, I think it's statistics over here, if not more often on graduation rates, number 5. But to add in the library component, they can be traced. You can try to say it. I think a lot of us fill out the ACRL report every year, and there's an ECS report. I don't know if anybody does that. I don't know where to get the statistics. Those two are, as I think, we're supposed to all fill those two. We are, but... We're filling all that data, can we get it? No, they pull it all together, and you can't pick your own. I don't quite understand. You can pay in a get access to it. Yeah, you have to pay. I've paid one. It's $150 a year. I ran the tribal libraries, but not everybody reports. It's small. I mean, it's not everybody's... Yeah. So what you want is a heck to start doing this and getting it back to you for free. Right? Okay, so somebody write that down. Yeah, Gary's going to write that down and then you can write it. I want to go on to the last bit here. Which, I don't know, I hope we'll put a little bit more perspective on this. This is this is what I like about this quote is that it talks about a journey. Assessment is a journey. And I would say it's a journey that really has no destination that you learn more, but it doesn't end. So you can't think about assessment as well, we assessed everything last week. Yeah. And now, we don't have to assess anything else. And again, it should be really an ongoing thing. And that it is worth thinking about. That it is a complex enough, important enough process that you want to spend some time reading some of the some of the articles maybe that are on this list. Are these extra? I think this table got shorter. Sure. And actually paying attention to this and getting outside help with it because it's definitely worth it. This whole problem of how you began something like this. Choosing an initial focus, you could go so many different directions. One of the things that comes up again and again in literature is that whatever you do, your assessment needs to be tied to your larger institutions vision or mission statement strategic plan as well as to your own. So that you're showing through your assessment activities that you're being accountable to the larger mission of the institution. You're tying the library to that really directly. So that's one thing to do. So look back at those documents, revamp them and what pops out. Okay, so by assessed instruction I can clearly show how we are contributing to student success that is stated in the mission. You can also look at Jenny's library where there's only positive input, but you can look at this weekly swim. What are your users saying? Oh, there aren't enough computers and I think, John, you were talking about the computer usage issue. So maybe that is the thing that pops out at you as the place you want to start to get your feet wet with assessment. I told you why we wanted to do the commons. It was new. It would seem to be really good. So we focused there. You can also look at your budget to understand your investment. If you're paying a lot for a service, if you're spending a lot of time on it, that means you're paying a lot for it. Maybe that's a place you want to do and new library instruction can often be that service that is costly. You know, some of you are having to do assessment on your own. You're either in a small library or it's been written into a director's or one who does assessment. And I would make the plea that you change that in some way. That you bring others on board with you at whatever level, including thinking about getting help from other faculty on your campus. Who have an expertise you may not have, who have an interest in the library. Maybe they're the ones that use library instruction the most. Ask them to help you formulate an assessment plan. Not to put all the work on it by any means, but to get their input and to get that sort of camaraderie that will help you keep a momentum. Taking some kind of inventory, whatever that means to you of your assessment skills. What are some skills you need for this kind of job, for assessment? What do you think? Attention to detail. Attention to detail. Analytical. Analytical skills. Being organized. So computer skills. Knowing how to use Excel, let's say, maybe. Knowing, yeah, Jenny. I think being able to look at it from the user's viewpoint. Not what are we doing but what do they need to want. We can do that. Statistical knowledge. Statistical analysis. Maybe that's what you were saying about analysis. And don't let the fact that you identify a whole stop you from starting this. Because, of course, we all work in education. We know how to learn things. We know where to go to learn things. And how to get help. And so exploring resources and seeking advice from people who aren't here doing a good job at this that you perceive. And that can be, you know, emailing somebody, or it can be writing to the article. That brings me up. We've done some surveys, but I would be interested in seeing surveys that other people have written so we know quite how to ask questions. Questions to ask are some examples. And how would you go about getting those? I email everybody saying, when are you? You guys have a listserv, right? So I foresee that, Jenny, you're going to put a message out there saying, is anybody surveying their students about how they're learning? Does anybody else have some of the surveys we've done after we get it back? We shouldn't have worked at that question differently. Right, people didn't understand what you made. They gave you really bizarre answers. Some of the stuff that I've already talked about, and please, please keep the bibliography. Yeah, we've got to keep every sheet of paper together. I can also send it out. You have Mary Ann send it out on listserv in electronic format, which would make it a lot easier for you to click on some of those long-long links that I have in there. Because I'm telling you, I was really amazed at some of the resources that are out there. And I'm highlighted in yellow the ones that I've cited within this PowerPoint. But there were a lot that I read that I just simply didn't do a direct quote from. And it just informed some of my thinking on it. You know, Steve Hiller stuff, those are all PowerPoints that he has obviously put up on the web. This is presentations he's done at ASA LA. And they really got some really great questions. You will read them and you'll think, ah, why didn't I think of that? And it will help you determine what is meant by outcomes and assessment versus evaluation. There are some association resources. ACRL is really into assessment right now. And you'll see up at the top three links to some ACRL resources. One is the Immersion 13 program assessment demonstrating the educational value of the academic library. I've applied to attend that. I won't know until July sometime whether or not I'm good enough to go. I had to apply for money from our provost to do that. Because assessment is such a big issue everywhere, the provost was pretty free. You know, it was not that hard to get. So you might think about that. Not just having to go out and find grant funding but go into your university or campus college and administration and say, I need to assess this. This is what I want to go to. If you identify a program they're more likely to say, oh, yeah. And there is a whole value of academic libraries initiative that ACRL is doing. And that website includes a very long report that is full of detail about how you establish your value within your campus community within whatever community you need to. So this is a real mishmash of articles, websites, associations and that sort of thing. Hopefully it is useful to you. Once you've done your initial assessment of course you don't want to stop there. And it sounds funny, but you need to assess your assessment. And there is actually an article on my bibliography that is about assessing assessment. And it sounds like a never-ending regression, but it's important to reflect on what you did to figure out if you would do it differently. Just like that little example of the survey is absolutely right on the end of the figure scale. Every time I do something, I'm like, gosh, you've done that in different ways. That slips out of my head. Well, this is saying we need to keep track of what you need to do differently so you can implement it. How can you streamline it? A lot of times when we first do something, we make it much harder on ourselves. Understandably we're creating something from scratch. We don't know what the best way to do something is. So how are you going to streamline it? How will you make it easier? Who else should you involve? Maybe you've created a group to assess something and you realize at the end of the task, oh, I really needed that other person's perspective because every person you have in the room brings in a different perspective. That's why it's really good not to do assessment on your own. The skills issue. How will you use what you've learned? That's a really interesting question that we haven't really talked about. So you go to all this work. You do more than just gather numbers. You tell a story. Where are you going to tell that story? Well, usually in the past you want those numbers to justify budget increases. So you tell the story to higher administration to your numbers. Yeah, okay, so brand applications. Your website. Why not have something on your website? User of the week. This person said this about the library. This person got this out of the library this week. There are lots of creative ways that you can use assessment outcomes. You just have to sit for a while and imagine what that's like. No, no, I know. What did you learn? Let's take a quick question. You always have to figure out, what kind of permission do I need to use somebody's story? Very good. And the dream. All dreams is that you create an assessment plan for your library. That includes a lot more than just an accessible assessment for a cycle. So how often are you going to assess library instruction? You're not going to do it every day, maybe. You're going to do it once a year at this time for a certain period of weeks. That sort of thing. I'm almost done here. How do you recognize accessible assessment? It should be participatory, it's ongoing and informative. You should be able to do it, not kill yourself and doing it. It should be useful, meaningful and relevant to the larger tribal community. So this is my charge for you. It might be easier to do this, just thinking about every new thing that we do. We're going to ask that question of how will we define success is the question that Gail brought up earlier. How will we define success is the project, service, and collection? If you create that as a staff you are going to remember to assess your project, your collection, and your service at certain points. Grow your staff skills, draw others into your assessment plans and don't forget that you are always going to need to refer back to those documents that govern the larger institution. You always want to be tied to that. To show that you're valuable and that you're worthy of money, notice, celebration. You want to be stars. This kind of thing will make you that. I'd like to know what MSU or any of our tribal libraries are doing to assess their instruction efforts in particular. That's kind of what we're trying to focus on now is assessing our instruction. I'd like to just hear from anybody else that you're good. Well, I'll just say from our point of view we're starting into that. That's one of the reasons why I'm being I've applied to go to the ACRL immersion program on assessment because we haven't been good at assessing. We've done here and there evaluations of courses but we haven't assessed. I am with Mary Ann and another person who's part of the transition team are trying to create doing a little outcome survey with all our writing 101 students to see if they do know where the catalog is and what the distance between them and databases. So we're experimenting with a sort of really brief survey over for that. And that's where we are. I mean we're not we're not the assessment stars in the room for instruction. We do do the the nap forms that are very class on canvas including all the library tests for credit courses. And I think that the most part the more classes we teach though are non-credit and so on. We do that. We're better than we thought. One of the categories of assessment that are very impactful are the ROI studies that we turn on investment studies. There was one just released in Texas this spring for every dollar you invest in a public library in Texas as a return of $44 to the economy. And that sort of thing is something that you can tell the legislators and others. For your public library, right. But that model can be extended to other settings. Anybody else have a great instruction assessment story? We're assessing bibliographies for the classes we've taught. We've just started doing it. We've been doing it for about a year and it's been hit and miss and the type of classes we're getting bibliographies from are inconsistent. So in the fall we're going to just focus on the entry level English classes and getting those bibliographies and we're assessing them for are they a library resource or not? If they're a website do we think the website is reliable? So we're doing that for students who have taken a library class and then we're surveying professors to try to get their opinions about the usefulness of the class. They've been pretty honest with us for the last year. Their comments have actually been really helpful and have changed how we are delivering our library instruction actually. That's really great if you can get them to be honest. Because a lot of times today CES has a really vital service for us from working with them. That's kind of what we found when we were asking questions of our instructors I think. So that sounds like you're doing a couple of things to fit together. So there's some ideas Tim. And I would say for instance maybe you guys want to be an email contact to get what analysis the sort of parameters of the analysis of the bibliographies. Because that is what I'm hearing people are trying to do most commonly is to analyze students to analyze what suits are using in their papers. And one suggestion would be to get bibliographies from papers of students who did not have library instruction as well so that you can do a comparison. Who's that? I wasn't watching over there. I did. Valerie. Okay. Anything else anybody want to say Jenny? When you talk about how doing a survey can get tiring answering those questions. Has anybody tried maybe asking just asking people one question when they come in or when they leave? Person to person. And just come one question for a week or a month or something and just sort of record what they're saying. I wonder if they could be, you know. For classes that are, they call them the two minute surveys where you'll ask two or three questions at the beginning of the class and the same two or three questions at the end of the class. So it only takes a couple minutes to evaluate and assess what the students are walking away from. I think we have to end now. Thank you. Sir? Thank you.