 Greetings, everybody. Welcome. I'm delighted to welcome you to our first webcast on libraries that learn using evidence to transform library services. The first webcast will be on an administering live call in the U.K. and it will be delivered by Selina Kilik from Cranfield University. Our second webcast in the series, which is scheduled for February 10, 2015, will be delivered by Michael Maciel from Texas A&M University and we will be developing more webcasts in the future. Without further ado, welcome Selina. The floor is yours. Thank you, Martha. Thank you everyone for joining us today. My slides are in British English so you might have to interpret my English to American if you're not from the U.K. Also, in the U.K. we have a live call dilemma. So we use live call a lot. If you find that offensive, please, I'm sorry, but this is how it's going to roll from now on in. This session is kind of my last ever live call session as I'm leaving my role as a live call administrator after 12 years and it's going to give you an overview of setting up a survey from an administrator's perspective. I'm also going to give over some information about my experiences of running the survey, how I've approached my data analysis here, and actually how we've used our results to improve our services at Cranfield University. To give you some background to Cranfield University, we are the U.K.'s only wholly postgraduate university, which in American terms is graduate only. We have no undergraduate students at all. And with that we have quite a small population with about four and a half thousand students but about 1,500 staff. So it's an interesting mix. We only teach science and technology subjects and business management. We don't do arts and social sciences. We have two campuses and three libraries which means from our point of view branch library analysis is vital. And we were one of the first libraries in the U.K. to run live calls back in 2003. We were actually the first country outside of North America to run it and it was translated into British English back in 2003 just as I started in my role here at Cranfield. And I've been running it annually since 2003. So I now have 12 years of experience of live call in the U.K. It's been a joyous task and a real good fun job for me to last decade or so. From a point of view from starting off, I would suggest setting up your survey. You allow yourself plenty of time for conversations, especially with your department heads. I try to spend some time discussing with our heads of department the different customization options. For example, the optional questions, the demographic questions and the position categories to ensure that we're going to meet their needs at the end of the day. Very much focused on what data we're going to get out of it after the survey is closed rather than what questions we might think are interesting. We spend a lot of time with them doing this to in order to get them on board with the results and get them interested in the results when they come out. And it's actually time well spent. The other time I would suggest you spend some time with your student administration body whether that be your student representatives or any other customer groups you may have. To actually talk with them to ensure that the terminology you're using for your demographics and your position categories are terms that they will recognize when they come to complete the survey. For example, here in the last year with the option of choosing our own position categories we used a combination of taught master's students and research master's students. And those categories from our point of view, from an administration point of view made a lot of sense, but unfortunately to our students they were not terms they recognized and they found it quite challenging to complete the survey for that question. We should have spent, and in hindsight now we'll always spend some more time with our student representatives to ensure that the terms we're using in our surveys and our terms that our customers identify themselves as. When it comes to marketing your survey, if you are using a sample or you have a small population I would heavily suggest that you try to personalize your invitations. It's something we set up here a few years back and it's had a huge impact on our response rate. Previously we'd always used a general email message to all students and staff inviting them to complete the survey. After a few years of recognizing that things were challenging and we were not getting a response rate that we wanted, we ended up spending some time investing in personalized mail shots to all of our customers via email with a direct message in a mail merge format so it opened with a dear John, we really value your opinion, please, would you complete our survey? By actually personalizing the message rather than using a general message we found that we had a higher response rate from our customers and we also tried to send the message to their preferred email address rather than their institutional email address. It might be a bit unique for Cranfield but when we have 85% of our customers are distance learners and off-site accessing the survey they don't tend to use their institutional email addresses, they tend to use their personal ones more. So by communicating them in the channels that they use we managed to get more people responding to the survey. The other big win is always to get your academics and your library staff on board and getting them face to face promoting the survey whenever they interact with our customer base. And we tried to spend some time with the library staff before the survey is launched to discuss with them what we're aiming to get out of the survey, what the survey is all about and how they can help us to encourage our customers to complete the survey whenever they meet with them. During the survey I always heavily encourage everyone to continue to promote the survey as much as they can. One tip I had from a colleague over in the UK at the University of Sunderland the wonderful Kay Greaves has done a recent situation where they've actually staged their incentives. This wasn't with Libco's, they're not a Libco library but this was another format of feedback that they were looking for. When they hit a certain number of responses their incentives went up so the students started to encourage each other to complete the survey in order to get a bigger prize as it were for the number of responses that had been received. The responses were published on the library website with almost a thermometer style ticker as the number of responses went up and I would recommend it's a very good creative way to encourage students to encourage each other to complete the survey. Another trip I picked up a few years back and I did here was to actually start to answer some of the comments mid-survey. So when I sent out the reminder email I was able to say to people in a general email thank you if you already completed the survey. We have spent some time responding to some of your comments already on our webpage. Please go along and look. If you haven't yet responded please take some time and respond now. We found from that that we had a huge impact on not only our website traffic to that you said we did page but also it was interesting to see the academic conversations with our academic liaison looking at their comments regarding that webpage. It was clear that senior management across the university who had all received that invitation had spent some time reading the you said we did page and were more aware of some of the comments we were receiving from our customers and some of their concerns. We didn't shy away from some of the concerns that they had but some of the ones that we knew we could answer quickly with marketing promotion were published on that you said we did page to help encourage them to use our services as much as we would like them to use them. Always, always as well reply to any emails you may get promptly from the Libco survey so that they can see that their comments not dropping into an ether without any thought or regard in the future. What's your surveys complete? This is where the fun really starts and the analysis becomes your biggest task. I do an awful lot of Libco analysis and I am a whiz with spreadsheets. My best friend bought me this mug for Christmas and I absolutely love it. If you are not a spreadsheet wizard yet, I would suggest you spend some time training yourself up or going on a course or doing some online training materials or marrying someone or employing someone to do it for you. The amount of analysis you can do with Libco is infinite, your time however is not and the quicker you get with Libco analysis in Excel using templates and the like the better and more detailed analysis you can do in the future. My first analysis I ever did was actually our branch library breakdowns and that was quite vital from my point of view as we have very different campuses and very different libraries. That initially used to take me an awful lot of time and didn't give me much time for anything else. After a year or two I realised that templated approaches were the right way forward as we continued on our Libco journey and I would recommend if you are running Libco regularly and you are doing same analysis year on year to start setting up your own Libco template for data analysis. These days all I have to do is run one SPSS script and it suddenly spits out all of my branch analysis within half an hour. It gives me a lot more time to do the detailed analysis that gives us things going forward such as identifying our successes, identifying where our key opportunities for improvement are, benchmarking our results and doing the longitudinal analysis and those are the ones that really lead us into giving some actions out of our survey results at the end of the day and I'm going to pick up on those in the next few slides. One of the most time consuming things from my point of view would be working with the comments. There is no quick win with this because at the end of the day you still have to read them. They don't just get coded automatically no matter how much we would love our qualitative software to do it for us. The great tip for this is if you are going to use Libco regularly would be to invest in some time for training and actually some qualitative software and start setting up a customer feedback database with your free text comments going forward. So for example at Cranfield what I have done is used not just our Libco comments but actually all qualitative comments we receive from all customer feedback surveys across the university regarding the library service and fed them into one monster customer feedback database in EnVivo. I use that to code and query our comments going forward on an annual basis holistically outside of the methodology of just the survey instrument. It's given us a better feel for our customers' concerns and to see where our customer's themes are and what things are coming up repetitively across different survey methodologies. I tend to find that customers will only talk to you about the things that you've asked them in the survey so if you haven't asked them a specific local issue which you may not have done by using Libco you may not be aware of some of the things that are being coming up regularly in other areas so it's always worth having a look at it outside of just the survey methodology and to see other areas that your customers are providing feedback. The other thing I do with my EnVivo database is I link up our qualitative data with our quantitative comments. So for example if I have a key opportunity for improvement around a certain question for example IC1, I can see anybody who scored us low for IC1 and find their comments quickly. I can then have a look to see if they've detailed in their free text comments details of their concerns and to find out if there's any specific actions I can take directly from the things they've spoken to us about in the free text comments. Working with the comments takes resourcing and this side of stuff does take a lot of time. It's very, very valuable and useful but you have to be able to staff resource it in the first place. So if you're beginning on this journey I would suggest you do invest the time and training with it but be prepared to do something with the results. Coding for the sake of it is not going to get you anywhere at the end of the day. I've spoken a bit about our key opportunities for improvement and this is actually how I tend to break down my analysis these days. I start with looking at our items with the highest desired and lowest adequacy or superiority scores and I scatter graph these out using the scatter graph templates that are available on the Libcroll website these days. They've recently been published if you haven't heard of those before. Don't panic, it's nothing personal. Have a look in the resources section under charts and templates and you'll see some instructions and the scatter graph template that you can use yourself if you wish to. We focus particularly on those which are the highest desired and have either the lowest adequacy or superiority score as our key opportunities for improvement and normally we identify about four or five items each year that are the ones we're going to focus on to try and to improve our satisfaction scores and the customer perception. So for each item I take a look at specifically the longitudinal trends for that item and whether or not this year it's just a blip or if there's been a continual trend of dropping performance and perceptions in that particular area. I also benchmark at an item level for each that question so for everybody in the UK, all of my UK competitors if you're listening to this, I am using your data this way I will identify who has scored better than us for that particular item. I will look at their perceptions rather than any gap scores because I recognise people have different expectations at different institutions and I'm looking for institutions that have higher perceptions than us in that particular question to go and speak to them to find out what are they doing that we can learn from. I then also look at our free text comments around those fields and to see if there's anything specific I can identify from the comments that we can make as an improvement or as an action going forward. When it comes to that, if we can't identify straight from the comments we do look into further research to identify where we can make actions and improve for our customers' perceptions. For example, we've run a focus group program in the past we're currently doing and we have been on-goingly doing usability testing on our website and other quick surveys and polls with our customers to find out what they think of our services and how we can improve. The great Megan Oakleaf once says if you're not going to communicate your results don't bother doing the analysis in the first place. We always look to give our feedback to our customers but also to our staff and the biggest tip I will give you here is to always tailor your message to your audience. Each group demands or deserves a different message coming forward. We feed back to our customers along the lines of a thank you and also telling them who won the incentive of what key actions we are looking to do and change in the future. I tend to focus on the things that we know we're going to be able to achieve rather than the things we'd like to do. There are a number of things we'd love to do every year but unfortunately ideas are a lot easier than actually incentives sorry not incentives but actually getting things achieved. Sometimes it comes down to politics, sometimes it comes down to funding sometimes it comes down to just sheer time but we can't change everything overnight. So we always go with the messages to our customers of the wins the things we know we're definitely going to change this year the things that have been put in place already have already been signed off and already funded. When it comes to feeding back to the library staff I don't tend to give them the radar charts I'm more go with telling them about changes to perceptions in percentage scores because it's a very quick way for them to understand and interpret what I'm talking about. However my department heads like the radar charts they also like our zone of tolerance bar graphs so they get to see all of those. When it comes to the senior stakeholders I tell them the key up to key successes but also our key opportunities for improvements and where we're going forward with those. I don't tend to give our senior stakeholders vision of the radar charts because they don't tend to be able to interpret it quickly and easily and I don't want them to feel bamboozled or confused by the data. Fundamentally what we're continually looking to do is this intrinsic cycle of data collection analysis which is leading us to informed decision making. Data collection analysis without using it to make decisions is just wasting everybody's time. Likewise making decisions without actually knowing the data and actually having the analysis behind it risks spending a lot of money and time into things that our customers just don't want from us. Some of the actions we've actually taken at Cranfield include improving our academic liaison to try to increase our understanding of our customers needs. For example, we have a particular issue around textbook provision. We try to work hard with our academic community to ensure that we have reading lists in time and we're able to provide the text that they are recommending to their customers. We also work to embed our information literacy training with our staff and students and increase our book procurement to meet those needs. We work with our IT departments to improve computer services within the department however it's not necessarily within our control but we do work with the data using the data to enforce our arguments for increased and improved provision. We've actually used Libcroll to adjust our opening hours not in a sense of opening longer but actually reducing our opening hours based on our customer feedback. Initially when we started using Libcroll back in 2003 we had a reduced opening hours and we increased it in 2008. We found that we'd increased it too far and we were actually exceeding our customers' desired expectations for opening hours with that data and some other feedback we had in different formats. We reduced our opening hours again so that our perceived level of scores just under the desired level of expectation. It's still within the zone of tolerance but we're not spending too much time opening the building unnecessarily and that staff time has now been reappointed to actually helping to improve access to our electronic resources. That's something we're still working to improve but it is always high on our list of priorities. I've also worked with the Sconell community to discuss with them how they're using Libcroll locally and I've been supporting the Sconell community in using Libcroll since 2003. Some of the things that other institutions have been doing is use Libcroll to secure additional funding to provide increased resources but also enhancing access to their resources through the purchase of cross-searching tools and other databases. Institutions of use Libcroll to actually directly improve their help services and their information literacy training, whether that be in person or online or through chat services. Upgrades to the physical space has been quite key in the UK and there's been a lot of procurement projects through Liby Spaces in the last few years that has included increasing IT provision, variety of different working styles and sections from group work space to independent study space and actually introducing the use of zoning and noise control within our physical facilities. These have been received very well by our customer base and we have seen a direct improvement in Liby Place scores within the UK Sconell community year on year. Fundamentally, we always assess to improve, not to prove and Libcroll has found to be a useful tool for the asset crown fields and others in the UK to improve our library services and we will continue to do so. Thank you very much for listening and if you have any questions I'd be delighted to hear from you. Thank you Celina. This was wonderful, especially. I love your story about the opening hours, how you use the feedback to increase them and then you use the fact that you were surpassing their desired expectations to calibrate it just to the point where you are barely meeting where you are approaching close to the desired expectations. This is a wonderful story. I think I have a question that is probably in the minds of our audience. You won't be at Cranfield University. Would you like to share where you are going? Yes, certainly Master. I am moving to the Open University. I am going to be their library services manager in charge of student engagement. So we are looking to work with all the departments across the university to hopefully improve the customer services and improve our student experience. From us here we look forward to the possibility of doing something different in a library without a space, right? It is a completely distance learning university for those of you who have not heard of the Open University. We have a quarter of a million students but we don't have a physical space. We do have a physical library that would be unfair to say we didn't but they don't tend to come to us. They are distance learning students so it is an interesting new challenge for me. I look forward to opportunities there for future collaboration. Thank you. Thanks, Master.