 to the CNI Digital Scholarship Planning webinar series. And if you participated in a previous session, welcome back. I hope you're all doing well during this difficult time of the pandemic. We're so pleased to have close to 300 registrants from five countries and a wide variety of institutions. I'm Joan Lippincott, Associate Executive Director Emerita of CNI, and I'll be moderating the nine sessions of this series. Each of you is registered for all nine sessions. Don't worry if you need to skip some. We'll have recordings available for all sessions, as well as a set of questions to guide planning discussions on your own campus. Both the video and discussion questions for session one are now on the website. We have two speakers for this session and we'll take questions after each. Please type your questions in the chat box at any time. In addition, after the formal one-hour session is over, we'll open the mics in case some of you wish to verbally ask questions of the speakers. The chat box is also available to communicate with each other or with me or our technical lead, Beth Sechrist. During the presentations, all participants will be muted. For this third session, we'll learn about assessment initiatives related to digital scholarship programs, both needs assessment and ongoing assessment. I'm frequently asked about good models for assessment and I'm so pleased to welcome our presenters today. Erin Brenner, Associate University Librarian for Digital Scholarship and Creation at University of Pittsburgh and Maris Mandernach-Longmeyer, Head of Research Services at the Ohio State University Libraries. They have very useful and thoughtful frameworks and examples to share with us. Their bios are on the webinar site and I won't take any more time with introductions in order to give our speakers more time. So over to you, Erin. Thank you, hello. Very glad to be speaking with you all today. Thank you to the organizers for having me and thank you to all the participants for joining. My name is Erin Brenner. I am currently Associate University Librarian for Digital Scholarship and Creation at the University Library System of the University of Pittsburgh. Today I'll be talking about assessing campus needs for digital scholarship program development. I'm gonna draw on work that colleagues and I have done at our own institution, Pitt, when we were starting our own digital scholarship program within the library in 2014. This is an information gathering and strategic planning initiative for the library, but in many ways the activity of the project itself had a transformative effect that went beyond gathering information. The initial process that I'll describe was fairly formal and time intensive, in part because we were launching an entirely new area within the library at the time. But we've also reused aspects of this process in more lightweight forms when we have launched subsequent aspects of digital scholarship programming. Now I recognize that those of you listening in are likely coming from a range of institutions of different sizes and types and that your digital scholarship programs may be at different stages. So I'm hoping that whatever your own situation, you may find something of value from this example when you're considering new or ongoing program development. First, I'd like to start with a premise that shapes my views on this topic overall. And that is that the practice of digital scholarship and library's roles within it is fundamentally relational rather than transactional. I don't think that's a particularly controversial thing to say and it's consistent with how we're talking about digital scholarship already in this series. So on Tuesday, for example, when Pam Lack was talking about the space in which she works, she said, we wanna emphasize the humans in the space rather than the technology, that's the same spirit that I'm talking about here. And of course, bringing up something relational, we have to note that in our current moment, everything that is relational is different. We're working at distance from each other and we're more mediated than ever before by technology in between. But it's still, it's worth remembering how much our visions for digital scholarship and practice emphasize partnership, collaboration, participatory and public facing projects, co-discovery and a greater place for the library and its people within the full cycle of research, teaching and learning. All of these values are relational and they all require trust and mutual understanding. And if we bring that perspective to assessing campus needs, it suggests to me that the method and the stance of assessment should be relational as well. So that can mean a few things. It can be about the process of inquiry itself. It can be designed so that it enacts qualities that develop and strengthen relationships. For example, showing interest and curiosity in others, listening, reflecting back understanding of what you've heard, co-creating. So these are basics. But thinking of needs assessment with the relational frame can also mean making an effort to learn about existing relationships and capacities in your environment and then using this framework to situate your program in the library. Here I've learned a lot personally from the civic tech and data communities use of ecosystem mapping, which is intended for actors in that space to better understand existing capacities and connections, all those needs. And building on that, it's worth considering that there are multiple relational systems that we are working within. Digital scholarship development in libraries is often, in the way that it plays out, a form of organizational change and repositioning. There's a major question then in how we face, or that we face in how the development of our digital scholarship programs involve or don't involve the entirety of the library organization. Because of this, I believe it's crucial to give attention to internal library relational systems as well. And of course, the support system of peer and professional networks and communities of practice is also essential. So additional relational systems like these influence the design of program assessment as well. So given all of that, I'm not gonna advocate for any single way or formula to follow in conducting assessment for planning a digital scholarship program because so much is inevitably dependent on your local context and environment. What I am hoping you'll take from my example are strategies and a certain orientation towards doing this work. So to the process at Pitt, process at Pitt was called a strategic audit of the library support for digital scholarship. It resulted in a written report and the link to that report is on the screen here. The report discusses the methods and findings that we discovered in some detail. And it also contains a set of recommendations for our local context that ended up serving as a roadmap for the next several years of program development. I'm not gonna focus much on our specific findings and recommendations. That's because this report focused on Pitt's environment and yours is certainly different. And in fact, all of our situations are now different in many ways because of the pandemic, of both how it's changing us immediately and how changes will continue into the future. The other thing to mention is that any discrete assessment process is necessarily a snapshot of a particular moment in time. So I mentioned that one of the successes of our work was that it provided us with a roadmap for program development. But to be frank, I'd put the lifespan of that particular roadmap at somewhere around three years. So this kind of work, like all strategic planning, is not a once and done process. I am gonna talk about some things that I hope might be useful to you generally and that might not be obvious just from reading the report. And so there's three parts to this. The first is the context for the needs assessment and how that context influenced its design. The second is a look at the methods employed. And I'll try to pull out which I think might have been the most important and useful. The third part is some discussion of the limitations and what follow-up from the processes continue to be most challenging for us. Okay, so starting with context and design. The impetus for doing this assessment project was our sense that things were developing around the university that we in the library needed to be better engaged with. At the time, and this is 2014, that included research data management, a growing digital humanities community, especially faculty who self-identified and self-organized around DH. Faculty hires and digital roles across the humanities and social science departments and the same departments were often talking together about how to better incorporate digital methods and tools into their curriculum. And then elsewhere, we were seeing related changes in libraries in higher ed. So as one specific example, at this time, several of us were enthusiastically reading and passing around Jennifer Vinopal and Monica McCormick's article supporting digital scholarship in research libraries scalability and sustainability. That's on the slide. And talking about that study and thinking behind that paper. So our challenge at the time was that we had little or no user-facing digital scholarship services. Although we did have well-established programs in digital library collections and digital publishing. So we had a lot of components. And we had no physical presence specifically supporting digital scholarship on campus. Even more significant though, while we felt the library had a key role to play, it did not have the positioning on campus to be considered a strong actor in the space. And we didn't have the internal organizational positioning as well. And what I mean by that is not just positions that might show up on an org chart, but more like an organization-wide vision for what we were hoping to accomplish. So our library director at the time made the decision to develop digital scholarship with a new unit and a physical presence in the main library. And to do this through a process that would both design and launch the program. So that was assigned to me and the process was what we're calling here the strategic audit. The project took place over six months. During that time, I had some release time and vocal support from the library director. Both of those were key. It meant that my role had a certain authorization and character that was not part of the regular business of the library. I was able to act as a kind of consultant or researcher for the library, which had a different kind of signaling and ended up permitting different kinds of interactions and conversations. So I approached our study influenced by Vinopal McCormick's paper in which they interviewed a number of local faculty and a number of peer institutions. Those interviews used a semi-structured interview design which I'll discuss a little bit more soon. Along with these, I added conversations with internal library units that had some stake for involvement in digital scholarship and also a scan of the local environment at Pitt and in Pittsburgh and a review of relevant literature and reports. And it was clear, I think from the beginning that this wasn't just a matter of identifying some new service areas but that this process would need to touch on something about the library organization. So some thoughts and more details about the method. There are a couple of specifics of the design that I'd like to mention. First, in libraries, we like to survey. So I remember a few years ago reading the Ithaca SNR library survey which asks directors how their library gathers feedback and information about library services and collections. And the results showed that after informal conversations with students and faculty, by far the next most common means of gathering feedback was through surveys. I would argue surveys are better at measuring within some framework that is already known to us such as perhaps how many people are aware of a particular existing resource but they're not as good for discovering what is unknown or discovering what is nuanced and complex. Surveys also don't do much to strengthen relationships. So back to that relational frame or mutual understanding. Instead, we know that many people end up ignoring them and or they get annoyed or fatigued by requests for survey participation. So rather than a survey, the methods of this project were modeled on qualitative research using semi-structured interviews, as I mentioned, and I ended up doing qualitative coding for analysis of those. And just continue with that a little bit, designating something as a special study in this case and adopting some semi-formal research methods rather than say describing this as an environmental scan also ended up doing some useful signaling. So it made it a bit easier to do things like cold call people whether locally or nationally. It also gave some cover to bring people together internally and ask very difficult questions within the library organization. Qualitative research like this is an inherently social activity. In fact, it's an activity of social exchange. Some writers have called a researcher doing qualitative work a human instrument of data collection. And sometimes the role of researcher as an influencer in this process is called into question. So it's sometimes known as the observer effect and usually meant negatively something that might bias or invalidate findings. But in this case, the work helped us to develop a social infrastructure on campus around digital scholarship in which the library was a significant actor. And as another aside, I was pleased to see in the recent OCLC report on developing research partnerships across campus which has just come out in the last week or so a very similar frame used there. So some keys to this method, in my opinion, the interviews with the faculty were not designed to ask about our library at all. But instead to learn about what the interview subjects were doing, how they were doing it and what was important or challenging for them. So specifically we asked about their major areas of research and teaching and then a series of more focused questions about their collaborations, their research tools, methods and needs, teaching and training needs and sections related to specific areas and interests of the library. So in our case, those were data management practices, publication and sharing of research outputs, their use of repositories and how they tracked citations and research impact. We did not promote the library in these interviews. We did not ask anything about existing library services. So it was really meant as a process of discovery. Secondly, as mentioned, the design of the assessment looked at multiple relational systems at the same time which ended up really helping to inform a holistic understanding of the situation that we were in. And finally, although I did ask liaison librarians for context interview, at that time it did not end up providing many leads. So I used snowball sampling, which is a method of asking study subjects to recommend other subjects. It works well and it had a great side effect of helping us to better reveal, or helping to reveal to us better the network of relationships that existed on campus. So turning to some limitations and ongoing challenges. As I mentioned, we did find this process immensely helpful and we took many specific actions as follow-up as we had hoped, including launching a new digital scholarship program in the library, establishing new physical spaces and labs and expanding staffing. But because I'm focusing on the value of this assessment as a generalized approach, I wanna mention a few things that, and it's now several years later, seem to me to have been limitations or things that continue to be difficult to work on. So you may have noticed in the listing of interviews that I conducted that students were not included. Students, particularly graduate students, have turned out for us to be a very important audience for the library's digital scholarship program. We did not include interviews or focus groups with any students in our initial study. We did hear consistently from faculty that they want their students to have experiential learning opportunities with digital scholarship that go beyond workshops. But initially for us, the needs of faculty were more legible and the faculty were a little easier to recruit into conversation. But we have learned that listening to students and paying attention to building relationships there is extremely important. Students, of course, have a very different set of circumstances for engagement because what motivates them and constrains them is different. Building a more formal student experience in our case is required, navigating university administrative bureaucracy about student programs, things like compensation and funding and permissions. And it's been harder than it should be, but building a relationship with the administrators who handle graduate studies in schools and departments has helped a lot. Next, the internal organizational aspects that I've mentioned a few times already are very hard and continue to be hard to shift at a broad scale. And maybe this is a plug for the next session in this webinar series, which I believe focuses on staff. So I will be interested in listening to that. But in our report, the first recommendation said that digital scholarship should be treated as a core service of the library, the entire library, rather than the responsibility of a single unit. But getting library staff broadly connected to digital scholarship has been a challenge. So we can see it looks like some cross-organizational structures are necessary, but there's not a single obvious model to develop staff capability across units or to reconcile various identities and responsibilities around digital scholarship. We have had the most success there in developing staff as practitioners in their own right rather than service providers. So as an example, through digital humanities interests and practice groups that are internal to the library. Third, as I've already mentioned, doing a discrete and largest large-ish needs assessment project means that the findings will be a snapshot of a particular moment in time. The findings were extremely useful, but it has also been striking to me how quickly the picture of our local environment had went out of date. Pitt has had a fairly intense amount of leadership and structural change during this time. So within just a few years, we had a new library director, a new provost, CIO, vice chancellor for research, a new school of computing and information, and that list goes on and on. But then on the other hand, I think that amount of change is not totally atypical either. So this has affected, among other things, our sense of campus partners. Many of the partners that were identified in the initial assessment don't even exist now in the same way, or the people with whom we forge the strongest relationships have moved elsewhere. Along the same lines, in terms of rapid change, we found that new interests and technologies and practices will emerge very quickly, probably not a surprise. There were many things that our initial scan did not capture that are now very relevant to us at Pitt. For example, 3D, VR, and AR technologies, civic open data through partnership with a regional open data center based at the university. But we also acknowledge that those and his unanticipated needs are to be expected. So it's impossible to recognize, much less predict everything that may be relevant in a scan or assessment. Instead, we find it's most important to be in a position to hear those emerging needs and to participate in how they are realized. To bring us back to our assessment study, I can point to that process as a spark that helped get us into that position. And on the whole, I'm glad to say that's where we've been able to continue to operate. So I'm gonna conclude here with a summarization of some of what I think are the significant social outcomes from the method used, and also hoping to end on a positive note. Our assessment process registered the library's interest and stake in key areas within the university. It helped reposition the library as a listener instead of a seller of services. It was a gateway to partnerships and participatory design in the spaces and services that the library was developing. It was itself as a process, a gentle kind of advocacy for certain practices that are particularly important to the library or where we think we contribute values such as open scholarship and data curation. So just the act of talking about these things, for example, with some faculty was a kind of advocacy. It helped to build relationships, and this of course is the theme of this particular talk. It helped to build relationships and identify existing relational networks. And then finally, it seeded new spaces that we were developing and new services we were developing with actual and rather than hypothetical users. So I am stopping there saying thank you and looking forward to your questions and discussion. Thank you so much, Erin. Your work peaked my interest back in 2014, was it with your study and it's fascinating to hear how you've built on that work. I have a question to start things off, but I want to encourage all our participants to type questions into the chat and I'll alert Erin to them. Early in your presentation, Erin, you mentioned that you felt the library didn't have the positioning in 2014 to actually start working on digital scholarship services or programs. How did you realize that and how would you suggest to our participants to understand whether or not they have positioning? I like the word, the term positioning. Sure, I think some of the ways that we sensed that or felt that was a sort of lack of people reaching out to us to discuss things that in areas where we thought the library had had value to contribute and expertise in places where we wanted to grow. So some specific examples of that was that I mentioned that at that time, there were a lot of stirrings about research data management at the university level, a lot of development of interest in digital humanities, people self-identifying as DH. We did not find people coming to the library or coming to people who worked in the library to ask questions or to feel like they had a place within the services that the library offered. And so given that dynamic, I think, what we were wary of, what I'm trying to get across is we could stand up services, but if people aren't thinking about the library, they don't know us. They perhaps see the library as fulfilling a different kind of role. I think it sets up those services to not do so well. And so what we really wanted to work on was to build up those social relationships and in a really meaningful way, not as a way to just say, get to know us, now come work with the library, but so that people on campus who had these needs felt like we were listeners, that we were curious about their work, and that the services that we were designing took account of their specific needs, knew them, they could see themselves in them. So I think there's a variety of ways to do that, the particular study methods that I describe are one. But again, mentioning, I had just skimmed that OCLC report I mentioned about cross-campus partnerships in the research enterprise, and I think there were suggestions like just getting involved in campus governance and showing up, frankly, in different places is a really important basic step to do. Thank you. We have a request for you to share the link to your original report, but why don't you hold off on that? I don't wanna distract you because we have several questions and I want to get to those. So when we go on to Merris, maybe you can add the link in the chat, please. I'd be happy to do that. So the next question is, you mentioned there's a lot of change since you conducted this study. Do you have plans to build or on or expand it for the current climate? If so, what would it look like? Yeah, that's a great question. We have not done a process that is quite so heavy weight as the one that I described here. The first process in 2014, which was a six month long process and it involved release time for, in this case, it was me, and a series of fairly formal interviews. We have done smaller versions of this in more targeted ways. So for example, about two or three years later, we focused specifically on GIS services within the library and we reused a lot of the methods, but on a smaller scale. I do think it's probably time for us for a larger scale scan like this again. And I think ideally, it would probably look pretty similar to what I described. But I don't have specific plans to share for that just now. Thank you. The next, and by the way, some one of our participants kindly entered the URL into the chat for the report. The next question is, I'd like to hear your thoughts on how we might carry that relational frame into ongoing assessment, such as how are we contributing positively to our campus partners mission? I like very much the relational frame that you put around assessment. I'd like to hear more about how we might continue this mindset into other areas of digital scholarship assessment. And thank you as well for this presentation, very helpful. Oh, glad to hear that. We're welcoming presenters. Yeah, glad to hear that. And I think Maris is gonna, who's following me is gonna talk about ongoing assessment. You know, I did, I have been giving a lot of thought to ongoing assessment in our particular case. And I think that the way this question is asked is right. And my personal opinion is that here we need to start from articulating goals. So before we even begin to think about indicators or measures, we think about what our goals, program goals are. And then it's a matter of design or looking for indicators that would help us assess the level of that progress towards those particular goals. I, again, I don't have really great specific examples from Pitt to share there because that's a process that we're just beginning. But that is the general frame or approach that I would take to that. So, you know, the short version is probably that assessing the health of relationships isn't gonna look just like the counting of attendance at workshops or the number of, again, transactions in a consult ticketing system. But it's gonna look probably something more qualitative. It may be based more on stories. It may be based on things like inclusion in certain campus-wide initiatives or plans or hiring initiatives, things like that. That's a good point, Erin. I would like to mention that a study on which I served on an advisory group done at University of Calgary on cross-disciplinary projects included the faculty at the end of those projects or the time period of the program wrote an assessment in response to very specific questions. And that was a way to gather some of that relationship information in a qualitative way. And it was really fascinating how specific some of them were about the contributions of the library. And Tom Hickerson, I believe, is a participant today. And if there's a public version of the report, I hope he'll type that into the chat. Our final question for this segment and we can have more at the very end is would love to hear more about how many faculty you spoke to and were these one-on-one meetings or focus groups? How did you find a way to get their attention? That's right. So it was, oh gosh, I don't remember offhand. It was something like 10 to 12 faculty interviews at that initial phase. They were all one-on-one. I think they worked really well as one-on-one because they were pretty expansive. I have also conducted a part of focus groups which absolutely have value. But in this case, because there were so many different things that we wanted to ask each interviewee about and we very much wanted their personal take on all of these things and to build an individual relationship one-on-one interviews worked really well. And then I mentioned that snowball sampling or at the end of each interview, I said, who else should we be talking to? Who do you recommend that I get in touch with? And that's a great way of opening doors to people to get that direct referral. But as I also mentioned in the talk, being able to say, I'm conducting a study on behalf of the library into digital scholarship needs around campus. I think that was a much more effective way of getting the attention, as you say, of the interview subjects than to say, we'd like to get your opinion about library services. So I do really recommend that particular strategy. Thank you. We have one more question, but I'm going to save that till after Maris' presentation, I think it's one that both of our speakers might want to respond to. So Aaron, thank you so much for both your presentation and your very thoughtful responses to the questions. I'll ask you to stop your screen sharing and then for Maris to begin. Over to you, Maris. I'm working on it. Take your time. And if Beth can provide any assistance, please ask. I'm going to go ahead and ask Aaron that final question. Acknowledging the notion that standing up services is easy, but organizational change is hard, what would you recommend libraries do for their internal assessments to derive strategies for building digital scholarship as a core library service? With hindsight, is there an approach you wish you'd use? Go ahead and if you don't mind, Aaron answer and we'll ask Maris to address that later. Sure, that is a complex topic. I think that's a really big topic. I think there's multiple strategies. The one that I mentioned in my remarks in which I really do advocate for is to focus on developing library staff as practitioners. I think there's a lot of emphasis in digital scholarship on the processes being learning processes and the way that we learn those is by doing them. And on the other hand, there's also a lot of, I think, anxiety or uncertainty about people in the library who may not have been trained on these particular methods about their capability. And so I really feel like one very, very good way to bridge that gap is to do internal library programs which focus on cultivating staff as practitioners themselves. And I think that helps both in the learning but also helps build that kind of confidence that's so important to doing this kind of work with others. Thank you, Aaron. Maris, over to you. Your slides look fine. OK, perfect. Thank you. Wonderful. I am Maris Modernach-Lungmeyer, head of research services at the Ohio State University in Columbus, Ohio. And I'm going to talk a little bit about assessment and what we've done on an ongoing basis. So for a little bit of context, a little about Ohio State, we're a top 20 public university, 61,000 students, 15 colleges. We are a land grant institution, 200 undergraduate majors, 250 masters, doctoral and professional degrees. And a little about the libraries. We're a top 10 ARL library. There are 13 departmental libraries. And so on your screen right now you can see Thompson Library is situated in the center of campus. And we have a number of specialized collections as well. What is highlighted over on the left is the 18th Avenue Library, which is where our research commons and our research services department is housed. So while we serve across, we have the public space on the third floor of that otherwise 24-hour, seven days a week library. What you'll see here is the entirety of the research services department. You can see from titles is a great team to work with. And from titles, you can sort of see the scope of services offered from digital humanities to data visualization to geospatial information services are the main focus areas. You'll also see in the bottom right some of the vacant positions that we've had for a number of reasons. But data services librarian, data services specialist for outreach and education, and the research impact librarian. As Aaron mentioned, I'm going to echo a lot of what he said in how our services were started, but also sort of what is driving how we continue to evolve and develop ongoing services. A lot of our focus ends up in how the libraries can support the research lifecycle from planning research to conducting research to publishing to looking at increasing impact. And across the libraries, we end up doing this through referrals and triage, consultations, education, and workshops, and then also showcasing research. And what we found through our research commons that space as a service actually is a valuable asset that we in the libraries can provide. All right, to talk a little bit more about space and services. Our service planning started back in 2012. So I was hired on, and there you had already done some graduate student listening sessions. There was a white paper on this evolving mode of research support. We had some initial strong partnerships with the Office of Research, especially with grant office, part of that. And sort of brilliantly during our listening tour and scanning process, our partner in the Office of Research grants, he said, well, you have all these other spaces. Why are you waiting to start your services? So we listened, and our services started in the fall of 2014, a lot of workshops and other library spaces. And then our physical space opened in January of 2016. It's 10,000 square feet dedicated to research at all levels. And that really has been our focus of supporting research. This is not the only space in the library that does it, but it's the one dedicated to research support. We built our approach on partnerships around campus, so a hub and spoke referral model. And as you can imagine, at a school the size of Ohio State, there's a ton of duplication of services. And so that was one of our main tenants early on, is how can we not duplicate services? As I mentioned earlier, it's a 24-hour library, so it had to look physically different than other spaces in the libraries. I'm giving you a bunch of context about spaces and services before I jump into data and assessment and how we're going to do that so that you can have a frame to build from. So here's what the physical space rendering looks like. You can see the different spaces that are outlined there, some for individual work, some for consultations, a main point of referral, and then some more private spaces. And what I have next are some pictures of what that looks like. There are whiteboards everywhere. And there are places for brainstorming, a lot of movable furniture, keeping up with tech that is available for this space. What you'll see on the upper left are some of the dedicated computer lab with advanced software available, some consultation spaces in the upper right, health conferencing room in the bottom left, and then our main colloquia space in the bottom right set up just for general study. You can move everything around and move materials from our storage closets so we can see up to comfortably, probably 60 to 70 people in the room. We've held colloquia here. We've had seminars. We've had lectures where we have people in-person and remote. It's just really built around flexibility. Additionally, so that's a little bit of the spaces, we also have a separate website where you can find resources for the kinds of services that we are providing from finding to managing to visualizing data, digital humanities, GIS, and mapping research impact, as well as some of our guides that are both curated by folks in the library and our partners that provide consultation education in the space too. So again, we don't have to have the expertise. We have to know the expertise and then invite them into the space. Data we gather. So we get a bunch of data and as Erin was mentioning, some of it's useful for quick decisions and some of it is a little, it builds on just the data that we have sitting that we're sitting on. But we do consultations in-person via email. We gather both event registration and those who actually attended. We do website tracking with Google Analytics. We do space headcounts with SUMA and we're transitioning that to Qualtrics. We gather our reference interactions and live answers. We also track the number of partners we've had semester by semester or year by year over time. After, so that we have gathered since 2014 or 2016, depending on if it's related to space. And since then, we have also added to that initial list room reservations mediated and unmediated, software requests. We are able to capture lib guide usage, social media reports. And we also track how many of our partners cross promote events. So again, sort of a prelude of is it useful? And the short answer is it depends. We have, as I mentioned, been able to implement some quick fixes. So updates to the website where people weren't finding information or the original navigation bar was confusing, didn't match user expectations. We were able to extend hours. Graduate students in the space were lingering longer than we expected. So originally, we were open nine to five and it got extended to nine to seven. We've been able to identify new partners, make some small adjustments to the spaces, reconfigured the computer lab, and examined workshop offerings and consultation hours. So some people who had been offering consultations initially are not anymore. And then we have additional areas where we have looked at environmental scanning. So what you'll see in the image there is a lot of what I end up doing with my team is about every 18 months we sit down, talk about what has gone well, and what we want to plan out the next three to five semesters. How are people thinking about the services that we're offering and how do we evolve to meet their new needs? And what got us thinking, is there a better way to tell the story? So we think there's a, we know we're doing well on campus. We see high usage of the space and services. Experts are getting tapped out of capacity. But is it making a difference? In 2017, we weren't really sure. So, oh, before I get to that, I have this little environmental scanning interlude. So as Erin mentioned, how you do this, it really ends up being a continual process. So what I have for everybody on our team is that all new focus areas conduct one. So these are the areas that have done them so far, since probably 2014. Lee Bonds, who is on the webinar today, if you're unfamiliar with the process, she wrote an article about it. And so especially in the humanities, first things first, for conducting an environmental scan has some really good tips of how to do that. Some of the methods she used included the snowball sampling. And then we're just starting to get into the process of how do you stay up to date on your environmental scanning? Because as the Erin mentioned, and the folks on Tuesday mentioned everything changes. All right, so in 2017, we started working on a logic model. As a test for the library of how we could assess is the program working? Our assessment librarian at the time, Sarah Murphy, helped guide us through this process of really being able to articulate long-term impact short and mid-term outcomes, and then looking at all of the inputs. She and I wrote this up, and so the link is there for our forthcoming article in College and Research Libraries. And we use the W. K. Kellogg Foundation Logic Model Development Guide. So it walks you step-by-step through the process with examples. It was really helpful. But in addition to sort of an overarching structure, this is fine, but like many strategic plans, it can lend itself to, great, we've made it. Now it sits on the shelf. So one of the changes that we did at the same time was to pull together this data gathering plan. So this is an excerpt that matches to that relationship building. It identifies different audiences. The assessment question we're trying to answer, what the criteria for success would be. You can see some of them are just flat numbers to get baseline. Others are looking at percentages of folks that we would like to target. And then it captured the data source and how frequently we were going to do it. So along with the slides, I also sent this to Beth. So I think it's up on the website of the full logic model and data gathering plan. Now I should say, is it working? And there were a couple of things that got in our way. So the first thing I would say is find somebody on your team and assign it to them. It needs to be part of someone's job description. And the task dates that we set, probably aren't actually the best measured to go after because they aren't specific enough. So we want to know August, we're going to gather these. There are pieces of it that definitely worked that we really had and some others that we have iterated on to improve. And then of course, when folks ask, well, great, you're doing this, right? A whole bunch of changes happened. So we had promotions and reclassification, departures and retirements that allowed for reconfigured positions. We had some successful searches, some failed searches. Larger across the libraries, we went through a library strategic planning process. In addition to the changes happening in the libraries, we saw changes on campus, new VP for research, new partners that either retired, they no longer had alignment or there were new priorities that were established that meant they couldn't offer services in the way that we had originally planned at the beginning. So this is not to say we aren't using the data. Let's just, we gather a lot on an ongoing basis and we use a lot of it for various reports and just tracking over time. So we do keep up to date our event archive, but we also have a pretty robust list of who has offered which workshops or programs, which semester, who was the lead, who did they partner with in the library, just so that we can think about what was successful, what would we maybe wanna repeat in the future. We use this data to assess existing partnerships and new campus partnerships. So we do share a bunch of this information with our campus partners immediately following workshops, but also at the end of the year. And one of the things Erin talked about too is like how do you find out what was gonna be useful to your campus partners? We do a lot of just asking them. So other folks that have used us end up focusing on research support. And so they need a lot of metrics. How many successful grants are coming of it? Do we have anybody who came to a workshop and then followed up with a consultation? So we have the ability to track a lot of that information. We also have been fortunate to have some extra funding applied for it from the libraries to get an MBA student to do a social media analysis for us. And from that, we were able to work with our library communications department and enhance the website. Again, I think that was about a year ago. And most recently, we've been using this data and analysis of the network that's present to provide our donor relations and fundraising folks, researchers that can then be highlighted for university fundraising videos and library fundraising videos. All right, okay. So what we end up doing is a lot of annual reports. As I mentioned, last year we piloted this partner report and so they got the kind of information that is listed there from attendance, breakdown, consultations. We did do a survey of those who had used consultations this year in individual events. They also get feedback from attendees. Annually, we're sending subject librarians reports so that they can see who in there liaison areas have made use of either workshops or consultations available through the research commons. And then individual presenters also get each event and over the semester a report. All right, one of the other changes that we've also made is through promotions and vacancies, we were able to take the program manager role and add assessment to it. So these three bullet points come straight from that program manager job description where it's written into her job now to track these things and help coordinate that assessment strategy. I highly recommend this and can't say enough of our program manager and how wonderful she is to be able to tackle this as she's been in her new current role for a little over a year. And then 2020. So all of our services went virtual. We were able to pretty easily virtualize some workshops, increase tracking capabilities by making them virtual. There's different kinds of reporting that happens through Zoom. We have the ability to record and save those. So those are synchronous. We also had a number of our experts who took this opportunity to make their workshops, turn them into asynchronous self-paced learning options. Our research impact librarian who has since retired piloted this a year and a half ago with the research impact challenge. So through LibGuide, she would release each day a new challenge that would go out via email for over the course of a week. And then our GIS librarian adopted this to do an ArcGIS challenge that lasted over once a week over the course of a month. And he has also taken what used to be an in-person workshop and turned it into a self-paced or a map guide that walks the user step-by-step through the process. Our visualization team is also looking at this as a possibility in the future. In digital humanities, there's been creation of an endorsement for teaching that has come through this. It was already in the works already. And so, but a different kind of virtualized offering. Currently, this was mentioned. The folks at Yale hadn't done this yet on Tuesday, but it sounds like as of Monday, we will have our computer lab with specialized software available virtually that folks can reserve times on and map into. All right, so where are we now? This is way hot off the presses. We just are in this process now and had a meeting yesterday in our department of looking at how we can refocus key areas for the department. So we asked each of these focus areas to define the mission of their focus area, goals, determining success and what future plans one, two, three years out would look like. And what we started with is then how do we compile all of those to highlight department-wide themes. And you'll see those department-wide themes over on the right. So connected campus, how to build awareness, advance research and expand library expertise. What we're gonna do next is match those to the existing library strategic priorities. So both top bottom up and top down approach to meet in the middle to determine our ongoing assessment strategy, future programming considerations, short and long-term goals. So what we have that we're still working on and it's not agreed on is how to fill out this matrix for each of those areas. And I raced through that, but I'm happy to entertain questions and what I do have. So I'm gonna stop my share and throw in the chat. Nope. I'm gonna throw in the chat all of the links that I mentioned. So they're all there in one big pile. Thank you, Maris. Thanks so much. I think many of us wanna go, whoo, that's a lot of different kinds of data and different kinds of studies that you're doing. If you don't mind, I'd like to go back to the question that Aaron addressed and see if you have your own take on the question, acknowledging the notion that standing up services is easy, but organizational change is hard. What would you recommend libraries do for their internal assessments to derive strategies for building digital scholarship as a core library service? With hindsight, is there an approach you wish you'd used? Now, Aaron stated that he believed that digital scholarship should be seen as a core library service. I don't know if you have the same perspective and your library system is even larger than Pittsburgh. So any kind of response will we'd be interested in to that? Yep, so we are actually in the process of grappling with that ourselves right now. So we have sort of a working group that is looking at additional digital scholarship support needs across the libraries and it has folks from all across the library, special collections, preservation, I'm gonna miss a whole bunch, cataloging, acquisition. So all of the aspects from beginning to end of the process, because what we have been finding is every question starts fresh. So every time a researcher comes, even with the simplest, it typically will be like the answer to date has been it depends. Is this a service we offer? It depends, it's been case by case and we're trying to figure out, we know that there are scalable services that we could offer, we just have to be able to offer them equitably and reliably so that folks can, researchers are able to predict what services are available. We have not had a ton of clarity within our library, so I think we're on the way to those discussions and they have been ongoing. Everyone realizes they're important and they're hard to have without a mandate from the top saying, yes, you're just gonna do this, which has not been what we have. And so we know we've offered services long enough that we should be able to figure out what is scalable and what we actually want to adopt as a service. It's just articulating that and getting some agreement. Thank you. This question refers to something that I think was fairly early on in your presentation. How did you know what modifications you needed to make? Were they things you found out yourselves or did the users tell you formally or informally? So we did, we, changes to the website, we knew it was a little clunky, we looked at our Google Analytics, people were not getting to specific areas and so as we added more content, we realized we need to do some audits. So we had our user experience specialist at the time do some card sorting with users and we did do some stopping of users that are mostly grad students at the desk of, would you know about the website? Do you think about using it? Where would you go to look for? Similarly, modifications. Oh, hours, we looked at space usage. So every hour on the half hour, our students who manage the concierge desk will go around and do head counts and we very quickly saw people weren't leaving at five. There were still questions and by switching to seven, it was a pretty easy switch. We let undergrad students into the space there after the time that the desk is not staffed. Thank you. Since we're at time now, I'm going to thank our speakers for their really thoughtful and very specific presentations, giving you lots of ideas for what you might do on your own campuses. And thank you to all of our participants for their excellent questions. Our next webinar is on Tuesday, September 22nd and our speakers will discuss staffing. And we're going to end the recording.