 Good evening everyone, congratulations for making it to the last session of the first day. Thank you for joining us. We appreciate it. My name is Laurie Alexandra. I'm the associate university librarian for learning and teaching at the University of Michigan. And with me is my colleague Doreen Bradley, who's the director of learning programs and initiatives. And so Doreen and I are going to focus our remarks on a recent library analytics pilot we did about our course integrated instruction program. Before we jump into the details of that, we want to set a little bit of context. This is a very, this is a longstanding and a very robust service. It's changed over the years. We've incorporated new technologies, teaching practices and concepts in support of information literacy and inclusive teaching. It's actually a very popular service. We get a lot of requests from faculty and graduate assistants to provide scholarly research instruction, ranging from critical thinking, source discovery and evaluation, learning technologies and academic integrity. It's also a haphazard service. We're a very distributed environment. We don't have an information literacy requirement. And so we respond to the requests that we receive and we also do a lot of promotional activities about our service. So it's not a surprise that throughout the years we've engaged in a variety of assessments in order to continually align this service to how people teach and how people learn. But I also want to put a fine point on the fact that this service has direct impact because it's the one place where we're directly in the curriculum. And that's very important to us. So we have been doing a lot of imagining and reimagining that if we could elevate and have a more significant impact on the undergraduate learning experience and student success, how might that look through this service? We know that scholarly practices are changing. User needs are emergent and expectations of when and where learning happens is really becoming iterative. And we value ourselves, our commitment to align and reimagine our services and embrace innovations and emerging practices related to information literacy. And so that some of the questions that we've been asking ourselves a lot recently is, what does it mean for our team to actually be a learning team? How do we build, test, iterate, and rebuild this service over and over again? We have questions about how to embody diversity, equity, inclusion, and now accessibility within our course-integrated materials. We have questions about how the curriculum itself is changing. There are new demands being brought to our service as the curriculum is changing, as faculty are thinking about different ways in which they are creating learning experiences. And we have a lot of questions about the frequency in which we teach and in what courses. And the commonality of these inquiries that we've been having with ourselves is, yes, how to understand our current program, but how do we actually unearth and surface the new possibilities that we could have for course-integrated instruction and understand what that impact could be? And this really got us starting to think about learning analytics. And by that, I mean the data about learners for the purpose of optimizing learning and student success. And then it also got us thinking a lot about library analytics. And by that, I mean the library data that's primarily used to improve our services. And so as we started thinking about analytics and the emergent nature of that, we started to think about how we could actually use data to inform our decision-making in a different way. And it's interesting to note that these conversations that Doreen and I were having that Doreen's team was having were mirroring many of the other conversations that we were having across our library. We've just recently conducted a large review of thinking about services and spaces. We've brought in consultants to do that and the shift is really how do we think about our spaces in terms of being service-centric instead of being collection-centric. And so a lot of the questions and the things that we were asking were mirroring this larger library discussion that was happening. And I'll just give a call, a shout out to tomorrow morning there is a session by some of our colleagues who will be talking about that study. So if you're interested in further of that, I would highly recommend going to that. So we really wanted to think about an experiment and it's the one we're going to talk about today that was scoped around data from our library instruction request system and the university data warehouse. And how we could use mapping and bringing those two sources of data together to frame our thinking about programmatic changes. So in our time today, what Doreen and I are going to talk about is we'll spend a few moments talking about our campus context around learning analytics. For us that's really important because our campus is highly immersed in that. We want to talk about the complexities related to some programmatic challenges when you start thinking about using analytics. We'll share how we went about scoping this experiment and what we actually learned from it, lessons that we've learned from it. And then we want to open it up for some thoughts and questions that you might have. So just for a moment to talk about our campus engagement and alignment around learning analytics, there is a very big commitment to the exploration of learning analytics at the University of Michigan. It's neither top down nor bottom up. It is everywhere. It's in our water. It's in our air. And it's coming from a lot of whys. Those whys are the desire for collection, analysis and use of data to improve learning, to assist instructors in achieving a wide range of teaching goals, to the changing ways that we're interacting and expected to interact with students, to find out how to effectively use emerging technologies, to provide service enhancements. And most importantly, to really think about how students can themselves become agents of their own learning. If they have access to their own data, what else might they want to do? And so we just want to go over a few of the activities that are happening on our campus. So several years ago, I'm going to start with the Learning Analytics Task Force. Several years ago, our then provost charged a team of faculty to think about learning analytics and to provide funding that people could apply for if they had ideas about learning analytic projects. In fact, Doreen and I applied for a grant through that as well. And this was shared by a faculty member, Tim McKay, who is a leader in analytics. And that leads me to one of the other bubbles, which is faculty champions. All the learning analytics projects have faculty champions with them. And that makes a significant difference in the success that they had. One of the outcomes of the Learning Analytics Task Force did was it hosted an event called SLAM, which was Student Learning and Analytics at Michigan. And the main purpose of this group was to start to build community around analytics. So people could come together, whether it was a small project or a big project that they were working on, present, it was like a speaker series where people got to come together and it started to build community. Another result that came out of the work from that Learning Analytics Task Force was the recommendation around building a digital innovation greenhouse, which is now offered through our academic innovation group. And they are looking at, when you do an experiment around learning analytics, what are the things that you have to think about when you want to scale it? What is the infrastructure that you might need to do that? There's been a lot of work around campus data practices. We have data stewards. There are best practices around that. And in the last two years, our dean spent a lot of time working with a group of faculty to start working through issues around privacy. And what are the expectations for consistently collected data? And what would the steps be that people would need to go through to request that data and use that data? And two of the main key points that came out of that was working with IRB. And also the creation of an MOU that put out the expectations for the use of that data for researchers. Most recently, another activity has been a MOOC that's been created by several faculty, and it really delves into what are the practical things that you might need to, as an administrator, as a faculty member, even as a student, need to be doing the thinking about related to learning analytics. So that MOOC was offered, I think, in the last three months as well. And then finally, Unison, of course, is a consortial effort that we're part of where they're building infrastructure and one of the infrastructures they're building is around learning analytics. And all of these different types of activities that are happening on our campus around learning analytics, they're all stemming from the desire to think about student success. And that's what they're rooted in, and that's how they get expressed in different ways. So some of the complexities that, once you start thinking about analytics, whether it's learning analytics or library analytics, very quickly issues start to arise around storage and access. And how are you gonna analyze the data? What are the policy implications? Privacy, IRB. The actual emergent nature of analytics with each different type of analytic project that opens up new possibilities. And how do we think about that? The data itself, as we found in the study that we did, which we'll share in a minute, when the data's incomplete, what do you do? Is it interoperable? There's a lot of discussion right now about Caliper and how that might play a role in helping systems talk to each other in different ways. What are the different sources of data? What about our vendors? And as well as expertise itself. So when we started to work on this project, we quickly realized we didn't have the expertise to do some of the analysis we wanted to do. So how would we go about finding that expertise? So we scoped out a project and that project really looked, again, specifically at our curriculum integrated instruction and connected it to data that's collected about students within our UM Data Warehouse. And so we wanted to look at questions of, who do we teach for? And what part of the curriculum do we teach? And how might we make better informed decisions about that? And I don't know about your instructional data, but prior to this project, we had slips of paper that said things like class for psych. We just didn't have a lot of data about where we were teaching. And so we had to start thinking about how and what kind of data we might want to collect in order to look at that. So we knew that doing an experiment like this would take several elements. We had to identify what questions we wanted to do. We had to think about where we would get the data from. Sally is the name of our instructional request system, which Doreen will talk about in a second. Lark is the name of the part of the UM Data Warehouse that deals with student records around learning. How would we clean the data? What would be the statistical analysis that we might want to use? And then how would we take what we found and engage stakeholders to talk about our future needs? So in order to move this project forward, we quickly also realized we couldn't do this on the fringes of people's time. We had to put a dedicated team together to think about this. So Doreen stepped up as the lead for that. We hired a graduate student who brought a lot of the evaluation and assessment expertise around data crunching that we needed. And Teresa Stanko provided a lot of project coordination. And I can't underestimate the importance of that project management role in doing a project like this. And so now Doreen's gonna take us into the specifics of what we did and what the data showed us from what we learned. So thank you all for being here again this late in the day. I have a number of data slides that'll help you kind of wrap your mind around. It's hard to jump into somebody else's data that they've been looking at for a long time. But we hope that we can accomplish that today. So the sources of data that we had were twofold. We had Sally, which is short for our scheduling app for library instruction, which is how it got its name. And we have data on all of our curriculum related sessions back to 2013, July 1, the start of our fiscal year there. As Lori mentioned, prior to that, we had really scant data. We didn't know who we were teaching for, where we were really strong in curricula and where we were head weaker, I think showing up curriculum wise. And also the UM Data Warehouse, which I think has data back to kind of the mid or early 1990s on all the students who've been through the University of Michigan. LARC is a snapshot of that and the data is actually extracted. It stands for Learning Analytics Data Architecture as you can see. And it has kind of three slices of data that we can look at. Data at admission, so what do students look like? What do they bring to the university with them? Their demographics, high school, test scores, all of that socioeconomic status. Data for the current semester that they are in in case you want to look at things that are going on concurrently. And then data during their time at U of M that we can look at. All of this information is extracted from the data warehouse and actually put in box, UM box, so that researchers who are working with it can download it from box and that's what we do. Before Doreen goes on, I just want to talk for a second about the culture shift that happened when we started using Sally and we started asking people to record not only English 125, but section 102. And there was a lot of discussion that we had to have because that was a cultural change within the librarians that were used to teaching, just going off and teaching but not recording. They would record that they taught a session of the number of students. But that little piece of data was, we spent a lot of time on culture shift. Having the section number so that you can go back to the registrar's data and find out who exactly was enrolled in that class so that we can get down to the student level was essential and we didn't have that before. So we did a lot of kind of backtracking to get that on some of our older data. So one of our slides here trying to get you to kind of go through, how did we actually get to the data that we analyzed for this study? So we started with the N at the top about 199,000 students who have gone through the University of Michigan data in the warehouse. And we dropped that down by the process of elimination here, who started at the university before 2013 because that was all of the data that we had in Sally was back that far. We eliminated transfer students because we didn't have enough of their histories to, we want to be able to track their four years or their five years, however long it takes them to complete their degree at the University of Michigan. So we really want to see our program impact at the university. We ended up with about 25,515 students that we analyzed for this project. We took a quick snapshot to see who are we actually teaching for at any given time. So out of those 25,000 students, 8,000 we never taught for. And on the other hand, there are about 16, 17,000 we did teach for. And in the orange is how many of those we did teach for happened in the first year that they were at U of M. So that's the majority of the time that we see folks, our students is during their first year. A slight, it's about 13%, 10 to 13% that we see in subsequent years. So not as much as we would hope in upper level courses and really in their majors, where students are in their majors. But the top ten courses, another question we ask, where do we see these students? And these are just two years worth. We have the previous two years, 14 and 15 as well. But our top two seem to be consistently UC280, which is a course for our undergraduate research opportunity program. So it's students who are paired with faculty doing research. So it's a very research intensive program, dominated largely by biomedical sciences and social sciences. And then our English 125, which is our basic college writing course that where most institutions teach a lot. So those are the top two places where we see students as well. Women's studies, lots of cross-listed courses. That was a big issue we had to work out. How do we deal with data and cross-listed courses? Engineering 100, which is on the list, is a writing course for engineering students. But so this kind of felt right to us, where we're putting most of our effort, particularly with UC280. We put a lot of time and effort into organizing that course, figuring out what we teach, making sure that all of our instructors who teach for it are teaching similarly, the same learning outcomes. So this felt right to us and good that we're spending a lot of time on these courses, and we're definitely teaching a lot of students. So we were happy to see that. We looked at cohorts by entering year. So for fall 2013, that is the group of students that just graduated in April-May of 2017. So that was the first group we were able to track for an entire four years through U of M. And we taught 71.2% of those students at some point during their career at U of M. Fall 2014 is actually our current seniors this year. So we don't, we'll be adding data in May on them. And so I expect that they're already past the 71.2 of our previous seniors graduating seniors. So the 71.4 should go up even more. And subsequently down the list, the fall 2016 are our current sophomores. So we saw them about 55% of them during their freshman year with us and we expect that's gonna go up at least 13, 14%. So we're pretty much on track for that group. The spring summer, we kind of grouped all into one. We're very heavily integrated into some spring summer program. So it wasn't surprising to us that we see 87% of those students. We see most of those students coming in. Frequency and timing was a big question. We wanted to know when we see students and how many times we may see individual students. We have lots of anecdotal evidence that we see students we've had, we always ask them to raise their hand. Have they had a library session before? Have they had it this year, this term before? And we always hear of students who've had two sessions or three sessions in the same semester with us. And we're trying to figure out where are we seeing these students because we can go back programmatically then and figure out some better approaches. If they're seeing us twice in the same semester, they probably don't need that unless they have a really specialized need or research assignment. But so this is some data on that. So the middle column, there's a number of students who had a library instruction session ever with us and how many sessions have they had in total during their time at U of M? So 51% have seen us one time. 27% have seen us two times at some point and so on. 3.8% have seen us five times. And so we need to do some digging and see where are these places that they're seeing us five times and does it make sense where they're seeing us five times? And the column on the right is of the students who had library instruction in their first year, how often did they see us? So 52.8% of the students we taught for, we saw them one time in the first year. But again, there's 5% who've seen us three times in their first year with us. So we want to again do more digging and find out if that makes sense for what they're actually being asked to accomplish in their curriculum. This is a really interesting slide and I know some of you are far back, I don't know if you can see, but we looked at gender. We looked at lots of questions about diversity, equity and inclusion, which are really important to us. So the goal that the orange on there is who we taught for. So in our classes, 58% of the students were female and 42% were male, which is quite different from our student body population. We're much more 50-50. The most recent incoming class this fall was actually 51% female and 49% male. So the students we don't see, which is the blue, 64% of them are male, which we were really surprised by that and tells us that we need to go back and look at perhaps some of the programs on campus that are more heavily male dominated, like engineering some of the hard sciences that maybe don't ask for library instruction quite as often. But on the other hand, the top two classes that we taught for on the previous slide, UC280, which is our research class and our writing class, those are very interdisciplinary. So this one is still kind of puzzling as to why there are so many males that fall into the category of students we don't see. We looked at first gen students and making sure that perhaps they may be coming to campus needing more support than other students who have parents with higher levels of education. And U of M, our definition of students for first gen is anyone who has parents that have less than a bachelor's degree. That's the definition that we use. So this goes all the way up through associate's degree and the blue is students who had library instruction and the orange is students who we have not taught for. So we were really happy to see this that we teach for the good majority of these students in all of these categories. And keeping in mind that on average, we teach for about 69, 70% of any class of students that goes through. So this data felt very comfortable for us. We also looked at race to make sure that, again, from an equity inclusion standpoint, we spend a lot of time focusing on programs where that reach out to underrepresented groups on campus. And so we wanna see for all the time and effort that we're investing in here, is it paying off? Are we actually reaching these students? So, and you can see that by and large for African-American students, 83%, we were thrilled to see that we were reaching that many of our students on campus. You know, some of these are really low numbers. We wish that they were higher like only 11 Hawaiian students in four years, but at least we feel better that we saw 82% of them. So again, this shows us that all the effort and outreach to the programs that we do here is paying off for us, we're seeing these students. Student retention is one measure that you will see very frequently in any kind of library or learning analytics projects that are out there. U of M has such a high retention rate, about 99%. It was really hard for us to see any impact of library instruction here. So these are totals for the last four years on some of the data that we see. So there may be a correlation on the 1% difference between those who had instruction and those who did not, but nothing statistically significant. And then the last slide is instruction by discipline. And I'm gonna walk through each of these separately because they're very different. But this is what Lori was really trying to get at. Once we have some richer data, we can go and have discussions and start discussions with some of our programs and schools on campus. So the blue is engineering. And our engineering librarians were feeling that they were not as well connected in the curriculum as they wanted to be, but they had no data to actually show them where they were connected. So here we can see that 49%, that's the blue, 49% of students have library instruction who are in the School of Engineering. 78% of them we see during the first year. So of seeing less than half the engineering students, we see almost all of them in their first year. So that means they're really not integrated into the upper levels when the engineering students really do their design and research work. We're probably seeing them in their college writing course. And how many have we seen? Two times or more is only 27%. So our engineering librarians have now taken this data and have drafted a proposal that they're taking to the college administration and curriculum committee, proposing just that, that they're integrated into the writing course first year and the design courses in fourth year. Moving on, nursing is the green. And this is kind of a mix. Our nursing curriculum changed about two years ago. So this data represents our old curriculum and their new curriculum. And very high integration here, which is not surprising. Most of the health sciences programs are usually more integrated I think than our other programs. But the good thing that they were happy with is that we see at least 75% of their students two times or more all the way over in the right column. Which tells them that their new curriculum is working. They're integrating more information literacy throughout their program. And public health is the one anomaly here. Public health is an undergraduate program and it's brand new on campus. Students enter in their junior year. So the school actually asked us for data on their entering class to see where they had library instruction in their first two years at U of M because they wanted to know in designing their new curriculum, where were their students coming? What was their starting point? So we looked at, in the orange, the public health students and all of their incoming students had had library instruction during their first two years in some other course at U of M. So, and actually 34% had had at least two or more sessions. So, which was wonderful news for them because then in designing their program, they knew what they, they didn't have to start with square one with these students. Any information literacy that they designed in their curriculum, they could start at a higher level and start at the disciplinary information literacy tools and concepts. And Lori's gonna wrap it up. Sure. So some of our key takeaways from doing this experiment was to really try to think about how he'd use data to do a programmatic engagement. How could we use the data to start conversations that we previously hadn't been able to get in the door for? And in at least three cases, that has happened. How do we improve our data collection? So when we were going through this process, we learned a lot about how we collect the data and we learned a lot about how the U of M data warehouse collects its data and how can we think about that when planning something moving forward? We also know that we are short on expertise in this area and we really need to think about either how we're going to train and develop that skills among ourselves or how we might acquire that or partner with other people to bring that expertise into these conversations. And finally, I think our main goal of all of this was to think about how to have that alignment and engagement with campus and how data change that alignment and engagement with it. And I think we are, we have three minutes. So we can either take questions or if people wanna come up in a few minutes and ask questions too. We'd be interested in hearing your thoughts and your questions at this time. Or you might wanna go out to the reception and those that are really super interested can come up and chat with us as well. That's fine as well. So why don't we do that? So thank you very much.