 So I'm really excited to be with you all. This is my, not only the first data science day, it's my first data science day. So I'm really pleased to be invited and included in this lineup of my colleagues sharing today. So my focus is gonna be on learning analytics and I'm a former librarian, although you never really turn in your library card, so to speak. So I consider myself a librarian, although I've been at the iSchool for about 15 years and that's where my practice and research and teaching is all embedded. So I'm gonna be talking about learning analytics, but from the perspective of libraries. So I hope that those of you who aren't library folks will also find something useful in this talk as well. And just in case, I'm gonna share the slides, if you wanna follow along on a separate screen or if you need a screen reader or something like that, that's the link to the slides. So here we go, let's get to it. We only have a half hour to talk about this great topic. So as you're going through the talk, I would invite you to add questions. Jay's gonna be monitoring it for me in the chat, but I also thought I'd give you all a few questions to think about as well. And these are some of the ones you might wanna be thinking about, particularly if you are library focused. So you might be thinking about to what degree should librarians be involved in learning analytics, what data might libraries have to contribute to understand student learning and success in an academic context, what ethical decisions and there are many need to be made with regard to this work. If we got data back from this process, what kinds of decisions and actions could we make and take and what conflicts might arise as we go into this territory. I wanna acknowledge also that what I'm about to share with you came from two, well, few other places, but primarily from two IMLS funded grants that we had at the school. I was lucky enough to spearhead these, but each of them involved many, many other individuals. The first is library integration in institutional learning analytics or LEVA. And the second one is class connecting libraries and learning analytics for student success. And I want to drop in to chat the links to those projects. There we go. In case you wanna explore those as well, I'm gonna allude to them a few times in the next 30 minutes. And at each of those links, there's more than you might wanna know, but also some useful content about each of these projects, including who was involved, what was accomplished and greater explication of some of the things I'm just gonna be able to nod at today. So I just wanted to make that acknowledgement and also acknowledge my gratitude to IMLS for funding these projects. So let's start with a common definition of what learning analytics are, specific to academic libraries in higher education, which is where my focus is. So learning analytics in higher education has been defined many different ways for I think a good working definition for our next few minutes together tonight is this. The use of institutional level systems that collect individual level student learning data, centralize it in a record store and serve as a united source for research seeking to understand and support student learning and success. The term learning analytics gets used differently in libraries sometimes than this definition. So I wanted to be clear about where I'm coming from for this talk. To give you sort of a really oversimplified map of what that might look like, I present you with this beautiful diagram in soft colors, so as not to be overwhelming. So the idea here is that with learning analytics in higher ed, the data comes from many, many different places, including student information systems, operations data, learning management systems, iPass data, we'll talk about iPass systems in a moment and all kinds of other data coming from different parts of campus. All of that data then is consolidated in an institutional record store or data repository. It can go by a lot of different words, data lake, data puddle, data stream and in one place in the institution where it is or should be protected by policies, procedures, practices, governance, personnel, chief privacy officers, technical security and whatnot. The purpose of bringing it all together is analysis so that educational researchers can run queries and correlations based on research questions or user stories to seek to understand the student's experience. Of course, we don't set it and forget it. This kind of data needs to be continually assessed for bias and error as we try to use this data to understand what helps students be more successful and what might get in the way of their success. The folks who would then be able to view this data perhaps in its native form or more likely in dashboards would be on the right-hand side. Students, faculty, institutional folks broadly across the campus and advisors. And I've alluded to in those boxes why they might need to know that, right? So faculty need to be able to improve their courses and the curriculum overall. The uninstitution should be able to understand what's getting in students' way and dismantle those hurdles, not just for individuals but for large swaths of students, advisors, and students, most importantly, by getting their own data back can start to use that data to make decisions, be more empowered, have more agency over their learning journeys. You'll notice that there are these elements on this diagram that are not the same color as everything else. So there's library data is sort of in a navy blue color and so is the word librarians. And that is because for the most part, generally speaking, library data is not included in institutional learning analytics in institutions of higher education in this country at least. And librarians are often not privy to what the results of the data analysis is even though they would be natural partners in implementing good decisions that come out of the data. I said that I would explain in a moment what an I-PASS system is. This is sort of a flavor of learning analytics that many campuses have rolled out including our own for undergraduates. And these sort of solutions are generally referred to as I-PASS systems, integrated planning and advising for student success. The idea of an I-PASS system is to do some of the things I just talked about to bring together information about students in order to connect students, faculty, advisors and others in many cases librarians, but not usually, more and more so, yes, though. To sort of unite that educational team around students and their journey towards success, give students back information about their pathways and decision points and things that they might be able to reach out to or take advantage of. And ideally this is supposed to happen in close to real time because in a student success journey even a week can be too long. So trying to get sort of slow, higher data speed it up so that people can make decisions on it. There's lots of different companies that have waited into this space and some institutions also have their own homegrown version. So this isn't exactly learning analytics but it is definitely a flavor of it that has a lot in common. So I wanted to include these systems as well in our definition. So overall, like what's the point of all of this? The point of learning analytics is to help educators discover, diagnose and predict challenges to learning and learner success so that we can do something about that, right? Or maybe the decision is inaction but typically an action would be the result of our learning. And this is really important for us to understand better the student experience and also for students to be able to understand the landscape, especially those that don't have another way of interpreting the environment they find themselves in because maybe they're first in family or maybe other challenges are presenting themselves and they have to do a lots of things at one time. And so any assistance and pointers and suggestions that they can come to either on their own by looking at their data or be connected to in a more direct means should be useful. So on that screen it says deploy active interventions to benefit students and interventions is a horrible word. Like it sounds like something you would not want to be a part of. It sounds like, I don't know like surgery or something but what is an intervention in this context? So there's really macro level interventions and individual level interventions. So on a broad large scale level, we're talking about systemic and structural changes to higher education practices, processes, policies so that the learner experience can be improved and obstacles that we inadvertently set in students' ways or maybe inadvertently, I don't know that would be bad but it might be the case can be removed, right? To dismantle and get rid of those obstacles. Also on an individual student level we can facilitate communication. So allowing learners to see their own data and see how their learning behaviors line up with their intended trajectory or what their peers are doing or you can imagine many different ways to present that information. To notify students and their educational partners of important events or milestones. To make connections when students need or want to seek some sort of service or resource on campus but either maybe don't know what exists or don't know how to reach out to that service or resource and otherwise link students with things that they have a right to participate in because they're a member of the community but they might not otherwise know about. In library land and academic libraries, the term analytics as I alluded to earlier does not always mean these things and sometimes the term learning analytics gets used to mean general information literacy assessment, a one-off or episodic correlation study between some sort of library engagement and some sort of metric of success and it's also not primarily about proving that libraries are valuable although sometimes that is a side effect and lots of folks who have to make administrative decisions are interested in to what degree the library is helpful and impactful and how we can do more of that but that's not what it's primarily about. It's about improving that learner experience. So it's important to remember in the context of libraries that learning analytics is only one approach to understanding the student experience. We have always used surveys, focus groups, interviews, ethnographic approaches and lots of other strategies and tools for understanding the student experience but all of those tools have gaps as does learning analytics. And so taken together, used in conjunction we can better understand what's going on but learning analytics is not a panacea and does not necessarily replace the strengths of those other approaches. It can, however, give us an idea of where to apply those other approaches. So focus groups, surveys, interviews are all labor intensive not only for the librarians and other professional staff that will put them together but also for students to participate in them. And so learning analytics might give us a way to understand the environment and know where to apply those sort of in a basic way and then learn where we would want to apply those sort of more why and deep searching types of strategies to delve deeper. We think there's a problem here or there's an unexpected success here, let's find out more then we'll use our other approaches to do that but learning analytics can give us a scan of the situation. And it also is, as I've tried to emphasize a few times primarily about understanding and supporting and improving students' experiences. Okay, so what can this help us do in libraries if we engage in this work? Well, this is where I'm gonna point to those links that I gave you at the outset because what can we do with this is a voluminous piece of thought and communication in and of itself. So I'm just gonna point to a few things in those reports for those of you who are interested in going further. First of all, learning analytics can help us answer a great number of research questions or action research questions that librarians have long had about student learning and success, what libraries can do to support that, what we might be doing that does not support that so we could stop doing it. We could also use the data to take action including instituting better communications with students, improving library collections and services and facilities based on what we find, making better instructional decisions because librarians teach all the time and just better implementing our decisions based on data rather than anecdote or perhaps partial information that we might get from our other traditional approaches. I also think it's really important that this work could integrate librarians into the larger institutional picture. If you are a librarian in an academic context, that's always a goal is to be part of the campus and to be fully integrated into the life of the campus. And so this is another way that librarians can do that. And certainly when the results of larger learning analytics processes and projects come out, the librarians are key people to help implement change because of their connection because of their third safe space on campus. And so there's lots of benefits that librarians can bring to this work overall. And finally, if you like user stories, there's more, almost a hundred in the Lila report of ways that the researchers on that project and the librarians and learning analytics experts that helped with that project could imagine that this kind of work could help students, librarians, advisors, faculty, institutional researchers and institutional leaders and administrators do their work better with this information. But I don't have time to go into all of that tonight or you'd have to hang out with me quite a bit longer. Now, there are tricky spots in this work and I wanted to highlight those as well. Libraries have long valued privacy, user privacy. We still do. And so the idea of collecting individual level data is fraught. That's probably like understating it, but it's a complicated space that I think is unique in some ways. And we have conversations around the privacy of data that I think other professional spheres don't have the same kinds of conversations. So I, again, I can't go into all of those in depth, but I did wanna at least acknowledge those tricky spaces where the terrain is difficult around this work. The first one is misconceptions about learning analytics and we sort of have alluded to that with the misunderstandings of what it even is and what we can do with it. But certainly the privacy issues are another tricky area and understanding the difference between maintaining data at all or getting rid of it as quickly as possible, understanding where residual data lives in our libraries through connections with the supplier community, the vendor community, thinking about what data we would never want to have because someone might come and ask us for it. And librarians hold sacred the responsibility of having information about what people are reading and not sharing that. So thinking through all of those contexts, how do we keep information that is sensitive, confidential, only keep what we need to use them to understand important research questions and how do we make sure that other data is not kept? And so the, let's keep it all and see what it will reveal is not really aligned with the ethos of librarianship. So we have a lot of conversations around that in this space. There's also the sort of notion of knowing more about our students. Actually, you know what, I think I have slides for this. Yes, I do, there we go. So for a long time, librarians have worked without detailed information about their users across the board, including in academic contexts. But we also need to think about the opportunity to have more detailed information about students and what we might be able to do with that information that would be helpful to students. So we might be able to take action and make decisions with them, with students and on behalf of students, if we had more information rather than knowing almost nothing and being left to assumptions or partial information to make decisions on. So the other side of that is also when you know more that also puts a guess a weight, an obligation, a responsibility to do something with it. If you know that there's a problem and you do nothing about it, that is a different space than if you had no idea there was even a problem. So knowing brings more responsibility and we have to as a profession wrestle with that and think about how that changes, how we might do practice. In library spheres, one of the main outputs we hope of interaction with the library is learning. And learning analytics gives us an opportunity not to just look at how persistent or quickly students move through their educational program to completion, but also what are they learning along the way? And so up until this point, we've sort of in libraries tried to correlate, what are we doing in libraries with our students persisting through their program? Do they complete their programs or degrees? And that's really, are they getting the grades? All of those metrics are sort of large grain. They don't get into the details of what was actually learned. And so learning analytics presents us an opportunity to get more into that space. I think one of the things that's most motivating about this work is the opportunity to understand our students better instead of looking at student data as monolithic or having very rudimentary understanding of how different aspects of students' identities influence their success and their strategies for success in higher education and in libraries specifically. This gives us an opportunity to be more knowledgeable about how different aspects of student identities make a difference, right? So not just race and gender, but degree program, major, what year, what else is going on in their curricular life and so on. And how does all of that make a difference on what can we do to sort of take a more nuanced view of students rather than just let's design for average. Let's just throw all the numbers together and aggregate it all and make our best guess about the average. Well, there's no such thing as an average student. And so that puts us in a poor position with regard to making advancements in equity and inclusion. I mentioned the labor involved in understanding the student experience. Also, we have an issue about labor and equity with regard to different kinds of institutions. A big institution like Syracuse might be able to do some of this work. A smaller institution like a smaller liberal arts college or a community college might not have the staff to be able to do this kind of work on one-off episodes. So having a system that will help them gives an opportunity for all to learn this, the detail that we can get from learning analytics. Lots of tricky places about who makes the decisions about data. Right now, the vendor community makes a lot of those decisions maybe institutional leaders to some degree, but mostly library vendors. And if we wanna have ownership in the library and with students that we would need to negotiate that territory, this is sort of the same content. A little bit different, but mostly the same. And then, so all of those are about sort of should we do this work? This last one is about how could we do this work? And one of the things that if you're in a library, you know about libraries is that there are many, many different systems operating at any one time. They spit out data in totally different formats and bringing those together and formatting them in a standard way is, I mean, it doesn't happen, right? So if you try to take library data as it is now and ingest it into a learning record store, it could just be spaghetti, right? Just formatted differently, a mess to try to deal with and normalize. But the second project I talked about and also you have the link in the chat is the class project. And that project resulted in an interoperability standard. It takes care of this problem. I won't go into a bunch of detail because we don't have a lot of time, but Caliper is the name of the interoperability standard. It's a common interoperability standard across educational technologies. You can see the list here on the right, we are the newest profile in their standard. It's a JSON triple, basically with an actor, action and object. And for the library profile, there's three actions, library use, which is designed for spaces. Library resource use, which is designed for collections and use of materials, whether that's physical materials or digital materials in the library and participation. So asking a reference question, attending an exhibit or an instruction session. And it looks like this. In real life, this is a simple Caliper packet. This is a more complicated one. This first one doesn't identify the user, privacy issues, the second one does. So, and it also says what course the person is reading this journal for. So the standard is agnostic in terms of how much data you put in that's a design decision by an individual library organization in collaboration, I would imagine, with their institution. Okay, so if you are involved in education or in libraries, what are some of the things you can do in this space? Well, you can think about what this might look like at your library and at your institution. Think about what research questions you would want to answer. Think about how this fits into your current strategic plan, what you would need to do in terms of educating your colleagues and your staff and how you would act in this new space that's coming. It's coming. And just sort of in terms of a general action plan, you could think about, what questions do I want to ask or answer? What policies or procedures would we need to get ready for this kind of work? What kind of data is okay to include in this kind of work and what would we never want to include or record in any way? And who also do we need to bring into the conversation and clearly at the top of that list would be students because it is their data. Learning analytics isn't new in higher education. It's been around for at least a decade, but it isn't going away either. So this year's EducAWS Horizon Report, which is a major report that comes out of the educational technology and IT realm in academia has listed learning analytics again as a key technology in practice. This has been true for, since at least, I'm going to say, 2011. This year, the section on learning analytics included, and this is the screenshot of the first page of the learning analytics section, our project here at Syracuse, the IMLS funded project on the very first page. We got a shout out, which was great. They did a lot of analysis and asking people about their opinions about learning analytics. You can see that it ranked highly in terms of addressing equity and inclusion, an important aspect that we've only alluded to tonight. And really this work is increasing in importance. Post pandemic, we know for a long time that online learning is increasing, but it's increasing even more quickly now. In the past, libraries have really relied on proximity dependent approaches, and we can no longer really trust that that will get us what we need because we do have that increased student learning, but also there's populations of students that have always been sort of invisible to those approaches. So if you're going to do a focus group with students who come into the library, you never hear from students who aren't in the library and so on. So this is a growing area, despite all those tricky places that we have to figure out. This is a list which I won't share each individual one, but this is a list of what happens if libraries don't pursue in some way involvement in learning analytics, whether it's with library data or just participating in the larger workings of learning analytics at their institutions. I've alluded to some of these, the danger in designing for average and leaving folks at the margins who don't fit whatever average is, missing opportunities to give students back their own data, missing opportunities to talk about privacy in the overall conversation on campus and the like. So these are some of the resources I wanted to share with you. I also wanna give you, put this in chat, two more resources. We're doing on time, okay. This is the library caliper, the caliper library profile and also a conference paper that we wrote about a year ago about some of the issues that I alluded to tonight. There's ACRL, the Association of College and Research Libraries also has a learning analytics toolkit. A lot of the content heavily borrows from these two reports that you see here. Okay, that was a whirlwind. I hope that that was interesting to you. I would love to hear any questions that you have. I'm gonna stop sharing so I can see a little bit better. Jay, do we have any questions come in? We did not have any in the chat come in. Does anyone have any right now? I'd love a question, send me a question. Carly's here, Brenna's here. I bet some of you folks have questions. All right. One question that's kind of been asked at others is just like kind of your path into this industry. Obviously, you mentioned starting as a librarian, obviously now you have much more of a data background. So just kind of what led you into that. Yeah, so before I was a librarian, I was a public school teacher. And so it's sort of baked into how I think about things, continuous improvement and inquiry and using all resources possible to understand the situation and also try to affect change in the future. And it's bothered me for a long time, how many students are not able to complete their educational journeys. They take on debt, but they don't complete to get the credential to take care of that. And this is disproportionately affecting students of lower socioeconomic status, race, and what you would imagine to be the case. And that's not okay. And so any resources we can get to make better informed decisions and take actions to improve the future, I think it's an obligation to do that. I work in this area, but I don't consider myself a data expert. I partner and align myself with folks who have more technical chops than I do. But at the end of the day, as a team, we get the job done. And so for those of you who are maybe interested in data science, but maybe don't think that this is an affinity area for you naturally, having the larger vision is also an important element that is essential. And so I would encourage you to think through that. Oh, here's the questions. Yeah, we just had some questions come in at 7.30. Yeah, but this is the beginning of the student talk. It's fine. The student panel, we have 45 minutes first. So I would say definitely answer the questions. And then for everyone kind of paying attention in the chat, we will shut off the questions for Megan after the two looks like from Carly and Nika. Okay, great. So Carly, hi, Carly. So Carly asked about hesitancy for libraries to share data. Do you see on the horizon some chances with the growing interest in data library and shifting? Yes, I do. So some of the partners that I work with commonly are the University of North Carolina at Charlotte, the University of Michigan, the University of Minnesota, but there are others, the University of Denver. So there are library suppliers like EBSCO that are interested in this space. If you have time, you can look at their new product. It's called Panorama. And I don't have any insight information on that, but I'm just observing it from a distance. You know, I think that the need to address equity and inclusion has to drive some of this work. We need to do better as libraries and doing that. And I've been in the library field for 20 years and we haven't moved the needle nearly far enough. That is an understatement. And so, you know, we need to bring all of our data, but like all of our evidence to bear data being a big part of that. And so I think data librarians having those increased skills makes this less mysterious and more understandable and doable and we have more agency over it. I hope that made sense, Carly. And then could you give us an example of a successful learning analytics project implemented on campus? Yeah, okay. So I would look at the Georgia institutions for examples. There's examples throughout higher ed all over the place of Wayne State is another institution that you could look at there's just tremendous changes in their practice. I mean, just doing learning analytics doesn't solve anything. You have to then use the data to make better decisions for students, right? So it's always in the application. And the Georgia system and Wayne State are two that come to mind and having made big differences in retention and graduation for populations that they had trouble making inroads in previously. So those are two places I would look. It's such early days yet with libraries that I can't give you examples of that yet. But I believe that we will have more examples of those in the coming years. This caliper profile is brand new. And so we're working with the supplier community right now to see how they can implement that as well as individual libraries. So hopefully it'll make a difference. That's certainly the intent.