 So excited it's this turnout, this is fabulous. I'm so happy to see all of you in some familiar faces and some hopefully soon to be familiar faces. I'm Megan Oakleaf. I teach in the library school at Syracuse University and an associate professor. I used to be a coordinator of instruction at STS libraries, STS libraries, and then I got my degree at Chapel Hill in terms of dark side. So now I teach LAS and make baby librarians as excited. I don't really enjoy that phrase, but I would say that's right. And I'll let my public center Malcolm introduce himself. Go do it now. Sure. I'm Malcolm Brown. Hi everyone. Thanks for coming. I'm the director of the Edgeville Community Initiative. And Art said I was directed back then to be a Dartmouth player working often and happily with my fellow library. Jennifer Taxman is right here. She can attest to that. Oh, fabulous. Okay. And I hope that maybe even more people will come in. I'm excited also that the doors aren't locked anymore because I know they were locked earlier today. I came down today. I have small children and I couldn't come for the whole session and I saw all the tweets about the excitement this morning. And I thought, oh, they're up at four. How sad. But if you're tired, don't mind. So we can all be tired together. So I wanted to talk about this topic. I enjoy talking about this topic and I hope that you guys will bring a lot to it. Malcolm and I are planning to talk for a little bit just to give maybe a little half the time or a little bit to give a grounding. And then we really want that to become a discussion. And we really mean that. We want you guys to talk to each other, talk to the full group, and really sort of suss out some of the details of this issue. Because it's new territory. New technologies make new decisions. Think about strategic connections with the rest of our institutions and making sure that the library is at the table for these discussions. And being a resource based on our areas of expertise and also learning from others. So I'm really excited that you came today. I want to thank you for that right after that. I want to start by dealing with the difficult term in our title, learning analytics. I was talking to Malcolm about a year ago, really, about learning analytics and what I thought the future was related to this for libraries. And our first conversation was sort of a dance around, what do you mean by learning analytics? And then when we tried to talk to other people in the library community, we had the same dance. And we ended up really sort of understanding what we meant and then also sort of making a distinction between two likely definitions of the term. So I'm going to give like a more official definition of learning analytics in a minute, but just for the purposes of this early part of our discussion. Basically when I say learning analytics, I mean using and analyzing data, usually big data, educational data to advance student success and student outcomes. Now what that immediately put us into though was sort of navigating the territory between what is that and what is student learning assessment. What is that and what is maybe what we end up calling library analytics. So let's sort this out a little bit because I think if we want to have a discussion, we have to kind of get on the same page. And you might not agree with my definitions, but let's try to agree with me for the purposes of the discussion. Not anything I say, but just a piece of what we're talking about. Okay, so in terms of what the definition is between this and assessment, my background, my research area, my teaching focuses on information literacy assessment. Which I think is relevant here for student success and student learning outcomes. And if I was going to give a 30 second summary of where we've been in libraries in this territory, I would start it here. With a long, long history of surveys, getting at affect, getting at confidence level, how do you feel about what you've learned. Unfortunately, sometimes I don't see you about the library and teaching it, which doesn't really get at learning, but is interesting. Then sort of I would say about a decade and a half ago, there was a heavy emphasis on tests. And a hope that we could come up with some sort of objective measure of information literacy. And that still persists. None of these things go away. Ever. Nothing ever goes away. We got into sort of rubric territory. And we're going to look at artifacts of student learning. And we're going to use rubrics to try to describe what we want to see in terms of learning. Articulate what it is. Articulate how we're going to assess it. And again, that hasn't gone away either. And then I would say probably the last five years, this has been an area of emphasis as well. So back in 2009, Mary Ellen's in the room, so she'd correct me if I'm wrong. The ACRL recognized the need to help librarians articulate their value and impact. And that started off with the Value of Academic Libraries report. And in that report, we defined lots of different ways to talk about value and impact. Most of them in the white are things that we have a lot of literature on or significant value literature on. But the shift for doing the ACRL Value Initiative was to focus on impact of the library and defining it in terms of the goals, missions, and purposes of the institution and how we fill that. Not negating any of those other conceptions of value. But this project has really focused mostly on the library's impact on institutional goals, missions, and needs, which is rooted in our values and our beliefs about learning about higher education. Okay, so since that time, almost seven years ago, we've added really some return on investment research and a lot of correlation, not causation, research trying to pin down what are individual students doing in the library and what does that mean for GPA, for retention, and so on. So that's where we're sort of focusing now and into the future. And the research in this area was almost non-existent in 2010 and is now proliferated rapidly. So there's lots to learn about. ACRL has been busy at work. They conducted summits with higher education leaders and provosts and chancellors and whatnot. And then did the Assessment and Action Project. And hopefully a lot of you are hearing terms and phrases that you're familiar with. So this has been an art of interest in our profession. Now they're looking at ways to expand what they've learned from this progression and other ways to deliver professional development to more people in lighter weight ways than maybe an entire commitment to assessment and action. They've got a new research agenda coming out this spring. So it's still moving forward in a big way. And in some ways, the first image of the Value Report was like a drop in a pump. And now I think it's the next cover. I'm glad you're here. It should be a fire hydrant because it's really expanding. And I think what that demonstrates and it's relevant to our conversation today is the interest in documenting the impact of the library on student outcomes, student learning, student success. However, we define that. It's a little bit different on different campuses. So that really completes our arc up to this position. The next stage, I think, and again, we don't get rid of any of the rest. I believe is learning analytics and paying attention to what's going on in our areas in our institutions of higher education. This does not indicate that like each one is significantly better than the rest, but that all of these things are sort of, it's like more of a time evolution than improvement. They all have damning weaknesses. They all have great advantages. So there's not like you've got one go with that. All of these things are complementary. Okay. So I alluded earlier to the need to clarify what do we need by learning analytics for at least the purposes of this conversation? And what do we mean by library analytics? And I think what a lot of librarians are talking about right now in terms of learning analytics actually ends up being what we had to define as library analytics to make forward progress in the conversation. And what I mean by that are the sort of correlation numbers that we've been talking about. So the students come to the reference desk or participate in the instruction or check out more documents. What does that mean for GPA, retention, and so on? And that's the beginning of this larger thread in our research. But it's not the same as what our institutions mean when they say learning analytics. They're related but not synonymous. So if we move instead for the rest of today to an institutional focus, the library within a larger institution, I want to use this as our working definition of learning analytics. And it's not new. It's the one that keeps being cited over and over again. So there's nothing like really inspiring about my selection of this definition. But I think it's clear and I think it works. The measurement collection analysis and reporting of data about learners and their contexts for the purposes of understanding and optimizing learning and the environments in which it occurs. So to some degree you could say that library analytics is a part of that. But that's not really what the rest of higher education is thinking about when they're pushing forward in their progress of learning analytics. This chart and also the one on the next slide, because I couldn't decide between the two, I like to vote so much, shows a progression of what the institution considers learning analytics. What they're talking about when they talk about this. Starting with descriptive analytics, which describe what's going on in terms of what's happening with our student populations, who's succeeding, who's not. And then trying to figure out why that's happening. That would be the diagnostic stage. So we're kind of in the bottom left-hand corner right now in higher education generally. Moving in, everyone hopes to greater emphasis on the predictive analytics. Predicting what might happen so that we can swoop in and intervene. If we can figure out where there are problems in the curriculum, we can do instructional improvements. If we can figure out where students are struggling the most in terms of their coursework, we might be able to intervene with those at-risk students. So trying to be helpful with the data that we have rather than just sitting on it. This is a really fun chart too. Basically the same thing but with a little bit more detail. In all cases, I would say, and Malcolm might disagree with me, but I would say we're sort of below that bar going across. I think most of the things that are on the higher level are sort of future. And things that are underneath that bar I think are more current and even sometimes a little bit aspirational. So what's going on? Where is it happening? How often is it happening? Where is it happening? What's the problem and what do we need to do about it? And then the next future in this area would be above that bar mostly. Now the goal of learning analytics is not to gather more data, but to do something with the data. The goal is to intervene or take action. Now that might mean setting policies or procedures or staging interventions. That makes it seem really confrontational. I think we need a different term. But you know, interceding on students' behalf to help them learn. Sometimes these interventions might be passive, might get a notification. They might be active asking students to go meet with someone at tutorial services if they're struggling in a writing course, maybe meeting with a librarian if they're struggling in a research-based course or something where the librarian has the right data set or whatever. But connecting all the dots within the institution actively, getting students ideally a nudge and not a kick in the seat of the pants, but moving in the right direction to be helpful to the students. Also ideally, we want these interventions to happen more in real-time, finding out at the end of the semester what happened. It's a little too late for some portion of students if you're focusing on advocacy students. Finding out even a couple of weeks past when the problem is occurring for many students and something at the bottom falls out of the front or their personal lives or whatever and they stop going to class, wait two or three weeks might be irretrievable to get them back on track. So we want this to be as close to real-time as possible. And also I would just add here that when you're taking actions or making interventions from learning analytics or planning to do that, I think that those actions and interventions should be judged not based on how many students did we send to tutorial services or how many faculty members did we help with the placement courses that their students are getting stuck, but rather what did that action actually result in? Do we have more retained students? Do we have students getting better grades? Do we have students recording higher satisfaction? What have you? So where does the data come from in these systems, generally speaking? They come from places that we already have data, but they're being brought together in a new way. So the student information system, that is usually historical or static data like social-economic status, high school GPA, SAT scores, that kind of thing. Learning management systems, whether those are being used for online courses or campus courses, but there's a lot of click-through data and activity data in those systems that could be drawn into learning analytics system. Information from publishers, like what students are doing with their textbooks, clickers, video surveys, even how often students are going to the dining hall. Do they stop eating? Do they record when you're eating too much? I'd sign up. Or if they stop going to the gym and they have to establish a pattern so something's going on, maybe their advisor needs to know. Or maybe they don't. Maybe that's an overreach. We need to talk about this. Why would we want to bring all these things together? Well, mostly it's for pedagogical purposes. Mostly we're looking to improve the curriculum where students are struggling. Where are the trouble points in a given course or given path to degree completion or program completion? Alright, if at-risk students are struggling, why are they struggling? When are they struggling? What can we do about it? I've prepared a few notes because I don't want to forget anything. Oh, this is also a really important one. Helping students become more aware of their own learning. So a metacognitive role. So places where the learning analytics system is open to students, which many of them are, they can see when they're starting to maybe fall off or go from a green light to a yellow light or a red light. And so that also helps them be, oh wait, but something's going on. My faculty member, my professors, realizing that something's going wrong in my grades or whatever. And it's actually, I've gone from yellow to green to yellow. I need to do something about that. So thinking about those and having the conversations with their advisor as a result. And then also I think we need to acknowledge that that has a small focus in terms of individual students. It also has a large focus in terms of our institutions. And that means also business purpose. To keep students retained, to get in completion with their programs and their degrees. Those are educational goals. They're also business goals. So the two things are sort of together. Now, like I said earlier on the slide with the ARC, all of the approaches to assessment and student learning improvement and student success have advantages and disadvantages. And I want to acknowledge a couple of those rather significant areas of concern as well. So one of them is organizational culture. There are lots of institutions and my friends seem to work at all of them. Where people say that they care about data driven decision making, but they actually don't walk that talk at all. And so more data in a culture that doesn't use data to make its decision is really not all that helpful, right? It needs to be actionable. I was at an ACL value summit years ago where Charlie Blake said, you know, you could have all the survey results all the data you want, but if it resides in binders or on someone's, you know, H drive, it's not actually being used and that's pointless. And actually a good way to fill an assessment effort as well. So organizational issues are really an important area of concern. And also if your campus is like that, not helping them not being part or around the table when issues are hashed out and concerns are hashed out, I think there's a place for libraries at that table in terms of lots of different areas of this... Am I over time? No? Okay. Or this movement, I think libraries have a lot of skills to bring to bear, whether it's understanding when students struggle and intervening in their educational or understanding how to use research data or having professional ethics around the use of data. So there's a lot of things that we can bring to the table that help with the organizational culture issue. Also data, right? You need to have good data, complete data, data that is protected, data that is used properly. Some of the systems when you talk about learning analytics or homegrown, some of the systems are vendor produced. And so when you have systems that aren't homegrown, you might have questions about proprietary or the closed nature of some systems you need to be able to understand because you may have to someday defend decisions that you're making based on data. You don't know how that's coming out of the system or the problem. I am not an expert in data quality, but lots of people are in data quality. You can say that. Data-period quality. The cohorts, if you're looking at small cohorts, particularly for at-risk sorts of things, small cohorts, that we know in research can lead to sort of spurious information if you're not careful. So you need to be careful when you slice and dice down the areas where you could identify people or that your data is actually not high-quality data because you've cut down to such a small population that what the results are are not legitimate. I think also it's worth saying, and I've been talking about this in the value construction, but also learning analytics are at least currently built a lot on correlation rather than causation as well. So looking at how things correlate, that's not the same thing that's causing it. So you could tell a student you need to have these five behaviors and they tend to lead to more success, but you can't say you do these five things and you will get an A. So understanding what to promise, what to not promise, that's a really important part of the process as well. So Malcolm and I are going to take a moment starting now to talk about different types of learning analytics systems that are present and are developing in higher education. I'm going to show you the two easy ones and I'm going to make Malcolm do the hard stuff, but I'm really excited about what he's going to talk about. I want to talk about something that's kind of referred to as an I-PASS system or an integrated planning and advising system and these are rapidly proliferating across our campuses. Malcolm's going to cover some of the more developing areas. One of them, I think a good first example of them, some of you may be familiar with course signals at Purdue. It was created there, it's now part of the Ellucian product line so it's now a vendor supply, but the idea here is that students and advisors and I believe also faculty have access into modules that show, I was earlier referring to green, yellow, red with customized notices and automatic notices to students about what's causing those things. Probably don't need to customize note about not attending, but some things require maybe faculty or advisors to say something specific. There's, I included these links not because they make for a beautiful slide but because these are, I was learning about this myself, these are the places that I thought gave you the most information, quickest, the least amount of time, right? So, high impact, links to learn about this sort of process. Syracuse has also, my institution has also bought into a version of this, it's not this version, but they're rolling it out with our undergraduates this semester. Graduate students, I think either next year or next semester. So this is happening a lot and fast. There's another example slide that I wanted to share and I'm looking at it at the University of Michigan where lots of great learning on what this work is happening. This is a screenshot from their system that is used for advisors so they can address, I don't have to risk students using learning management system data but if you want to learn more about this kind of thing also following what's going on Timothy McKay and others at the University of Michigan great ones to watch and he tweets. So you can pay attention to that as well. Okay, so the ones I just showed you get out homegrown but there are lots of others as well. I also wanted to just acknowledge a few pioneers in our own field who have been doing this kind of work. The University of Wellington in Australia is different from US institutions in that they were the early adopters of using data to look at student success and then go to their campus into it. So that's kind of fun. But the other examples here are not exactly that but they're really progressive in terms of their use and I believe based on my conversations at the Library Assessment Conference that this slide is going to fill up pretty fast because there are lots of campuses that are just getting involved in this and their campus might be involved in the Learning Analytics Initiative but the library now is. Scott Walters is standing in the back looking for a dapper with his cloud tie and he could talk to you about what's going on at DePaul in terms of the library's integration as a point of intervention for students who are struggling. At Texas Tech Laura Hynes sent me a little like three or four sentences I wanted to read to you about what they're doing there. She said, I asked for subject librarians the personal librarians to be included and they were able to add them to the list of tutors so they have access to the system. Advisors and faculty are able to refer students to the librarian for assistance. Students are able to make appointments using the system due to the capability of syncing with public calendars of the librarians. Librarians are able to leave comments for the faculty and teachers to classes reminding them to set an appointment with their librarian for research assistance. This has just launched this semester. Faculty and students are slowly adapting. The reporting feature I hope to be able to use in the future will allow me to look at the students who did meet with their librarian for a course and their final grade in the class. So you can see how this is starting to work out. University of Minnesota presented on this at the library assessment conference and all their presentations are available on the web so I will go into detail on that. Okay, how do I do that? I'm over. Welcome. Go. Thanks, Megan. All right. So what I would like to do in my segment is to come at this from a certain angle and by growing in on it from a different angle I think you'll find that we're again in the exact one spot that Megan just brought us to. So what I hope to do is show some thinking that's coalescing about digital learning environments in general on our campus so we'll put a spotlight on the opportunity that exists while it's opportune now to be thinking of learning about this but it's institutional rather than on the side of the scale. Okay. But first this is an academic meeting so there's a quiz. All right. Let's play around. This flag what is it the flag of? It's supposed to be the fraternity. Anyone know? This is the flag of Esperanto. And I'll get back to Esperanto while I'm talking about something black in Esperanto in a moment. So what I'm going to describe is some research that we've been doing at EDGE CAUSE did do at EDGE CAUSE with some money that they got from the Gates Foundation and these are the two reports we produced on it with a little URL down in the following if you're free to grab these reports if you're curious about them and if you want to hear about this piece in more detail. So the question was from the Gates Foundation what should the next LMS be? What should we go by? What should it do? You know, whether it be LMS 2 or 5 or LMS 523. But when we started off with this research we quickly came to the realization that that's a long question because whether it's LMS 10, 25, 537, you're still in this box thinking about the LMS. So we felt that there's no single application up there possibly because of the diversity that you see across all the percent of the way if you take it. So the wrong question is what should the LMS be? So we decided that this notion of an Uber application that LMS sometimes is seen as a digital learning environment. Again, it's the wrong approach. So we decided to cast them on a side. And we came up with this name which is fairly kidious but you know, something that I guess only has framers for that. We called it the next generation digital learning environment. It's next generational in terms of trying to get away from the underlying associated with LMS. It's digital because the digital infrastructure and our digital environment enable everything that we do and it is a learning environment and not just a learning management environment. So that's why we came up with this name with an even more kidious acronym which is NGLE. Okay. So what we are now thinking that the right approach is is what we call component architecture so that you bring in a variety of digital componentry to allow faculty and students to scaffold together to create the environment that will best support their learning and teaching. So that might be in some cases that you have an LMS augmented by a series of applications. It might mean that you have an LMS that is so kind of covered over by these augmented applications that you already see. It's not really that forefront in the user experience. It might be kind of a backdrop role but it's not the main user experience or you could have an LMS at all just a framework of applications. So our refer, our thinking is entirely agnostic. We're not saying that one of these is better. What we're saying is what we need to do is to have the mobility to construct a framework that will support the learners and our instructors. So what works really styles me is openness and an openness based on standards. So that's why we're using this label metaphor and that's why there was a label fiction at the beginning. So labels as you know allow you to build almost anything as long as they appear to a certain standard. This is the actual specification. So as long as those little tags appear to these standards you can build something like this. You can build whatever you want to build with pieces that can be of any shape or size. As long as they appear to certain standards you can fit them together to suit your purpose. We're thinking that is the best way to approach this. So why that's why there's this sort of Esperanto idea that's certainly important to us. And now with respect to learning data it gets even better I think. So in our thinking we've identified five functional domains that we think that this next generation environment needs to address. So now there's this focus now on the first two which is interoperability and learning and learning data. Okay. So right now learning data is largely silo. We have it from a variety of sources. It could be the LMS of this information system it could be from all over the place. And the fact that this data is silo is one problem. The fact that it's all recorded in different language like one might be playing one might be laughing is another problem. So what we're thinking is is that if this remains where everything is siloed in a sort of different format trying to come together and inform a larger set of learning data it's not going to be on a hardware side and it's going to be a very very hard process and it's going to be very easy to get the storage. It's going to be very easy to get the siloed and speak in different languages. So the idea here that's emerging around the caliper standard is that of something that's called the learning record store and this is at the heart of two very important standards around learning data. One is S-H-P-I which is sometimes called T-Can. The other is the caliper standard for a client as well. They both have this notion that what we want to do is to bring it reservoir or deposit with our learning data and have it all in the same language have it all in the same location that we can then use that analytical intentions to make use of the data to improve our learning environment to improve our learning experience and promote student success. So that's the core of the idea. So this is a graphic from the S-H-P-I site. Again we have students having a variety of experience using a variety of component trees. Why not be able to gather all the learning data from all those various components and have it flow into a reservoir and again so that you can have a much more detailed integrated picture of what the student is doing. Caliper is talking about the same thing the same standard from IMS Global. They're saying look at the variety of experiences and tools that the students are using wouldn't be good if they all spoke in the same language so that you have this learning data repository so that you have on the faces of this interoperability you would have you would enable innovation and also feed it. So now let's look at a few campuses and butchering so those two campuses were out of the gate and we had a session at the Las Angeles Conference I'm just shamelessly following slides from that session. Let's first look at what UC Berkeley is doing here and you probably can't read this all too well so I apologize but what we can see there on the left side is a variety of applications and by means of the California XAPI Open Standards they are feeding data into our learning record store that again makes it available to another company. And they are out of the gate in terms of designing the record store this is some technical detail we can make these slides available so if you're interested in the detail you can grab it then I'll skip through this for the sake of time they are also thinking about student agency and privacy they have a number of privacy principles already in guiding their use of this data and their collection of this data and their aggregating of this data so they've been doing some thinking about this and they also have some recommended practices so they not only have a collection of data they have a series of practices around it to guide their use of it the University of Kentucky has been working early on for a good number two, three years now and again so if you think about in the old days if you were say approaching this from an IT perspective and say now we need to integrate this into one of our central administrative applications if you have to integrate these all by hand it's an immense amount of work it's an immense amount of expense and then when either one of the two changes either the central host system or the remote application that they change you have to do integration all over again so that's why these standards are so important because they will eliminate that cost that loss of time and allow you to be able to make these connections much more effortlessly it's anything about a learning record store again, if all your ancillary products are using or talking this Esperanto learning data then you may all deposit it effortlessly into this learning record store and then explore available for use by the institution for its own two-digit purposes so again they're talking in exactly the same way variety of applications all speaking in the same language depositing this data at a single location so that it can be made into something that's a teaching sense so the caliper I won't go into the details of the caliper but it is essentially the nice thing about it is that it is designed to capture learning data and it is extensible so the set of definitions and expressions in the caliper is not set but it can be extended so that was a quick tour we're going to now go into a discussion portion so Megan the measures are much better okay so what we wanted to do is really hear a lot from all of you sort of facilitate you guys talking to each other but then also report out to the larger group so that we can all have a free and fun and not at all you know fraught with peril conversation initiative on many of our campuses and really we wanted to start you off with sort of a general question which is what's the library's role with respect to institutional learning data and the like what is our role what could our role be what should our role be of course that assumes my perspective which is that we should have a role so maybe you can push back on that as well so what I'm going to ask you to do is to turn to your elbow partners the people near you and if you don't like that person that's fine and if you need to move if you don't that's fine and I want you to engage this question marks like engage this question for a little bit and then we're going to ask for different people to report out and get the everything going but I wanted for you know some of us are introverts and we need to reflect and then talk small so maybe about seven ten minutes to talk small with your partners just a couple things I know it's time I wanted to just point out a few places where you can learn more I know they're going to be sharing these slides so my GA very helpfully selected out of the huge annotated bibliography she's done four resources that are great starting points I've left actually two handouts in the back if you want to grab one we're also going to be talking about this at the ACRL conference one presentation that's all librarians and another one where Malcolm is going to reprise his role as well as some others on the panel I'm really excited about moderating these discussions and also Malcolm has contributed some continue here reading so if you're at the next you're already started and you're at the next stage here are some additional things you can think about as well so I want to thank you guys all for coming we can stick around and talk but I want to also release you because I think you're eating that