 Welcome to the C&I Spring Membership Meeting webinar. Today is the last day of a two month long program of webinars representing the sessions for the online C&I Spring Membership Meeting. And I'm really delighted to be able to introduce today's session. I'm Joan Lippincott, Associate Executive Director Emerita of the Coalition for Networked Information. And this afternoon's session is the on again, off again, career of learning analytics. And I'm delighted that Malcolm Brown, the Director of Learning Initiatives for EDUCAUSE, is going to be our presenter today. I've had a long association with Malcolm in his role as the long time leader of the EDUCAUSE Learning Initiative, or ELI, and I can't think of anyone better to give us this view of learning analytics. He's going to tell us about a survey that was done by several associations. And I'm also very eager to hear his own perspectives on the findings of that survey. So Malcolm, take it away, over to you. Thank you very much, Joan, for that kind introduction. And thanks to C&I for this opportunity to come chat. And thanks to all of you who have joined us today. I know everyone in this time of the pandemic is really very, very busy. So I doubly appreciate you're taking the time so we can have this conversation. And I'm looking very much forward to hearing your thoughts about learning analytics. Okay, so yes, we are going to be talking about learning analytics and higher education in this hour. And you could rename the session as the case of the missing analytics and giving away the punchline a little bit if I do so. I don't know if missing's the right word or whether we want to say it's merely tardy but not entirely missing. So what I'm here to do is to give you some kind of food for thought. And I'm looking forward to discussion at the end to hear your takes and perspectives on what's going on there. So this session is less, I think it less of a presentation. And more sense making that is to say we have things that are going on in our environment and what do they mean? And particularly this is important because the material I'm going to present has all been prepared or was all done or prepared before the pandemic came in. And this disrupted everything pretty much. And so a big question for us to consider at the end for our discussion is given all that we've discussed in the session up to this point, what now is the effect of the pandemic on all the thinking about learning analytics and what is its prognosis as we go forward and work our way through this crisis? So there's a lot to cover and there's a lot to talk about. Also, I would say that you who have joined this session, I think you have your pause on a keyboard or very near a keyboard if I'm not mistaken. So I would encourage you to make use of those pause near the keyboard. And feel free to comment and chat or pose a question in the Q&A so that we have something to work with and to discuss. So please participate because again, I'm very much looking forward to your perspectives on what's going on here. All right, so in order to kind of assess where learning analytics are today, we need to do a little bit of history and this is going to be a selected history that kind of takes us up to the present. Now learning analytics goes back at least to the 1990s, maybe to the 80s. And for a long time it was called data mining or educational data mining. But this article that appeared in 2007 was one of the first times it really began to connect with a larger audience. And what was interesting about this article is that it could point to work that was being done at other institutions prior to 2007. As you can see here, Baylor, Alabama, Sinclair Community, and Northern Arizona, and Purdue as already beginning to work with educational data with a view toward improving learning outcomes and more student and instructor success. So when you look at it from that perspective, it looks like learning analytics is off to a good start. It hasn't settled on its name yet, but still this is the usual pattern. Some pioneering institutions get the inkling of a good idea. They start to work with it, others catch on, and so on and so forth. And momentum grows and builds. And then following on three years later, Kim Arnold wrote this article on the work that was being done at Purdue. Those of you who have been around for a while will recall the Purdue Signals project that was one of the first major implementations of learning analytics in higher education. And she wrote this article to report. Now, she said in this article this that the Signals project has delivered early successes and significant challenges remain. Just look at this little snippet of hers right here. Do you see anything that is interesting or maybe even mildly problematic is too strong a word, but it makes you go, hmm, is there any phrase in here that you can see that might trigger that kind of reaction? Well, if you're like me, it's that right there. So what is she talking about? Okay, significant challenges remain. And again, remembering that this was written in 2010, some of the challenges that she refers to is that data is frequently maintained in diverse areas. And so it's hard to pull the data together into something coherent. So it can be analyzed for the purposes of analytics. But also she said something that was very interesting and I'm reading from the article, once the data is pulled, the nature of instructional process may make it difficult to analyze. Hmm, we all say, hmm, that makes sense. Learning is very hard to measure and it's hard to analyze and to understand whether it's being successful or not. So that so that is something that kind of lays the seeds for things, I think. And this is a screen from the course signals project. As I said, it was one of the very first projects to get underway. You can see the use a traffic light metaphor to indicate where students were in terms of how they were performing. And the idea here was clearly is that if it's green, it's fine. If it's yellow, you keep an eye on them, it turns to red, which is like there's danger. Then you begin to pay attention. So in this case, if you look at student E in that row, student E starts out with green, turns yellow and then turns to red. So that would be a way to signal to the advisor or the faculty member or the instructor that maybe student E should be contacted and just to see if they need some additional help or support. Seems like a great idea and they did have some success initially. Shortly after this, there was this first conference called the Learning Analytics and Knowledge Conference in 2011. This was held in February in Banff, Canada. And this was a gathering of researchers who had been for some years already doing research in the field of learning analytics. Now, Banff is in the Canadian Rockies. And so it was I can remember it because I went to this conference and it was really cold there. But it was also a very exuberant or exhilarating because there was a lot of these fresh ideas. And I came back very enthused about the notion of learning analytics. And so I was guilty of writing a brief that was a report on this conference. And here's something I wrote back in 2011. Now, here again, I would say, do you see anything in this thing that I wrote that might be problematic or might be overly optimistic or just maybe it's not as careful a formulation as might be warranted here. Well, what what stands out to me is this term rapid adoption. You know, just because and we know this, once just because something is a good idea doesn't mean it's going to be adopted either slowly or rapidly. I mean, there's that Rogers book on the diffusion innovations and an innovation can be a great idea. But if there's not the right context around it, then it might not get adopted. So I think maybe this phrase of mine of rapid adoption, it was a little bit of a rush to judgment. So at the end of the year, 2011, the Chronicle of higher ed was getting into it and putting articles out on learning analytics. And they said, you can think of it as a money ball, or a kind of a money ball comes to higher ed, which is kind of an interesting metaphor. But metaphors are dangerous. Are we saying that they're saying that, you know, analyzing student learning is the same way as analyzing a ball player's performance. So we need to be careful a bit about that. In the same article, they also mentioned there are skeptics who are worried that it will introduce some sort of mechanization of learning and teaching and higher education, that everything will be done wrotely and the same. So I would ask another thing to be thinking about as we head towards our discussion later in the session. Do you see that happening at all in higher education? And they also, I think, rightfully so, brought up the concern about the ethics and the privacy issues surrounding learning analytics. OK. And of course, there was some hype and stuff that was flown around here. And those of you who've been around for a while might remember this kerfuffle around Newton that was this was a vendor who brought in this teaching thing that was sort of analytical and also somewhat adaptive. And the CIO was unfortunate to go around saying that Newton can kind of reach your mind and was sort of a magic pill. And obviously that raised a lot of interesting commentary. Even NPR picked this up, albeit 18 months later. And this was back here in the connection with this. This was the instance where Michael Feldstein made his famous remark that Newton was selling snake oil. And I think history bore him out as being correct. So that article in the inside high rate appeared in January the 25th, 2013. Now let's go ahead by one day. And a day later, Thomas Friedman brings out this opinion piece in the New York Times. And although he's talking a bit about the impact of the MOOC and the potential for students to be able to kind of custom buffet like put together their own curricular study by taking a course here, a course there. He is also talking a bit about learning analytics and stuff of this sort. And he quotes this from Rafael Reif, who at the time was president at MIT and said that there's a new world unfolding and everyone will have to adopt. And the question is, has that happened? Does everyone has a new world unfolded? And if it has, has everyone adapted? And that we might have had some skepticism. But again, here's where the pandemic comes in, has the pandemic actually shifted things so much that we could say that that is an imperative now. Food for thought. The next step was for institutions to form collectives and actually begin pulling their data. Because again, the more data you have, at least in theory, the more data you have and this data was this was anonymous data. The more data you have, the more cases you can see, the more patterns you can identify in that data, the better equipped you will be. Again, so goes the thinking here to understand what's going on with your students and be able to design more effective interventions that will contribute to their success. So this was something that was led out of WCET. If I'm not mistaken, it was called the PAR framework. And it's interesting in this particular inside higher ed piece. They were saying is this a match dot com for higher ed and it's kind of interesting in hindsight to use that connection there, given what match dot com has become. But nevertheless, so there was an article in the education review that covered this written by Alan Wagner, as you can see here. And Ellen wrote this, so here's a snippet from her article. I'm going to pause you for a second to give you a chance to read this. OK, so it is interesting here that they're thinking about covering the ground. And you can see here the ambitions for learning analytics come here clearly that they are talking about expanding the data set that is tied to interventions. And so therefore, it will contribute directly to both student and instructor success. So I keep flying back to this point because it seems like, at least to me, I don't know if you would agree with me, but it seems to me that that the basic idea of learning analytics is something that seems so valuable that who could argue against it? We make data informed decisions about how to either construct your courses or to intervene with the learners in order to contribute to overall success. Who could argue against that? So at this point, because a pause is going up here, I'm going to pause and just check in with Joan very quickly, just to see if per chance any questions or comments have come in that we should address before moving on. Yes, to the participants, pardon me for neglecting to say at the outset that you can enter your questions at any time in the Q&A box that you'll see as part of your screen. I'll also be monitoring the chat, but we ideally like you to put your questions in Q&A. Malcolm, I do have one comment or question, and that is that on one of the slides you said you thought the problematic part of the quote was the rapid adoption of learning analytics, but I would have chosen affordable. Are you going to address affordable later in the presentation or do you have any comments now on that? Yeah, actually, that's really apropos, Joan. Thank you for that because I hadn't thought of that. But now that you mentioned that last year before all this pandemic stuff hit, I went to the LMS conference, IMS Global, I'm sorry, IMS Global Conference. And some of the schools were talking about these great architectures they had set up for gathering of all this learning data into one big pot. And they were talking about their cloud servers and how they were enriching caliber data and all this sort of stuff. And I was scratching my head and saying, well, this is great. You know, we have UC San Diego, we have University of Michigan and some others. These are all big schools, and they can afford this. But I wonder about a smaller school that maybe couldn't. So I think you're right, definitely right to point to the potential cost of this as a factor. So thank you for bringing it up, very legitimate. Thanks, Malcolm. No other questions right now. All right, Ian. So let's mosey on here. So let's turn our attention a little bit in terms of finishing up our look at history and visit Gartnerland. Everyone probably knows this graph that they use year after year to kind of chart the course of technologies. And as you know, it's like there's hype. So the hype and expectations go way up off the charts almost to the very top. And then you're plunged into this disillusionment period. And then it plateaus in this area where it finally kind of comes to rest and might be productive. So I was interested in saying, OK, how did they chart the course of learning analytics through the recent years? Let's take a look. So this is their hype cycle, as they call it, for higher education for 2014. And as you can see, I've pointed out because this is really a thicket of various things going on here. You can see right there, there is a reference to big data. But if you look carefully and try to find learning analytics, go ahead, but you won't find it. So in 2014, learning analytics were not yet on the Gartner radar in terms of higher ed technology. So now let's take one step forward to 2015. And again, because this is a thicket, it's kind of hard to find things. Let's look around here. I'll point out that here you see it's now become big data and it's beginning to plunge down. It's like a roller coaster ride, plunging down into that scary trough. I don't know if that was your experience on a roller coaster, especially those big ones, but it's like you go over the peak and we go down and it's like, whoa. So anyway, so there you can see big data going down the trough and also look what's appeared here. Learning analytics has suddenly appeared and it's going right up that to the top of that plunge. So that's where they had pegged learning analytics in the year 2015. 2016, here we are. And what has happened in the meantime? They've noticed that it's not just big data anymore, but big data in education, which makes sense because this is the hype cycle for education. And you've probably aspired learning analytics per se right in the top before it takes that scary plunge into the pores of disillusionment. Now we go on to 2017 and let's see where our things right now. So there's your big data. Education is about the bottom and learning analytics is taking its plunge right down the trough. And here's 2018. Let's look again where things are. Now they're calling them education analytics. I'm not sure why they rephrase it. Maybe someone who's participating in the session knows and can enlighten us in the chat, but they rephrase it or renamed it as education analytics. And then big data education became, I guess, master data management. So that's where those things are. And here, this is the last one because I don't believe the one for 2020 has come out yet. So education analytics is approaching the bottom and master data management is pretty much where it was before kind of right there just beginning to grind its way out of that trough into something more productive. Now the interesting thing about trying to, I always wonder, I don't know if you folks wonder about this, but I just wonder whether they are trying to think that this hype cycle is something kind of metaphysical and always obtained. So therefore it's just a matter of finding things on it. What if something doesn't fit the pattern? It's always, I always hold this a little bit loosely, but it's kind of interesting that they're talking about kind of this disillusionment process in the latter half of the 2010s. So let's hold that thought, that suggestion, let's call it just for a moment. And I wanna look at another way of measuring things and that's the Horizon Report. Now this is a chart from the 2019 Horizon Report and I chose one from the 2019 because we changed the methodology for the 2020 Horizon Report and this is something we don't do anymore, which is namely try to predict when these various technologies will see what they call mainstream adoption. But it is interesting for a historical's perspective on because it was addressing analytics technologies and that's right there. So I wanna zoom in and look a little bit more closely on how this fared in the Horizon Report for the past number of years. So let's take a closer look here. So if you recall what the Horizon Report used to do up until and through 2019 is to make these predictions to say how soon will this particular technology or practice see mainstream adoption? Now mainstream adoption was not really all that well defined and there were other sorts of problems with the methodology but let's not worry about that. Now let's just look at this and see what the results were and see if it tells us anything. So for the year 2012 and that report, it was predicted that analytics technologies would see mainstream adoption in two to three years. So that's what that said there. And on the following year in 2013, again, it's two to three years and you could say, okay, that makes sense and maybe it was three years away in 2012 and then it's only two years away in 2013. So that's fine. And then in the 2014 report, it was in that most narrow horizon, so almost immediate of the horizons which is less than the year. And up to this point, you could say, yeah, that makes sense. It was kind of slugging its way through that medium adoption horizon now it's in the shortest one and should go in. And then when you look and it wasn't even mentioned in the 2015 report, you say, yes, it's seen mainstream adoption so it doesn't need to appear in the Horizon Report anymore. But it did. In 2016 it was also seen as being working there in the less than a year to mainstream adoption and for 2018 and for 2019. Now, this I think indicates at least a couple of things. One, I think the prediction business is a very dangerous business, particularly if you write your predictions down and people can come back and say, well, you know that thing you predicted well, it didn't really happen. So this kind of looks like analytics technologies is sort of hedging its bets or at least the expert panels were not quite sure what to make of it or inconsistent or just maybe somehow sloppy in their methodology. Kind of who knows why a little bit here. However, if you look at it from a different perspective, it's interesting because it kind of echoes, I think, or does it, what we just saw in the Gartner lines, which is this disillusionment. And maybe here, this is telling us that this is actually rather accurate in the sense that the adoption is not happening nearly as quickly as we thought it was going to earlier in that decade. That this sort of hesitant or slow down or however you want to characterize it, adoption of learning analytics is in fact being accurately described here in the horizon report because of this kind of, may I call it stuttering here in terms of the adoption timeframes. So I mean, my question here is, is there, are they kind of telling us the same thing about the course of learning analytics in higher education? Again, that's the question that we can come back to for the discussion. I would be really interested to hear your thoughts on that. Okay, Joan, Kitty has just put his paw up again. So I'm gonna pause here and just to see if there might be anything in the chat. We do have a question from Sarah Prichard from Northwestern and she writes, what struck me about that quote was the illusion to generating lots of data that would support research for years to come. And we said at a moment when there was far less fear about invasion of privacy and misuse of big data. So I think this actually has come up for a number of years in discussions of learning analytics but probably at some points more strongly than others. So what's your perspective on that? It's possible that things are a bit more naive, good eight years ago about the ethics issue. We are going to see, I'm gonna give away a little bit of some of the evidence that ethics and privacy is probably still something that's under addressed, so to speak in higher education for sure. And I think that if, but the question is, so what the Power Framework was doing they were pulling their data in a way that's similar to what Unizen is doing now to some extent and it was all anonymized. So you couldn't trace a particular record back to a specific learner because the name had been removed. So you didn't know who this was from. All you do is look at some of the data, both the demographic data of the student and their activity and try to understand patterns based on those statistical analyses. So the question is, and I'm not sure what the right answer is but the question is, is that violating with privacy if you do that? If you know that it was Johnny or Sally who's record looking at it, then obviously we have a privacy issue and we'll come back to that I think later in this session. Thank you, Malcolm. And I think Diane Goldenberg Hart from CNI wants to add something here. Diane? Oh no, not at all. Sorry if I miscommunicated. Okay. Okay, please continue, Malcolm. All right, thank you very much. Okay. So now, let's see, come on. There we go. So now I'm gonna go back to my kind of mystery novel or detective story metaphor. And I'm probably mixing my metaphor, so forgive me, but I'll use a courtroom one here. I'm gonna lay three exhibits in front of you that are analyses or perspective on the adoption and use of learning analytics in higher education today. So we're moving up now to 2018 and recent days. And as I said, I'm just laying this evidence in front of you or these exhibits, so to speak. It's kind of like a lawyer saying exhibit A, exhibit B. I have three of them. And then we'll stop and then see what we want to think of this in discussion. So that's the approach here. So I found this article really interesting and you can see there's the name of the authors in the title. And this is a meta analysis of articles on learning analytics in higher education. And it makes it really interesting reading. So I'm going to try to summarize some of their main findings here in the next few minutes. So here we go. So here are the parameters that the authors were working with. They selected some 252 papers that had appeared in this timeframe. So those are the papers they analyzed in this meta analysis here. And here was their main research question. They were very clear about that. This was the main research question that they were approaching this work with. It's kind of interesting thinking about teaching and learning in higher education as scientific knowledge but let's let them have that. So they were interested, what really do we know? I think it's another way of phrasing this question about the application of learning analytics in higher education. And what they did was they kind of unpacked that broad research question into four sub-questions. And that is, is there evidence from the papers that they reviewed these 252 papers that there's evidence that learning analytics improved learning outcomes one, that they improved learning support and teaching two, that they are scaled or deployed widely in the way that they put it and finally that they are used ethically. Seems like a good set of four questions, right? I mean, those are your key questions about learning analytics and if you can answer them you would come a fair way I think in terms of understanding just what's happening or not happening with respect to learning analytics in higher education. So let's take a peek here. So this is one of the charts from their article. I'm going to explain what this means but you can see on the x-axis there those four sub-questions that I just enumerated in the previous slide and these are the results. So let's look at it and see what these individual bars mean. So let's go with number one. They found that there was little evidence showing improvements in students learning outcomes based on these articles on learning analytics. So the way to look at that column in the chart is that the dark blue there, as you can see is just something just short of 10%. That is the number of articles or the percentage of articles in which there was clear evidence that it did support or improve outcomes. The remaining light blue column or block are articles that maybe suggested that they improve but it wasn't quite clear from the way they presented it. So they said, okay, so those are sort of not quite clear but maybe they have some support or some evidence for improving learning outcomes. But this means, if you look at that, if this means it means like 72% of the articles they reviewed had no evidence at all about learning analytics improving student outcomes. At least that's the way I read what they're saying. So let's look at column two. This was about learning support and teaching and there was much more evidence here as you can see in 35% of the articles they reviewed there is evidence there. And in a large number, in addition to that, there were some that were in this sort of not quite clear territory that had some suggested evidence in it but maybe not quite as clear cut as the 35%. So in that point, we seem the literature seems to be pretty strong that there seems to be evidence for learning support and teaching. But now, when we go on to it has its scale. Boy, look at this. In 96% or 94% of the articles they reviewed they didn't see any evidence of why deployment of learning out of it. So therefore it was sort of ones, the twosies discourse over here, discourse over here and not in a general sort of way. And that last column, which is that they're used in an ethical way. Well, 82% of the articles they even mentioned ethics at all in them. So that's a little disheartening. So here's a comment that I have excerpted from the article. I'll pause here just a second to allow you a chance to read it. Now, the emphasis in red is obviously mine but here we have potential and transfer. And so as we saw from the history that learning analytics was perhaps unduly burdened by this notion of great potential. And the burdens of potential was commented on in the past several decades by the philosopher Linus from the penis cartoons. And I always love this cartoon about great potential. So like Linus, perhaps learning a leg step and saddle with this great potential burden. But here's another comment. This gives us a little bit of room for hope I think. So I'll pause here again to give you a chance to read it. And what I think that this comment is getting at is that for a long time there's been a debate that a lot of the data that has washed up on the shores of learning analytics has been proxy data. How many times does Sally and Johnny log into the LMS or do they access something? Which is okay as far as it goes but it really doesn't help you that much understand learning processes and stuff like that. So if there is this realization that we need to get beyond these sort of proxies and probe more deeply into the learning experiences I would take that as a good sign. Okay, so that's our first exhibit Ladies and gentlemen of the jury. Here's exhibit B. This is from a group called Titan Partners and they did a survey of about 1,000 faculty administrators in 2019 about learning analytics and we'll explore some of their findings right now. I don't have a reference for this because this is unpublished research at the moment. I'm going off an internal document. They circulated with the every learner everywhere group but they will be publishing this in the future. All right. Okay, so one of the things that they were doing this research on behalf of the every learner everywhere or ELE and as we know the Gates Foundation is very much concerned about equity and inclusion. So that's why there's this particular spin on things but still there's information here of quite general interest as well. So let's look at their key findings. So there's this potential and realization discrepancy here too. Their findings are but people think it's a great idea but let's say the spirit is willing but the flesh is weak here perhaps. So less than 10% of respondents state that there is robust training support. That's not so good in terms of seeing it adopted and support for the implementation is kind of generally lacking. So let's look a little bit closer here. So here was one question that they asked. What were the most important institutional priorities for learning analytics overall? And you can see here, you can see that little dotted red box that again represents their interest in tracking performance gaps across student groups but look at what they identified. Increase in retention rates and course completion. We just saw that in a previous slide about probably moving away from that emphasis onto something a little more deeply but there are a lot of good ambitions here, improve access and equity, encourage faculty to innovate better, to increase diversity and things like that. So again, all these aspirations for learning analytics have been something you just kind of argue with but the question is, how far are we to realizing those? And this is kind of interesting in their findings, they felt that the two years were ahead from the most part of the four years in terms of the use and successful use of learning analytics. Now the assessment data that they leverage, what data are they looking at when they do their learning analytics? This one, I don't know what you're thinking is about this. We see assessments assessing the summative and formative assessments on the top, frequency of things like engagement with the instructor, pre month, you know, the demographic or pre matriculation data and time on tasks. I don't know what you think about that priority, whether that's the right things to be looking at or prioritizing. That's something that just invites lots of discussion and debate about I think, I'm not sure that that's the way the world should work. And what are they using it for? Well, I mean, again, some of these aspirations are some of the ways that it's being used, you know, who could argue with it to improve teaching practice, to promote better interventions, and improve learning and things like that. Okay, let's go on now. Now support. This one is also somewhat of an ambiguous result here. There's a 70% inch who don't know about the support for faculty and also a fair amount of folks who say there aren't, there isn't any support, just kind of period. Moving on now to this one policy for guiding the use of learning analytics which would I think hook into things like privacy and ethical use. So here the result was for faculty, 24% said just plain no and then 48% said they didn't know whereas administrators had a bit of a more optimistic perspective on how this was being conducted at the institution with a few more saying that, yes, definitely there is policy in place but still that's over half of them are saying that the administrators are saying that they either don't know or say no, they just isn't in place. And again here, this is echoing the earlier side which showed that the two years are a bit ahead of the four years in this regard. Okay, let's move on to the final exhibit ladies and gentlemen. This is a survey that as you crossed it what seems like ages ago, right? Anything you did in January and February doesn't it seem like it's from a different time period or something like that? So anyway, we were curious about the maturity learning analytics practice and higher education. So we did a survey on here are the results. Here are the survey demographics. We asked them, are you responding on behalf of your entire institution that is you're bringing an institutional perspective to these responses only a part of and it was somewhat evenly split here but a slight majority responding on behalf of the institution. So taking an institutional perspective. Here's a little word cloud of the titles of those who responded. And here is the institutional demographics. So you can see a slight predominance at least a simple majority of doctoral institutions but we did get some input from other institutions as well. And so what we did when we asked some questions about to ask them to assess the maturity of their institution we asked them to do it in this framework. One is that the thing that we're asking about is absent. It's either absent entirely or just so uncoordinated that it really is as good as absent. One, the second stage step up would be or more mature stage would be limited to a few areas. So it's in the initial planning or piloting stage. Developing is the next stage. Is it really sort of getting under way and seeing broader adoption? Is it an establishing that everyone? Oh yes, we know that we have learning analytics in place and I can access it say from my LMS or something like that. So it's an established practice that people know about to one that's highly optimized that's only established but the kinks are worked out, it's working well and people find and derive value from it. So maturity is low according to the respondents for this particular survey. On average schools find themselves or self-assess themselves as being either between the initial or developing stages. Again, remember that's one of the reasons why I took us through the historical perspective on learning analytics was to say how far back learning analytics has been with us. It's not like it's something that appeared in 2018. And at most 20% of schools have some aspect of their learning analytics practices being established or optimized. But still largely absence is assistance for learners. So this breaks down to how folks self-assess their institution across these particular parameters. The collection analysis of learner data, it's that's the widest practice I think of these as you can see from these results. But as you step forward or down the list here helping other stakeholders aid continuous improvement, we see the absence or initial factor of growing and the developing or established and optimized factor shrinking as you progress down this line. So it looks like some of the real payoffs which is helping instructors and helping learners at the bottom of the list, which is not probably where they should be. The world works the way it should. Key results, number two, very few have met their goals or visions for learning analytics. So only 3% of those responding assess their institutions as having achieved 80% of their initial learning analytics vision or goal. And about 45% have achieved 20% or less of that vision or goal. So here it is in chart form. So you can see that 3% there on the far right side. That's the respondents who felt that they were in that percentage of achievement of their initial vision for learning analytics. And you can see where the biggest single group is is way back on the other side and on the left side. So yeah, there that is. Third key results. Some people suggested that schools are focusing on student success initiatives at the cost of devoting resources and energy to learning analytics. I'm wondering if that's the case at your institution. And it was the student success analytics from the results that we got back were seen as sometimes being in competition with learning analytics programs. But there are some fuzziness and all that and I find this when people are talking about student success exactly kind of what that means. Does it mean just advising or does it mean student success in a more holistic way? And if the latter wouldn't LA that is learning analytics be part of that. Again, stuff for us to talk about in the Q&A and discussion. We also found in it that there were core elements that were not in place, that these core elements of leadership, funding, change management, data, and use of analytics are not in place at more than 50% of the institutions. Now we asked them to try to identify to talk about exactly what their strategic goals and vision is for learning analytics. And this was like two, three. So these percentages will not add up to 100. So the top look at it was developing LA's capabilities and field building. Okay, so that's their big goal. What happened to the learners and the instructors in that? But maybe it'll come later. Yes, so here it is. The next one, biggest vote-getter so to speak was student success. But look at this. Now comes the third place is I don't know or else they feel that it's not clearly stated by the institution. Only now do we have improved learning and improving teaching and then at the bottom more specialized or general outcomes. So this is the way that our respondents self-assess the strategic goals and vision of their institution with respect to learning analytics. And so let's look at this. This was, let's take a look at these elements that are associated with learning analytics and look at the numbers in terms of whether they were rated as absent or only slightly deployed somewhat or mostly or fully. So these are the best ones in terms of the smallest absent percentages and the largest somewhat or most fully. So it was felt that, but again, almost at 50% there was no visible support by senior leadership for learning analytics. 50% say that they don't have a learner record store or data lake for learning data in place. And 47% said not really much going on in terms of privacy rights being discussed and understood. Here's the next three. Does learning analytics have a dedicated staff? 60% said nope. Is diversity, equity and inclusion integrated into LA? Again, more than half said no, it's not. And does the learning analytics resource that you have on campus guide your instructional design and your technology implementations? Again, 59% as you can see saying not really, if at all. And then finally, these last three, there is more, but I've decided these are just kind of the highlights if you will, just to give you a sense of how this played out. Instructors relying on data for course design, not so good. Instructors have support needed to utilize learning analytics and also the last one, there's a community of practice around it. So all those suggest immaturity if you look at this data in terms of learning analytics practice. And I'm just going to kind of show you some quotes that we got. So here's one that suggests that for at least at some schools there's this antagonism or competition between learning analytics and advising analytics. Is that the case at your campus? And is that the way that worlds should work? Here again, we see this emphasis on, or this suggestion that because we're focused on student success, we don't really have time to do the learning analytics. So it's not part of our strategic goal set for this year. Here's one saying, gee, there's I guess no leadership at the top or really committed and involved leadership perhaps is the best way to summarize that. And have access to that information. That characterizes our approaches, wait and see. So we don't really have one that is a stated set of vision or goals. And in fact, the faculty there are up in arms about thinking that this might be an intrusion that this constitutes surveillance of students and therefore is an invasion of privacy. Now, I want to correct it. I just brought the problematic ones. There are a number of ones that received the absolute positive. There are schools doing good work. So I'm not suggesting by any means that it's all just a dark cavern and nothing positive is happening. What I am trying to do is to present as I said, evidence that seems to suggest that the overall maturity of higher education with respect to learning analytics still has a ways to go. Perhaps there's a ways to put it. And headlines, this just in yesterday, as some of you may know, Educause has been doing a series of what we call quick polls to quickly gather information about practices that schools are adopting to respond to the pandemic crisis and get it out. So we start off on a Monday and publish the results on a Friday so that folks have this information to help them with their planning. We just published one yesterday on quick poll on student success analytics. So you can dig it up. It's on our data by its blog. And here are the key results from that particular study, quick poll, that in the wake of this pandemic situation that the interest in student success analytics has grown. But it's focused mostly on technology usage, which is legitimate because we have problems as we know about student access to technology and their usage from their remote sites or from home. And they're also trying to monitor LMS course content and course design. And the top priority is to have data in order to conduct interventions. And these are the topics of interest that came up. I invite you to go take a look at that blog post that makes for interesting reading to get at least an initial glimpse of how the pandemic crisis has really informed or influenced brain analytics. Okay, ladies and gentlemen, those are your exhibits. There's your data. What do you think? What's the situation of learning analytics in higher education? So here as we go into discussion are some things we could talk about, but anything else you wanna talk about is fine too. Do you think that learning analytics practice is still somewhat immature and higher ed? If it's yes, does it matter? I ask that seriously. If yes, why is this the case? Do you agree that the emphasis on students who have this overshadowed learning analytics, say at your campus? And given the pandemic, what are the prospects for learning analytics? So, Joan, what do we got? Malcolm, I'm gonna go back to some of the questions and some were back a number of slides, but actually, they're stated more in the form of comments. So first from Andrew Pace of OCLC notes that even in 2019, there was not a lot of discussion of ethics at the IMS global meeting. So I don't know if you'd like to comment about it more. That was, yeah. It's easy to get carried away by the technology and to overlook, oh, yes, we do have to worry about ethics and privacy. So I would agree with that observation for sure. And follow up comment from Sarah Pritchard to her privacy issues earlier. She says, perception of threat has greatly increased in recent years because of data breaches and social media misuse. So the factual reality of having the data anonymized is not really her point. There is likely more skepticism about adoption of learning analytics systems because of those optics. I would love to talk to you more about this because I'm not sure I fully appreciate the point you're making here. The thing that, in terms of the risks of the data posed by break-ins and stuff like that, you really need someone who is up on cybersecurity and I am not that person. And whether the increasing move to cloud systems is an improvement in terms of security or actually introduces more risk. But there's probably more to your question and maybe you could rephrase in the Q&A and we could come back to it. Sorry, I just wasn't quite sure I understood it. And one more comment which is Scott Walter notes that he finds this surprising and I think this was the conflict between learning analytics and student success. He says as my previous place of university clearly framed learning analytics efforts as being a component and a support for student success initiative. And that was really a point, Malcolm, you were making as well that you weren't convinced that it was something divorced from student success but a component, is that right? Not at all. I mean, I think that it would be unfortunate if we thought of student success in its narrower sense as being just about student advising but instead think it more as a holistic way. Can the student be holistically successful at their institution? Now, obviously their role and identity as a learner is probably one of the most important if not the biggest part of that identity. So that means learning analytics has a huge role to play in their overall success at the institution which is not to say that advising is not important or that some of the other supports that students need in order to be successful overall at the institution or by any means unimportant. But I would think that learning analytics has the potential and probably is now beginning to contribute substantially to student success. So I agree. No question there. Thank you. Please go ahead. Well, I'm at the end of my slides actually. So we're now kind of ready to kind of have a palaver if you will and to see if anyone wants to for the comment and either the Q&A or the chat. And we're going to offer a third option if any of you would like to speak directly with Malcolm whether that's Sarah who Malcolm asked if she wanted to clarify her comment or anyone who'd like to pose a new question. If you pose it in the Q&A box or the chat then I'll repeat it to Malcolm. But if you'd like to ask it directly, we'll unmute you but to do that you need to raise your hand in the appropriate window. Okay, right now we don't have any other questions in the Q&A. So I mean, one thing that I would invite folks to comment on is that we've seen some evidence that suggests that maturity in learning analytics practice in higher education has a ways to go still. One is that true of your campus? Would you, how would you assess your campus' maturity with respect to learning analytics based on some of the parameters and measurements that we've been talking about here in the session so far? And what do you think its prospects are? And again, I would say what do you think what influence or what impact will the pandemic have? Because I can imagine that the pandemic has turned things entirely on their head across higher institution and probably in all dimensions of your institution. If not only just this big move to remote and remote teaching and learning that we saw in the spring but also we know unfortunately the financial situation for higher ed is pretty grim right now. And so it doesn't leave us a lot of resource in which to maneuver. So does that mean that things like learning analytics are gonna be kind of put on hold because there's simply not enough resource to address it right now? Or we've seen some schools say that I need to invest in teaching and learning because this is my huge challenge. I think it was Western University in Ontario. I saw a kind of a help wanted sign for 10 term instructional designer positions because they felt they needed to beef up their instructional designer support to meet the teaching and learning crisis that they were facing. So I'm just curious if folks could at their institutions describe what's going on there. That would be great. You can do that in the Q and A or in the chat or by raising your hand to speak directly to Malcolm. Malcolm, I've also noticed this big uptick in postings for recruitment of instructional designers. And in fact, I was talking with someone who's not in the field of education who was asking me, why is this going so poorly? And I explained how even in very large universities the number of instructional designers is usually way less than 10 for student populations of tens of thousands and many, many courses in many faculty and it just doesn't scale for them to help really develop online learning as opposed to what I think some people are referring to right as emergency online education, something like that. Right, right. Designed that way. So the other thing that I think is going to become more important in our environment starting now and into the fall and spring is the component in terms of what all of the supports that students are going to need. So I know this focused on learning analytics but if a student doesn't have housing or can't afford to get a meal plan at the college or is having psychological problems because of all of the anxieties of this time period, are we going to use any analytics to help us understand what's going on with our student population? Well, yeah, you need data. I mean, again, the basic idea behind learning analytics seems irresistible in terms of its logic, right? Get data, understand what is actually happening and then you can plan your interventions and course redesign is much more effectively to meet those needs. But I think the big challenge for higher ed here in terms of looking ahead to the fall now is that even though you can say, well, yes, we only had a week to get our, to make this transition from face to face to remote, even the three months that we might have now between the start of the fall term is not a lot of time. When you think about the years and years that schools have been putting to developing their online education practices and they're good at it. I just had a conversation on Monday with folks at Davenport University and they were talking about their work with Hi-Flex, the Hi-Flex course model. And they've been at it for three years and they have now 12 courses running on that model, 12 courses in three years and they're doing what I gathered was a really good job of it. They started out kind of rough but then they're beginning to refine it but took them three years to really understand how to hit their stride. At least that was the impression I came away with for something as complex as the Hi-Flex model. So it's just, it's just, it's a challenge that almost beggars the imagination, I think. And part of it is, Joan, as you rightly point out is that I'm sure universities are under-equipped in terms of staff resources to meet this transition. So there might still be remnants of remote teaching and learning in the fall simply because there's not enough time to acquire the expertise and experience of the online learning. And there are questions about whether schools will be resorting to OPMs or something like that to help meet that need. Malcolm, thank you so much. We're going to, we're at the time to end the session now. I do wanna alert the attendees that if you would like to remain on and just chat, I'd like to thank Malcolm Brown for a masterful presentation. I learned so much. I am so pleased that we will have this recording available for our C&I audience. And I also wonder if you'll end up writing it up because I think part of it is this context, not just the current survey, but the context that you provided was so rich and so informative. Thank you very much, Malcolm.