 Right good morning. Thanks everyone for coming. Just to be clear if you were looking for a conversation about usability You're in the wrong place. That's next door. We're talking about opening this and learning analytics John's role at the opening initiative is to focus on poor software. I need to focus on initiatives and partnerships and Because we think that the analytics space is one where we definitely can't go it alone I felt like this was a good place for both of us to be talking. I do sorry But we dive in one of the things that's been pretty interesting to us about the past day and a half It's just how much is going on in the analytics space It's going to give you some warning that we basically rearranged our slide deck over the past half hour to try to capture a lot of this stuff That we've seen happening And we really feel like given the people that are in this room Maybe the worst way that we can use this time is for the two of us to stand up here and pontificate What we'd really like to do is move through the slide deck reasonably quickly and then move on to some discussion It really pushes us forward to talk a little bit about how we as a community can build better analytics tools and build better resources You can plug into those tools We take our scientific approach to course design what that means is we recognize that every course or every learning bar We build is really hypothesis about what things going to work and we collect data to actually evaluate the experience and drive continuous improvement process And we do this through communities of use and evaluation So really this is really sort of showing what our biases are in the beginning The scientifically based approach to course development is the water that we swim in and it has a huge impact on how we perceive the analytics space Um, obviously that's not going to be the same way that all of you perceive it And that's why the conversation becomes interesting So what we like to walk out of here with um How many of you read our abstract? Sounded pretty ambitious, right? We thought so do that we've reread it so What we'd like to do though is try to deliver a little bit on that promise and dig in a little bit on what some of the tensions are in building reasonable learning analytics, especially around the dimensions of what's happening in use driven design context of OER How it is that variety and adaptability seem to be creating some conflicts for us in building effective analytics platforms What tools can we be building what tools already exist? And uh, what drives it all? How do we get the data? What are the needs and challenges there from there? Let's talk about what a community-based plan for better analytics would look like and hopefully commit to some action What are some effective next steps and really building this stuff? There's been a lot of chatter about analytics over the past day and a half We feel like this is a good opportunity to submit some plans So, um, I'm sure a lot of you've seen our feedback loop slide Um, the point that we come from is that one of the greatest opportunities of technology based instruction Is this ability to invent assessment throughout the experience and capture data to drive these powerful feedback loops to Just better support the learner to help instructors understand where's tips are in that learning process and intervene earlier To better inform the science of learning so that we can refine our approach And to affect changes in the course design experience because we view this as a huge huge opportunity In order to do that to realize this we need assessment and data And the reason that it's such an opportunity is that the huge amount of OER that are out there and the enormous amount of students That are using it now not to mention all the potential students that are using it in the future Can be building and creating huge piles of data that can be used to drive these analytics systems No longer is the information that we're capturing no longer does the information that's being generated need to be confined to the individual classroom and confined to simply one faculty members analysis and interpretation The size of the data here gives us the opportunity to do things that we think can't happen in context outside of OER So we think OER learning analytics and especially the science of learning have a potential relationship that couldn't have existed prior to OER's cooperation so making that in this space it's it's hard to make some sort of data because there's so much variety and Josh took some of the numbers that we we also pulled out because I don't know too much idea But you look at 19,310 modules like new connections. We have Something like 2,000 courses in MIT's open course square 2,600 videos of Khan Academy and all these different things are capturing different types of data in different formats of different systems Or maybe they're not collecting data at all We have all sorts of spaces 600 free courses at the EOU and much much more out of the EOU comments And so, um, you'll question me. What's driving this? What's driving this cooperation? We've done all these great resources We we put them out there releasing access to them. Um, but why do we have so many? Why do we have so many? And a huge part of A huge part of what's driving this comes out of OER's original mission to expand access, right? Very early on the assumption is if we have more stuff out there Then we've expanded access the more things that are out there The more students are going to be able to take advantage of it the more fact would it be able to take advantage of it but it's It's been expanded. I think this this urge towards creating more and more stuff and contributing your stuff has actually also been pushed forward By the philosophy and definition that we have for what constitutes an OER this need for reuse redistributing Revising and remixing the content has led to us putting more and more stuff out into the space And we'd like to think that it's possible to continue with this philosophy But avoid the fifth art which is continuing to recreate things that already exist Instead what we'd like to do is continue to think about things in this context But add this notion of evaluating resources so that we can take what's work working Continue to build upon it rather than continuing to recreate the same set of statistics courses the same modules and biologics It's a recreation becomes a barrier to reuse and if we create opportunities to coalesce and To improve through evaluation. We're actually getting back to our original intent Which is to revise and remix and this proliferation isn't just happening in the OER space It's happening in your institutions, too So, you know, it's it's just happening on a different scale It's something that happens at Carnegie Mellon is that every department feels like it needs its students to learn how to program So they send them off to take an intro to cs course We go through these cycles where that course would be embedded in the computer science college And all the students will travel there to take their course and then the departments will decide this isn't beating their needs And so everyone builds their own intro to programming course We have one in the business school one in the it program We'll have another course popping up even within the college of computer science This and this is this is crazy, right? It gets away from the entire notion of what a university ought to be this notion that we have a center of expertise And uh, we're going to take advantage of it. This isn't just a cnu Everywhere that you go in statistics folks are feeling like they need to build these specialized statistics courses Beyond the core statistics that exist in the mathematics and stats department People feel that they need business statistics research statistics statistics for nurses medical statistics So this isn't just a problem with the oer But the low cost of oer pushing things out into the space for the world to use Actually exacerbates this pre-existing problem or at least our pre-existing sense of wanting to create an awful lot of stuff And we don't want to we want to acknowledge that context is important that proper motivating examples are important in these environments And this is across these areas, but there's also a foundational core that comes across all these different types of statistics And so what we're not talking about is the the one statistics course What we're talking about actually engaging across all these different varieties to find commonality Um, and so just quickly what's driving change in these settings? Why is it that I want to have? 15 different computer science courses or 20 different statistics courses And it turns out that it's a lot of different things in some cases folks are being motivated by data Simply looking at what student performance is happening within their college within their department I think market demand tends to be a reasonable driver for folks that are looking at what skills students need become employable But it's also acknowledged that there's an awful lot of intuition I've got a feeling my course isn't working or simply preference That an ongoing debate with my college roommate who teaches english, uh, the smaller boards full out in Pennsylvania who will call me about once a semester and say hey Completely changing my intro to film class because I'm bored teaching these films. Well something about the old films not work I don't know. I'm just tired of teaching it Um, so there's real space in changing things simply based on preference and not based on what we believe does work Or what could work better And that creates all sorts of problems Um, and use that quality is not the variable that we're duplicating a lot of effort It's hard for someone to consume resources if they don't have a basis of choice They don't know what the pedagogical intent is what context of use is targeted for and what I didn't work at all And you really hit this proliferation You I think it's a very good option if you can't just sign on the thing that you actually need It's hard because that it's hard to evaluate because we're not building pools of data that are deep enough because they're They're just this is fused across a large number of resources Uh, and that makes it hard to improve and hard to scale up things that are successful And all of this is to say really bad effective. This is a miss And we're not the only people that are asking this question. If you were yesterday's keynote, um That's our sorry you trade off I can push the button if you don't understand You heard dr. Cantor asked exactly the came sent question. How do we know what's working out there? How do we evaluate this stuff? So from our perspective at OLI we would say that an OER is effective if it can demonstrably support students in meeting Articulated measurable learning outcomes in a given set of contexts um We're going to risk throwing off the rest of our presentation right now and ask is everybody's comfortable talking about effectiveness in this way for the rest of the Presentation do we miss something that you think is important? Wow. Yes, uh, sorry. I think you So the only thing I would add to that is, you know, I think that some of the research that you've done on funder statistics Of course that says that that the student are able to actually achieve the learning outcomes more quickly Especially when we're looking at the challenges around development or general education lower division courses That the piece at which learning outcomes are achieved Is the one other piece of that that's critical So that's what we intended by the given set of contexts that we have to acknowledge that context is important when we're measuring effectiveness So we have we agree Um, can you define measurable learning outcomes for me and um, maybe I'll explain where we're Qualitative and kind of new pedagogical ways of thinking come in there But the way that when I whenever I hear learning analytics or whenever I've seen it I've seen it in terms of here's multiple choice questions and here's, you know Your way that you're going through the module and it will tell you where you are and where you need to go but a lot of us have to work in you know, program-based education and Kind of Social participation in you know diversity of spaces that doesn't really apply So I'm wondering guys, you know are thinking about that and what you know how your model connects to that. Yeah, absolutely So that's a big question. Yeah, it's a big question. So this has been a few seconds on it So I can actually give an example. So then we've been we've been working on We're working on a number of social learning tools one and working with this called knowledge form It's a collaboration we're doing with the University of Toronto And in that it's a basically a discourse space for Folks to engage in knowledge feelings. It's the idea of developing 20 percent of skills by authentically Engaging in discourse around set of materials. And so we're not doing multiple choice questions But one of our we can still have goals for students to come out of that one of the goals is in statistics For example, we want students to we say authentically participate in the discourse of the domain So when they leave the course they can speak like a statistician they can understand something in the york times and so How can we assess that we can look at we can develop rubrics to measure their engagement in these environments We can look at their peer interactions. We can look at the growth and spread of ideas and communities We can we can look for new forms of assessment that go beyond multiple choice questions So multiple choice questions are one vehicle But rest gives this project the hd catalyst initiative that's looking at this question of new forms of assessments that global consortium project that we're We're involved we're involved in So it's not that we don't want to we don't want to teach a millennial likes to just Tutors and cognitive tutors or questions. There are other ways that we can use data even as their context But we talk more I'm glad to talk to more more about like your perspective in great depth In this definition, I'm not really sure the phrase demonstrated demonstrably support students What that means it seems like that's a little fuzzy And I'm not sure how you would actually evaluate that looking at a correlation between student aid completed this learning module and did better on this Assessment, is that what you did? So I'm not that I'm not real sure that's pretty fuzzy for me in terms of the definition So I think that in that context what we're you know what that teaches on is this How it is that we're defining and measuring our learning outcomes If our learning outcomes are actually measurable then we are able to see that a student that has taken advantage of or made use of a specific OER has Perform better on these outcomes has achieved the outcome in a better way than students That's great that sets us up for where we're going with the talk so if we can hold that question Come back Let's go back to your question to get at the end and see if we've answered it as we get a little further And so so if we can move forward with this definition of effectiveness the question comes out is so why are we Take a look at this now. Why are we getting on this now? and Well, it's hard. It's hard to even define. I mean this discussion here kind of gets up at some of the some of the challenges there It's costly. It's not something that individual faculty members. Do you can do alone? Simply because the students are giving them the time to do it But also because there's a whole range of expertise that's needed around assessment and evaluation and learning science to do this We don't always have the mechanisms to support them in that way It can be threatening. I mean It's we have if we know something doesn't work. We have to do something about it So that can be threatening and we have disparate systems with standards different ways of collecting data And then this question how do we measure effectiveness is And so the just all comes down to say is that to make this happen. We need better enabling processes and systems That's right Even when the data is there to actually drive these feedback loops requires tools for individuals Educators administrators to be able to look at and actually interpret what's happening in the data So this is where this continuing demand and need I think for learning analytics platforms is coming from And one of the reasons that this has been uh, which feels like a pretty hot topic of conversation at this conference Is people are starting to recognize that we've got lots of information, but we don't know what to do with it Yeah, which is really What do we do though? Okay, we've had a big discussion about the need for learning analytics But what does it really mean in practice like what what is learning analytics and how do we create and use them? And we think that this conversation ends up being difficult to have because often tools that whether it's us that are building or other organizations that are building them end up attempting to build proxies to measure learning rather than Attempting to actually authentically engage on the assessment and evaluate whether things are working or not working And this is natural, right? I mean a lot of this comes down to what we do in the academy when we give grades Grades in the end are intended to be a proxy for whether or not a student is achieving their outcomes Similarly though when we're talking about learning analytics We're also looking at not just the immediate actions of a student against a learning outcome But how that students existing in the larger context of a class how they're existing in the larger context of a university And so we're using this phrase learning analytics to try to capture Lots of different things ranging from this authentic assessment of learning outcomes all the way up into how's this student performing and engaging in my class How are they existing in their university or college career? And so this kind of comes to the definition of what we're looking for for what We consider the learning analytics and that's to say that data collection Is it really data isn't released unless it's used and that's the sense making of data can actually be a lot harder Than necessarily collecting sometimes it's easy to collect data sometimes really hard to make sense of it And so we're defining learning analytics You know Reporting and reporting that allows you to make decisions Intervene and and be actionable and so I give the example that you could so there's it's informative to know that there's a correlation between a given set of student behaviors and success. Let's say let's say that's Let's say that's logging into a blackboard or registering for a course or Signing up for an OER by a certain date. We've actually actually heard this before so that's great It's really informative so you can predict average students But how do you intervene because that predictive power goes away if you make the intervention make everyone sign up early That makes the power goes away Or you can look at we also often get an instructor to ask us about things like participation data Which again very useful to know if your students are participating who's participating what they're doing how much But is making every student do everything really going to give us a better picture of What where the learning state is Because that's what we're talking about. We want to make sure that we're collecting data that can lead to Interventions to improve outcomes And part of this then ends up trying to ends up being about Explicitly what kind of analytic tools we're trying to develop is making the distinction between educational and academic management analytics Things that are going to help you predict average students across the whole of your institution They're figuring out what kinds of behaviors there are useful intensive supporting average students Classroom management analytics really understanding and predicting across learning outcomes how students are engaging with the material Where they're being successful, but then down in the weeds how are students achieving individual learning outcomes? Where are they failing and what do you need to do differently in your classroom to try to serve those students? All this leads into the problem of data collection, right in order to drive these systems What we need are common mechanisms for data collection and this has been Made really apparent to me recently in participation with an nglc grant David Wiley and I were actually just having a conversation about this that when you end up having lots and lots of Different kinds of data at varying degrees of richness being collected It's almost impossible to apply a standard set of tools and to give information back to the faculty that are That are actionable So one of the first steps in getting to the ability to build a standard set of tools and platforms Is starting to get at data collection These however are technical problems and the technical problems aren't the hard ones to solve It's the second grade And these are the issues that are actually that if we can solve these taking care of the technical problem the data collection is easy So where we want to go at this point the ideal situation would look like well We have some common data standards and comparable metrics that we can all take advantage of We have analytics Enable know we are we make these these platforms and systems Just available as you create it makes you it's there you have the power of data collection analytics We develop some commonly accepted ownership privacy approaches and we all have a community-based commitment to measuring effectiveness through assessment So how do we how do we approach this? Well, let's let's start by bringing together what already works What already works on data collection what already works on gathering evidence and what works for analysis tools There's already a lot of great stuff out there and that's again That's been what an awful lot of this conference has been about They at least from my perspective is finding out just how much stuff is out there that either isn't They're well publicized or that we just haven't been talking to Um, and we can we can run through these. Okay, so you know within the olie we have our learning dashboard It's uh instructors. It gives a learning outcomes center view of student Student performance. We work with the beginner science and learning center. They have a large data repository called data shop Which is designed for learning scientists and educational data miners The old evidence hub is a great place to crawl to that is collecting Evidence about what makes so we are effective. What makes useful? We are effective We have projects like learning registry which we're trying to tackle this these data collection distribution problems We're looking at forming schemas where we talk for a lot of us to talk about data in common ways um, and there's also some words that we're doing in our project about media and evidence And yes, this is what the current cc olie project has really been about it's been about uh Explicitly building courses that are going to address the gatekeeper problem But doing so in a way that's not building a single olie or we are that's intended to solve our problem To instead creating a community that's going to both help develop evaluate and improve that oar and use it So that once you have something that works you're able to improve it in specific directions In some of the more individual contexts that you need And so we have some existing things, but we also need to build new things in each of the same areas And so a lot of this work is driven by different types of data So we need to define what these different types of data are we have metadata peridata contextual data the context of the use Behaviors in behaviors logging into a system viewing a page watching a video Interaction level data. Did I respond correct or incorrect? What are the what for what are the practice opportunities that are available to students with the skills that go with them? What sort of support they know they receive from an automated tutor system? And then there's this even raw data that provides this opportunity for finer-grained education research to better the science going things like mouse clicks and individual interactions in the system And we think that one of the other things that we need to build are better mechanisms for making sure that the data is shared Right now all of the data that's out there ends up being wrapped up in individual lms's when we are lucky We can get some common agreement on okay, you know I'm participating in this grant and part of this means that i'm going to feedback the data to you But what we'd like to see is a world where taking advantage of the oer means explicitly Agreeing to share the data so you know that this mock-up of the creative commons license here Thought cable was going to be in here. We're going to harass him about Lost my opportunity for that this sense that if i'm contributing a piece of If i'm contributing a resource into the commons one of the things that I expect to get back from that contribution Is that folks using that are going to put the data back into the world so that we can take advantage of it and continue To improve the resource absent these kinds of mechanisms and policies the data will continue to Hide on individual servers and really the only way to make this happen is through a community-based approach And this is the way you know our project versus purchase force development We still have to go of course We're actually bringing together different areas of expertise and these interdisciplinary teams that draw on the expertise of credit email and then also Other institutions that we work with On the previous line, what was the sd? Oh shared data. Oh shared data. So you're proposing a new license That would be share alike, but also I don't you know, it's the top of the slide. We actually argued a bit about whether it should be uh Whether it should be shared data or evaluate alike actually, but that's something that we can get into conversation And so, um, you know real benefits to any grace approach is we feel is this allows us to build alignment In ways that help to break that reusability paradox because we can bring different perspectives on a domain together Find that common core of things Build alignment around a common set of student outcomes. Yeah, we have to adapt for business statistics We need to add some different examples there. Sure. We need to make sure the context is right, but um, We can actually if we can build pedagogical alignment in a community Then we create greater opportunities or reuse that that paradox starts to break down And from our perspective, that's one of the only ways that we're actually able to build better tools So in this way, I think this is one of the ways to break this concern between the infinite variety of courses And folks that are concerned that we're trying to build the one single course that is communities coalesce around a resource It starts to be improved and developed in in specific contexts Uh, so we bring us together to have a full spectrum and by this we mean that full stack of data So we really capture a complete picture about What's happening in in the system in the use of data using that to inform improvement And trying to look up data comprehensive approach to our view on things Uh, what this eventually gets us is you know, even the current analytics tools that we're talking about that we've shown you End up being the tip of the iceberg We end up being able to build incredibly effective learning intelligence systems That are not just going to improve the OER but are in the end going to improve student experience And and there is a consequence here that's worth acknowledging Which is that we are giving something up in this approach and that is um, you know, we're have to open our minds Are allowed to be changed by evidence, you know, it's we have to take a scientific view To construction and to construct any resources and when things don't work We have to try to say, okay, you know, we might have to Improve things abandon things change things modify things to really find What does work and what are the relevant contextual variables that make that scalable? So next steps, um, yes, we kind of outlined a little bit general approach here We were talking about standards a lot yesterday to the problems that space and we kind of feel that innovation has to come first So we want to talk about you know by innovating standardizing and scalability and stop signal So we'll kind of jump to our conclusions Uh, which is this the ways to hit this are you know, not just a shared community commitment to assessment and evaluation But building this definition of analytics enabled OER so that as we continue to create and improve AR We can OER we can do it towards this definition a common approach to data and then building these shared and private analytics platforms It's recognizing that some of these platforms are going to need to exist in the comments But this doesn't need to preclude folks from building their own platforms Which are going to exist whether commercially or inside individual institutions. It's speaking the same language And then these platforms need to be the enabling infrastructure that allows technology to get out of the way of Folks building these resources so you can build whatever your OER is and not have to think too hard About all the technology all the data formats all your operability concerns that go with analytics based approach So we've completely destroyed our intent to have a good discussion by talking too much over our slides, but the questions or comments So I think the things that we're currently using most extensively are the learning dashboard tool that we show Which is intended for instructors We're now building that out to make it a tool to be better used for course developers And the tools that are available from pslc's data shop Which are publicly available and further which we've taken some of our data and already made publicly available We plug into so both the tool and the data has been made available Mostly from our stats course, but we're looking at other courses. Are you using data mining? Yeah, we actually we we are We have models We do we we actually have a competition going on some of the models. So we have a researcher who created a Knowledge construction module for statistics. We actually have a lot. We actually engage in the community of data miners Learning learning science data mining community through the bitter science learning center We actually have complete resources to try to find better models. So I'll do something that we're It's actually a the attempt is to beat the researcher. Can you build an automated model that will do better than the researcher can do? Sure, yeah So what's the what's the process in the earliest date to get a prototype or a reference implementation of this to share the platform idea that could federate Experiences from lots of different sources and places on top of these So it's something where we're moving aggressively on now Talking with some of the learning registered folks here and other folks here about what opportunities We've been bringing systems together and what we want to do is we're working with uh, the tax grantees in the Department of Labor Consortium Planning services these folks we want to mention that platform involves to providing some of these basic analytic functions so that folks who are building those resources can Boots drop their way these cycles Months not here. So I think it's the short answer And so that's actually a call everyone here is to come talk to us Give us your thoughts on this approach and uh, let's figure out how we can work together to do the two innovation That's standardized in scale Thanks. Sorry to have