 Well hello and welcome. I'm very excited about this opportunity to talk about a little bit of our journey over the last year of really taking a clear look at how user experience and instructional design really can and should go together. So in this particular talk we really want to explore how we as a community can take advantage of what the user experience field has really learned through building the web technologies and some of the other more commercial designs and then how that can actually play into our educational systems and the courses we design. And one of the reasons I'm very excited about sharing this is when we started this journey and I don't know if I've actually ever told you this but I didn't really think that it was going to make that much of a difference for our project in the sense that we already thought that we were thinking about what are the learning outcomes, what are the students needing to get through this piece of content. But when we actually started going through some of the given takes and the conversations I realized, oh this is a really important experience to bring. Especially as we deal with students that aren't necessarily the traditional students as Josh was talking about this morning, these students are very distracted, they have a very limited amount of time and how can we design things that are really helping them focus where they need to focus. And that doesn't necessarily mean just in learning design but also we don't want to create barriers to the learning that don't need to be there. Okay so today we're basically going to just give a little introduction of who we are and what we create and then I'm going to turn it over to Bill to talk about user experience and some of the background of what's being brought into this. And then we're also going to give a couple of concrete examples of what these collaborations look like and then talk about some of the lessons learned in our future plans and then I'm hoping that we can leave a lot of time for questions as I really do want this to be a dialogue and discussion. First we are a research and development group at Carnegie Mellon. We try to take what the best that we know of how students learn and the best that we know about what they need to learn as we talk to our subject matter experts and then creatively and scientifically create learning environments that will improve the teaching and learning within higher education. And as if you were in the last conversation you know that we also collect data on student interactions, very rich data, long data and then we also analyze that data and use that to feed for powerful feedback loops. And that process is always being iteratively improved. So what do we mean by what kinds of things do we create at OLI? Well we create learning environments. What does that mean? Well first we bring together a team of people and look at what are the learning objectives? What do students really need to know in this domain or in this project? And that's the foundation of everything that we create. Then we ask, well if we know what we want students to do how are we going to assess that? How are we going to measure that? What are the types of things that are going to tell us if a student is attaining these or not? And then we also think, okay now we know where we want the student to go and we know where they are and how we're going to measure if they get there. What are the types of activities do students need to engage in in order to reach those objectives? And then as we create the learning environment what we're trying to do is align all three of these things. And that's what the learning environment really is. It's an alignment of those things. So when you go into an environment what we're trying to do is say this is what you're trying to learn, this is how we're assessing you. These are the activities and they should align so the student can focus on what they need to focus. So when we create these learning environments we have a lot of people from different expertise on the team. And one of those areas is the user experience area. And over the last year we've really been working at how we can integrate this perspective more closely in the project. So I wanted to talk a little bit about user experience. This is a field that maybe some people in the room are not too familiar with. I'm not going to hit you over the head with lessons in it. You can always start with Wikipedia when you definitely know what something is. And they'll give you the ISO definition of what a user experience is. It is a person's perceptions and responses that result from the use or the anticipated use of a product system or service. Okay, that's nice. So what does that mean for us at OLI? What we take from this is we incorporate usability studies. It's the design, development and evaluation of the experience that students and instructors have as well as authors who are creating things in the OLI environment. So I want to talk, I really like this. There's an article out there and I have the Leo out to it. And it poses this question when it's talking about just user experience in general. It is an honest question, how smart are your users? Because a lot of people will sort of dismiss user experience, especially when they're thinking that they've got, well, I know that my users are familiar with technology. They know how to use a computer. I don't really need to worry about it. And the answer may surprise you, it does not matter. They can be geniuses or morons, however you want to classify them. But if you don't engage their intelligence, you can depend on their brain power. And so it's really important to know is just because someone may be capable of doing something, if you really sit them down and drive them to do it, that doesn't mean that when they're eating Twitter feeds or doing other things that they're not paying as much attention. And so you can't count on having their 100% attention. And so he has this nice chart. It sort of lays out a person that uses desire to be doing what they're doing and maps out a grid against complexity. So something that's really simple to do that people really enjoy, he puts Angry Birds way up there. It's a really simple interface, lots of people love it. However, we live in the right side here. Learning is a complex activity. It is not like playing Angry Birds. And depending on the student or the instructor that you're working with, desire can actually move up and down here. But either way, we do have, he labels this as the danger zone to try to avoid designing with low desire and high complexities, but we're going to be in there sometimes anyway. So in short, learning and instruction are complex tasks. And they tax the user in a way that is not like a round of Angry Birds or Fruit Ninja had to get some other game. And so you can't think about ignoring the interface and the user experience because these are people who are going to be taxed. They're passive. If you read the article, they give this list of adjectives and our users are all of those things. They're stressed, they have other things that they're doing. And so you really need to focus on this experience. But I should also say, even the developers of Angry Birds pay a lot of attention to user experience. I'm just saying we need to pay even more. This is a, I think if you're in the field at all, you know, Jacob Nielsen, and this is a good quote, consistency is one of the most powerful usability principles. When they only behave the same, users do not have to worry about what will happen. Instead, they know what will happen based on their experience. And this is the kind of thing that you want to set up for your users so they're not spending their time thinking about, where am I supposed to click? What should I be doing? And focusing on what you want them to be doing, which is I've been learning or instructing. So let's talk about sort of the history of user experience at all high. As Renee was saying, we sort of would really dig in on certain issues when we knew they were thawing. And I like to call it HCI's triage. So we know we've got some problem here. Let's do some user studies, some evaluations, figure out what we need to do to go fix interfaces. What we're working on now is really integrating into software development right from the start, and that's something that's separate from what we're talking about today. This is platform usability. Now we're working on integrating into the course development process itself, working with the content experts as they develop activities that are deliberate through this platform. So how are we doing that? It's a simplified version of our course development kind of process chart. We break this up into several different phases, partly just to keep things straight, and it's not always quite this simple, but this is really what we're trying to bring user experience and trying to figure out exactly where things are most effective. So right now what we've been doing is in project initiation to do our needs assessment. Well, that's a perfect place to say, hey, are there places that we know right from the start that we're going to have issues and we need to bring in some user experience expertise to take a look at. So this might be things like setting up user studies. It might be, gee, there's a whole new lab environment that we're going to be creating. Let's take a look up front at the user experience before we actually start programming and then have to go back. So in pilot development, we always develop a pilot, either module or unit, some chunk of material that we test with students before we go out and create the entire thing. So a lot of times in this area we'll come across new activity types and I'll actually show you an example of one of those in just a minute. And then within the pilot evaluation, as the students are using it, this is a perfect opportunity to bring in some structured user studies. Full course development, I think this is still kind of, right now it's more of a triage, it's more of an as needed. Hopefully we've worked out the bugs in the pilot unit, but there's always a chance that new things would come up. And then ongoing, we see a need for just ongoing studies of the full course but then also consultations and emerging technologies that may be changing or different student populations coming into the mix, that sort of thing. So this is one of the examples from the pilot area, pilot module. This is from a course developer, actually no, this is from a content expert. He was trying to create some activities for students in the anatomy and physiology course and we were trying to come up with some new creative, almost games that students can be playing. So he wanted to do the crossword puzzles. It sounded very good, pedagogically. There are definitely different cues and different skills that students are using. It taps into their spelling of the terms. It taps into memorization. It taps into all sorts of things. So we were talking about what it would take to actually program this type of interface. And we got, I guess, 20 minutes into the conversation and we turned to you and said, what do you think? So I'll let you share your thoughts. So I'll tell you what I think. So the role in the project that I play is not to be decided whether or not something is good for a student for learning. My role is to take a look at what's the user experience going to be like engaging in this activity and is it going to assist in that learning. And in this case, I felt like I was learning. So if you think about, okay, how are you going to do a crossword? Now on paper, there might not actually be a lot of issues. People are used to filling these out. They change their letters. They erase. They do whatever. On a computer, it's actually a bit more clumsy. You have to communicate. You want to give them a box where they can just type the word because this is memorization of a word. So you want them to get the word out. So you don't want them doing the box by box. So if you don't do it that way, you need to communicate how many letters they can type. You need to resolve conflicts. So you already have one word going this way. It doesn't fit the word you just gave me. You have to communicate that to the user. This is not unresolved. You can do crosswords online. But it is going to get in the way of someone when you're really just trying to get them to be learning. And so this is a sort of one sort of, I mean, it's maybe an easier to understand example, but this is an example of where bringing this expertise to the process can say, hey, this might be a great activity, but when you put it into the computer, actually, and you put an interface on it, you now have students who are doing two things at once. They're trying to answer the questions and learn the material, and they're trying to learn the interface with which to answer the questions. And you don't want students to have to be learning that interface while they're trying to learn. And I do want to pick up on a point that you just made as well, that our main goal is to make sure that we create things that enable students to learn or interact with the material in such a way that it's very authentic, learning experience for them. That's always what our goal and our mission is in any of these conversations. So, I did want to bring that to bear. And then also just when we're talking about this as being an activity that would be great on paper, but not necessarily online, do we want to spend all of our effort and our programming expertise trying to solve the interface problem, or are there other ways that we can create these types of tasks that aren't going to be so taxing and are going to be more comfortable for users who have been using our system? Right, and it's a balance. And that really was the question was, well, okay, we can spend a bunch of time researching current interfaces coming up with the best one we can, but is that really worth our investment or can we just come up with a different activity that we'll try to engage and memorization is a tough thing. Flashcards or people to tears, they might be effective, but I'm not saying that we need angry birds to get students to learn, but you have to strike that balance between engaging someone and then not sort of creating some interface that they then have at the same time. So there's also some broader UX work that we've been doing. The most important thing here is user testing. User experience is a broad term, but my background is actually in human-computer interaction, which means you design something, you sit down and sometimes they use it, you make notes, you make videos, and you do this a lot. And then you put all that together and you look at what things are working, what things are not working. You come up with possible solutions, you put it out there and you do it again and you do it again. And so that's something where we are really looking to that broader process. This one piece was a narrow discussion of the crossword, but then there are broader things in the system, such as the navigation, the way quizzes and this quiz homepage actually give people negative affect, because it's got all these rules about how this quiz is going to be administered. That's actually important to pay attention to. I'm going to mention accessibility here because at OLI we're taking also a broader non-technical approach to accessibility. So there's section 508 WCAG, which if you're dealing with federal funding or community colleges you're familiar with, but we really want to go beyond those requirements. And so as part of the I think I've got all the right A's and C's in there, that thing, the federal program and yesterday I was talking about open, we're collaborating with CAST and so they do universal design for learning and we're bringing that in in the course development process too. That is accessibility, but that is making sure that these assets are accessible to everyone in ways that goes way beyond just hand-to-screen reader-reader. I need another example. So we gave the one example of a really specific element in the course where there's one activity and we're trying to figure out what should we do with this. Here's a broader application where this summer I spent some time mapping out our authoring process and we were kind of looking at that saying, okay, first what is it? And so actually one of the things we discovered was there were different people using tools different ways which may or may not be best practices but there are actually a lot of ways to do things. And so once you actually interview people and watch them doing these things you can map it all out and say, well, we can really streamline this for people and save people a lot of time and really give them a better experience on the authoring side. And so this is something that is not core specific. This is looking at the user experience for anybody who's using the system. And one of the things that came out of that is that our tools don't even approach quite this step, but a lot of the tools you see out there are sort of really focused on the word word processing, log publishing and sort of tool sets. So now that we're looking at, we're going to develop our full suite of authoring tools what can we do from the user experience point of view to set up the pedagogical process of generating this content. And then the last bullet there is really important ensuring that this tool supports iterative development. And so we would love to bring things in and actually cast some tools that do this where you can look at the usage of the assets you're creating as an OVR in the authoring tool to see where people are performing whether you're using it or not using it. Those are the kinds of things that if you really think about the users and what they're trying to accomplish before you start implementing a tool really change the direction of what that tool is going to be. So looking forward so a lot of what we've told you is process that we've started really using in the past year I would say heavily in the past six months and so we've had a lot of little opportunities to examine activities that are much more complicated across everyone things like simulations. And so in the going forward we want to sort of streamline there's still a little bit of since we're all sort of putting this into a process you know when should I send you an email about this, when do you want to be involved and so we're still sort of figuring some of that out but really the bigger piece is going to be evaluating this process as a whole. You know if you've heard anybody talk about OILI you've heard about iterative improvement and one of the things that we want to help work on improving is this process. So we have not been doing this long enough yet to have done an entire course front to back to go through that five those five bubbles. And so when we have completed that process we will make a concert effort to go back and look at that and say okay first of all are we happy with the easier experience that came out the other side. Regardless of the process do we feel like we accomplished what we needed or do we look at that and say there's still things in here that we need to address and if there were gaps how do we go back to the process and fix that so that we can catch those gaps and how do we better utilize our time through this process. So that's really what we're looking to be doing at OILI and really integrating this user experience piece into the course development which is we believe actually very important and fairly novel to bring it out of the software realm and actually apply it to the development of OERs. And with that we'll take questions. No questions. This is EC's Angry Birds. Thank you. So based on your experience of using design in course development what should you not do in course development? What should you not do? Reinvent the wheel is a very big one and if I go back to Jacob Nielsen's quote there when there is a sufficient if you want to call a widget an interface, a device for asking something someone or asking someone for asking something someone or getting someone to give you some kind of input that exists rely on their pre-existing knowledge it's very very tempting especially if you're a software developer or you have some software developers to say oh I'm going to write this really cool thing and it might be really cool but the person that you're giving that to does not care if it's really cool they care if they're learning from it and they would really much rather have an experience where they can go and engage with that content than have to learn that new interface. Reinvent that as well I think for me I had the course development team with OLI and I think one of the things that I've learned here is to not get too married to an idea because oh this is either really great or we think that this is a novel approach to a pedagogical problem we've been wrestling with but really trying to take a look at our users as a whole what other things are out there that we need to be competing against and a lot of times when we even talk about competing against all of those things we think oh we've got to have a whispering thing that looks just as good as Angry Birds and it has to be engaging and have to have fun and that's really good stuff but it's not necessarily essential and we have to keep our minds on what's essential what are the actual goals and what is going to make it easy for students to attain the goals so that's been something that I've been and if you're looking for something a practical this comes from that article I actually keep this over my desk because a lot of times people get the user experience saying oh you're just trying to fix things for people who can't use it which is a novel we're trying to do and so this person he actually breaks it out into if you're thinking of the word oh that's a stupid user who just can't figure it out that person is probably and especially what Josh was talking about today the traditional user of your OER are they stressed well let's take a look at some of the examples that he gave tired most definitely if you're taking muscle relaxers in order to sleep before you can go back to school you're tired untrained is typically the case passive will depend on the student independent more and more distracted definitely if you actually go into a student's dorm room even if they're in a traditional institution and you observe them using an OER and they we're not going to be over your shoulder way so you give them time trackers and you monitor what they're doing on their computers it's back and forth it's Facebook it's mail it's your OER it's back to Facebook it's to the NBA current ticker it's back to Facebook to make a post it's back to your OER and this is the traditional student so the users that you have as much as we would like to believe are dedicating themselves 100% to your OER at all times are not what did you learn that's one that we're actually working on very deeply right now a lot of passionate opinions yeah what's my opinion there's a problem of dual function in navigation through OERs if you're talking within say one module of things a lot of times from a strict interface point of view you don't want to be creating new fancy things so for example I would recommend against using the desktop based version of CoverFlow to go through your content but believe me I've heard people suggest CoverFlow because it looks really cool on a touch device so you want to be using things that look like the kinds of navigational tools that you see on common websites that your users are using look at I mean Google is not really actually a good example but look at something who's got a lot of information and a lot of things they're trying to convey I would suggest say Amazon and Wikipedia as a matter of fact Wikipedia as navigation on the mobile platform I even prefer to their standard navigation but you don't want that to be overloading I'm just trying to get somewhere on the flip side there's an argument on the pedagogical side that you want to make sure that this navigation is at least exposing enough of your current area so that the user knows pedagogically where they are in their course material and so that's a balance that actually we have spent quite a lot of time trying to strike so check back with us in January and hit our site and see what we end up with although I think that we've actually got a pretty good prototype down that we're going to be piloting internally and testing before January this might be the same question but in a different way is there a a tension between what's inside the course and what's outside the course when you send the user off to a website that's not your website or would you frame it or something like that? Yeah, there is although in our particular circumstance we very rarely are sending students off-site so that's not something we put a lot of attention to but I will say actually to try to give another practical example users whether you would think it or not are still really passionate about not having windows pop up and there was for a while this sort of kind of feeling because tap rousing had taken over and people were getting highly familiar with it that it was okay to open new windows. User testing done by the Nielsen Norman Group this year still show users getting confused when new tabs open up they try to hit back and it doesn't work and so in some cases it's almost even worse than popping out new windows so one of the things that we're doing right now in OLI is being very concerted about finding all the places where even if it's a quiz like that we used to just kind of pop into a new tab like you know you click here to start big button to pop it out we're trying to eliminate all that as much as we possibly can so that's one practical thing that I can tell you about moving around and changing windows How do you eliminate that? Well in our case we're trying to bring as much as we can in line so it does not you can do it in two ways we're trying to bring things sequenced in in line so instead of trying to send you off somewhere we actually find a way to sequence that into the materials so it's part of the real-life logical flow Is it embedable widgets? Yes and so you can also do depending on what you're doing you can also do overlaid pop ups that's something that people are getting a little bit more used to although I'm not as much of a fan of those Do we need to stop? I'll go to all of these questions Yeah This isn't like a philosophical question It kind of goes back to all constructivist traditional It's like I kind of see this as a kind of thing in more traditional like how do we make traditional instruction more like just more usable, easier more accessible like maybe are you missing what's your opinion like it's still like a fairly artificial setting maybe painful like you're trying to inch the user experience out from that bottom right quadrant up a little bit to like find it desirable but it's still No, that's part of the problem we can't do that really we can't really do that what we can do is make sure we're not driving them the other direction and poor interfaces can take people and use your site and drive them straight to the bottom and so there might be ways to look at it to try to kind of lift people but the easiest the most destructive thing you can do is have a really bad interface that drives people away and even if you have a great interface that once they learn it is just really great for them if they really felt negative affect in their first experience with you it's going to be a lot harder to get over that and bring them back up to enjoying using what they are really the best interfaces are not noticed by users the best thing you can do is that they don't know you have usability experts working for you follow up to that I was reading about how the Khan Academy that one of the successes of it had so much to do with the interface and how easy it was it's very common, it looks a lot like other sites that students have already used how do we capitalize on that when our institutions are kind of forced to use it can you give me an example for example, we use Sakai and I enjoy Sakai I like it, but the population of students that I worked with just the last 8 weeks or developmental at and it took them 3 weeks just to get to the point where they were comfortable navigating Sakai enough so that they were equal to actually access the content and use it and I don't know I don't want to experience that again with this group that's difficult traditionally a user experience person has difficulty convincing their company and so there are actually training sessions for how you maximize ROI and you show that by improving the user experience you get better ROI, you get better click through you get better return and it's a lot harder to do that here but in a market, in a corporation that's usually the argument you make as a user experience person like I will save you money, I will make you money by making your customers happy the closest parallel that it sounds like you might have is I'm trying to teach people and this tool that you're having me teach with is not teaching them and it is getting in the way of my instruction but without knowing any more of the specifics that's the only sort of suggestion I can give yes, no I have a question at our point I went to the kickoff we're the only California grantee on Monday so this is a gathering of 20 million dollars worth of investment and I'm wondering as you're listening to how you're going to roll this out whether you're planning a two years implementation what role do you play in that particular process are they going to be open Are there going to be courses or books and ultimately how are you going to participate to make sure this one quarter of an experiment by the administration that actually might be workable? I'm glad to ask that question. OLI is part of the coalition that are providing support for all of the grantees. So for all of the grantees we're going to be creating resources on good course design. I would expect user experience as a part of that. Different kinds of resources that people go through. There's also going to be a group of people who are going to be with us piloting using our tools and our platforms so that they can gather the data to build in some iterative improvement. And then also there's going to be some projects that are going to be working with us in complete co-development. So if you remember that team circle that we had, then they would be the domain experts coming in and we would have all of our groups of expertise working with them as well as a member from CAST who does the universal design for learning piece. So I think depending on the projects and how things go different projects will be working with us in different ways. But everybody will be getting as much as we can give and provide as far as best practices and lessons learned and talking through some of the more thorny issues. Does that answer your question? Kind of. I guess the last question I would have built on that is the do you imagine that all of these courses that are going to be built in this program. For example, there's three silos here. There's green tag on agriculture and healthcare and nursing. So all of those courses are going to be built out and it's looking through the rest of the grantees in a similar way. I think the important distinction is that we're not funded by that grant. We're funded by the foundation to support certain people who are coming out of that grant and so we will be doing that but we have not yet figured out who those people are that we're working with. But those three tiers of support to whatever level we're doing that will incorporate these or experience piece. So the grantees may be creating various different types of resources. If they're working with us it's most likely going to be things on our platform, modules, learning environments that sort of feel. But we have not mapped out a plan or some of those relationships just yet. Okay. And do you think, when are you going to start mapping some of those relationships? Because I mean they're kicking off already as we're speaking. So how does that? It's well that works. Okay. So I think, we didn't have a list of the grantees and I don't, I'm assuming, yeah I'm not even sure if we have them yet. So once we have that list then we can start talking and communicating with people and reaching out and seeing who's interested in working with us. And it's time for us to get kicked out. Yeah. But we can keep talking. Thank you. Thank you.