 Hi everyone, I'm Steel Wagstaff. Welcome to our July Press Books Monthly Product Update. I want to start off by showing you some new features and products that we have been developing and are shipping out to people soon. The first, we have as many of you know released a network catalog not too long ago, and in doing so we did our best to make sure that it passed all the accessibility guidelines, but we missed a few things. And so we fixed those. And so let me share our screen and just show you very briefly high level of what happened. In your browser, if you want, you can install these little tools that help you do automatic accessibility audits. And so here's an example of a Press Books catalog that has a bunch of books and filters and things. And when I ran the accessibility checker, it found five automatic issues. For example, we were using one of these ARIA roles incorrectly. We had this ARIA role, I think it was like, I can't remember what the role we had applied here, but we applied it to ARIA role that needed a child and it didn't have the child. And then we had some IDs that were not unique. Each time you use an ID element, it needs to be unique. We have a search bar and then a hidden search bar that displays only on mobile, but in the DOM, in the page viewer, the ID occurs twice. So that needed to be fixed. And that was really this one. And then we also applied an ARIA label incorrectly to an element that didn't need one. So we went a little overboard and did it wrong, more or less. So that's what it used to look like. Now, I ran the same thing on a Press Books network that has the new code changes and look at that claim. No automatic issues in the report. Obviously, that doesn't mean that we've addressed everything related to accessibility for every user. And automatic checker is never enough or sufficient. But we did manage to remediate all the automatic issues that were found. And the things that came back from that site report that a client reported to us, we have fixed those in the catalog. If you notice other issues that are affecting accessibility or inclusivity for any of your users, please do let us know. They're important to us. We want to fix them. We want Press Books to be a platform that everyone can use and feels invited to use. So that's the first update we have. Okay, second, I want to show we have a feature that allows institutions to, when you're publishing a book, you can enter metadata for the book. And the metadata for the book includes a whole bunch of information, like the author, the editors, the publishers, etc. So I'm going to share my screen here and show you there is a feature that allows you to list the institutions that contributed to a book. This was requested not too long ago. And the reason why we built this feature out, the reason why we built this feature out is sometimes people want to display, like you have a Press Books network that's shared by multiple campuses or multiple colleges. And you want to display in your catalog or in the directory the institutions that contributed to a book so they can be filtered. Well, there's a big list of institutions here that we manually created and maintain. It includes a lot of universities in the United States, Canada, Australia, the UK and parts of Europe, mainly from the clients that we support. Well, we had some French-speaking universities in Belgium that wrote in and said, hey, we'd like our institutions to be listed. And so you can now see that we've added several universities from Belgium. So for example, I could pick any one of these five French-speaking institutions in Belgium. I could now apply it to the book Information. You'll now see that this book has this Catholic University of Levant as the institution that contributed. And down in the metadata, it will be displayed for people to list. If you're watching this video and you say, hey, what about my institution? Why isn't it represented or included? Please do let us know. And we're happy to add other institutions as appropriate to that list that we maintain. The real exciting thing is I didn't write this solution. I didn't code this. Our friend Mitch, who's on the call. This is Mitch's first-ever PR. So if you're new to software development and you want to do something like this yourself, it can be done. Congratulations, Mitch, and thank you for contributing this feature to Pressbooks on behalf of our clients. Anything you want to add or say about what your experience was like to encourage others? Well, thanks, Steele. It was fun to do that. And I have to say I do have a little bit of experience in coding, so I didn't find it overwhelming to find the right place to add it. But it was the process is really straightforward, and I was happy to do that. So it was fun to be able to contribute. That's what I wanted to hear. It's fun to contribute. Thank you, Mitch. So the last thing I mentioned, I don't want to show it, but there was a bug that was affecting LTI launch links in the newest releases of Moodle. So Moodle 4.x or whatever in the 4 branch, they were interpreting what's called the LTI hint differently than we had initially supposed. And when that change happened, it meant that LTI links were breaking from the way Pressbooks was creating them. So we submitted a change to the underlying library that Pressbook uses to generate that LTI hint statement, and now it's being properly parsed and displayed. So if you're using the latest releases of Moodle, you should see that the LTI connections and the links are working as expected. So I just wanted to notify people about that. Those were the kind of big or major changes that we've made to the customer facing product. The big thing that we've been spending time on and thinking about over the last month has been our product called Results for LMS. And so in this portion of the meeting, I'd like to share with you kind of a research update that Michelle and I worked on. So Michelle and I will kind of split this part. It'll be a little bit of a presentation mode, and there's some slides that accompany it. So if you want to see the slides or reference them later, I'll put them here in the chat. But this is basically what we know about the results for LMS product and what we're going to focus on and what we are focusing on improving for this fall and in the future. So I'm going to start by sharing my screen and just give some context. So if you have not used Results for LMS before, what it is is it's a way to connect your press books, a book published in press books to your learning management system. We do that via a specification called LTI. And LTI allows a third-party tool to connect into the LMS. When you've connected a book with LTI, it allows students to launch the book securely, view it in the frame of the learning management system, and it will provision a press books account for them silently so that they can access private content or give them all the permissions they need to read the book or even to create if that's what you want to do. The reason why we call it Results for LMS is this add-on product not only lets you connect the book to the LMS, but it also allows you to, when a student completes embedded H5P activities, it can roll up their performance and send it back to the grade book. So you can see how well a student did on quizzes or quiz sets or multiple choice questions that have been embedded in a press books chapter. And so it's helpful to give instructors and students some insight into the learning that's taking place. The first part we want to kind of talk about is what the user journey looks like, the kind of phases of using this product. And Michelle as the UX expert and specialist has kind of broken this down and will give us kind of high-level overview of what an instructor does and experiences as they use this product. Michelle, I'll pass it on to you if you're late. For sure. So we've identified that there are four main phases that instructors go through when they're using the Results for LMS product. So the first is, of course, we've got foundation building, testing and integration, configuring for class and revising. So the first one, foundation building, is a little bit just about, this is where you actually build all of your materials. So you're going to be writing your textbook or using pre-existing material. You're also going to be either writing H5P questions or you're going to be double checking the ones that exist and adapting them to your content. And then, of course, you're downloading and uploading these to the chapters and adding those sections into the book before you actually start the next process, which is integration and testing. And so in this phase, you preview a lot of your activities to make sure that they're correct. You attempt them to make sure you have your max score, select them for grading, and then you also configure a bunch of grading settings before exporting all of this information into the LMS that you are using. So the third one is about configuring it for class. So this is all on the LMS side. So you're going to be importing the links, the files that you have from Pressbooks. You're going to be adding non-Pressbooks readings, links, discussion forum, the things that you really do to flesh out the class. And then you will be publishing the course. And then you can preview and test the graded activities, as well as look at the grades that are going to be displayed from the students once they complete the questions. And then the last, of course, is a revision period. So no textbook is perfect and it will always take a couple of tries before really nailing it in. So this is the part where you would be taking the questions, the comments ideally from your students and then adapting your content for a better run the next time around. It's Michelle. So a little bit of product history. First, built this product and delivered it in two courses taught at the University of Texas Art in fall and spring of 2021. And then in the summer and fall of 2021, Amy, who's on this call, and I conducted the very first pilot of this product. And we had four or five instructors in each term from four or five different client schools try out the product. And we did really kind of heavy hand holding and lots of interviews and just understanding what is it like to use this product? What do you wish you had? What are the needs and features and desires you have for this? At the end of that, we released a first set of UX and UI improvements focused on like, okay, let's simplify this, let's make this easier. Let's address the major pain points. That was in January 2022. And we really haven't touched the product. We've done a lot with it since then, except that we discovered through the use of this that there were some great reliability concerns. Initially, we thought, oh, it's because users are doing it wrong. And then in the course of kind of deeper investigation, we found it were in fact some reliability issues that we needed to fix in the product. And we're able to replicate those and roll out the fixes. So at this point, we have a con. It's doing what it's supposed to do, but it was a learning experience. So what were these pilots? I'll kind of give a high level summary of what we did. And Amy may want to chime in and fill in details if I've missed some stuff. First, we called it REAP, it's a bad acronym, but it's the results early adopters program. We did it in the summer in fall of 2021. And we had participating clients from Bay Path University, the College of DuPage, Virginia Tech, the Maricopa Community College System, Open Oregon and the Umpco Community College and the University of Washington. They used a combination, mostly Canvas, but there was a couple of courses using Blackboard. And in the pilot, we had really two sets of goals inside on the product side. I really wanted to know, does results for LMS actually fulfill a need that instructors and learners have? Do the instructors like it? Would they use it again? And would they advocate for other people to use it? Was it not only did it fill a need, but was it a satisfying experience? And instructors and learners wish that it could do, that it does not do now. So like, what are the missing features and what could make this better? On the support side of things, because we're providing this to instructors and because it sent the grade book, we really needed to find out how support do instructors need or learners from in using this product, because it is different from the regular press books author interface. Another question we had was, how hard is it to create and configure graded activities and connect to the LMS? And how can we make it easier to use this product? When we asked instructors what their goals were, they said they wanted to increase engagement and do formative assessment. And so you can see some sample things. Instructors said, I want to highlight a couple that I think were nice. One of them said, they want to automatically grade students so they feel incentivized to complete the activities. This was something we were really common, like instructors saying, hey, I'm asking students to do the reading and then I have no idea whether they do or whether they or did they understand the reading? I don't find out until they come to lecture and then often I have to throw away my plans because they either didn't do it or did it but didn't understand it. So that was something they wanted to change. Another person said, I want my textbook to resemble what publishers provide, less dry, more bells and whistles. So it was really about the interactive components that they were excited to build in. The other thing that we heard people saying was doing formative assessment. And when we say formative, what we mean is activities that you complete repeatedly where you're getting immediate feedback to help you check your understanding. The important point is not the overall grade you're assigning which you're doing a summative, but it's assessment that helps you learn as you're forming ideas or you're shaping your understanding. So the one that I really wanted to highlight here was I'm looking for formative monitoring of student progress through the textbook, showing the progress as well as their challenges and areas in need of assistance or greater explanation. Instructors want visibility into what students are learning so they can help them learn more effectively and students want the same thing. So that's what we found or their goals in participating. Amy and I did a bunch of stuff to help them. I don't need to do this in detail, but it was just what we did to do the pilot. And that's what we did in 2021. Here's what we learned. The first thing is we did a survey and we pulled seven of the eight instructors who participated gave us feedback on the survey and 85 students also completed a voluntary survey. The student response rate was really varied across the courses. Some instructors said, no, I don't want to give my students a survey and some said, I'll give it to them, but I'm not really going to push it. And if they remember to do it or I put it, you know, so some instructors really said, hey, this is important and all their students responded. Some almost none and some literally none. So the just take that with the grain of salt when you see yours. Here's the first set of questions. We said, how efficacious, how does this work for engagement and learning? Students told us that compared to other courses and textbooks, their level of engagement with this course was higher than average. We asked them on a scale of one to five with one being low much lower and what five being much higher three being the same average scores 3.8, which means slightly more than than average. We also asked them compared to other courses and textbooks you've used, how effective was it in helping you learn the required course material? They ranked that a little higher still closer to an average score four. When we asked instructors the same question, they ranked it yet even higher. And they said, I, this was very effective in helping my students learn compared to other courses I've taught in the past. And so that felt promising and was we felt like, Hey, this is on the right track for what we want to have. Obviously, we'd love it to be fives across the board, but this was the honest answers we got. When we asked instructors how hard it was and how much effort it took to do this, we asked first like, okay, compared to other courses, how effective was the textbook in providing you with insight into what students were learning? So if you look back, this was how effective was it in helping them learn? They said very giving them insight into what they were learning. Excuse me. When we said insight into what it was more like, well, about average, it didn't give them that much more insight into what they were learning just that they were learning. When we asked about the level of effort, they said slightly more than normal. And how difficult was it? Again, slightly more than average difficulty. So these were things that we wanted, we knew that it would be hard for them to do something new. And they confirmed, yes, indeed, it was harder than I know than normal. And so that was something for us to pay attention to. Finally, we asked them about satisfaction. How likely are you to use this product in a future version of your course? And the score is quite high, 4.28 on average. How likely to recommend it to a colleague or to advocate that your institution purchased this product for broader use? The score for both of those was 3.86 out of five. And so that was the initial data that we collected and what we learned from quantitative survey results. So the next thing that we want to highlight would be the known problems and the things that we're working on now and what we plan to do to address them to make this product better, which is a big focus for us in the current, present and near future. Sadly, grade pass back wasn't reliable. We thought that it was, and our testing, we found that it was, but there were a couple of things that students could do that would break the grade pass back. For example, if they launched an activity and started it and then launched another activity in a new tab, the session would get zeroed out. And so the second tab would be the only active session. They could complete the activity, send a grade, but if they went back to the first one that they'd launched previously, it no longer had an active session and it wouldn't send those grades. We didn't realize that that was happening because it never came up in our testing. And then a student helped us verify that. That was a major issue because it was hard for us to replicate. Every time we launched a session, it works. Well, we hadn't launched multiple sessions and multiple tabs and that posed a problem. So we had a handful of small bugs that were affecting reliability and were hard for us to pinpoint. Thanks to really hard work from our developers and from our testers, Ricardo particularly, we fixed those bugs and we shipped fixes and releases that we think account for all of the use cases that we've seen. And so to grow confidence, we want more people to use the product. We want to use it and make sure that they have it with a bug-free experience that isn't frustrating for students and instructors. So that was the first issue. The second issue is, you know, who's using this product all that well. We don't know which clients are in it, how much they're using it. And so we just need to increase visibility to the network managers and to ourselves about who's adopting this product, who wants to adopt this product. We haven't done a lot of product marketing around this. We haven't really communicated directly with instructors. It's sort of been like something that people have to ask us about and they're like, oh, yeah, yeah, we can turn it on. So that's something we want to change. We believe that this can impact student learning and that instructors want to impact student learning. And so we want to make sure that people who want to use this product understand what it is, understand its pricing very clearly and can make informed decisions about and try it and really just try it. So the big thing we're going to change here is we're going to do more focused outreach to network managers. We're going to invite people to participate in no-cost pilots to try out the product and help us improve it. And so that's what's happening starting this fall. The third thing we learned and Michelle alluded to this. Instructor said sometimes there were too many steps. It made things that should be simple, time consuming. The configuration for this was just too complex initially. So we've done some work after that first round of pilots. We made a bunch of changes to the interface. We made it cleaner, easier to use, hopefully and can re-change some of the configuration settings. There's still a lot of work to do. At the time that we built this product and improved it, we did not have a professional UX UI designer developer understand and now we do. So Michelle has done some really cool work on UX heuristic analysis and we started doing user research and that will be a major focus of our fall pilot. How can we make this user experience easier? How can we reduce cognitive load? How can we make this just flow easier for you? A fourth issue, instructor said, hey, all you're giving me is a high-level aggregate grade. That's not enough. One instructor said, I had to fix problems with grade submissions blindly and guess what the score should be because I couldn't actually see the underlying answers the students gave and I have to say, if I were in there, should feel exactly the same way. Grade alone is not really insight into student learning and so now what we're doing is we're really actively researching a tool called the learning record store and a learning store is a place where you store more detailed learning analytics about student attempts. They scored a three out of five on this activity. They answered A, B, and D. The correct answers were C and D. That kind of information needs to be collected and made visible for this to be valuable for instructors and clients and so we have a lot of learning to do there but we're will show a little demo of what's in progress later in the call and that's I think a real exciting opportunity for us to actually give insight to learning and that leads me to this. The fifth issue is students don't know where to focus their energy for improved learning and instructors don't know how well is this course material helping students learn? Where are my individual students struggling or where's the whole class struggling? So we're trying to understand some of the highest priority needs and interests for both instructors. This will be the focus of our fall pilot. We want to enroll at least 1,000 students as many instructors as we want to use the tool and we're going to be doing UX research interviews and finding out what matters most to you, what do you wish you knew that you don't know and then trying to rapidly build things that help answer those questions to make this product better. So that's what we do there. Here's the sixth one probably the hardest of all. Sometimes the activities just aren't very well designed. This is a do-it-yourself learning tool and many instructors don't know how to make effective learning experiences. If you've been a college teacher before you probably know there wasn't a lot of training at least for me when I was teaching in college courses there was almost no training effective even less training on instruction design principles or the science of learning. Earlier in this call I was talking with Simon and over his shoulder he had this great book called How Learning Works. It's a super important book with lots of great concepts about that but that knowledge is not widely distributed among instructors and sometimes we make poorly designed learning activities and so we have some things that we want to try to do to help increase instructor awareness and some ideas for how we can make easier to make effective learning happen on the instructor side. In particular Amy is going to be leading a couple of workshops in the next month or two that really focus on H5P and interactive activity creation. I'm currently working on a webinar series about how to create H5P activities. I myself have never taught so I'm not an expert on instructional design all of you are but I'll just be talking about how to navigate the platform in order to be able to create the H5P activities. I'll show some really good examples of how others have used them and where to find the right resources to be able to create them. So think of me as like your hub where to find the right things as much as I am and not as much as a design expert and yeah if you have any tidbits to share make sure to come if you have instructors that you know of who are going to be creating assessments in the fall or activities in the fall please send them our way. The second part of this webinar series we're hoping will be a panel if they want to come and talk about their experience with building many H5P activities for their book we would love to have them so make sure to send them our way for that as well. Thanks. Okay so Cheryl asked at the University of Arizona we have a new institutional H5P subscription to the H5P.com product and we haven't implemented results pressbooks results around that yet. For now pressbooks built-in plugin does not connect to your institutional H5P.com instance will results connect that instance directly so faculty aren't potentially creating H5P content in two places. That's a great question Cheryl and right now you're right so right now the way that people create H5P and pressbooks is through an H5P plugin in the pressbook universe. We have talked with H5P the Ubal the makers of H5P about a wrapper for H5P.com that does the same thing as the current wrapper for the free and open source H5P version. Nothing has been built they mentioned that they would be interested in building this so that we could build tighter integrations but we didn't have an interested institution at the time that had it was a more theoretical question than a practical one. So the last I heard from Spentura was a year and a half ago and they said yeah we'd be interested to build this wrapper and you could consume it they would want money from us to do that and I think that's within the realm of reasonable possibility especially if it helps you avoid a problem that you're having but we haven't seen any development on that and nothing has been started that we have had conversations about it so right now it's not possible but in the future definitely could be if there were demand for it I would really push hard to make sure that it does happen we don't want that to be annoying for people. Currently if you build an activity on the H5P.com instance you can of course download and re-upload it to the pressbook network but the for example our solution it's going to require sending XAPI statements to a learning record store and then visualizing them those have to be essentially statements that are hosted on our infrastructure because we have to direct the to an endpoint and we won't have the same control over what's happening with H5P.com statements at least the best of my knowledge unless there were a wrapper that allowed us to do that and so that's all stuff we would need to explore with Ubal and Centura because they host there those activities there whereas we host the activities built inside the books and that's the that's the rub I think. All right so the next thing that I wanted to show is we have been doing a lot of research into the learning record stores and this is pretty exciting I don't know how much people know about XAPI or how much they care to know about XAPI but I'll give a little crash course and then we'll talk about the practical applications. So XAPI refers to the experience API and it's a specification for structuring statements about learning it can describe both online and offline learning but the basic format follows a triple pattern every XAPI statement has an actor has a verb or an action and then it has a context statement so for example you could have a statement that said steel answered question A and scored three out of five the actor would be steel the verb would be answered and the context or the object would be question five and three out of five that's the kind of basic format of the XAPI statement every H5P interaction produces or omits these XAPI statements but if you don't listen for them they just go into the ether and disappear up until now press which we haven't listened for them we said oh we're just fine to let them go in the ether but what instructors are telling us is that when we're using the results product we actually do want to know more about learning and the method to do that is to collect those XAPI statements to structure and organize them and then to give you some way of giving you insight into the patterns that emerge from that so we're taking slightly structured JSON information and showing you things that you might want to know as an instructor so Michelle and I have been designing and developing an initial prototype for what we're calling like a chapter activity viewer that would be part of the results product that would help satisfy this instructor desire to see more information about student learning our goal is to work with our development team on a quick cycle and produce a minimum viable product or proof of concept that could be used as part of this fall pilot so we're on a tight timeline and we're working really hard and really fast and what I'm going to ask Michelle do is to share a Figma which is our design software to kind of give you a high level overview of what we're planning to build so you can see a sneak preview and give us any comments feedback ideas that you have as we're designing and building this you'll see it for real soon but this is the first official preview and I'll hand it to Michelle again for sure I just want to reiterate again this is definitely part of the process this is the very first stage so it's not very beautiful but we are working through the issues and how things should be laid out so that it is functional and works really well for instructors so I'll start off by saying it's kind of a two tiered approach in the sense that we want to make sure that all the information that instructors bring in about the h5p is considered and this is kind of the main area that then is filtered through at where you get the results from students so things that the instructor would bring in are of course the chapter's activity name the type what the max possible score is and what the actual answer is for those questions that were created and then of course on the student information we have the student ID the activity name the attempts they have made the score the response given and the time submitted up those answers and so our first sort of flow through if we had sort of chapter three selected and we really wanted to look at this one question h5p question the wisconsin floor and fauna flash cards we could do that and then it would give all of the students that have completed that activity and you'd be able to see what their attempts are the score and those responses given the time submitted essentially this is what it would look like then of course we have kind of more iterations and little edge cases so what if they have more things selected what does that look like if you have multiple selections are you able to expand and collapse these so you're better able to focus on the content that's on the page if you have other sorts of activities that have if you want to check the different attempts and then also of course we have more detail about the popovers so with the time submitted you should be able to look at it through a calendar view as well as more granular timestamp and you should also be able to see different types of attempts so all of the attempts or the last one or the first or the best attempt that the student has made we also have the chapter dialogue popover and then that applied to say if we wanted to see what all the attempts were for this one multiplication activity that's what it would look like and you'd be able to check out the score from there so this is what we're working for this is a real base of the product and now we're also figuring out exactly like how do we want to show the responses how can we make this information really meaningful for instructors to engage both with their students and maybe make their own refinements get the information that they need in order to take action next thank you michelle i really love the how smooth you are and demoing that despite it being a very early prototype the things that i want to stress are the goal for us that's very important if you could just share that screen again michelle is sure and there's i think kind of three key components that i want to highlight out of this the first is there will be ideally in the top left of the page that's what we're seeing now that chapter selector so what will happen is depending on which chapter you want to look at the the the statements in the bottom would be all of the activities that were generated within that particular chapter so usually instructors are thinking of the chapter as the assignment and so let's say i want to see what happened when a student scored a 45 out of 50 on chapter three this will be the details of that and then you can say let's look at chapter one next and the bottom of the page would refresh and show you all of the attempts that are relative to chapter one the second piece that's really important to us is that instructors should be able to drill down and select some or all of the statements in a given chapter they should be able to select by the type of activity or the specific activity so if you want to see how did students do on just this particular quiz question then you could see that or if you wanted to see how did a single student do on all of them you could filter by student or by activity and we also want them to filter be able to filter by which attempt so maybe you're interested only in how did students do on the very first try they took on this quiz because that tells you more about like what they actually learned from the reading rather than they got better with practice but like what's there where did they where did they start out with so you should be able to filter by attempt those are the kind of crucial features that we think we want to have like the information should be displayed in a table format and given the instructor the ability to kind of drill down at the level of the student or the level of the activity the third kind of component that we're thinking about is how to make this compact information display compact and interesting so that you can get insight quickly but also able to be expanded so that you can see more if you want to know more so we're trying to get a good healthy default state that's just enough information for them most use cases and then give people the flexibility to see what they want to see or answer the particular questions they have you can see that some of the principles that are in play that we've done for things like the network catalog or the directory where we present a large body of information and give you searching and filtering tools to see a smaller subset and the next stage for us is to build the technology behind this and the data storage so that we can populate this with real data and then get it into the hands of instructors who have actual questions about student learning and make it better based on what they tell us they want to need so that's what's coming for us and that's what we're working on happy to answer any questions about what we have in mind and timeline and what's going on with our pilot but the big the big thing to mention is this fall we're doing another round of pilot Amy and I are recruiting institutions and instructors now if your institution wants to participate or you know of instructors that have built h5p activities and are interested in this kind of information let us know we'd love to sign them up we want to help them succeed we want to learn from them and we want to make sure that what we're building is something that people want and that they find useful for accomplishing their teaching and learning goals so that's what we had to share on the results front today there's just a couple questions about pilot participation for the fall the first question was if we want to participate in the pilot but we don't have many ready ready for this fall will the pilot be available in the spring and the answer is yes totally we're anticipating this to be a fall through spring pilot and maybe even longer so if you want to participate in the spring let us know and we'll have to get you in there we're really focused on the fall since that's about to start right now but spring is also available the second is how do we learn more about what's what's entailed with this pilot more the details so there's a document I just put in the chat that explains what the pilot is the rough timeline and there's a interest form that you can fill out that says hey I'm an institution that wants to participate it's going to ask you whether you have permission to set up the LTI configuration if you've already configured LTI which is the back end stuff that network managers and LMS admins have to take care of for instructors and then whether you have you've already identified instructors and books who want to use this or whether you need help finding instructors and books at your institution to pilot the program so we can definitely help you find people because we have a pretty good awareness of who's doing what on your network or if you already have people in mind that's great even better our general feeling is when we've done pilots in the past having a cohort of say three to five instructors at a school makes the experience a lot nicer because you have people that you're doing this with at your institution but we can also make up cohorts across institutions and that's fine too we're happy to make you a friendly learning community so one or many or however you want to do it but we're just eager to get this into the hands of instructors and to learn from them about how well this is actually needing their needs and that's been the focus for us in the last month and it will be the focus for us in august too so expect to hear more about this at our august monthly product update and instead of just a wireframe and figma hopefully you will be able to show you some working software and initial responses from our instructors at that point thanks everybody for coming out to our monthly product update we really do appreciate all that you do and want to help support you succeed in whatever way we can talk to you soon