 Alright. So, as we move along today in today's session, we're going to be making space for discussion and you'll have opportunities to ask questions along the way, but feel free to use the chat. If something comes up that you would like clarification on and raise your hand if you'd like us to have you jump in with audio or video. Okay, so thanks for joining us. I'd like to begin by acknowledging that UBC Vancouver is situated on the traditional and unceded land of the Musqueam territory and we would like to thank them for their continued tolerance of our presence on the land. I encourage everyone today to consider your own relationship with the land that you live, work and play on and to find out more about those who lived on the land for you. So today I'm joined by four other folks, both some Kim, like myself, she's a learning designer at CTLT. Tim Cato is the team lead at the Learning Technology Hub, also CTLT. Peter Staffschuk, I probably mispronounced your name Peter, is a professor of teaching with the Department of Mechanical Engineering. And Amanda Bradley is an associate professor of teaching with the Department of Pathology and Laboratory Medicine. Okay, so here's our agenda for today both song will start us off with a brief introduction to peer assessment in general, and talk a little bit about the considerations for implementation. Thanks, Tim will talk about the history in the background of the peer scholar tool and its inclusion in the UBC Learning Technology environment. I'll talk briefly about how you can set up your scholar in Canvas and what the workflow will look like for students when using the tool. Both Peter and Amanda will discuss how they each use peer assessment in class, how peer scholar was used to facilitate peer assessment, including their reflections and their recommendations. We'll also have a breakout activity with three topics you can choose from to join and afterwards we'll have a chance to come back and discuss anything that you wish. And we'll also give you some resources for learning more about peer scholar, if you'd like to explain, sorry, explore further. Without further ado, so take it away. Hello everyone, this is the song from CTLT. I'd like to begin my discussion about different types of peer assessment. So first type of peer assessment has been widely used in a group work setting. This is a great way to promote individual accountability in a group work and instructor will know more about group dynamics. So students in this case evaluate their group members contribution and collaborative behavior. And IPR has been widely used to support this type of student peer assessment. The another type of student peer assessment is actually on peers product. It can be writing assignment or oral presentation or multimedia product. It can be used for individual assignment or group assignment. If the student to submit their work, it could be draft or final submission of their work. And the student will use instructor's rubric assessment criteria when they review peers product. So at UBC, Canvas peer review tool and peer scholar has been used to support this traditional way of using peer assessment on peers product. In some specialty cases, students first need to submit their work, that's the same. The difference are on actually how students provide feedback on peers product. For example, using class, a student can provide annotated feedback directly onto a peer's product. And especially for video assignment, student can provide time specific feedback, annotated feedback on peers work. And compare, it actually present peers work in pair. So student can use their innate ability, like a compare peers work when they prepare the feedback to their peers. Next slide please. So now we want to understand your reason why you want to use student peer assessment, especially on product, and what are some concerns you have in the use of the peer assessment. So under the view option on the top, you can access annotate tool. Please use the type text tool and type your benefit and challenges onto this screen. Yes, peer assessment is particularly on the summative use of the peer assessment has been used in the classroom to save faculty marking time. One of the challenges time consuming to set up, especially like when you are using a third party tool like peer scholar, definitely you need to set some time aside to understand the interface of the peer scholar and other peer assessment tool. That's what if a student don't do the assignment. That's one of the biggest challenges that many instructor experience. So later Pete and Amanda will share how they prevent this from happening. So why we need to think about the benefit because we need to get a student buy in so they can really enjoy the participation of the peer review process. And some of the challenges, we can overcome these challenges from by preparing the peer assessment with care and choose the right tool to support the peer assessment process. So any one of you can take a screenshot. So on the next slide, I'd like to share. These are really great. But because of the time that we have I'd like to go to the next slide. So on the next slide I'd like to share some of the resources that you can report to when you design and planning peer assessment. Thank you so much. So here are great resources that you can refer to when you plan out the student peer assessment. First, CTLT recently updated the resource pages and then there's a section about student peer assessment. There you can access to some of the handout that talk about tips and strategy on student peer assessment, especially on product. And there are some faculty usage cases. So you will know how other UBIS faculty has been using student peer assessment in a various different way. And LP Hub, there you can access the instructor guide on various student peer assessment tool. And you can also access some shorter stories on usage cases. And third example is a student peer assessment training workshop. This is a sample training workshop that has been developed by psychology instructor. These two psychology instructor couldn't find the time to train and prepare a student for a student peer assessment. So we created a self-tracted module to prepare a student for a student peer assessment. There they can pass their understanding about the assignment and rubric, and then they will have a chance to use the rubric on a sample assignment and compare their rating and reasoning to that of the instructor. And third one is the planning worksheet. This is the template that has been developed by psychology instructor at UBC. By using this planning worksheet, you will not miss important element when you are designing student peer assessment. Student peer assessment is very complicated assignment because inevitably you add multiple components, multiple steps, submission steps for the student. You have to make some decision on how many peer reviewers and then how to distribute your grading to acknowledge each phase of the peer assessment process. So you will want to mark these resources. So next team will talk about the history and background of the peer scholar. Thanks, Bo Sung. And sorry for the brief pause in the screen sharing before. So yeah, I'll give a brief overview here of just how we got to where we are today with peer scholar as a tool at UBC. So peer scholar goes back. It actually originated at the University of Toronto. So in psychology there, some great colleagues over there, so Dwayne Paré and Steve Jordan, they started it up and it was very much based on their research and their goal to improve learning. It wasn't driven as a let's make as much money as we can, which is typically what we see with a lot of educational technology these days. So that's a bit of a nice difference from some other tools. And so we went through our formal pilot process in the LT Hub a few years back. And so the tool had been used prior to that. They're on what's called V3 of peer scholar right now, but I know Pete and others had tried it back on the V2 days. There's a much older interface back then, but it still did the job well. And so, and more recently we've done a working group on student peer assessment tools at UBC. So Pete actually participated in this group along with a handful of other faculty members. And that resulted in some recommendations to LT Hub leadership based on the survey that was conducted by the faculty members in the group and based on the discussions of those results and looking at the market in terms of what was available right now, as well as what we already had experience with at UBC. And so, you know, out of that came peer scholar as a key recommendation for traditional peer assessment as well. You know, I peer was looked at in terms of team members of bringing each other's work and compare and class came out as offering something different that we can't otherwise get. And then, of course, you know, there's the peer review tool built into Canvas. You know, no sense turning that off or anything. So that's there to stay to. But that is that is essentially where we got to from that group. Largely last summer. So, and how things work essentially is we have an enterprise license right now. And so there's no cost to yourselves or your students. And the other model that they do offer, which some other institutions have adopted is the letting the students pay $10 each. We didn't want to go that way at UBC. So the cost is covered through LT Hub. What I would say is, you know, I would consider peer scholar to be a more advanced peer review tool. Certainly, a lot more thought and research has gone into the design of it. A lot more effort has gone into the software versus in Canvas. You know, Canvas overall, a lot of effort has gone into it. But if you look specifically at Canvas peer review, they don't have a lot of capacity and instructor, the commander for Canvas to focus specifically on their peer review. So it hasn't really changed much over the years, whereas peer scholar has been something that's evolved and they keep adding to. Yeah, I got some stats from them. Now I'll caution. I'm pretty sure that the sandbox courses are included. So the 126 is probably a lot higher than actually is because I know there's quite a few sandboxes. But the number of students should be reasonably accurate because we don't generally have that many students in the sandboxes. So it's hitting quite a few students. And that's over the past year, those numbers. So, yeah, it's gotten quite quite a bit of adoption. And I will say, you know, a unique thing about working with this vendor, their support is very responsive. They're over an East Coast time, but, you know, we've had many times where Pete or I or somebody else has sent them an email in the evening our time and get a response later that evening. So, and I know that's quite late into the wee hours for them over there. So they're very responsive to any problems. We sometimes find, you know, because they do, they do roll out their changes quite frequently and quickly. Occasionally, there'll be a glitch here or there. But, you know, the moment we send it over, they're pretty quick to address it. So a little bit about how peer scholar looks in action. So the student workflow is a little like this, the students will enter the create phase. So once you've outlined as an instructor what the assignment is you can put in the instructions, the students will create their work for the assignment. They can type it right into the interface or they can paste it from another app, or they can upload a file. And once it's scheduled to release, they'll be able to submit their assignment. The next phase is assess. So they are given other peers assignments to assess and they will use the rubric that you provide and offer any comments to their peers. And then there's the reflect phase where students receive that feedback from their peers and will be able to work on any revisions based on the comments that they were given. And they can also reflect on how that peer feedback influenced the revisions. That's the basic workflow that they will go through. And it's setting up in Canvas is pretty easy. It's an external tool. So like Tim mentioned, it's not the built in tool that comes with Canvas, it's an external tool. So you just have to go through a couple of more steps to set it up. So you just, when you're setting up an assignment, you select the external tool and type in that URL there. And it performs a little better if you load this tool in a new tab that will checkmark that you see when creating an assignment. So that's how you set it up in Canvas. And there's a few activity types to choose from. There's the classic. So that's sort of the basic peer assessment activity. There's a case study so students can see different cases and give feedback to peers who either did the same case as they chose or different case. And then there's some options for doing group work as well. So creating a classic activity. Basically you outline each of the stages. So in the create phase, you know, you give them their instructions on what's expected of them. Same with the assessment, you can provide the rubrics for not only how they're to assess one another, but how they will be assessed as well. You set the due dates and then the grading happens. And once you're logged in to peer scholar itself, this is kind of what the dashboard looks like. So you can keep track of how your students are doing, who's at what stages, that kind of thing. And it gives you a few analytics to look at as well. Yeah. So it's a pretty easy to use interface. Okay, so that's the boring stuff. This is, we're going to be handed over to Pete and Amanda, who will talk a bit about their experience using the tool. Thanks so much, Nicole. So I'll go first. I'm in engineering and particularly using peer scholar in a large first year class, but have also used it in other classes. So here is our, our first year cohort several years back about 800 students or so. And it's a big group. Now peer scholar works fantastic with these large classes, but it also works really well in the smaller settings so and everything in between. Tim, if you can go to the next slide. This was the challenge we were dealing with the first year so the math doesn't work basically. And I know you're not supposed to put equations in presentations but I figured this one can slide in. Tim, if you can advance the slide there. So we had about two hours of spare TA time our TAs were heavily booked and doing different activities about 800 students and if you work that through that's nine seconds per student that that's not a lot of time to mark anything and so first year in our new program. We had an activity where we essentially just gave 100% for students submitting anything. And there was no follow up they didn't receive any feedback on their work. We just did quick spot checks to see what were they actually submitting and that's where it ended. Peer scholar came along and this was a great opportunity for us to really enrich the entire process from from multiple dimensions. Next slide please Tim. So, in our first year introduction to engineering courses we have three main activities that use peer scholar. The first one is a simple letter and students write text directly into the peer scholar tool. And to really get them familiar with the process but also to provide some meaningful feedback related to transitioning to university. I'm going to talk about the second activity there which is a technical memo that students write. But we also have a follow up activity after that that's even more complex, but the the technical memo gives a good sense I think of basic use of the classic peer scholar tool. So this is a four page formal written report that they generate with media with images and so on embedded so they attach it as a PDF. We're really looking to hit a number of goals. We want to improve professional communication we want to have them see the way other students have approached a technical problem and to learn from that. And then the act of giving and receiving feedback is something that's certainly valuable and we're hoping to develop. Next slide. So here is our basic workflow you can see that the CR that Nicole talked about in the middle of that's where peer scholar comes in but here we have about 800 students working in teams, each team generates a technical memo. We upload that individually into peer scholar so the create phase each student is responsible to upload their teams work. And then we go through the assess phase, we give about a week to do the upload about a week to do the assess. And then there's the reflect phase so essentially in the assess phase each student is anonymously reviewing two other teams memos. So in that phase they're looking at the feedback they received and evaluating that essentially. And at the end we do some post processing. I love working with numbers and data that come out at the end and there's some reasons for doing that. I should at this point before I forget mentioned that peer scholar set up so that if you don't want to get into a lot of post processing there's some really easy ways to just get each students grade at the end. And that will help you manipulate a lot of text boxes so if you want to customize the tool is great for that if you just want to use it sort of out of the box. It's good for that as well. Anyway, next slide. So I'll talk a little bit about each of these stages some of the nuances. So I won't talk about the creation of the memo that the students do that's particular to the class. In terms of using peer scholar, as I mentioned we have each individual teammate, upload the memo their teams memo. And the reason we do this is it helps us track each students engagement in the process, but it also means that each student is individually responsible to give feedback, and to evaluate feedback they receive. So let's create some challenges because we're not using the team features in peer scholar. So that means teams individuals could get their own teams memo back from another teammate, or they could get another teams memo twice for example so that that happens from time to time. A peer scholar does have the group work features we haven't been using them yet they're relatively new and we've heard of some kinks that are still getting worked out. But I see a switching over to that soon which will address this issue of getting your own memo to evaluate for example. Next slide. So in the assessment phase I'd say peer scholars really very flexible in terms of the types of assessments you can use. We use primarily a multiple choice rubric style valuation so we have about five different criteria we're using. We're using a four point scale you can set whatever you like. The next thing is you can put it put in whatever text you'd like so on the right hand side that's a sample of the text we use for one of our criteria. You can also assign whatever point values you want for each response. You can have multiple responses there's a star scales all sorts of different types of evaluation so very flexible in terms of the assessment side, and then we also include some some text boxes. So the evaluation of a student looking at another teams rubric and then if you go to the next slide Tim. You also get all those same tools for the evaluation of the feedback received if you want to include that stage so you can also turn that off if you prefer. And here we're using a game four point scales we have some text asking students to give feedback on how useful and an accurate. And the nice thing here is it provides a bit of a check that if a student does an evaluation improperly of another team's work say they're completely unfair, it gets flagged in this stage. And so we know which ones to go in and spot check and correct if necessary, it very rarely happens though. Most often we find that this just adds a whole layer of accountability to the assessments. All right, and next slide. So our post processing we go through what I go through and do all of this myself it can be done automatically in peer scholar. The reason I like to do this I like to have that know exactly what's going on at the individual criteria level on one student evaluating one other students work. So there's a lot of data that comes out in the CSV file if you go this route. You can mark for participation and look at the rubric scores and so on. But doing it this way it allows us to identify outliers, it allows us to find missing scores and so on. But there are some some challenges certainly if you go this route, it means you're dealing with a lot of data. And it's also a challenge for us, as with any peer assessment what do we do when somebody doesn't complete their assessment. How do we treat that, for example the individual who doesn't get any feedback on their work, they have nothing to evaluate in the third phase. I'll just quickly wrap up with the last slide and then turn it over to Amanda so for us. We found it, it's really easy to use peer scholar in both the remote and the in person settings. Essentially there's no difference it works very well in both the customization of the grading and of the summit types have been phenomenal. Another thing one of the reasons I do the CSV data file outputs, I like to look at different criteria because we use that for our accreditation purposes so we can look at how students are performing in different areas. And then, lastly, we've used it in all sorts of different types of assignments videos PDF files text files images everything you name it, and it's worked well in in all of those. We're really happy with it. It's worked extremely well in the large setting I've used in the medium setting and Amanda is going to speak about peer scholar in the smaller setting. Thanks. Okay, thanks very much Pete and then after I've shared a little bit of our experience we're both happy to take questions. We'll have the remaining time for a breakout session. So it's lovely to be with you all this morning. Thank you for having me join you. So I'm the director of the bachelor of medical laboratory science program which is a tiny program and the faculty of medicine that almost no one has ever heard of. And I teach three courses in the program. Next slide please Tim. Thank you. So as I mentioned it's a small program and by that I mean we have 14 to 24 students in a given cohort. I've used peer scholar for student peer assessment assignments in a fourth year course that I teach which is a course that introduces the students to research and also helps them to think about and further develop their job skills. So the student peer assessment in the course takes place. It's the classic version of activity and peer scholar, and we do it. We use it for two different written assignments. The first which I won't talk about other than to say that the process is simple. And as Pete has also talked about the advantage of having a first peer scholar assignment being relatively simple is that it allows the students to become acquainted with the tool itself and also to have a better understanding of what the expectations are of them in the whole student peer assessment process for the course. The second assignment is more complex and that's the one that I'm going to tell you a little bit more about that is a scientific abstract that's based on a progress report. So individual students are writing the scientific abstract and the progress report is the same progress report that all students use as the core content from which they need to decide what parts to include in their scientific abstract. I will just say that typically in our cohort they'll be anywhere from zero to perhaps two students who have any experience with writing a scientific abstract before having to do it in this course. So please, Tim. Thanks. So, there are of course, a number of learning outcomes for this assignment and some of them deal with the ability to summarize the scientific reports and write a scientific abstract and those aren't shown here. What I've shown you here are the learning outcomes that focus on the peer assessment piece. So part of what the students will be learning to do as a result of completing this whole assignment in all of its peer assessment parts is that they'll be able to evaluate scientific abstracts according to a rubric, provide clear constructive written feedback to peers, and consider peer feedback in order to improve and revise their abstracts. So that last one in particular deserves some emphasis. In fact, that's probably one of the big motivators for including student peer assessment in this course in that we recognize that in the program overall. Well, we stated that this was a program level learning outcome that we wanted graduates to be able to use feedback to improve their work. We didn't have very many opportunities for them to do so. So you'll see that embedded in this assignment. I put an asterisk beside the rubric with a little note to make a make a pitch. If you have not read this article by Linda Nielsen, I absolutely highly recommend it. She just makes so much sense. And she just does so much work in all of these areas. I found her paper to be particularly useful in providing ideas around developing feedback prompts. So the rubric for this assignment has six criteria. Each criteria has three rating choices with the description under each one. But to make it a more fulsome type of evaluation, the students also provide written feedback. And as we all know, the, there are varying degrees of skill levels within any cohort. One of the criteria is the clarity of writing. And certainly we know that there's a variety of skills in terms of writing. One of the issues that students have with peer assessment on writing is that they might feel like they have superior writing skills to appear who is assessing them who may have poor writing skills. So with that in mind, again, Linda Nielsen's paper really takes that to heart. And I'll give you an example of the prompts that we use for clarity of writing. The first ones that were the following one or two points were most confusing to me copy and paste the following one to two sentences were particularly strong or effective copy paste. So something that anyone can do and that provides feedback that is useful to the creator. Next slide please. Thanks Tim. So what do students do from start to finish for this assignment. In the middle of class, they read the progress report, and they submit a content list. So that gets them thinking about what content they would include in the scientific abstract. That is an accountability piece. So they just submit that as a document on canvas, and that makes makes it much more likely that they will come prepared to class in class they discuss their ideas for the content that they would include in the abstract, except just using a rubric on other examples of scientific abstracts. And then everything in the box is what happens in pure scholar, as you, as you've already heard. In our case. Three students are assigned randomly and anonymously to assess. Each student is is assigned to assess three students scientific drafts. When they see the feedback that they received from their three peers, they rate the usefulness of their peers feedback. And we just use a simple scale as you see here how useful was the feedback very useful useful somewhat useful, not useful. And importantly they need to write in the text to explain why they chose that usefulness rating. They respond to the feedback that they received and I will just say that on the next slide I'll show you exactly what that looks like, and they revise their scientific abstract. Well actually Tim can we go back to the previous slide. Thanks. So just to finish that off so they revise and submit a final scientific abstract, which the instructor me then marks, and then the students review their grades and the feedback that they've received from their instructor. Thanks next slide Tim. So, this is exactly what the students get in terms of their instructions for that response to feedback part. It's going to be half a page to a page on how they use the peer feedback to improve their draft, specifically listing two or more revisions based on peer feedback and explaining their rationale. And if they chose not to use some of the feedback they received less one or two of those items and provide their rationale for not incorporating that feedback in their final version. And next slide please. So I'll just finish with this to show you the weighting of the assignment, noting that in this particular assignment the student peer assessment is formative. So no grades actually come from that average peer assessment because that is their draft, which they are then improving and submitting a final. The final abstract, which is marked by the instructors were 60%. The peer review process, which takes five different steps, the students get marks for doing those submitting all five steps on time. So that's the creation of the, the draft abstract that's reviewing and assessing their peer submission. That is submitting their final abstract, the response to feedback and reading the usefulness of their peers scores so those five pieces on time completion gives them 30%. And then the quality of the feedback so that is the average utility or usefulness score that they got from their peers is worth 10%. A quick blast. And that's all I have to say in terms of sharing and I'm not sure how much time in the schedule we have for questions at this point. Who's being hard. So we have a question for Peter from Jeanine in the chat. Peter in the case of your class is the final assignment grade solely based on the peer review grade. Almost we. So we have team assignments, two evaluations per student and five students on a team so each memo is reviewed about 10 times we average those grades together and look for outliers along the way. But we also assigned some marks for participation. So the students get a mark for the to some small mark for doing their upload, which is a really simple one. They get some small mark for participating in the review process at both stages. And then they get some mark for the the review of the feedback they gave. So, so there's two different reviews there's one peer review of the technical work and the writing, and then there's another review that comes from the feedback both of those contribute to the mark. I think we will most of the mark is for the composite average team grade on their written document. So I think it was about 20% or so is based on sort of the participation side and then the rest is for the quality of feedback they gave. And so I should mention you can customize that within peer scholar you can automate that entire process so I talked about using a CSV file but you can set it up in peer scholar to say, you submit a piece of work that gives you four points this rubric is out of 10 points. If you engage in the peer assessment of others that's worth three points and it'll automatically tally all those, those marks, you can also have TA and instructor marking built right in in peer scholar that will also auto tally. Sorry, I think I cut someone. No that's okay I was saying thank you so. Jeanine, let us know if that answered your question. Thank you. Thanks. Thank you. Are there any other questions that pop up right now that you want to address. There'll be another opportunity later too so you think of something between now and then that's okay. And if there's not, we do have a little breakout up Jim. Can I just ask a question about how you handle the student who misses one of the deadlines and desperately wants to get back into the process. Thanks Jim. Amanda I think we do similar did you want to. You can set it up in peer scholar to allow for late submissions. And you can set up at the same time that a late submission that they automatically get docked the mark for completing that step on time, but they are still able to get get back into the process that does have some limits you have to set the time about when the deadline is solidly because there is a time deadline at which it will internally do the randomized selection of who reviews whose. In our case we normally will set a deadline, and then the late submission deadline is 24 hours later and we tell students, your assignment is due Monday at 8am. There are no submissions until Tuesday at 8am, but nothing thereafter and the the that hard deadline is an important one because at that point it's now distributed all the work to others. So the peer work has been distributed amongst the group. I think there's some work Tim you may know more about accommodating kind of these late submissions but I think just in any peer review process once you've distributed things out once you hit that hard deadline. It's really important to get a student caught back up. And I also think that that's fair. I mean that is a hard life lesson. There are deadlines at after which it's, you know, you don't get to do something. Yeah. That's really good advice. Yeah, I think it's stretching a lot to ask a peer assessment tool to be smart enough to figure out what to do if somebody submits after reviews have already started and certainly if it's after reviews have already finished. That's definitely too late. Doesn't matter what type of tool you're using. I think peer scholar is the most advanced as far as being able to accommodate late things that I've ever seen from a tool. So I would say that much. Yeah. Good points. Jeanine has another question. Any concerns or challenges from an accessibility standpoint. For example, students who have accommodations. Has anyone ever had any students register with the accessibility center for accessibility. Come to them with any concerns. Amanda or Peter. We certainly have lots of students registered with the center for accessibility. But I cannot recall a single case related to peer scholar. I think it might depend to some degree on the type of assignment that's being used. And the nature of the, the disability. But it hasn't, it hasn't come up for us. We try and stay really open to the students and certainly we accommodate in all sorts of different ways throughout the course. But yeah, I haven't seen anything in peer scholar that's come up. I think it's the same with me. Yeah. Great. And Jeanine, if you have any specific, like, examples or questions, feel free to ask about that. Good to know. Thanks. Another question from Brian. Are all reviews textual or are there audio, video, etc. So I know a little bit about that. I know that you can do several kinds of media. So Amanda and Peter, you will, you just deal with the texts. Is that correct? From our context, we want the data that comes out at the end. So we want the rubric based scores that we can then turn into grades and accreditation criteria and so on. But it is possible to, there's inline commenting in text. It's possible for a reviewer to add an attachment. We've been playing around a little bit, not so much in peer scholar, but in compare with using Kaltura. And so students could do a video response and then, and then link or provide that. But that's not the main way we use the tool. It's possible, but it's not, not our main use. In terms of annotations on the product. Yeah, so it's possible. Certainly in the text in peer scholar, we have used that feature where students can annotate piece of text that's been uploaded. If it's a PDF or another file, I don't know for sure. I don't think there's any mechanism within peer scholar to do that annotation. Directly they would have to retrieve the file, do the annotation outside of peer scholar and then upload the attachment back in. It certainly has that capability, but I'm not aware of any annotation other than directly into text that's been entered into peer scholar. I believe that's correct to Pete that and that's why I have the students submit as text within peer scholar itself because it's a way that the students can give each other in line comments and that seems to be really quite, quite valuable. Great. Tim, do you happen to know if any other classes instructors do anything with audio or video exclusively in peer scholar? Not to my knowledge. I think most that I've worked with are using either the rubrics or the open ended text based question and response to prompt the feedback from the students that's the vast majority. And similarly, the vast majority are using the classic activities. So I worked with a lot of people on the group side of things yet, but we did. We have been testing that on our end just to make sure it actually works and seems to be okay. Great. Thanks very much. One quick note to add. The first few times we were using peer scholar, we had them uploading videos and the videos could be quite large. So it seemed to handle large file sizes. I forgot the exact cutoff we had, but it was 50 or 100 megabytes or something quite substantial. We've moved away from that and there's many other options now for handling those large videos, but it did. It was the system was capable of handling large amounts of media. Good to know. Awesome.