 Hey, everybody, thanks so much for attending. We're just gonna give it a minute or two so all the participants can stream in. It looks like the room is filling up already. All right, folks, we're gonna give it another minute for people to come in, but thanks so much for attending. All right, and so let's get started. Welcome everybody to Liquid Margin's number 44, marginal notes with major impact, boosting comprehension with social annotation. Super excited to have you here. Also wanna let you know what a couple of events we've got coming this fall. On November 2nd, we've got our next Liquid Margin's social annotation in D2L Brightspace. And before the end of the fall semester, we're gonna have another session leveraging social annotation in large classrooms. So if you've subscribed to this webinar, you will get an invite to the next. I hope that we can see you there. And I don't forget to go to bit.ly slash Liquid Margin's to always stay on the list. If you have any questions, this is designed to be interactive. We'd love to hear it. So just make sure that you drop your questions in the Q&A section of the bottom navigation bar. Panelists will be answering your questions as we go, or you can also hold off for some Q&A at the end. So that's just a little bit of housekeeping. We've also got closed captioning enabled. So if you need to see it, just remember to enable via the closed caption icon in the Zoom menu located at the bottom of your screen. And so thanks so much for attending Liquid Margin's 44. I'm Joe Ferraro. I'm our VP of Revenue. So we'll receive our commercial teams. And I'm super privileged and lucky to have Nick Denton from the Ohio State University's Pharmacy Department. Welcome, Nick. Thanks for joining us. Thanks for having me, Joe. Awesome. And we've got a lot of really great information that we're gonna share about Nick's experience using hypothesis in some of these pharmacy courses in just a few minutes. And I actually will kick it off here. So I'll let you take it away, Nick. All right. So I will start sharing my screen or I think you need to end your sharing first, Joe. That's more like it. All right. We see the slides again. Excellent. So again, thanks everyone for joining. Happy to share some of our ongoing research on how we're leveraging social annotation to enhance undergrad student primary research literacy and professional identity. I am one member of a research collaboration spanning cognitive psychology, cancer research and pharmaceutical science at OSU. So I think we can all appreciate as educators and content experts in our various field how important the research literature is to us for being continuous learners and how we can have our students benefit from that experience as well. Not only is that research literature necessary for us to stay current in our field, it helps us again with our critical thinking skills being able to support our ideas with empirical evidence. We've also shown in previous research that students that undergo inquiry and content mastery in our field also get a boost in their science identity, their ability to identify themselves as a pharmacologist or pharmacist or so on and so forth. And it also similar to how we use the research to formulate our own research investigations, students can also benefit from that modeling and launching their own investigations, their own innovations in our field of inquiry. And also bonus points, if you're able to incorporate research literature, that is from your own university. That way students, if they learn into a research paper that particularly interests them, they can then go upstairs and talk to the very researcher that perform that research and possibly get involved in undergraduate research experiences. Now, the challenges to introducing students to the research literature is primarily that the primary literature is not made for content novices. It's again, a scholarly conversation that is ongoing. There's a lot of reference to previous research with a lot of heavy jargon filled language, a lot of unfamiliar methodologies that we just do not have the word count in our publications to properly express to a content novice. And the result of that is that students that are engaging in a research paper in a field that they are not familiar with, they can spend an inordinate amount of time reading that paper because as a content novice, you are having a hard time distinguishing which are the most important points of that paper. Everything in that paper looks equally important when you're a content novice. So you can easily spend eight hours trying to read a paper from cover to cover if you're in a field that you're not into. And you can get through that with very little retention or recognition of the novelty that paper was trying to convey. Now, one of the solutions is to get undergrads involved in research opportunities. So that way they gain the expertise to better digest those research papers. But the problem with that is that there's only so much opportunity to go around for undergrad students. The undergrad research opportunities are often reserved for capstone projects or upperclassmen experiences. And they are also disproportionately inaccessible to our underserved student populations that may be having travel barriers, financial barriers, family obligation barriers that are very real as well as perceived barriers of that science identity that we mentioned earlier, which disproportionately is underdeveloped in our underserved populations of students that haven't had the same opportunities to develop that science identity. So the solution that we're trying to make to introduce the research literature to our students in a way that is more efficacious is to use social annotation. And if you want to scan the QR code on the slide, that'll take you to an open Canvas page where you will be able to look at the assignment that we made for an example and also launch that hypothesis platform where you can look at some of the annotations that I copied from our project with one of our courses in cancer research. And when we use hypothesis social annotation, it's typically used to engage students just by having students informally, maybe make a couple of posts or something of that nature, maybe put a question on there for their peers to ponder over, but we wanted to take a bit more of a structured approach to it where we're incorporating not just the student annotations, but we're also incorporating the content expert annotations to it. And what we're trying to do is three things with our socially annotated research paper. The first task is for students to answer an assessment question that is seated by the instructor. And the purpose of these assessment questions is to help students focus their attention on a particular area of emphasis that will come up in the in-person lecture while they're doing this pre-reading on hypothesis. And again, what we're trying to do here is just narrow the gap enough in a concept where students can then answer it with their own expertise, their own knowledge, without having an insurmountable gap of a cognitive gap in order to really comprehend what is going on in this research paper. And the second task that we have students perform is to post their own question on the research paper. And we'll tell students to maybe share like a murkiest point in the paper, something that they may be confused on, something they want some more clarification of. And that then pairs with the third task where students have to reply to one of their peers' questions. So what this allows us to do is that it allows, again, the instructors to add enough of their expertise to the assessment questions where students can then apply their own knowledge to answer that assessment. Students can then also exchange knowledge between themselves because we might have a student in the group that has particular expertise in this area of research or this particular methodology or this particular conclusion that they're trying to interpret. And we want those students to have a chance to shine in that hypothesis assessment where they can help answer their peers' question. And then the beauty of it is that with hypothesis, this is all Canvas integrated. So again, you could technically do this on like a Google Drive or something of that nature, but you'd spend probably an inordinate amount of time trying to comb through those student assessments and then grade them and then integrate them into your LMS. So that's why I personally prefer a hypothesis for my social annotation work. And what you'll see here is an example of what this looks like. If you scan the QR code, you'll see even more examples. But this is what we're getting from the assessment here where I first add a limited number of respondents to that assessment. So all 20-plus students in my class can't answer the same assessment question. I'm only going to accept the first two respondents to this assessment. And then we can already see some of the replies and conversation going on there. But also we have the students that are marking their own question. And you can see in this interaction on the bottom right, one of the students may have been a little bit confused on how the bone marrow produces macrophages or where the macrophages are coming from the bone marrow. And then another student was able to clarify that confusion for them, save the instructor a little bit of time and help your fellow student overcome that bottleneck in their understanding of the paper. But this is where the real beauty comes into play is when we then go from the pre-reading that's on hypothesis and how we apply it to the in-person lecture because now we can model a journal club. That's honestly the ideal journal club in my opinion because what I'm able to do is that while I am going over the figures and the methodologies on my PowerPoint slides, I'm also able to put the hypothesis up on my tablet and I can scroll down and I can warm call students into the conversation. So what that might look like is that I may say, hey, Joe, you had a really great answer to what a Western blood is. Can you share that answer with the class and can you share with us how the researchers are using it to answer this question? That is a deal more engaging to students than cold calling Joe and saying, hey, I need you to answer this question about this particular sentence of the paper that you might have spent five minutes reading but you read the whole paper cover to cover didn't put a lot of emphasis on it, I need your answer now. That's not a very psychologically safe way to engage students in a journal club situation. So instead by using the hypothesis to warm call students into the conversation, we're able to get pretty much 100% engagement where by the time we get through that journal club, everyone in the class has contributed to that discussion. And this is also what's called a jigsaw pedagogy because whereas the students on the pre-reading are really focusing on their assessment question, now we're able to bring all those pieces together in our conversation in person as we go over the paper as a class. And as the instructor, I also get the additional benefit from those questions because that provides formative feedback on any murky points that I may have not realized was in the paper because as content experts, we may be blind to some obstacles, some knowledge gaps that students may experience when they first read the paper that we may not have thought to address in our lecture but now those are very clear to me when I go over the annotations prior to lecture, I usually say like maybe an hour before lecture to read through the student annotations and that provides me with an idea of hey, there was a little bit of a hold up on what the Eliza assay is. So I'm going to spend a little bit more time to cover that because there was a little bit of confusion there. Now everyone's back on page, let's get the conversation rolling. And this honestly, in my opinion is the model for what a professional journal club should be. I have also done this in high flex environments as well where I can have students that are in person and during the lecture and also students that are on Zoom having the microphone in the lecture hall and also speakers for the students to report out their conversation. And this way to facilitate a journal club has worked streamlessly in those environments where I want to be engaging students that are not only on our main campus but also on our regional campuses in this course. So that has also been a game changer. And so Nick, there's a question in the Q&A actually too. First is would you consider a journal club the same as class discussion of an article or do you frame it differently? So the way that I would probably frame that because previously when we didn't use social annotation we used discussion boards for the pre-readings and we would give a very generic prompt to choose one of the figures and give us your interpretation of that figure or something of that nature. And it did not really work out for when it came to the in-person discussion of that paper where if I showed the slides up and I say, okay, what's going on in this figure? What is the method that the researchers have used? What are the controls in this experiment? What are the results? What's your interpretation of those results? If I tried to ask those questions when I gave the paper as a discussion board I would typically have almost the majority of the class giving the same interpretation of what they perceive as the easiest figure. Whereas with hypothesis I'm able to spread those assessments out and when I then ask those questions of, okay what is this Western blot the researchers are using and how are they using it to answer this hypothesis? That's kind of the way that we're facilitating the journal club in our in-person lectures. And so, I mean, I hear this a lot from instructors across the country. There's a real disconnect between the discussion board and the actual discussions that are happening in class. So this is a good way to really thread that needle. Right, right. So one thing that I do in particular when I'm like setting up my hypothesis assignments is that I'll actually set those up in parallel to preparing my lecture slides. So that makes it much more time efficient for me because as I'm preparing the lecture slides I may think to myself, okay I want to make this part really clear to the students but instead of making a mental note for me to explain that I'll instead make an assessment question on the reading and have the students start that conversation. So I would say when we're in-person I would say about 10% of the class is me just warm calling the students into the conversation. 90% is the students sharing out their findings and interpretations on the paper. Great, thank you. All right, excellent. And if you haven't had a chance to go onto the hypothesis assignment already here's another copy of that QR code. This is around the time where I would do like a kind of a waterfall reflection which I think we can use the Q and A for. So how about what we can do here is let's take three minutes for our folks on the Zoom to reflect a little bit on where do you see this form of expert guided peer collaborative social annotation contributing to either your teaching as an instructor or even your scholarship as a research advisor. Let's take three minutes to reflect, write your answer in the Q and A chat box but don't hit enter yet. And I'll give you all just three minutes. Guys, what educator would I be if we didn't have some active learning in this talk? All right, just one more minute to wrap up your thoughts and then I'll give you all like a three, two, one and we'll hit that enter button. All right, three, two, one and enter. And now we got a waterfall responses. So I particularly like this activity for those of you that teach online quite a bit and you don't want a flood of first response to a question and then a whole bunch of, oh yeah, I agree, I agree, I agree, I agree. This way everyone has a chance to kind of have their voice heard. Excellent, excellent. I kind of found warm calling to be really effective because once students realize you will read out their annotations in class, they invest more in them. Yep, absolutely agree there, Eric. I would also say there's probably a little bit of a friendly competition in my class when I do that because if Joe again has a really great answer and I call out Joe for having a great explanation for that, then that might make Amy here motivated to say, wow, I wanna really up my game so that way Dr. Denton calls out my great answer. So it's more of like a positive feedback than as opposed to kind of a toxic rivalry, I would say. So I really like the warm calling aspect of it as well. All right, first year undergrad students are going to read textbooks properly, get the main summary points through assessments. Excellent, excellent. Yep, yep, definitely. Also kind of that confirming insecurities for sure. Particularly with those questions, I find students would be surprised at how many of their peers were also stuck on that same question. So that gives them a little bit of validation that hey, having questions, having points of confusion, that's all a normal process of reading the research literature. So I definitely agree with that too, Amy. All right, it could be helpful for involving students in research and familiarizing them with the background literature. Absolutely, Amanda. I appreciate that the points are awarded at different amounts, more for engaging with the professor's question which somehow builds more high stakes than less for the unquestions and response to others. Excellent, excellent. Yep, I agree with that too, Leslie. I do kind of weigh the points a little bit heavier on that assessment question. Just so that again, the students will probably invest more time in their contribution to the main discussion. I might not have time to get through all of the questions that students pose, especially if some of their peers have already answered that question or it was a question that may have been very unique to one student and maybe not a global misunderstanding in the class. So I agree with you there, Leslie. All right, learning the layout for peer reviewed articles. Yep, I particularly like that too, Anonymous. It's part of your online teaching certificate. Yes, because we are also experimenting with this type of journal club and not just the undergraduate courses but also some of our graduate seminars, some of our FarmD programming. So we're getting into that space as well where it can be used as a professional development tool. Excellent, excellent. Nursing professor here, social annotation will be helpful for novice, student readers, of primary source research to inform evidence-based practice. I teach in an online EMP course and the research language is a barrier to learning. We'd love to incorporate this method. Absolutely Tracy, also a big proponent of evidence-based practice in our pharmacy students. So trying to get them as literate in the research and being able to really get some objective reasoning for why they are using this particular medication or doing this particular practice just empowers the students all the more. Excellent, excellent. And thank you all for participating in that waterfall. If there are any other questions, I'm sure Joe will help facilitate those as well. But you don't have to take my word for how it's going in the course because we did a few different analyses of some of the data that we generated from our courses, which again spans our Pharmaceutical Science Research Survey, our cognitive psychology course and also our Pharmaceutical Sciences and Cancer Research courses as well. And we want to see whether or not the students were really demonstrating a higher level of comprehension in those assessment questions when we put it on the social annotation platform as opposed to when we gave a more generic discussion board. And to rate this level of comprehension, we got a panel of other research advisors together that had experience in facilitating journal clubs with undergraduate students, graduate students, postdoc students. And we sat down and had a discussion on what kind of the expectations were for that research literacy comprehension as we move along the training pipeline. And we definitely said a few main points there were being able to maybe give a complete and correct answer for maybe an undergrad, upper-classman. And then as we move on to the graduate student, we really should be able to demonstrate some outside source material was used to facilitate that understanding as opposed to fetching an answer from just that paper. And then the postdoc level is where we really expect to have that high-level synergy involved where we're getting information from the initial reading and incorporating it to other readings to better facilitate their understanding and come up with some novel insights to the paper. And what we found was a pretty dramatic increase in that comprehension level where previously it was right around where we expected where a class that was about half underclassman, half upper-classman, we got right around that level of comprehension where people were kind of fetching answers from the paper, more or less correct but not really applying that information elsewhere to where to answer those assessment questions it really became the nor for students to get an outside resource to better their understanding of that topic to fully and correctly answer that assessment question to the point where we had maybe about 25% of the class hitting that kind of upper-classman, maybe new graduate student level of comprehension. And now it's the majority of the class, well over 50% that constantly reached that graduate level of understanding in those assessment answers. And before where we had no students reaching the postdoctoral level of understanding consistently, we had about 3% of the class consistently taking the information from the paper, incorporating it with other papers that they read in order to understand that assessment question and coming up with new insights to that research. And we were also able, because we again, took a lot of the initial barriers of the research reading from the students is that we were able to better, we were able to better conserve their energy as we continue on with the course. So whereas in our previous courses, most students were getting kind of burnt out after like the four for fifth paper that we went over. Now we were able to incorporate research papers throughout the entirety of a full semester course. And as we continue the course, we saw improvement in that comprehension, increasing the levels that we were not able to achieve in just a discussion board pre-reading where students were getting burnt out much earlier on. And one of the questions that we're trying to answer currently is we notice that of the assessment questions where students are able to achieve that highest level of comprehension, these are the questions that seem to be engaging what's called causal mechanistic reasoning. These are the questions where we are again introducing the students to a phenomenon. So if we're talking about a figure in a paper, we might introduce them to, hey, describe what a Western blot is. And then identify and unpack those factors so students are thinking about, okay, what is the procedure of a Western blot? And then we bring the question back up to the target phenomenon. So that could be the second part of the question and how are the researchers using it to answer this question? So you're not going to have the same outcome. If you just say, hey, interpret a figure on this paper, that's the assessment question, it's not going to work that way. To really engage students in this level of comprehension, you need to involve some kind of causal mechanistic reasoning. You need to introduce them to a concept, break it down for them, and then bring it back up into the paper. So describe what a Western blot is and how the researchers are using it to answer this research question. Is the type of question format that we see is most effective when we implement in our assessment questions? And so Nick, this is, I mean, this is showing some really dramatic improvement and I'm sure it enhanced the classroom experience pretty heavily. We do have a question from the group, were the students aware of the rubric and given these expectations or were these results actually pretty spontaneous once you brought the tool in? Oh, I am definitely a proponent of transparency in learning and teaching or tilt. So I make it very clear on that assignment page, that's the same assignment page that the students are seeing. So every assignment, they got the purpose, they got the task, and they got the rubric at the end. So it is true that students may be playing to the rubric but if we design the assessment in such a way that that task is having them go through the activity that we want them to perform and that assessment matches the learning outcome that we want to see, then I say that's perfectly fine in my opinion. And it was the same learning outcome that we were trying to get with the discussion board. It was just instead, it's a posting a discussion, posting a question and then answering a peers question on a discussion board. So it was the same activity but by moving it to the social annotation modality, that allowed us to make a lot more leverage of that causal mechanistic reasoning and achieve these higher levels of understanding than the students. Okay. And I know there was another comment just about research from our Wolf in 2000 about annotations. So you think that pre-annotating some of these with questions is sort of leading the witness or you think it's just driving towards the learning outcomes that the course should have to begin with? That's a good question. I can see where by focusing on some points of emphasis, we may be, again, focusing the student attention on one specific assessment. But I would say that if you frame the question in a way that is directed but open, then we're able to demonstrate that level of student understanding. So if I again, use the example of describe what a Western blood is and how the researchers are using it to answer a research question, you're not gonna be able to just copy paste that into chat GBT or something like that and get a conceivable answer. Students might be searching what a Western blood is on chat GBT or Google or what have you. But in order to leverage that application of knowledge, they really need to be able to dig into the paper a bit on their own to come up with a conceivable answer in my opinion. And I know you mentioned specifically with the discussion board itself and the pre-reading leading to burnout. And I can imagine looking at a bunch of these articles and trying to make it through when you don't have a ton of background knowledge you might need, what did you do before? Was there an intro to Western blood before they started to do the research or was it expected that they would know about it? Right, right. So typically what we would do is that we would break the paper down into two parts. First part would be typically the intro and methods to the paper where we really go into some of the background about okay, what is the problem we're investigating? What are the methods that are being used to investigate this problem? And then for the second part we would go into the results and the conclusions where we would be looking at the figures interpreting the data coming up with the conclusions and next steps to the research. So I would say that for the discussion board it would have a similar flavor to it where we would have students again for that first half go over the background and methods and then for that discussion board afterwards where they're interpreting the methods that would come after that first lecture. Okay, thank you. So back to the research is that after we first looked at that comprehension and we saw increased comprehension of the students we also wanted to get a better idea from the student perspective about what was working with the social annotation. So we first start off with compared to reading research articles on my own the social annotation strategy used in this course gave me a better understanding of the research literature. We gave that as a likert question and we got overwhelming agreement from the students. So that was just under 90% either slightly or strongly agreed with that statement. And then when we asked students to explain why they agree or disagree to that statement we found that the majority of students really liked the exchange of information with their peers. So interacting with peers on those questions and reply to questions really helped overcome those knowledge gaps in reading the paper. And we also saw some similarity in a couple of these questions here about breaking down the article into sizable chunks and having some expert guidance on focus points also being positive peer pressure that I mentioned before where Amy may want to ramp up her reply to the assessment so that way she gets praised like Joe was from Dr. Denton and so on. And honestly, the answers that we got from the folks that neither agreed or disagreed their answers were along the lines of, hey, you're the instructor you should be telling me the information I shouldn't have to be figuring it out on my own which I personally don't see as a problem because I want my students to feel empowered to find the information from the research paper and not being relied on spoon-fed information. Now the real struggle lately has been figuring out how to assess that increase in engagement we see because we again sat down with a group of experts and we tried to figure out what is the level of engagement that's expected at these different levels of training from undergraduate to graduate to postdoc and we definitely saw very much what we've seen previously where engagement was just pulling teeth from undergraduates when it came to answering these journal club prompts or maybe there was just one or two students that provided the majority of the engagement in the course. A lot of these prompts were just met with silence until one of the few motivated students kind of piped in and now for all of these journal club sessions we get full engagement. So we get full engagement at a graduate level of engagement in these journal clubs since we've incorporated the social annotation and a lot of times we even reach a postdoc level of engagement where not only are the students answering the instructor prompts, but we're also having student prompted discussion in that in-person journal club. So that has been definitely one of the most dramatic changes in my opinion since we implemented social annotation into our research article readings is just how much engagement we're able to get when students are again being warm called into the discussion and then being able to facilitate these higher levels of discussion between their peers as well. And I just included a quote from one of our peer assessors that sat in on one of our courses to give a teaching evaluation. And they agreed that this was a level of engagement and understanding that they had not seen before in undergraduates. So that has been very promising to see in our bachelor's of science and pharmaceutical science program. And then finally, the last research question that we wanted to determine was after the students had gone through this course they engaged in the research literature has that contributed to a sense of science identity that is predictive of student persistence in the sciences. So we used the experimentally validated and unfortunately acronym PITS survey for persistence in the sciences. And at the beginning, before they done any social annotation and afterwards post-social annotation we surveyed students on this questionnaire ranging in questions related to project ownership content whether the students feel like they have ownership of their engagement in the literature and their interpretation of the literature to project ownership emotion whether they feel excited or energized about reading the research literature their self-efficacy whether they feel they have the skills to fully engage in the literature there are science identity whether they see themselves as scientists there are scientific community values whether they see themselves belonging in the scientific community and also their networking whether they are taking what they learned from the course and talking about it not just to the instructor but also their peers, their roommates, their family their friends and we see across the board in most of these categories significantly higher scores on student engagement in these factors that are predictors of student persistence in the sciences from their undergraduate program to completing their bachelor's degree to going on to graduate school or professional school and so forth. And one of the other interesting finds here is that as we striate the student demographics we also found that there was an equitable gain in these pit scores for our underserved BIPOC, black indigenous people of color students as well as our first gen students as well had a narrowing of that opportunity gap in their total pit scores even though their white counterparts also experienced a significant increase. So everyone benefited in their pit scores but we narrowed the gap a bit for those underserved student populations. And of the particular categories we saw the most gain was in that self-efficacy. So students that may not have had previous opportunities to engage in scientific research or engage in the scientific literature found themselves much narrowed in that gap on the self-efficacy where they were able to catch up with their majority peers after they were engaging in this social annotation work more so than we saw in the discussion board pre-implementation. And I know you mentioned earlier students being able to see themselves as pharmacologists, for example and I think that these results really show that helping them realize you're not the only person that has the struggle with understanding some of these advanced concepts. So would you say this can, I think Amy in the Q and A mentioned imposter syndrome and normalizing the struggle as a whole would you say this really does drive that? Absolutely. That's the main point that I make when I'm facilitating the journal club discussions is I'm not asking like, hey, does anyone have any questions? I'm asking what questions do you have? This is a normal process. You should have questions. If you do not have questions then you probably didn't understand the paper enough to know what questions you should be asking. So absolutely normalizing that process. And I mean, you may have said this already but Leslie wanted to know is there a sweet spot in terms of how many folks you'd have in a journal group to get this type of engagement or can it work across large and small classes? Excellent question. So previously we've been using relatively small classes maybe about 20 some students and I find for like an hour long journal club that's about the sweet spot for I can engage like the entire class and make sure that everyone is speaking out at that time. However, we have also had a large enrollment in our courses lately where now we have about 40 plus students in. So I think word got around a little bit that people have enjoyed the course and we've actually done in Canvas we put the students into groups. So we have like a journal club group A, journal club group B. So that way when we use hypothesis and incorporated into Canvas we're able to assign that as a group assignment. So now when we have the paper we have group A engaging each other on their own hypothesis, social annotation platform of that paper and group B is on a different platform. So I find that's about the sweet spot where we're not having an incident of I need to have like a maximum of four respondents to each assessment. Otherwise you just get a whole bunch of yeah, I agree with so and so that answered first situation. And then when we go into that in-person lecture I'm finding that the discussion is still very rich because we're able to combine the insights of group A and B in that in-person discussion. Got it. So your recommendation would be whether the enrollment is 20 students or 100 students find ways to chunk them into smaller groups so that they can get the maximum experience. Yeah, yeah, yeah. I agree. If you can chunk your students into groups of like 20 or so I would say that's probably the sweet spot. Great, thank you. All right. And I think that's all I had. So are there any other questions I can answer? You know, I've got one from Heather. How do you pose a question and the limit the number of responses and hypothesis? She particularly likes the idea that it would reward some of the early readers. Excellent, excellent. So if I'm understanding correctly so how I pose the question would be that if there's multiple respondents I might first call out like, okay, Joe, you had a real great answer to this question Joe will give his answer and then I may say, Leslie, do you have anything to add on to Joe's answer? So that's how I would probably pose on to assessments that have multiple respondents. And absolutely, I agree that rewarding the earlier responders has definitely been very nice because when we did the discussion board it very much rewarded the late responders that after maybe one of the highly motivated students made that first post on the discussion board there was just a whole bunch of floods of, yeah, I agree with so and so but when we break down multiple assessments with a limited number of respondents that really motivates the students to go out and find the assessment that they can answer before the maximum number of respondents is reached for that question. All right, any other questions that I can answer for y'all? All right, if not, I just want to again kind of leave this as an open invitation for collaboration. Again, we have implemented this kind of social annotation designed in cognitive psychology, pharmaceutical science, cancer research courses and we have again some evidence that this causal mechanistic reasoning is a potential mechanism for facilitating that student understanding but we are still very interested in hearing from folks outside of OSU in other areas outside of the health sciences and see how best we can further polish this teaching pedagogy. Thank you. No, thank you. I mean, this has been super informative and it's really interesting to see. I mean, these are pretty heavy articles and so for students to be able to come in and not just see an increase in comprehension but also want to stick around with the program and not quit afterwards. I mean, I think one of the stats I've seen most recently right around 40% of STEM majors either switch majors or drop out and those numbers are increasingly higher among female and underrepresented populations and that has a big issue. That has big implications for the institution but also for the industry. So, I mean, if you were to connect the dots do you find just getting them more involved with the content earlier helps them see themselves persisting? Absolutely. We just done a few informal polls in the course about their level of engagement in undergraduate research and we see that in the beginning again there might be those like two or three students that are highly motivated that are already in research when they enter the course but by the time we end the course and we survey them again we have seen a significant shift in students that are now from the point of I don't really know what undergraduate research is to being able to at least recognize some opportunities at OSU to get involved in undergraduate research and also a significant few that have moved into research since they started the course. So again, undergrad research is one of those high impact practices where we can definitely form that science identity and get students retaining in our field of interest whether that's finishing their bachelor's going off to graduate school going on to professional school we're still collecting that data. And we did have one last question and interested in the reduction of the number of jump responses on annotation assignments. I can assume they're used to saying I agree but by the end of the semester that's not gonna fly anymore. Yes, yeah, absolutely. Again, when we had the discussion board it was just that flood of I agrees which definitely did not demonstrate their comprehension very well but when the expectation is that you are going to be called on in class you know what you're gonna be called on but you need to have something ready then I think that really motivates students to not put a junk reply out there. Okay, it certainly would have motivated me I was guilty of agreeing with everything. Hey, me too. All right. And I know that the team did share a link to the case study in the chat and wanna thank you so much Nick for taking some time after all the work you put into research talking about it with our audience so really wanna appreciate that. And we do if you're not a customer this is a great time to get started. Mid-year is a great opportunity to test out new technologies especially for your spring students who either may be transfer students or might be repeating some of the courses that they need to complete their graduation requirements and we do have a great spring started promotion with the offer you receive discounted pricing for first-time users as well as workshops and training for faculty and you do have the opportunity to get certified in socially annotation with our Hypothesis Academy. If you're currently a partner we do have our next cohort Hypothesis Academy running in just two weeks that social annotation 101 teaches you the basics and you'll actually be working in your own LMS. So you can actually get a jump on the spring courses and get some of your assignments set up we'll be sending some more information about this and they email with the recording but I mean with that wanna thank everybody especially you Nick for taking some time today and learning more about how social annotation can improve students in undergraduate research and really enhance engagement in the classroom experience overall. We will be sending this recording if you had to drop earlier weren't able to make it and feel free to reach out to us at Education and Hypothesis or Nick in the contact information that he shared but look at this we've been finished five minutes early so we will let everybody have that time back in their day and hopefully we'll see you on the next session in early November. Thanks so much again Nick and thanks everybody else for attending. Thanks for inviting me Joe and thanks for everyone for joining in. All right, fantastic. Have a great day everyone.