 from everybody to this presentation at the annual conference. I'm really delighted to introduce you to our next speaker today, Dave Cattrell, who's joining us from Hong Kong from Technic University. And I know I've actually been to your institution a few years ago, Dave. So what a beautiful place in the world to join us from. And please do say hello in the chat. We'll be monitoring the comments for Dave throughout the presentation. And there'll be time for questions and answers at the end. Dave is going to speak to us this morning about extensive learning in health education, designing and implementing video annotated peer feedback in a post-graduate nursing course. And I believe Dave, you represent a whole team of authors for this presentation. Yeah, that's right. Okay. So everybody, thank you very much for joining this session. I'm actually coming to you from Bath, where I'm now living, but I'm still working for Hong Kong Poly U. And as Marin said, this research is derived from a project I lead with a team of researchers, educational developers and academics at Hong Kong Poly U. The context for our research is a course called Innovations in Learning and Teaching for Healthcare Education. It's part of an MSC in nursing. And the students were all in-service professionals in the Hong Kong Public Healthcare System. Their main learning outcome was to carry out independent research, share their findings in a petrocucha and engage in peer feedback to develop their ideas into a case study. Two years ago, when the course started, students did this activity in person, asking and answering questions in real time. But last year, when they tried the activity again, they used a video annotation tool to give peer feedback online. And it's this use of the tool that is the focus of our research. Students can use video annotation tools to record themselves, engage in peer feedback asynchronously, and then add timestamp comments at specific points in a recording. So they could pause the recording here and then add a comment, which is linked to that point in the recording. And it can lead to a series of comments or a discussion around specific points in the recording. The main conceptual framework for the study is external versus internal feedback. External, where students use information from outside to enhance their learning. Internal, where students evaluate and comment on other people to generate feedback on their own performance and develop the skill of evaluative judgment. In the literature around video-based peer feedback, one theme is that feedback from peers is more effective if it's combined with tutor or self-assessment. By making the activity graded, it can motivate students, but this has to be done very carefully. Cognitive scaffolding in the form of rubrics, examples, or guidelines can be critical as can socio-effective support and a kind of informal dialogue around feedback to build a sense of community. There are not that many studies of peer feedback using video annotation, but studies that exist find that the approach is more evidence-based and more deliberate and solutions-focused. Students spend more time on task. There's less cognitive load because they can stop the video and focus on different points one at a time. It leads to more authentic collaboration because the environment and the tool is less face-threatening than having to give feedback in person or making a summative comment at the end of the video. But most studies that exist are experimental. They don't really look at socio-cultural factors and how they shape the design and implementation of feedback. And they rely on self-reported survey data. They don't really look at what students actually do using the tool. They don't look at the system data or the annotations that students make. So, excuse me, for our study, what we felt was needed was a systemic analysis of how the tool changes the nature of peer feedback. So our questions were, how can the design and implementation of video annotated peer feedback remediate or reshape an entrenched activity system of giving a presentation and responding to a presentation? And then what systemic contradictions are there in the design and implementation of this approach and how could they be overcome? For the theoretical framework, we used activity theory and we used Scanlan and Isheroff's five criteria to evaluate learning technology use, which look at interactivity, efficiency, serendipity, cost, and failure. I'll talk more about those later on. So why is activity theory such an excellent tool for understanding complex uses of learning technologies? Well, first of all, it looks at how a subject or students work towards achieving a learning outcome is mediated by a tool. The tool could be something like computer software, but it could also be use of language or it could be something like a rubric or an example video. Then interactions between the subject and the community of instructors and peers are mediated by rules, which may be formal or informal. So it could be something like the closure of the university, the move to online learning, or it could be social rules about how people interact online or in person. So this is very important. And finally, the community's actions in working towards the object or learning outcome are mediated by the division of labor. So who is expected to do what? Are students expected to be quite passive or are they expected to be more active in giving feedback? And what is the teacher's role in the activity? So these aspects of activity theory make it really suited to our research. A related concept, expansive learning was used to look at how the resolution of different contradictions at different stages of the activity lead towards better and more ambitious types of learning. The way we used expansive learning for our study was, first of all, to spend part of an interview with teachers and part of a survey for students, questioning the traditional approach to the activity to identify problems and then analyzing these problems in terms of what students did in the traditional activity system. Then in the interview, we modeled and examined a new system, a new approach to feedback using the annotation tool and we looked at the possibilities as well as the limitations of the new system. Then, having implemented the activity during the first semester last year, we got the teachers and students to reflect on what they did and look at how the activity could be embedded in future forms of the practice. And again, that was through interviews and surveys. But we didn't just rely on these teacher interviews and student surveys to explore the challenges and explore the impact on student learning. We also looked at what students did. So we looked at their video annotations using the tool. We looked at how many annotations they posted, but we looked at the quality of their postings, looking at how specific they were. I'll come to that in just a moment. We also looked at the trace data from the video platform to look at how much time students spent watching the video and giving feedback, how many videos they viewed and how many times they watched each video. So this data was really important to our research. When we analyzed the historical activity system, we identified three main contradictions. First, there were limited opportunities for students to innovate. They just followed the cases that the teachers provided. They were afraid to take risks for their innovation because of the course grade at the time. So they just played safe and made a few changes to the example case. This is a contradiction between the tools that were provided, the example case studies and the objects of the activity to develop an innovation. Then students had a very instrumental attitude to peer assessment. They just focused on the scores that they gave and received rather than the quality of the feedback comments. This is a contradiction between the rules and the objects. The rules were that students were graded based on the scores other people gave them. So they just wanted to give each other a high grade. So they would receive a high grade. Finally, there were unexpected time pressures in the historical system. It wasn't possible for all students to present life in one lesson. This took an incredible amount of time. It required two full lessons and still some students didn't attend so they didn't take part in giving or receiving peer feedback on their innovation. And when the teachers were looking at the semester to come last year, they realized that with COVID, the community of peers wouldn't be able to attend very many of the classes. So there was a potential contradiction between that community and the rules. The rules of having to present in real time but also the public health rules that everybody experienced, which meant that it wasn't always possible to attend class at a fixed time. So with those contradictions in mind, we looked at how we could resolve them through a remodeled activity system. The main changes that we made, first of all, around the tools to scaffold original innovation, instead of providing case studies that came from the teachers, students were encouraged to analyze their peers' case studies and develop their own using the video annotation tool as part of their analysis. So they weren't given example case studies to adapt. They had to come up with them on their own working with their peers. Then the rules changed to make assessment more meaningful. Instead of giving students a grade based on the scores that their peers awarded them, they were graded for their participation in the activity and on the number of comments that they provided. And they were given maximum choice over who they could give feedback on and how they gave their feedback using the tool. And to get around the problems with time constraints, the tool allowed the activity to become asynchronous. Students could spend several weeks reviewing five different recorded presentations and this gave them a lot more time to respond to feedback and develop their own innovation. So what happened when we implemented the activity last year? First of all, students and teachers felt that students' expectations about interactions were broadly met and we mapped this using a relationship map. 29 out of the 30 students commented. They produced 200 comments in total and the darker circles and the larger circles show where students exceeded their requirement. They gave more comments or they received more comments than the minimum five that they were required to do. So this is a really good finding in terms of the numbers. In terms of the quality of feedback, there was a lot more cognitive and social effective support through using the tool. The activity became more integrated and it became a hybrid activity where students gave feedback using the tool but also discussed their ideas together in class either virtually or physically in the same classroom. It wasn't bounded by the period of war and lesson but it became an ongoing thing over these weeks with really meaningful interaction. Students felt that there was more open discussion than they would get with the traditional approach and this was evidenced by the number of replies students gave. They didn't just write posts but they responded online. They felt that they had more time to think and understand what they were doing. Although this might be, the teachers felt because the students were all mature, experienced practitioners and they expected this kind of interaction. The tool was very efficient. Students felt there was a practical value to what they were doing because there was a clear link with the final assignment. There was more flexibility and using a timestamp tool helped them to be clearer when they were reviewing other people and this was shown in the comments which were generally very specific and contextualized. We were able to analyze the feedback annotations that students made. We found that across the 200 annotations, students gave 547 specific feedback units and there's an example of what we mean here. This example post has one feedback unit about behavior and another one about the effect E on the audience. Some of the better comments included also a discussion of the motive to what students were doing which is the M as well as suggestions around specific goals in the presentation. So we found that across all of their posts, students generally touched on four or five of these criteria, the specificity and on average, the posts contained at least two of the different units of feedback and there was a good balance that we observed between the different types of feedback and not only looking retrospectively with feedback on behavior or effect or motive but also prospectively feed forward making suggestions around goals. So the quality of the feedback was really strong. We found that all students watched at least one of the videos and they watched them a large number of times and they watched more than 45 minutes of video on average across the whole activity but that's a lot less time than they would have spent sitting through all the presentations in a traditional class. Not all of the students reviewed their own video and comments which is surprising although the ones that did, we can see that they viewed it a number of times and they watched it for over eight minutes so they were interested to read feedback from other people and again, that represents a much better outcome than using a traditional approach where students might not really give or receive any feedback. There were some positive accidental discoveries using the tool. Students were surprised by just how interactive it was and they were really motivated. Students made very purposeful choices of which videos to comment on because they could choose any five people or more from the group. Students took part in discussions using the tool by replying to comments without being asked. They weren't graded for their discussions but they spontaneously got into these discussions. They spent class time discussing feedback so it became a hybrid activity as mentioned and it changed the dynamics of control instead of one way feedback from one student to another it became a dialogue between them and this was helped by the students use of social language in their comments saying, good job or well done or I really enjoyed the presentation. The costs of using the tool were around time. It took up a lot more of students time and instructors time but against this there are a number of benefits. Students who spent hours and hours watching the videos did so because they enjoyed it. They didn't need to do it but they consciously made that choice to watch more videos and spend more time commenting on them. Students who did this tended to make better quality comments and by spending time outside class recording and reviewing presentations students had more class time for interaction and discussion. So it switched from being a passive activity to a more active peer learning activity. Finally, because the instructors didn't read the student comments before posting their own feedback students felt comfortable being more honest with each other. They didn't feel that their feedback would affect the grade they received. The tool had a number of failings. First of all, panopto discussions around videos are very lean. They don't allow for audio or video commenting they're just text based. And back when we did the activity there was no notification function in the platform. So students didn't know when they received feedback and some of them didn't revisit their video. Also one student of the 30 didn't use the discussion tool by mistake they used a different function on the platform. So in fact, all 30 students did the activity but one of them failed to make the comments public because they chose the wrong tool. The activity had a couple of shortcomings as well. The feedback comments were not equally distributed. Some people didn't receive or one person didn't receive any comments. Some people received dozens and dozens of comments. There were sometimes a variable quality just focusing on things like funds or the time of the presentation. Some students clearly had different expectations of the purpose and the nature of peer feedback. So the contradictions that we've identified allow us to think about how we could improve the activity in a future version of the course. First of all, the tool's shortcomings. There is now a notification function so that solves that particular challenge. Even though students can't leave audio and video comments and this isn't available in the platform, it did encourage the students to take the discussions off the platform and into the classroom and lead to this unexpected hybrid approach which is very valuable and that could be developed in the future. Lastly, cognitive supports. The fact that there wasn't a rubric may have motivated stronger students to go and exceed the requirements of the task and their comments were very specific, constructive and contextualized but weaker students might benefit from a rubric. Also, all students could benefit from examples of good feedback or opportunities to practice commenting before they begin the activity. So those are things that we can build in to the next time we run a course. To sum up, this is the first project to have used activity theory to analyze video annotation and how it changes the system of formative assessment. It suggests how the emerging activity can be consolidated. It contributes to a number of bodies of knowledge from the literature around scaffolding, using a hybrid approach which is something that hasn't been explored much in the literature. And it also shows what students actually do with this type of tool rather than just giving them a survey and a scale of one to five, which is how a lot of studies have worked in the past. And it'll be useful as part of our project to study the design and implementation of video annotated peer feedback in other contexts. So for example, on the graduate level or doctoral level, other domains of study or other types of presentation. If you want to have a copy of these slides, I'm very happy to share them and they include some references around feedback, activity theory and video annotation. But for now, I'll wrap up. And if you've got any questions, please do post them in the chat or ask me. You can also email me at my PolyU address. So thank you very much again and I hope you enjoy the rest of the conference. That's great. Sorry, Dave, I hope you can hear me okay. We have a question from Vicky Dale who says, really impressive study with excellent use of the theoretical frameworks. And we have also got a comment and a question here from Yvonne. She was wondering, was there much resistance to learning a new piece of software? They find that nurses can be quite nervous about using tech, particularly in assessment. All right, thanks very much for the question Yvonne. And it's interesting that you also work with nursing students. We found in the study that most students were quite interested to try the technology, but that was partly due to the nature of the course, which is all about innovation around technology in education. When we've tried to do the study with other nursing subjects, we found there's been a lot of resistance from nursing lecturers, which may reflect their students' resistance to using new technology. Generally, the problem people have with the technology is that it involves them making a recording of themselves. Although, of course, one of the benefits of the technology is that you can record yourself a number of times and then only share the recording when you're happy with it. It's not like a traditional approach where the teacher records you and then that's your recording that you have to share. You could do the task a number of times and then just share the recording you're happy with. I think it requires a lot of dialogue with the students, as well as the teachers you're working with before you go ahead using this kind of software. Thanks, Dave. We have a couple of other comments in the chat from participants. A lot of applause as well from Shannon, Anna, Balem, Alex. Melissa says thanks so much. David Watson as well. So, is there anything else you wanted to add to your presentation? Is there anything final words? We've got two minutes. Well, something I did add is that we have studied a number of different groups as well using this tool. We have used it at doctoral level to look at how students can develop their presentation skills. We used it at undergraduate level to look at group presentations, to look at how students work together in a group to give and receive peer feedback. And we've used it for some other types of skills, not just presentations, but also things like role plays for social work and psychology, working with students to prepare them to work in the community with older adults. And we've used it with optometry students in the lab to practice working on procedures. So, I think I'd be really interested to know if there are people working in the UK who've used this kind of approach or other universities around the world who've used this kind of approach or tool to develop feedback skills or the skill of evaluative judgment. It hasn't been used a lot at my institution, but I'm aware that there is research going on elsewhere. So, if anyone's doing this as part of their work as an educator or educational developer, then please reach out and I'd be very happy to discuss it with you. Thanks, Dave. I can see more applause and appreciation in the chat for you. And I'd like to encourage you all to use the channels on Discord for today's sessions. Dave, maybe you could post a link to your slides in there, but also please do continue that conversation and pick up Dave's question to see if anybody else is using similar technology. For now, I want to echo the virtual applause I can see in the chat. Dave, and thank you very kindly for braving a new platform on this first part of the conference presentation. Anna Williams has got a lovely comment just to close us out with this session here saying, you have inspired me to do some more research on this. Really interesting. So from us here at ALT, enjoy the rest of your day with us and a big thank you to Dave for a great presentation.