 Thank you, i llwyddonach! Great, thank you very much Hannah. I'm delighted to be here, I'm here on behalf of a team of three. Colin, Peter can't unfortunately been here with me today. Peter is a colleague of mine in the School of Education at Aberdeen and Colin works in centre of academic development. So we're here to give you a little bit of a flavor of what we've Allwn ni'n hyn ng�lifau ychwanegol wedi wefyddon cyntaf ils ar gyfer bwysig arwyganiadau ymlaen, rydych chi'n wir iawn i ddim yn unrhyw i'r bywau am y ddeitchu oherwydd y gwaith a fyddwn i hynny o dangos cyflwynau ar yno cyhoeddiadau angen i hyn o'r bydddoedd y maes ymlaen. Yn mynd i'r pr cmfioedd ymlaen, rydyn ni'n mynd i'n erbydd i'n fewn ni ymlaen y fwyaf. ac mae'n ddiw i'r ydych chi'n meddwl i'r mwg ar y dda i'r ddechrau. Mae'n meddwl i'ch meddwl i'r pwysig a'i'r ddweud y ddweud y pwysig i'r ddechrau, ac mae'n meddwl i'ch meddwl i'r ffwylltio ar gyfer y ddim yn gyfodol. Yn y ddweud, y prydau ddweud y pwysig. Aberdein i'r parwyngfans yw'r静edd yna, ac mae'r afriker sy'n amser yw'r ydw i'r mwg arall. Mae'r ystod y ddweud i'r mwg ar gyfer. Dr Hilary Professor Hilary Homans was the lead educator originally, Peter Matica has been involved with the course from the beginning and he was actually the coordinator of our on campus course as well, so there's been some continuity there. The course is aimed very much at, or one of the things we wanted to achieve at Unmuk was to actually reach out to learners in sub-Saharan Africa, to give them an opportunity to talk, to communicate and share experiences and ideas with other learners around the world. So it was really to try and reach out to a global audience and it involves quite a lot of sort of expert ideas and opinions, both in terms of videos from our own educators, but also interviews with key informants in sub-Saharan Africa, case studies and the like. So it's quite a, actually a traditional move because those two words don't quite go together, it's almost an oxymoron isn't it, but it's quite a video heavy course, it has text, it has links to other resources for people to read. It has lots of discussion activities to try and encourage interaction and debate between people. And there's also a very big aspiration in the course to develop critical skills and critical thinking around the whole notion of sustainable development. So what does it look like? I've just got a couple of screenshots and some of you may be familiar with the Futureland platform, but this is a screenshot from the very beginning of the course. It's quite a structured framework that the course has. It's a six-week course and this is just the introduction, the links through to the first few steps in the course which we're all about introducing yourself and really getting started. Along the way there are, as I say, lots of discussion opportunities. This is one from very early in the course just thinking about how people look at different perspectives on development, how people define sustainable development and the like. But even here, in MOOC terms, it was a relatively small MOOC. I don't think we ever had more than 5,000 learners on it, but normally about 1,000 or so. So relatively small MOOC, but even here you can see 1,500 messages contributed to the discussion. So I'm going to talk now a little bit about the design of the blended course and just because this actually has the same name, Africa Sustainable Development for All, I'm going to use the course code which is SX1519. The blending, the MOOC, it's a complicated story. There was a sustainable development course for on-campus students. It's an interdisciplinary course for cross-year students. It's a kind of a way of expanding the curriculum for first and second year particularly students. So this tends to be a course that's taken by learners across different subject areas who come together across academic years to take this as a way of expanding their curriculum. So the course came first, the MOOC sort of then was developed out of some of the ideas and to expand on some of the ideas from the course. And then it was seen, Hillary saw the opportunity really to blend it back into the on-campus course. In particular to try to give learners the opportunity themselves, our on-campus students, the opportunities to have these discussions with learners out there in sub-Saharan Africa. So we start with a MOOC there itself, that's a six week MOOC if you like. And here's the on-campus course blended around it. What happened certainly in the presentation that the research looked at was the first week of the on-campus course was delivered by Peter as an induction. So students were told that this was the approach we were using, they were going to have to study a MOOC and how to access that. And that was very important to make sure they all got registered at the same time. Then for six weeks while the MOOC was running, the MOOC replaced the traditional lectures that would have been in the course. So learners did the MOOC and then they also came back and had face-to-face tutorials alongside those where there were discussions of the content they'd been addressing and deepening of that. And then also the expectation for some independent study alongside that. So on-campus they were doing a combination of things. Towards the end of the course, then students got involved in group work, so after the MOOC had finished and finished up with group presentations. In terms of assessment, the assessment was made up of the test that was there for all MOOC learners at the end of the MOOC, so the future-learn test that they did. And then there were additional, there was an e-assessment and then an assessment of group presentations as well. So a variety of things made up the assessment for the on-campus students. So moving on to the research that we've done, what we discovered was at the same time as we were running this MOOC, the university was implementing the UK engagement survey with a wider cohort of on-campus students. So first and third years were being asked to complete the UK engagement survey. So we thought, well this is a great opportunity, we'll see if we can get the MOOC cohort, the Essex 1519 cohort, to complete this survey as well. And we can have a look and see what the differences are between the two groups. For those not familiar with the UK engagement survey, it's quite a big survey. It has, I think down the side, there's about eight or nine different categories, different sections, and within those there are a number of questions on each topic, all sort of like-it-scale questions, and then opportunities for students to comment as well. So we administered the same survey that was going out to our first and third years with our Essex 1519 cohort, just changing the sort of introductory information so that they knew that they were only supposed to be answering in terms of their experiences on this one course rather than their whole experience as a student. And then to supplement that, we got a good response rate from our Essex 1519 students, but to supplement that we intended to do a whole stack of focus groups and interviews and so on to just get some more information about their student experiences. I don't know if anybody else has had this experience, but it proved to be very difficult to get students to participate in focus groups. But we did get three students who came to talk to us, and that was extremely useful. There were three very diverse students, they had very different experiences, but they also had some common themes that came through from talking to them. So it's given us a bit of an insight then into their experiences. So we had the data from the engagement survey. Colin is our survey expert at the University of Aberdeen, so he took away the numbers and crunched them, did some man-Whitney-U tests, looked at the distribution of data and the patterns and so on. I'll just show you a couple of examples of what's come out of that. This is the graph from one of the questions. I hope you can read it from back there, but it's about working effectively with others. You've got here a graph which shows the SX159 students in the darker blue and then the wider student cohort, undergraduate cohort in the light blue. What we were looking here for was differences in the patterns and to try to work out which direction things were facing in, whether it was the SX151 students who were more engaged in this issue or whether it was the undergraduate students. So this one, I think you can see there's definitely a pattern that's different for the two groups and the SX159 students were definitely higher on this particular element on the scale. This is another one contributing to a joint community of staff and students. Again, you can see this is one of the significant bits of data from the analysis, so you can see there's a slight difference in the patterns there. So we looked for significant differences, we plotted these graphs and then we tried to assess which direction things were moving in. Interestingly, when we did this, what we found out was that there were a couple of areas where there was no significant difference between the two groups. So in terms of engagement, both the big undergraduate cohort and the SX159 cohort, there was no difference in engagement and it was in terms of these two areas, the interaction with staff and teaching on the course. So that's quite reassuring actually that there wasn't a difference in those. There were some areas where the big cohort of undergraduates appeared to be more engaged and the numbers in brackets here refer to the number of items on each of the little scales that came up with this, but they were in terms of critical thinking course challenge and one aspect of skills development, which I found quite interesting, this was the aspect related to writing. So it was about writing that we were more engaged with. And then for the SX159 students, there was a different set where they appeared to be more engaged through the course and these, as you can see, they're slightly different. It's about learning with others, it's about partnerships, it's about research, it's about reflecting and under skills development the issues that came up here were things like working with others, about learning from other perspectives and so on. So very much a sort of social angle perhaps to this one, social learning angle. Just to show you then a few of the things that came through from the focus groups and talking with the students, I've just picked a few examples here, one of the issues that came through very strongly from our students was the importance of induction and I mentioned at the beginning this week where we inducted the students into the approach for the course. They recognised themselves or they told us that there were students who'd found it quite difficult. Some of them were completely new to the idea of MOOCs, it wasn't something they'd encountered before so they did need a little bit of information to start them off but they got going quickly so that was very helpful. Another aspect was flexibility and control and our respondents certainly valued very much the opportunity for flexible learning to take things and do the MOOC in their own time. They all talked about how it was active learning when they were in the MOOC, unlike being in an lecture theatre where they felt they weren't active participants and they all interestingly used very systematic approaches. They seemed to work through, as they were supposed to, the MOOC in order, the activities they were requested to which I think is slightly different from the pattern of engagement you might get from a non-campus based MOOC learner. They liked the flexibility, they liked the opportunity, as one participant put it, to be able to switch off the lecturer. You can't do that, as he said, in a normal lecture theatre. The other interesting thing that came out was around commenting and actually participating. All of our focus group participants suggested that they were more likely to contribute to the MOOC than they would be in a classroom situation. That was particularly interesting. For me, as somebody who's been involved in distance learning for a long time, you can see some findings from other literature that suggest that distance learning students actually don't engage quite so much. But in this context, they were engaging as distance learning students in the MOOC. I suspect there's a few things behind that. Obviously the blending might be one of them, but also it might be the nature of the MOOC and so on. But they were all more likely to comment. Following on from that, how they commented and how they engaged with the discussions was very interesting. These are findings from other studies of blended courses with MOOCs in them, which all suggest that the on-campus learners didn't contribute or only in a small amount. So no posts, little participation, little exchange. All of our focus group participants read things and they contributed. So they all participated and they all talked about different ways they had learnt from participating in the discussions, whether it just be through comments helping them understand content or whether it be through discussions with people in Sub-Saharan Africa who could give them more examples or in one case, the last one, actually having some kind of more in-depth conversation from somebody out there who was practising in the field. So there may be, again, several different reasons behind this and it's quite difficult from such a small group of participants to draw conclusions. But it does suggest that something around the blending, possibly around the platform design, around the pedagogy, around also I think the teacher's actions, what the educators were doing in the MOOC and on-campus that may have influenced this. So just to draw a few conclusions, I'll put this one back up because for me this is the summary of what we've discovered here. It does seem that there are differences between the engagement of a group of undergraduate students and those who are engaged in blended learning with MOOCs involved. It seems interesting that it might be around areas like critical thinking, of course challenge around writing that actually on-campus students can be more engaged with, whereas it's some of the social aspects that are coming out through looking at the blended students. And I wonder if this is one sort of model of what might be going on where the big circle is the MOOC and the big wide world and all the learners who are in that space who are there to offer different perspectives to have conversations about that, to answer each other's questions. And then we've got the on-campus community of learners guided by the tutors in a face-to-face setting who can deepen things, who can offer support, who can motivate students to take part. It seems there's something akin to that going on. So what we've done just to put up a final slide for you then is to put together a few ideas of our own about what might make blended learning work if you're trying to do this kind of thing and blend a MOOC into an on-campus course. And really these kind of follow-up from some of the points I've made, things around induction, things around flexibility and control and encouraging social learning, encouraging people to get involved to really make the most of the opportunities to work in this global context. I've got more slide, which has a few references. Just to let you know that slides are up there on the slide share if anybody would like to see those. But I think I'll stop there. Thank you. Okay, good afternoon. I'm Cherry Poussa. I work at the University of Nottingham. I'm part of the Health, Ealing and Media at the University School of Health Sciences. I'm very pleased to be able to share some of my PhD research findings with you today. I just want to acknowledge the invaluable support that I received from my supervisors, Heather Ward and Sharon Ainsworth. So, because it's mid-afternoon, I thought we could start with some questions. So, if you don't mind, a quick share of hands. How many of us have had the experience of introducing a new online system like a yearly? Quite a few. How many of us have delivered training and or developed an online training guide to show students how to use it? Brilliant. Bear with me. Two more questions. How many of us wondered if the students were more confident in using the system following the training? And finally, how many of us wondered if the students' confidence was long lasting? Brilliant. Okay, so we'll come back to those questions at the end. So, a quick show of how I've structured my presentation for the next 15 minutes or so. A quick context of why I was motivated to do my research on I should have said it in the beginning, but I hope it was in the title, you saw it. Whether or not virtual peers can help web-based learning self-efficacy. I'll talk a little bit about the theories behind my research, what I asked in the research, the development of the intervention package that I used, a little bit about my study design, the methods data collection, what I found more importantly, some conclusions and recommendations and what I hope to do next. All in the next 15 minutes. So, a few years ago now I was involved in a huge rollout of a VLE in the School of Nursing as it was then. We decided back then that all of our modules which is about over 300 should all go on the VLE which was WebCT back then. And as you can imagine it was quite a difficult, lots of things to do, quite difficult job to manage and lots of training sessions involved. So I think at one point I was delivering about maybe about 30 training sessions in a week up and down East Midlands to pre-registration to student nurses and post-registration nurses, the nurses who were already qualified and coming back to do CPD studies. And clearly it was unsustainable but what was interesting was that I met hundreds of students and various, lots of range of scales but what stuck to my mind was I encountered lots of students who were anxious and even angry actually that they were anxious about using a new system they were angry that they didn't get they felt that they didn't get enough support and in fact research shows that students who are using a new online system for the first time or even a more than once if they're doing it on their own they're feeling isolated and they think that is more work and in some cases students even drop out from online courses they're feeling overwhelmed don't know what to expect and what skills do I need to have to do these things somebody can show me what to do in terms of the students that we have those student nurses they're saying we don't want computers we're a caring profession what is it to do with us and it's interesting that these feelings are still valid today it's still being reported in the literature so my options were I could deliver more training maybe a bit more effectively and evaluate or do a PhD on it and if I did want to do a PhD part time obviously what do I want to achieve from that and I thought long and hard about it we're always going to be delivering training but I wanted to do a bit more than that I wanted to change I want to help students more and but what was it that I was trying to do do I want to change their confidence is it competence that I want to change yes because clearly we know that there is a link between confidence and competence or I could do nothing but the consequences for not doing so would be like I said before students would be affected in terms of their confidence they might avoid web-based courses and activities within it and particularly for post-registration nurses because they've got to come back and do their CPD in order to maintain their or carry on with their job it might affect their CPD planning and more importantly and the big picture is that it has a direct direct impact on their capability to remain as a safe practitioner so the real impact is on patient care so personally if I did want to go into hospital I want to be cared for by a nurse who is confident in order and I wanted to do my bit in contributing to that safe patient care and also web-based learning is here to stay, it's not going anywhere the government keeps telling us to do more online use more technology use technology appropriately to deliver continuous learning so yes I wanted I wanted students to be more confident I wanted them to be more confident but actually I wanted I wanted to do, what I wanted to do was I wanted them to have more belief in themselves that they can succeed in accessing web-based learning and indeed as a web-based learner and self-efficacy fitted very nicely with what I wanted to do and when Dura defined self-efficacy as people's belief in their capability to organise and execute the course of action required to deal with prospective situations in specific domains in other words it's believing what you can do with the skills that you have and the very specific domain is important because you can be confident generally about everything but you might not have strong self-efficacy in a specific domain for example using the web for learning and self-efficacy can be measured you can have strong self-efficacy if you do have strong self-efficacy you can you view challenging tasks as something that you want to master rather than shy away from so I wanted to do something with that so how do I do it where does self-efficacy come from so according to these are the sources of self-efficacy so you strengthen your self-efficacy by having an experience of success so you undergo something do a training you've done something right so for example you manage to find your timetable online for example sometimes that's not easy to do so you've done that and it contributes to your sense of self-efficacy social support so your peers, your facilitators, your tutors that can help as well observing others and also generally having positive feelings about the environment that you're in so feeling positive about accessing the web for learning for example so in the context in the domain of web-based learning as we know that students might access web-based learning they might access their course material online with others in the same room or they might access it on their own at home or on the bus or somewhere else as we've seen the sources of self-efficacy you can have in blender setting you can get social support from your peers and your friends your facilitators but if you're learning on your own you might have nobody so I wanted to look at that more as a source of how to compensate for that in the absence of real peers so as a result I needed a bit more I needed something else so in the absence of real peers I found computers as social actors a theory or framework from the work from the work of Reeves and NAS and they said that people respond to web-based media or social actors so we've done this ourselves we respond to smileys on email or in a text and it says that the message is a happy message if there is a smiley face if there's a sad face it reflects the message from the person there's something they're not happy about and there's a whole load of literature out there about web-based media being flattering people respond to that as if they're being praised by a real person so I haven't got a lot of time to talk about it but we can come back to it at the end if we've got a bit more time so moving on to research questions so these were the questions that I wanted to to ask I wanted to achieve from the research so the obvious question do student nurses knowledge and skills improve as a result of training do the student nurses web-based learning and self-efficacy levels improve as a result of training does the format of delivery or presence of social persuasion impact on nurses web-based learning self-efficacies or blended or self-directed format and finally because of the absence of real peers in a self-directed setting situation I wanted to know if I put virtual peers in there does it affect the improvement of nursing students web-based learning self-efficacy so because it's an experimental study I needed an intervention tool and this is quite a big part of my study I developed a bespoke training package using a user-centered design and a part of that was a consultation with over 200 students and I won't go into this because it's a separate topic really but I just wanted to include it so you have an idea of the tool that I used and I put it in WebCity and developed it in conjunction with the students and and it should have all of the sources of learning self-efficacy in there and quickly some screenshots here I'm just going to skip all over that but I had four groups in the experimental study and I had two outcome measures knowledge and skills and self-efficacy learning measures and participants from I did three studies and these were the participants and more importantly I wanted to go to the findings because I didn't have time so overall the nursing students knowledge and skills levels increase following training so keep doing the training it does work and overall the nursing students web-based learning self-efficacy levels increase following training and more importantly their self-efficacy levels were maintained nine weeks later so I did a pre-post and delayed test and the delayed bit was what I wanted to find out whether it made a difference or not and in terms of the format of delivery actually in terms of the post-reg they found that the virtual PhD in the hospital but for the pre-registration it did so that's quite an interesting finding and finally just going over this quickly the virtual peers, the smileys actually were sufficient to help with social support in the absence of real peers so conclusions, that's it so training is important and self-efficacy web-based learning can be changed and more importantly is long-lasting and yet in the absence of real peers low fidelity of virtual peers like smileys as opposed to other tasks and more expensive things like that can provide social support I'm good to stop that It's always a good start when you can get the technology to work isn't it Well hello everyone I'm Simon Star and I'm a learning technologist from Canterbury Christchurch University we're a post-92 based in Kent in the UK with about 17,000 students and as a learning technologist it's been my dubious pleasure and honour to run our turnitin setup for more than a decade but as a learning technologist also I'm more interested in the pedagogy than I am in the technology so what I've come to talk to you about today are the findings from some research I did into students use of feedback provided through turnitin at Canterbury Christchurch University so my own interests in relation to feedback are around students use of feedback to develop independence so to become self-regulating learners Over the next 15 minutes or so I'm going to talk a little bit about the research but I'm going to linger on the findings and then I'm going to offer some questions to you to investigate the use of or the choice of presentation of digital feedback which is the focus of my research you can investigate that in your own practice or indeed support colleagues to do that so I know we had some technical problems earlier so I've got a couple of diagrams to show later so just in case there's a link to the slides up there I must say actually it's a pleasure to come to to alter not have to explain to everyone just is I know there's a few of us here but I've given up saying that at parties and just getting that blank look I don't really want to know so I just tell people I design fireworks now so a bit of background I like many of us we've had a growing use of digital feedback at Canterbury Christchurch University driven in part by a requirement five years ago for all coursework to go through turnitin for plagiarism checking and since then there's been very rapid growth in programmes and markers choosing to use turnitin to provide electronic feedback as well as you can see we have quite a high volume of that so we know a little bit about students experience of turnitin we know that they value the flexibility of being able to submit work electronically wherever and whenever they are and also to be able to collect their feedback without having to drive in and pick it up and we know also that a growing number of our students value the reduction in paper that digital submission and return of feedback can enable we also know for staff that providing digital feedback can reduce that all-important turnaround time but what we don't know much about is whether there's anything about providing digital feedback that might be helpful for the learners in terms of developing their independence and that's what I'm interested in so there is some literature around this and there are some suggestions in the literature that using digital feedback as compared to paper-based forms of feedback can actually improve pedagogic outcomes these studies here find that students for example are more motivated to go on and use their feedback to improve themselves as a result of receiving digital feedback and also more aware of the all-important assessment criteria when they use digital feedback or receive digital feedback as compared with non-digital feedback but the studies studies you find the literature don't tend to investigate in any detail why that might be what is it about digital feedback that might lead to better outcomes for students so what I wondered being a learning technologist is there anything about the digitalness of feedback which might engage learners more than non-digital feedback so that prompted me to come up with this research question this is some research I did for my master's dissertation a couple of years ago what I was really looking at was can I identify any ways in which the choice of the myriad different ways of presenting feedback using Turnitin might make a difference to whether and how students engage with their feedback so I've got the findings of that here today and I offer those to help guide and research I'm not going to linger here but just out of interest how many people are familiar with the feedback tools in Turnitin that's many of us isn't it okay so feedback enables a lot of different ways of presenting feedback through Turnitin including on script bubble comments, pop-up comments and also kind of just writing on the script red penning if you like and a number of off script ways of presenting feedback including paragraphs of summary feedback which can be written or it can be audio recorded and also a number of assessment criteria based presentations of feedback including rubrics which are essentially grids and also this grading form here which allows you to write ad hoc comments against each assessment criteria whereas the rubric has preset comments for each level so there's a whole bunch of different ways of presenting feedback through Turnitin so my question was does the choice of those make a difference so before I could consider that I needed to consider well what does engagement with feedback look like so going to the literature there are various ways of engaging with feedback from the obvious do students read it or listen to it or watch it, do they think about it and do they actually go and do anything with it make a plan to improve their work as a result to perhaps the less obvious do students go and actually seek a dialogue with tutors, peers or friends as a result of having feedback do they enter into a discussion once they've had feedback to I think a really under a searched area which is does feedback prompt an emotional response in students and does that make a difference so the design of the study was infused with a purposeive sample of undergraduate students across a range of levels of study and subjects and I asked them about their feedback experiences using or feedback through Turnitin their experiences of those and I was looking for evidence of those different ways of engaging with feedback and whether there were any barriers or enablers in terms of how they engage with their feedback and the results came in that essentially yes for those students the choice of those different presentations of feedback in Turnitin did make a big difference to their engagement with their feedback and more so it made a difference not only to their ability to use their feedback to improve it made a difference to whether they wanted to read their feedback or listen to their feedback at all so themes arose involving ability and motivation to use feedback and if the technology holds up we can have a look in detail so what this diagram is attempting to show is firstly where there were presentations of feedback in Turnitin so the on-script bubble comments and the off-script recorded audio comments had positive influences on students engagement so on the right hand side in yellow these are the some of those dimensions of engagement the feedback I took from the literature and the bubbles in the middle these are the ways in which bubble comments and voice comments influence students engagement so firstly if we take personalisation or students perceived personalisation of their feedback bubble comments had a positive influence because by them highlighting individual pieces of students work on the script students felt that the tutor had simply paid attention to their work with voice comments and came through there being a perceived warmer less formal tone compared with written feedback and that in turn led to students feeling valued interestingly for this group of students they all turned out to be quite independent learners which is a shame because it's not something I sought in my sample so actually where students will see on the next slide had got a reduced sense of personalisation through their feedback it didn't make a difference to whether they then went and used their feedback it might do for less independent students so next another influence was the way in which bubble comments and voice comments were students hadn't seen them before they simply had a novelty effect and that made them more likely to go and read it one student said if I see a bubble I want to click on it and find out what's in there so there was a novelty effect another influence was specificity so students said that bubble comments because they related to a particular part of their work were more useful to them and they were more likely to follow that bit of feedback up because they could see where it related to their work as opposed to off script off script comments which we'll come to in a moment and then finally voice comments because you have added tone and pace and inflection and also a tendency to use simpler language these students said through voice comments led to an enhanced clarity of meaning which again helped students to use their feedback simply because they understood what the tutor was trying to say a bit better so those were the positive influences negative influences the culprits were general comments which are the off script summary comments rubrics which are the assessment grid based feedback with preset level descriptions and quick mark comments which are on script bubble comments but which are preset canned feedback so in terms of specificity then students sometimes found it difficult to relate off script summary feedback to particular parts of their work so that reduced the value for some of them I don't really know how to improve my work because I don't know which bits my work those summary comments relate to clarity of meaning students said was a troublesome area when it came to rubrics because of the tendency to use the language of assessment criteria to say how well students are done within each level of those assessment criteria and actually also students found rubrics a problem because again they found it hard to relate to their work so actually rubrics were a real culprits with the sample of students that I spoke to and then finally both rubrics and quick mark comments reduce the students perceived sense of personalisation their feedback which led to them feeling less valued through that I just want to hop back a moment because the one thing that's really come out of this for me that maybe was new to me was this connection of emotionally connecting with the marker so this was one of the areas of engagement with feedback that I suggested is under explored and all the students that had received voice comments they all said the same thing which was they just felt closer to their marker as a result of hearing them rather than looking at what they've written down and that emotional connection with the marker in turn actually made them more motivated to go and do something with their feedback ok so what well the first thing is in my institution I don't know about yours there's been a big push to use rubric feedback because engaging students with assessment criteria that's a good thing there's lots of good theory about that which I'd support also it happens to be quite an efficient way of giving a lot of feedback to a lot of students without a huge amount of effort and there's a bunch of literature that supports this so for these students rubric feedback was problematic again because the language of assessment criteria was difficult for them and they also found the rubrics difficult to relate to their work because they kind of existed on the side they weren't picking out areas of their of their script a second challenge if an emotional connection with the tutor or the marker through feedback is important which it was for these students then what price anonymous marking is what I'd ask certainly personally I've found I think it's quite a human thing actually for me I'm more likely to take feedback from people that I feel I have a connection with and maybe a little less likely to take feedback from people I don't know or feel a bit distanced from so I think that might be an issue the other talking point I suppose is although I've been talking about Turnitin other digital feedback tools are available there's lots you can go and look at the exhibition all I don't really think this is about Turnitin at all and in fact the influences on students engagement really came from is the feedback in context in their work or is it off their script is the feedback canned or do they perceive it to be personalised to them is it audio with all its warmth and its extra tone or is it text and those seem to be the things that made a difference to students engagement so actually I don't really think this is about technology okay so a few limitations I didn't look at post graduates the reason for that is because I'm interested in development of self-regulations learners and undergraduates was a good place for me to go there I only looked at summative assessment on coursework because that's by far in a way what we use Turnitin for the most that's not to say that that's the way we should be using our digital feedback tool I think there's actually a lot more value in using it for formative and peer assessment that's not what we tend to do at the moment and as I've mentioned thank you so I'm going to take some questions now and while I do that I'll leave some questions up here for you Hi I'm over here I'm Gabby from Lefbara that was really interesting about the audio feedback being so well received by the students in your institution at Leicester University a few years ago we did a project where we got some academics to agree to give audio feedback and some of them gave both audio and written feedback in the margins of the assignments and then I did a linguistic analysis of it with some colleagues and what we found was that the nature of the language used was very different in the audio compared to the written feedback so the written feedback as you say was very much in the language of the rubrics it was very kind of formal the audio feedback was very natural, very human and there was all the tone that you have in the human voice as well which added to the warmth and sense of this being a conversation between the marker and the student and the students responded really well to the audio feedback I think they also liked the combination because they liked to see specifically in the text where they you know they were getting pointers to change things and also the markers started then using voice recognition technology to because the external examiners wanted to see text they didn't want to sit and listen to a minute or three minutes of audio feedback so they used voice recognition technology to at least have a transcript so that really confirms what you're saying as well Thank you, yes I'm aware there's some research going on at the University of Winchester looking into the same thing do markers different words when providing feedback on the same piece of work via audio recording or written Thanks, that's James Ittle from the University of Sheffield so you mentioned that obviously it's not you feel it's not about the tool but more the assessment method itself or the assessment feedback method Out of interest have you approached any of the tool providers like Eterniton to ask them that when they design the tool if they had kind of factored in any kind of process about how they decided which tools to actually create in the first place That's a great question, thank you I don't think I'm alone in having being asking Eterniton for about 10 years to do this I find that actually the tool provider is very receptive to hearing from the HE community about what's needed from their tools I think what's sometimes a bit different is the pace of which those things can be developed So for example I know that Eterniton understand that for there to be greater adoption of the audio feedback tool through Eterniton they need to do a couple of things One is to make the audio downloadable for quality assurance which is a real barrier to us at the moment but the other thing is to recognise that students like to see some feedback comments in context on their work so I know that they have talked about providing on-script audio comments So in the same way you can highlight a piece of text and put a written comment on it at the moment they are talking about being able to highlight a piece of text and do a short audio comment So in short I find them very receptive to talking about it but pace of development is a different thing isn't it? That was tense, didn't think it was going to load for a second Thank you for all not sneaking up I know I'm in the last talk so thank you for not diving out through the doors My name is Laura Orcham I'm from Cardiff University in the School of Healthcare Sciences Just to give you a background first of all about the School of Healthcare Sciences so we have quite a few undergraduate programmes the top four being Nursing, Midwifery Physiotherapy and Occupational Therapy and we also have postgraduate taught and postgraduate research as well So as an outline of my presentation today I'm going to talk to you about how we developed a robust system for summative assessments using captured technologies the challenges and the successes that we came across our future plans and also I'm going to talk a little bit about physical learning spaces So first of all change and identifying a need for change and initial ideas So in the School of Healthcare Sciences for our nursing programme we predominantly have objective structured clinical examinations or OSCIs presentations and Bible botches I'm going to focus on our presentations for the purposes of this presentation and I'm going to talk about our summative presentations with an oral element So historically in the School we have had two members of staff present at each presentation So one member of staff would act as the marker and the next would act as the moderator However, only the square root of the cohort would need to be moderated plus any border lines or fails So we were moderating all presentations despite the fact we only required the square root of the cohort to be moderated Historically the presentations have been filmed using camcorders for the purposes of external examination but not for the purposes of moderation So the key issues that we had were unnecessary work and actually moderating all students Staff time we all know and had higher education that were constantly being asked to do more with less and also unwieldy video files So the reason that we weren't using the camcorder recordings for moderation was due to storage time spent on trying to compress the videos and the inability to share them so what we proposed to do was to run a pilot in a single module using Penocto which is our lecture capture system for both a formative and summative presentation We only used a single module because that particular module had roughly 230 students which was actually the whole nursing cohort it was a module all nursing students were required to complete For those of you not familiar with Penocto there are two sides to it so we had core existing facilities which means that we had some teaching spaces where the audio and the visual was recorded we didn't have any enhanced rooms easily accessible to us and the enhanced room would also have video feed The recordings would be stored on Learning Central which is our VLE powered by Blackboard and the module would only be available to certain members of staff and it would not be accessible to students We would continue to have a first marker in each room naturally and the moderation would take place online in the VLE at a later date So what did our pilot look like? Well first of all before we started anything to do with the pilot we had to look at training Penocto is a fairly new system in Cardiff University and certainly in School of Healthcare Sciences it's still in the process of being rolled out to staff and not all staff are familiar with it so we sat down with all the markers and the moderators to run a training session I ran a group session we recorded it using Penocto so there was no excuses for anybody not being able to attend and not watching it but we also created an online resource in Zirti which was always available in a situation where they couldn't watch the training video and they could always refer to the online resource The training set out the expectations of the markers things like making sure the recording had started making sure you knew how to stop the recording at the end of the presentations and for emergency situations we had a too long didn't read section at the end of the online resource so if anybody for any reason hadn't got round to watching the video or reading the online resource in any depth and they had an emergency go-to section and they also had the contact details for the learned technology team and my colleagues Bex and the university IT team so as I said we used the nursing module with the large cohort we only had core capture available so for a video feed we used webcams on tripod and we scheduled all the sessions ahead of time we had quite a few members of staff asked us to provide a recording but due to the size of the cohort this just wasn't feasible we often have no shows which I'm sure everybody's familiar with we'd have students going off sick and so on so it wasn't feasible to keep the timings absolutely perfect with the recording so we split them into morning and afternoon and for the purposes of the pilot we conducted checks at the start of each session so I worked with the learn plus team learn plus in Cardiff University Oversea Penocto we just checked there was a video feed and an audio feed for each session and we also provided each room that come called in a memory card just in case there was a catastrophic failure which thankfully there was not to start with we filmed the formative presentations so the students were invited to come along and practice their presentations in groups the videos were made available to the students in blackboard directly into their module and the students were advised that the videos would be available to view so they could go in they could watch their own video and we did ask the students if they watched them and how useful they found them we didn't have a fantastic feedback we only had 27 students respond 6 of whom watched their presentations 5 of whom found them somewhat very useful but interestingly not for so much of the purpose of this presentation but something we are going to look at in the future as one student watched nearly all of the presentations that's a lot of presentations and one thing we are going to look at in the future is how we sort of advise students on moderating how much they watch so they don't become overwhelmed so for the summative presentations they were held over 5 days and we had up to 6 rooms going at any one time again we had 2 sessions per day and in the end we had about 60 recordings give or take a few for resets, extenuating circumstances and third attempts so then it came to the moderation this was the important bit so all the staff who were moderating were given training on how to access the videos in Learning Central the assessments team in the school calculated the square root of the cohort plus any borderlines and files and identified which students need to be looked at and it was up to the module leader to identify which videos the students appeared in plus the time stamps and feed that back to the assessments team that was a real turning point for us because we thought it was going to be really time consuming to do that but actually the information was all there because we already had spreadsheets of when approximately the students were due to attend and it was quite easy to work out the moderation was completed online at the same time so everybody was expected to be at the computers wherever they wanted to be to complete the moderation at the same time so there was support on hand so looking at the impact these are the real figures and this is what's most important so we estimated that before using Penocto there would have been approximately 107.5 marking hours and 121.5 moderation hours we estimated that based on previous uses of moderation from the previous cohort and then we calculated how many actual hours was required for moderation post Penocto so after we completed the moderation online it was reduced from 121.5 hours to 11.5 hours and I was going to say you can't put a figure on that you probably could if you wanted to try and cost it but that has saved a significant amount of time for teaching staff that can be freed up for other things to give students tutorials which they're always crying out for and to just spend time on other things which are important in the job we did have some additional resource time myself and my colleagues and the learn plus team and the IT team did need to be on hand doing the pilot but that is something that will be significantly reduced if the system is rolled out so we went from 229 staff hours to 122 staff hours we did use the videos for external examination we ran a very similar process and we identified the videos that needed to go to the external examiners and external examiners accessed the videos through learning central as well the previous system actually saw the videos going out I believe were on secure USB sticks to examiners which even though I was reassured that they were highly encrypted if somebody really wanted to access those videos they could I'm sure they wouldn't but there's always that possibility and why take a chance so the main challenges we faced during this time were some various technical ones we had a problem with the audio volume in one particular case because the microphone actually ended up being behind the students rather than in front of them there were a few issues webcam angles so in one case we had particularly tall students we only had them from about here down some of the webcams were moved people would come into the room and move furniture around and move the webcams because it was sort of in their way and we also during the pilot had one member of staff who was a first marker not due to illness but thankfully we'd already thought ahead and there was a roving marker for these emergencies and that person stepped in during the moderation process there was an issue with access to viewing the videos in the module but thankfully that was resolved because it turned out to be a central issue wasn't anything that we could have possibly foreseen and it's now been resolved so we asked the moderators and the external examiners what they thought as this has had the biggest impact on them we had 100% response rate for both and in terms of the moderators 100% of them found the new process quite or very useful they found it quite or very easy to use which is fantastic but anecdotally a lot of moderators did approach us and say this is great we want to see more of this it is definitely a useful system and is better than what we've been doing before because it saves us so much time and at this point they hadn't even seen the figures and they already knew how much time they were saving the external examiners 3 out of 4 thought that they preferred the new system one thought that they didn't prefer the new system they found it more complicated however there was also in the survey one external examiner who admitted they hadn't done any of the training so I don't know if that was the same person or not it might be it might not but it would be interesting to see what happens going in the future so looking at future plans what's next before I go on to the future I'm just going to talk very briefly about our physical learning space upgrades so since this pilot we've actually had a significant upgrade to our Penocto facilities in the school we've gone from having about 15 rooms of core capture to 23 rooms of enhanced capture so we have 23 dedicated spaces with video and audio feed for to be used for not purely for assessments but it will certainly help this was partly done in conjunction with our Learn Plus team who when they saw what we were doing offered us more equipment and also partly funded by an award that myself and my colleague Becks won internally to have some upgrades done so for the future our education committee in the school has stipulated that all summative assessments must be recorded using the new system they were very impressed by the amount of moderation they saved and they want this to be rolled out due to the system upgrades we're going to do a second phase pilot coming in the new academic year so we've got an opportunity to iron out any problems that might have occurred using the new equipment and also we're going to be working with the program leads for each program just to see what they need for their specific programs because something like nursing and physiotherapy they're going to be very different in the requirements and this is anticipated to be fully rolled out by the academic year 2018-19 unfortunately it will not be me taking this forward I've thoroughly enjoyed working on the project but I am currently in my last week on the job I'm off to passengers new next week but my colleague Becks and my other colleagues in the university will be taking it forward and I'm sure they will do a fantastic job thank you very much Hi I was wondering any other departments other than within nursing and healthcare that have been interested in this or have been doing anything similar at Cardiff? Our school of law have been doing a very similar process it's not something I found out quite recently so I've actually had a chance to speak to them about exactly what they've been doing but I've been told by colleagues that they do have a very similar process for some of their assessments not aware of any other institutions externally but we would love to hear from them if anybody is aware of anywhere else that might be doing this Okay, thank you I'm not quite sure how so much time was saved for the moderators because if the presentations were previously done face to face and now they're being recorded did they not have to moderate the same number of presentations? No, so the number of the number of presentations to be moderated was calculated after the presentations had taken place the square root of the cohort and only those students were moderated so the assessments team would identify the students being moderated and the team would only moderate those on the blackboard so previously the moderator would have been in the room at the time of the presentation and they would have moderated all of them because they wouldn't have known at that time which students needed to be moderated So that was a change in the process really not just because of the introduction of the recordings?