 So continuing with the theme of emerging technologies, we're moving swiftly from AI and LLMs to social annotations. So we have here also from Durham, James Udale, I hope I pronounced it correctly. Yes, you have. And he's going to talk to us about mainstreaming social annotations in arts and humanities and social sciences and evaluated pilot. Okay, thank you very much. So I don't think this was intended to be a Durham-centric session. I think that just happened by accident. But yeah, I'm James. I work in one of the two teams that Paul was talking about, the Durham Centre for Active Development. And this session is kind of free things. In a way, it's regional research. So it's research that I've undertaken during the course of this pilot. It's part reflective practice. And it's also part digital perspectives or critical perspectives on learning technologies. And I think to kind of intersect that Venn diagram, it's also therapy. Because I think sometimes it's good to talk about things that are challenging in institutions. And, you know, it's also good to talk about things that don't necessarily go to plan, because failure is a really, really good and important part of the learning process. But yeah, that's it really. So I'll discuss very briefly what I mean by online collaborative socialization, first of all, just to make it really clear. What we're talking about here is kind of what it says in the 10, which is the idea of students or learners being able to collaboratively annotate, highlight, kind of manipulate a document, digital artifact or reading. This is distinct from discussion boards, which are kind of, you know, you may discuss around the subject, what we're discussing here is kind of an in situ kind of collaborative learning process on an artifact. This kind of thing typically is done in an asynchronous fashion, but it can also happen face to face. But it's really quite important to discuss that this kind of thing isn't new. This isn't really kind of emerging technologies. I think the idea of kind of annotating a document collaboratively as part of the learning process probably goes back as far as writing. As long as it's been writing, there's been annotation. The idea of doing kind of collaborative annotations on a text, pulling it apart, doing those kind of things in a seminar context, that's as old as the hills really. And we think about kind of social constructivism. You know, that's something that we think is generally a good idea. When it comes to doing this digitally, there's research going back as far back as 1995, which discusses shared digital annotations. And since the mid 90s, we've had tools like Adobe Acrobat, which have facilitated the idea of collaboratively annotating a document. So where we are now is that we have a number of kind of big vendors who are designing kind of specific annotation tools. These are simply LTI compliance. A number of those are on the market, such as Talus Elevert, Perusal, and Feedback Fruits, amongst many more. Just to collaborate as well to kind of get a sense here. Is anybody here from Talus before I continue talking? Okay, that's fine. They might hear them this morning. Yeah. Well, you know, they can correct me later on if anything I say is a misrepresentation. So the idea is that there's all sorts of kind of tools now that do this in a bespoke way, and there's also lots of vendor-supported literature. And I'm not saying that as a criticism. I'm saying that it's worth being aware that some of the literature around these tools have been supported by vendors or kind of sponsored by vendors. But there is also plenty of research around these kind of tools about how it can improve reading comprehension, how they can perhaps improve metacognition, and various kind of things in a HE context. So given all of that Durham being a bit slow on the uptake, we really started discussing this kind of thing only in 2020 when the pandemic happened. That's because the world changed in a big way, and departments who ran seminars traditionally had to think of different ways to do them online. While many departments decided to kind of go for your typical online synchronous seminar via Zoom, certain colleagues in classics and in archaeology thought, let's try and do this asynchronously online, and they discovered Perusal. Perusal being a annotation tool that claims to be free to use for educators. Colleagues found that generally these tools went down really well. Students used them an awful lot. There was more engagement over discussion boards. And when I say engagement, I just mean participation. I know there's many more dimensions that particular term, and it's a bit problematic. But they found that while there's many things about online education that they're happy to ditch, this kind of thing was actually something they wanted to return. So just to let you kind of know, Perusal is an American company founded by the likes of Eric Mizzoua and partners at Harvard University. It claims to be free for educators and students with just the false and goal of making education better being their aim. It claims to be independent. And I'd say at this point it's quite highly featured and a fairly mature service. But it's got some quite interesting terms of use really when you drill down into it because the way these tools work is you upload materials into them and you essentially give the people who own these tools the right to use that material as much as they want. Also, if you don't own the copyright of that material to begin with, that can also be quite problematic. So one of the things about Perusal, though, is it requires students to sign up to make an account to give their data to the vendor as part of the process. But it does offer a LTI integration. So our colleagues in classics and archaeology thought, let's request our IT services integrate this with our VLE, LearnUltra, which makes perfect sense. Let's get over that kind of lump that stopping students engaging and perhaps causing a bit of a problem. I'm going to use this several times and I'm also going to attribute it appropriately as well to CBS Studios. But what happened is a year went by and nothing happened because, you know, universities have closed, IT services have many closed in their times. There's lots of things that kind of stop people from engaging and doing kind of things that are asked of them. And in general, nobody wanted to hold the baby. No one really gave a crap about this. So it just kind of sat there for a year. But under growing pressure and departments who didn't want to go away, they asked, basically, IT asked myself in the academic development side and also the library to look at the tools that Durham has to find something else to do this, this kind of functionality for students to basically make these people go away, essentially. So I was involved in that analysis where we looked at various tools. We looked at discussion boards and classic conversations in LearnUltra, which again is kind of not in situ. It's not context driven. It's around the subject. We looked at our readingless software to allow us to aspire, which lets you kind of, well, surface readings are very copyrighted during wear, but you get no collaboration around that. We looked at video comments and discussion on Penocto, which is quite good for video annotation, but maybe not fully all the way there. And also we looked at how you can annotate documents in Word and also Adobe Acrobat as well. But while these tools offered some of the features we wanted, there was not the kind of complete package. There wasn't the analytics. There wasn't the integration, the type of integration with the VLE. And also things like sharing permissions are really quite difficult on Adobe Acrobat documents when you've got a whole cohort of students. And there's also the sweet chestnut of copyright and finding a way to surface readings in a copyright adhering wear. So at this point in our library, we said, well, look, we used TALIS already for our reading list. Why don't we use that existing relationship to have a look at the annotation tool, TALIS Elevate, which as far as I could see at the time was fairly nascent. It did annotation, but not much else, but it was heavily recommended by the librarians. Equally, this is a tool that you blow materials into directly, despite the fact that they have a reading list software. It's a completely separate tool that you've got to blow materials into, which didn't make any sense to me. And on copyright, they've got some web pages, but they generally say, talk to your subject librarians, which is a bit crap really, but that's essentially what they do. So given that nobody wanted to hold the baby, I thought, what could my department do, given that we're kind of academic development? And we decided that we have these pockets of funding called collaborative innovation grants, where we can basically offer academic staff £5,000 to do a small research project to look at something quite in depth, obviously, from a pedagogic point of view. Given that the projects must be innovative for Durham, and that's quite a lot of scope, by the way, must be evaluated with some kind of dissemination at the end, must involve academics and students, and must have the potential for the idea to be mainstreamed. And we thought, surely this annotation thing, if it has actually some legs, it kind of ticks all those boxes. So I crafted this wonderful bid, this wonderful research project with colleagues from archaeology and classics. We were going to get a research system to do the project, the grinding of the data for us. We were going to do educator interviews, have a student facing questionnaire, and look at the analytic data. It looked like it'd be a really tight project that would look at both Perusal and Talus Elevate across four modules. So we kind of compare both. It's worth saying that we asked both vendors to come and talk to us, but Perusal didn't show anywhere. And they obviously wanted to sell their tool quite a lot to us. So it was all looking so good so far. We had project approval. We were going to get the money to pay for the licenses. It was all going fine. Oh no, my counter's gone up one, because at this point, our IT services looked at both tools under the university's cybersecurity and data security office. And they said that basically Perusal is not fit for use for any university business. I can't really tell you why, but that's what they decided. They then decided to announce this half through the academic year to academic staff. So as you can imagine, you're teaching and using Perusal, you're told you can't do this anymore, which caused a wave of consternation and also unearthed many more Perusal users that we weren't aware of. So we had a real problem here. We had a project that had fallen through. We had a lot of angry people and we were half through an academic year. At which point, my knight in shining armor rocked up, which was a new governance committee called the Digital Learning and Teaching Group, which was meant to support innovative digital learning and teaching projects. So I thought, right, let's do an open pile up telecelevage. We've got the approval to use it already. Let's get the money. Let's do a different kind of tacked on this evaluation, but let's do it anyway across the whole year. We got the funding in April and we started in September with a view of the final report being delivered in May. So we got the ethics approval shortly after that. The idea is that we'd have an open questionnaire submitted to all 26 academics involved in the pilot and results. We'd also look at the VLE data as well to try and bullet usage across the year. The data was coded in March and the final report was composed in April. What did we find? We found that roughly academics found the primary benefit of using telecelevage was having the LTI integration, which seems really simple, but when you've been using Perusal for a long time and students have to sign up, it gets over that hurdle, although there were issues with third party cookies. We found that academic practice varied significantly. Some people just bunged their readings up into telecelevage and thought, job's done. Equally, we found people spent a lot of time developing specific activities or designing in those in-context structured prompts into documents. And you know, surprise, surprise, we found that those particular engagements, those activities had much more student engagement, much richer responses and generally more satisfied academics. And we also found the majority of student contributions were anonymous, which was slightly problematic because certain academics used this for as a basis of an assessment. And there is no way in telecelevage to turn off anonymity for an activity. You've got to have some provision for that within the tool. In general, we found variants in practice. We found typically it's critical in social reading and readings that it was used for, but also some, as I said, some assessments and seminar replacements, using it as an artwork and visual arts critique, which was quite interesting, critique for cinematography and visual arts and also fully online learning activities, but mainly it's just kind of the readings which it was used for. Some academics said that this was fantastic because they spent the time to build the prompts. They found students were actually engaging with the materials well before the course started, which gave them confidence that the students were looking at the readings in the right way and also their prompts for helping kind of students to look at the materials in a way they were happy with. But we found in general that academics really thought this was a very poor replacement for Perusal. They felt they couldn't do things Perusal could do. You can't do in-built assessment. You can't add grades. It can't pull through the group data from LTI, which is a pain in the back side if you've got seminar groups and a massive cohort. Really, I think that in general, it was, it maybe did the job for them, but it wasn't quite what they had with Perusal and they were a bit grumpy about having it taken away from them. When I performed the accessibility review, I found that even if you design the most perfectly accessible PDF in the world with alternative text, with structured headings, the second that you put it into and tell us elevate, it removes all that data and completely scrubs it out, which was really quite appalling. Equally, when you upload images, you can't have alternative text to the images, which is, again, really crap given where we are at this time, five years on from the regulations. We found the Zoom feature was limited and we found equally things like pin drop annotations and drawings, students couldn't add alternative text, which is a common problem with those kind of interactions. Again, we asked TALIS, why do these two tools, reading the software and this annotation software, not in script, together, why don't they talk together, why don't they work together? Why can't TALIS elevate, wrap around readings that have been surfaced in a copyright-ideering way? Why can't those and structure prompts that people spend a long time building in be rolled over when their modules are rolled over? Why are they completely lost? Why can't comments include more than just plain text if people want to include successful hyperlinks? Again, why can't the grouping information be pulled through from the VLE? Why can't student-adult types of text or drawings and also why can't you selectively turn off anonymity for certain things? I guess they're working towards some of those things, but they have no answers to some of the bigger questions, I think. Roughly from this, I conclude that we found from this particular evaluation that this practice is nascent, but growing at the university. We think that we've got problems or we know there's issues of growing student numbers, a very, very small city, and the need to do more online provision. I think really the aim of having more active learning in classrooms is part of our strategy refresh. Really doing more of this stuff, online asynchronous collaborative annotation makes a lot of sense, especially given that we're seeing kind of a snapback from the pandemic where the VLE is being used in a very kind of repository kind of way. It's good to think about the VLE really as kind of a more of an active learning platform, and this kind of stuff can really support that. So despite the flaws in Talus Elevate, and certainly there's loads of flaws in there, the report recommended that we should support those academics who want to do this by evaluating, by licensing at a limited basis on the agreements that Talus does some of the things around accessibility that we're worried about and works towards some of the key features that we're worried about. But obviously we'd also keep an eye on what's coming because we've got the NULTRI, we've got Microsoft 365, and those kind of rolling SaaS services have got new features coming through all the time, which may kind of give us a solution. So at which point everyone lived happily ever after? I don't think they did because we found out that the governance group, I found out the governance group that was meant to be reviewing the paper was dissolved shortly before the paper was meant to be received. So I had to write a whole new report for the University Teaching Learning Committee to go up to Education Committee, which is quite high up the chain. So that was an awful lot of stress, but we got the recommendations endorsed. Things looking pretty good at this point, having gone off the rails completely. So yeah, everyone lived happily ever after, but they didn't because at this point we found out the software budget had been cut without telling us, which meant despite the recommendations, despite the endorsement of the Education Committee at the highest level, we had no money to pay for that thing. So what I would say in summary is these kind of online cognitive annotation tools, well, so this kind of practice of annotation is really, really, it's valuable. It's got obviously potential for both mainstream use in blended and online education. It's really a chance to kind of emphasise active learning, as I mentioned, and you know, really it's just a good thing to have, and it's got lots of potential in literature around metacognition about students kind of developing more structured ways to read. I think there's a lot of stuff there that really has benefit across different disciplines, not just arts, humanities and social sciences. I think that I would kind of throw a few punches at the vendors behind these tools, just because I think that the current crop of tools seem to be really contingent educators making informed decisions about copyright, or just kind of them farming off the big questions to subject librarians, which I think is a bit crap. And I think also it's absolutely ridiculous for a vendor to develop a tool which is accessible as far as the actual framework of the tool goes, but completely ignore user stability features. I mean, I think as a sector we've got to take a hard line on this and really kind of put our money where our mouth is and push vendors really hard to make sure they don't ignore this because it is just really crap. And I think lastly, you know, terms of service on these tools are really problematic. The vendor's claims to be kind of benign legally is to say, well, you know, you need to give us right to use your materials and our tools because, but I think it's an over stretch kind of unlimited right to use your ledger slide potentially goes a lot further than just showing it in one activity. And I think our copyright people in the university had problems with that. They raised it as a flag, but they were suitably kind of, I guess, placated by the vendor throwing lots of documents at them, which I still didn't really think answered the other core question on this. And I think lastly, universities like Durham and large institutionals to work in they can be very difficult to get things done. And as found, it's great to get senior management kind of backing for your projects. But when you see a lack of willingness to engage and hold the baby and really kind of just gatekeepers who don't really want to look at something in the first place, that's probably a red flag when you kind of undertake these kind of annotate and these kind of evaluations over the course of a year. And I think lastly, a quick at no one in particular, I'd say that if you do care about innovation, whatever that looks like, you've got to support the people who want to do it. And because ultimately it's educators and then students who have a bad experience as a result of that not being supported. Yeah, okay, that's just for my overall thoughts in the entire process. And last thing to say, actually, I guess, while this was going on, we have a undergraduate third year software development module in computer science that is always asking every year for client briefs for things to put together for their group projects. And I gave the brief of an LTI 1.3 compliant annotation tool with copyright adherence and accessibility at its heart. And I had two projects working with this year. One was a browser extension, which was an interesting way of doing the work. The other was an LTI tool, which was web based, which I would say is probably almost usable in some ways, was superior to TALIS. Not that we'd ever be able to use our production because, well, that's the way the world is. But it was a good process to go through and it makes me have a response in my fair first course after a bit of a difficult year. And that's it. That's all I've got to say. Thank you for thank you for listening. Thank you, James, for keeping to time. So do we have any questions for James? So I thank you for listening to this. I'm going to give you the mic. Cheers. Yeah, that's some really good research there. I wish we had had that work because we've got TALIS elevate our institution. And I wasn't aware of the extent of the inaccessibility. I was aware of the copyright issues, but not inaccessibility. So that's quite interesting. Where are you now then? I mean, in terms of, obviously, there's a project there for some development, the browser and the LTI integration. But is there a prospect of TALIS being used? Are you looking for other things or the tools at the moment? So basically, the funding was cut. So we have no money to buy the actual tool. That's not going to happen. Academics have been offered the chance to basically work with, essentially, to follow on to me to kind of help academics try and do the practice in a different way with the current tools that we've got, despite the fact that it doesn't quite all match up. So that's my fun kind of developing workloads or workloads of that. I think what's probably happened is a lot of academics have said, you know, you've told us we can't use perusal. You've spent a year wasting our time on this pilot that's gone nowhere. I'm going to go back to perusal, sod you, solve the centre. That's kind of where we are. So we're trying to support those who want to do it with the tools that we've got, but I imagine there's a large amount of attrition and goodwill, because that's the way things are, I'm afraid. So Frank Hansel, but that's what it is. Another facepan there. James, is there any way that some of that student work could be sent back to TALIS and to help their development team and find some quid pro quo. Clearly they don't have the development time to commit to this. That's kind of, they'll be prioritising somewhere else, but use your expertise, your students kind of time to try and lose them along a little bit. Coming from a software development company, sometimes a little bit of that can kind of buy you some goodwill. Yeah, I mean obviously the student's work is their own and the source code is their own, but I think certainly the things that those tools have surfaced, I'm happy to show examples with TALIS, and obviously I think working on the pilot twice were really quite good in engaging to hear feedback, but I think obviously act upon it is another question and as you say, there may be issues within the company resource to actually make these kind of things happen and expedite them, and I understand that, but at the same time, yeah, I think certainly TALIS have received all of the recommendations of everything that we've done, whether they act upon it or whether they can act upon it, or their interest to do so is it's up to them really, yeah. But yeah, it was definitely good work by the students, I think that like I said, I would be nervous about letting TALIS see too much of what the students have developed, for reasons, but yeah, it was a good project, and I think next year we might do this, given that it's obviously been useful for them as well. I mean certainly software development students, they don't tend to think an awful lot about what accessibility, I mean they have to to some extent, but I think to the level that we were requiring, you must upload a PDF that can be fully screen readable and keep semantic dates like headings, that was a challenge for them, that wasn't an easy thing to do, so set that up from the outset and they did it. Any other questions, comments, follow-ups? Just in terms of that, do you know kind of if other universities are trying this in a similar way and that you could use some kind of consensus to again sort of influence TALIS or other companies, perhaps they don't know that there is a significant problem, perhaps you've gone further down this, then they're aware of and would really want to commit to, but if you see that there's a you know a number of other institutions doing the same, it might really sort of help it rise up their priority list. Yeah, yeah, possibly, I mean obviously I think it depends on individual universities focus and essentially what they want, and if you've licensed a tool already, I mean I'm looking at you, I mean I guess are there things that you feedback to TALIS? The same response, it's you know the library can handle it, they'll give you some advice and our access to it as well is linked in with SAGE and that creates a problem because it's seen as an overreach by SAGE to try and get their sort of subscription continued by getting their product used by our teaching staff, so there's a political thing as well. The thing I forgot to mention was around digitization, so the university has a digitization process but that's not compatible with the one that TALIS uses and TALIS Elevate uses, so we have an instant kind of disharmony there which obviously causes a bit of a bit of friction as well but yeah I mean I'm happy to obviously share what I've done with both universities if they want to take this further, and as far as my work on this goes I probably won't be looking at an awful lot more unless a magic money tree happens to share, there's more scope to push this further, beyond obviously helping academics who want to try and do this kind of stuff with the skills that we have which maybe don't quite go all the way and the full annotations he'll does. My question to you James is following up on that comment that we as a sector need to put some pressure so what would be the mechanism by which you think we should do that? I don't think we should be licensing stuff without doing a full accessibility review and I know that that should be part of what we do now anyway because again we're five years on from the actual regulations how many universities do that? How many universities really kick the tires and accessibility beyond just looking at the statement and saying okay they've got a VPAT actually going through it we're looking at it from a usability point of view actually testing some of the claims maybe it's resource I don't know but I don't think we should be paying money for things really without assurances that they're going to basically come up to the place because what's the point to academic staff that you must make your materials accessible when you're giving them a tool that they can't possibly make things accessible with? So you know I've I mean I think certainly just money talks that's my that would be my opinion pragmatic but yeah Oh yes Wait point on that just an observation that's actually the Austrian government are doing that they're pushing back quite hard using consultants to really do detail accessibility reviews and saying this is what you have to complete now in one of us to do business with you so other governments could follow suit I guess Yeah definitely and I think they probably should I mean really they should have been doing it five years ago but you know that's the way things are and the other thing I was thinking is open source so do you Yeah encourage your students to publish that and that organically you know proper licenses and so on creative commons and whatever that gets improved and then we don't need to pay tons of money or something that can be adapted by institutions I don't know I'm too naive here No that's a really great idea I mean I certainly think that our IT department wouldn't be a fun about because it'd be one more tool to support that's not being supported by a third part they need obviously software as a service they'd like things to be outsourced I think the idea of kind of supporting updating patching some of themselves is really not attractive but I think yet open source it possible will be the way to go Yeah definitely A last question from anyone comment a massive thanks for your presentation James and well thank you very much