 start the recording. Good morning everybody. Good morning. My name's Chris Morrison and my name's Jane Secker and we are now running this is number 68 of the webinars Copyright and Online Learning at a Time of Uncertainty. Absolutely. Do we need to rebrand on that? We might do. I'm not sure. Yeah. Yeah. Yeah. I think a sign of, I mean as we've been talking about the sun is shining today. Yeah. We are spring is coming together. So I think it's a time of optimism and hope. I would suggest. Yes. Yeah. Yeah. Maybe. Yeah. Yeah. I'll see how the year goes. Yes. So we're looking forward to having a really great presentation today. Yeah. I will re-share the slides. I deliberately turned them off just so that we won't end the recording talking to just a black screen. Yeah. Or the title screen. Yeah. Yeah. And everyone can see our lovely periodic table. Yes. So we're wearing our periodic table. You've just got a new one. I've got a new one. Yeah. It's nice and new and fresh. Yeah. Mine's getting a bit faded. No. It happens. Yeah. T-shirts are available from copyrightnetrously.org. There is obviously a merch site. You can get tea towels, aprons and all those things for our copyright t-shirts. All those essential things. We may then have an ethical discussion later on about. Shall I put that in the chat? You can. You can do that about whether or not it's ethical to use this as a platform to sell merch. You're under no obligation to buy there are other t-shirt stores available. Well, those are the old t-shirt stores. We should put that in there. That's a thing to do. No, just for ours. Okay. Fair enough. Yeah. So what have we got today? We've got some copyright news. But the main event is guests Natalie Lafferty and Sharon Flynn joining us to talk about the Association for Learning Technology ethical framework. Yeah. The Felt framework for ethical learning technology, which we're really looking forward to. So thanks to them for coming along today. And we'll be talking about what we've got coming up next. We've got some webinars lined up. We've got some possible topics lined up as well. And we are looking for some speakers, aren't we? So hang around at the end, please. Because we definitely want to share some of the plans for this year with you. Absolutely. Since we last met. Yes. So I was preaching a sermon at the University of Oxford last week, wasn't I? For those of you who attended Ice Pops in 2022, they will know that we had a last minute room change to be at the Catholic chaplaincy in Oxford. And they were very welcoming to us. They were. But we were returning there. We were. It was an interesting effort to turn that into a playful space back when we were at Ice Pops. So once again, you've got to talk to us more. I'm looking down upon you. Cardinal Newman as well. Yes. But you were preaching about information literacy. I was. Yeah. My favourite subject to give a sermon on. Yes. So it was great. And then we got a bit, a little bit of copyright in there, but it was mostly information literacy because there's a whole programme of what happening at Oxford. So it's really exciting, actually. You're looking to develop a digital and information literacy framework. We are. Which copyright will be? Yes. Yes. The famously playful figure of Thomas Moore. Yes, Chris. So yeah, that was all very good. It was. We're thinking frameworks today. Today is clearly Framework Friday. Definitely. Yeah. I love a framework. I've spent lots of time talking about frameworks. We have the archive of all of these webinar recordings the slides from previous webinars have been previous 67. Yeah. I'll just pop a couple of those links in the chat for people, but I think everyone knows where these are. So it's copyright news. It's copyright news time. What's have been going on in the world of copyright? We haven't had a webinar for a month or so. It's been a while. Yeah. Well, I thought we ought to celebrate. It's International Women's Day today. And there are some fantastically amazing women in the field of copyright and digital education. But what I wanted to just flag up was that some of the amazing people that we've spoken to on our podcast, some of the women that we spoke to in the last couple of years. So Emily Drobinski, who is the current American Library Association president. She's rocking the ALA presidency. She is. Yeah. Yeah. You can hear us chat to her. You can hear Chris's little music, his walk-ups on he made for her. Caroline Boll as well. Yes. At University of Derby. We had a great conversation with her a couple of years ago about her copyright and found fiction. We have a son, Stavey, who's at the University of Barcelona. But we talked about open access. And I think you've got a sort of famous Spanish guitar player, didn't you, to sort of write a song? Yeah. Well, I think it may or may not have been Enrique Iglesias, who soundtracked that one. And the first guest podcast we ever did was... Yeah. Eleonora Resarty. Yeah. So that was really great. And we don't have it yet out, but there is an upcoming interview. We've already recorded with Caris Craig, Professor Caris Craig, who's ever in Canada. And actually her research is specifically around critical and feminist approaches to copyright law. Yeah. So that would have been one if we'd actually got out and edited it. But that's a really great one. And in fact, her answer to the question, what's our favourite sweet treat, is probably one of the best answers we've ever had. Just don't give anyone any answers. I'm not going to give it away. He's went on the editing. No, I know. But anyway, that's all to come. We'll get there. It'll come. It'll come. It's like all good things. It'll like the spring. Yes. Well, there'll suddenly be a flourish of copyright literacy, like Blossom. It'll be wonderful. So the next item, we talked about the Alt Cool SIG, the Copyright and Online Learning Special Interest Group. And our time as chairs was up. Yes, we've done three years. We've done three years service. So we went and asked if there would be someone who would be interested or any group of people interested in taking over. And we can now reveal that the new Alt Cool SIG co-chairs are the same as the old Alt Cool SIG in the same t-shirts. So the way that it worked out is nobody else put themselves forward as co-chairs. We said we would agree to continue on that basis. We've got an upcoming committee meeting. We intend to continue with the webinars. We do. We're kind of carrying on, but not just doing exactly the same as what we've done. I think we still want to make sure that we're... And we've got a couple of new people who've come forward to join our committee as well, which we're really excited about. We are looking forward to fresh ideas. We're not going to get stale. No, definitely not. No. No. So watch this space, really, I guess. Oh, this is an exciting... This is me, isn't it? Information Literacy and AI. This, I think, is proving to be a very popular event. It's a panel discussion that's taking place on the 20th of March online at lunchtime, so one o'clock. There's a few tickets available. We'll probably make some more available because it is proving so popular, but we're going to be talking about Information Literacy and Artificial Intelligence. And I'm chairing this, so this is a shameless plug from me with my Information Literacy group hat on. But yeah, a very popular event, and do have a look at that. And there's some great speakers as well talking about AI in all different contexts and how it relates specifically to Information Literacy. I've got an inkling that this topic might come up later in the conversation today. I think it may well do. Yes. Yes. Yes. And this is to let you know that there is an upcoming webinar on control digital lending. This is something available to Sconal members that we will be part of. And in fact, we will be talking about a briefing that we have written for Sconal and Research Libraries UK. So this is going to be a closed meeting. And Jisk. And Jisk as well. Yeah. In fact, Jisk been instrumental in commissioning and supporting this work. Yeah. So this is something I'm sure will be of interest to many of the people. And I think that the briefing paper will probably be going out, I'm assuming ahead of the, I don't know that that's the actual launch of it. Is it on the edge? But we, yes. So if you're interested in this clip, we will be sharing stuff on the various lists as well. This copy seek. And if you're not already a member of the CDL Jisk mail list, and you are interested in control digital lending, I would recommend joining that. Yeah. Absolutely. You're going to pop the link into that one? I can do that. Okay. I'll go on to our next item. Next item, just a reminder that coming up in May is the Sillip Copyright Conference. It's held online on Zoom, but I know it's again got the theme of copyright and AI and ethics. So there's an overlap with some of the topics we're talking about today. And then a great line up of speakers. I think Matt Lambert is speaking from the British Library. And I believe Caroline Ball is speaking as well. And yeah, lots of other people. So do have a look at that. And hopefully that looks like it's going to be a promising event. So yeah. And next item was just a news item I picked up. I think I picked this up off Matt Voigt, who is still doing his fantastic copyright news from a sort of international perspective. So this is research that's been done by Communia, who do lots of research on kind of public domain and creative commons and kind of all those sorts of areas. It's looking at researchers' perspectives on working with copyright across Europe. And I think it's based on a survey of things like how copyright exceptions work and what impact it has on the work of researchers. I'm interested in this just because we've been working quite frantically on analyzing our copyright anxiety data, haven't we? And it's causing a lot of anxiety trying to make sense of it. We're going to get it sorted today. We're going to get the post-its notes out later. We are. Yeah. But I think that there's some really interesting findings in this research that I'm sure we will be wanting to reflect on as we write up our findings as well. Yeah. So just to remind a Communia organization that kind of came out of the work around Europeana and opening up collections within Europe and addressing some of those copyright issues. Yeah. Yeah. Yeah. So I think that might be it. I think we're ready to introduce our special guests. Yes. Absolutely. We're really, really pleased that Natalie Lafty from University of Dundee and Sharon Flynn from the Technological Higher Education Association are coming to talk to us about the work that they've done and with colleagues on the ethical framework for ethical learning technology that's been a stream of the Association for Learning Technologies work. So one of the really good things about our group being and actually creating a special interest group as part of ALT is that we see the great work that's going on elsewhere and we're able to kind of link things together and we hope we are. I think there are some clear links and we've had it in our mind for some time between copyright, how we deal with copyright law, how we deal with all the questions that come up within the copyright specialist community and the bigger broader questions about ethics and law and new technologies and the use and implementation of technology. So I don't think we really want to say much more. I think you're there. We had a chat to you at the start but hopefully everything's working okay. We will get your slides up to share. Hi Natalie. Hello. Good morning. Hello Sharon. Hi. Hi. I'm here. Hi, we can hear you both perfectly. Great. Okay so your slides are up. Hopefully they're still working for you to advance if you wish when you need to and we'll hand it over to you. Yeah so take it away. Thanks very much Jane and Chris and hello everyone. It's great to be with you all today and we are going to be looking at the ALT framework for ethical learning technology and I guess maybe to start off I'm going to just talk through a little bit of the rationale for developing it and I will pass on to Sharon who will talk a little bit more about how we're looking to apply it and how we're also looking it into accreditation schemes like CityMalt and linking it to potential to develop case studies and then we'll maybe think about potential questions that you might have but also maybe some points for discussion so we can maybe unpick some of this a little bit more. So to kick things off I think it's fair to say that our perceptions of and perspectives on technology have probably been evolving. You know sometimes we talk about looking back in time through sort of rose-tinted spectacles and I think certainly my perspective of technology has gone through various permutations. One we're starting off probably quite optimistically back in about I don't know early 2000s and as we moved into the sort of 2010s, 11s, 12s onwards maybe a few concerns about some of the maybe unintended consequences of technology and we began to see maybe some of the social impacts of technology in digital and society, moral responsibilities and some of the ethical dilemmas and I'm sure some of you may be aware of a book by Andrew Keane called The Amateur which I think was around 2008-2009 and his concerns about you know the web and digital was very much democratising information but actually did that also come with some risks and we see now people can pretty much write what they want and we've got fake and all the rest of it so we have these unintended consequences and even back in 2016 Spectre proposed that did we need to have some kind of ecocratic oath in education so just as medics have the hippocratic oath do we need something similar in education that he came up with this sort of model or a values hierarchy really for the learning environments and if you look at the sort of little steps in this hierarchy you can begin to see as we think about learning environments and their accessibility that very much links into concepts like universal design for learning which came to the fore during COVID and also the accessibility legislation which came into play across Europe in 2019 and then as we move up this hierarchy thinking about sustainability and making sure so that no harm is done and I guess that also then links into things like learning data ethics and also things like GDPR and privacy and some very sort of high profile cases recently particularly at the University of Bristol looking at you know what is our responsibility in terms of how we deal with them learning data and analytics and I think really all recognised very much the same challenges and issues emerging around technology and so it decided it would have this project started in 2020 to develop this ethical framework for using learning technology and so that work started and Sharon and I had the privilege of co-leading that piece of work together with Bella Abrams who was also a trustee at the time she's the head of IT at Sheffield University and we started that work and we had a number of open members meeting we met with the old assembly but we also invited members of the community to form a sort of working group and we started off I think Bella came to a meeting suggesting we start by looking at some of the research ethics frameworks so we very much took that and some other models of ethics frameworks and just began to discuss those, ruminate over them and slowly begin to draft our own first set of ethical principles and they were ready by sort of January 2021 they went out for broader consultation and we refined a little bit more and then there was an open consultation May to June 2021 we had 165 responses to that not just from the UK but globally which was fantastic to see and then that feedback was very much distilled and then we were delighted really then to be able to launch the ethical framework in December 2021 and thanks Sharon for sharing that the link and this is really the framework and you can see there are sort of four areas that we've highlighted that we think you need to rethink about when we're using digital in learning and education and the first really relates to awareness and I think for me key is key is that whole thing of being reflective practitioners and I think that applies to all of us whether we're librarians, we're lecturers, we're learning technologists whatever our role so reflecting on our practice and thinking critically also being aware of our own limitations in terms of our knowledge but also maybe our own unconscious bias and also I think having that awareness of actually the impact of digital and the ways that we practice the things that we develop and the services that we design and deliver to the stakeholders that we work with whether that's students whether it's staff members of the public there's that whole area of care in the community so one practicing care of ourselves and others and again I think central and I think we see it here in this SIG group and we see it in many of the groups that run through old and old actually as a whole is that whole sense of collegiality collaboration mutual understanding wanting to learn from each other and support each other but also recognising that we can also influence beyond our own institutions and our own teams and again I think that's the beauty of groups like this coming together to sort of share our experiences and expertise and disseminating that good practice then there's the whole thing of professionalism and coming from a medical education background for me that's very much I think about role modelling and demonstrating that kind of accountable evidence led practice it's that commitment to ongoing professional development acting with integrity and honesty and again that awareness of complying with law and legislation copyright is the prime example I'm always astounded by how many of my academic colleagues are just completely oblivious to copyright law and when we do audits of the VLE what you know we find a sort of little shot of horrors hidden hidden away in modules so yes trying to trying to make sure that we're aware of actually what our legal responsibilities are and also being an advocate for those ethical approaches and then obviously values and I think here again it's that sense of inclusion being aware that actually digital poverty is a thing people with disability do have challenges using some of our systems so making sure again that we're very inclusive that we that we do make education open to all and accessible to all and and again just being accountable and transparent in the way that we work so that sort of is the summary of the framework and I'm not going to pass on to Sharon. Thank you so much Natalie. Yeah it's been a really interesting journey I suppose and as Natalie said we kicked off in 2020 and then the the framework was launched very shortly after the consultation and a lot of thought and there had been a lot of as Natalie indicated a lot of thought in putting together that framework, organizing it and and then the launch was was really exciting and you know we got a lot of comments back and we said well look we'll have to see how this works in practice and you know maybe in a little while we can come back and revisit it but you know you put so much effort into getting the original document together and the original framework together and the beautiful graphics etc so I'm just getting a notice that my my laptop is low on juice so I'm just going to try and make sure that we are okay and but the next stage then was to really look at and I'm going to start moving actually and was to look at embedding it and to make sure that people could actually start using the framework in a meaningful way. So Maren Deepwell within ALT had already I suppose anticipated some of this and so the ethical considerations had already been started to be embedded in the in the CMAL framework from 2019 from before we even began talking about the the ethical framework and so there were already some prompts and some some I suppose requirements within the framework to start thinking about about ethical considerations and then the framework itself once we we brought it into into play and launched it and it was very important then to start actually embedding it into people's work so a number of resources began to be developed and the first set of resources was self-assessment template so the self-assessment template which you can find at the link that that I posted earlier and there are two templates available and there's an individual template because obviously we want to try and be ethical in what we do as individuals as professionals within our own job within our own place of work and but it's also very important to be able to come together as a team and particularly when you start talking about values and and having shared values of a team and this can be quite a useful thing to do to come together as a team and to look at you know and are we being ethical in our work are we taking these these ethical considerations into into our practice as a group so and so both of those templates were quite important and I'm just going to speak a little bit more about those so and with the self-assessment template and or tool it can be used in a number of different ways so you could use it for example to as a lens to reflect on a particular piece of work that you are focusing on you could use it and for perhaps a piece of work that you're about to to to launch into perhaps as part of a project and or a new tool or a new platform that you want to focus on so and that flexibility was built into putting together these templates and and I'll talk a little bit about the self-assessment in in in just a moment but the two tools the one for for groups and the one for individuals are pretty much the same thing it's just obviously you complete one as an individual or you complete the other as a team the templates are available to download from the website they are openly licensed and they are free to use and they are also now mapped to the C-mult framework and there is I understand a microcredential and I know that and and I will come to a slide which says that you you can go and fill out a form but I haven't been able to find the online form so I'm not sure that that's fully available just at the moment but it's something that I'm going to check on after today so and the first step in the self-assessment and this looks at the individual and self-assessment form the first thing that you're asked to think about is is what is the focus of this so what we're doing with this form is using the self-assessment and the framework as a lens to look at a particular aspect of your work so as I said it could be that you're looking at a particular tool or a platform that you're using or that you're launching or that you're you're recommending to students or to staff and it could be while you're developing or reviewing a particular policy or a process it might be a particular project that you're you're you're about to launch into and this is just maybe something that you could do alongside the other aspects of the the project and project management and it could be a particular aspect of your own work and or it could be something else so it's it's quite open so you would select that first and then begin into the self-assessment there are two parts to the self-assessment so part a is where you reflect and essentially what this does is it looks at the four areas that Natalie spoke about and just a few minutes ago awareness professionalism values care and community and you look at the various bullet points within each of those areas and reflect does this apply have we talked about this is this something that that that is of value to us to us and etc and and basically just fill in the the boxes on the form you are also encouraged to reflect on any barriers so it could be for example that you can't do things in a certain way because of an institutional policy or practice there may be structural barriers there may be resource barriers operational constraints etc and again it's just an opportunity to reflect on that and there's a little scoring mechanism very very simple scoring mechanism and part b then takes that scoring mechanism and allows you to take a summary of your scores and then encourages you to think about focusing on some of those areas so focusing on particular current practice and areas for development so you can see how that might fit into the C-mold framework and now this is this is the slide that I would have saying that I'm not actually sure that the online self-assessment form is available but I'm going to check with on that after today and there is a an open badge from out which can recognize the fact that you know not that that you are a superstar ethically but rather that you have done the self-assessment and then the badge and indeed elements of your own self-assessment you don't have to share the self-assessment itself and can be used as evidence in your C-mold portfolio so that is one area of adopting the framework and one way that you might be interested after today to take a look at the form and think about how you might apply that in your own work either as an individual or as a group and beyond that there is now an out award for case studies of ethical ed tech and we are looking to collect more examples of case studies from all areas across higher and further education in terms of case studies and policies from individuals and institutions so specifically we're looking for example policies from institutions which would ideally be openly licensed and case studies of professional practice from different contexts case studies from institutions and we're also looking for case studies from vendors that's not as applicable today and we're also very interested here how the framework has been used and how it's been adopted what people are doing with it the ways that people are using it and this actually was within the context of generative AI this was a theme in the alt winter summit thank you Natalie for adding the playlist there some really interesting talks there if you if you weren't at it or you haven't seen any of it do go and have a look at the playlist certainly encourage you to do that so Natalie that's that's my bit over do you want to move on yeah thanks Sharon so I just wonder in the context of what we've presented so far I just wonder whether the audience might want to think about how do they think that the framework might apply to their own practice and maybe just put that in the text in the chat just to see what kind of things have been provoked so far we've got a few slides with some kind of statements that might identify a few areas but we're just really keen to hear from you where do you think the ethical framework maybe is relevant to your own practice yeah so that's interesting Jane so I'm just going to quickly move this I was wondering what Chris is trying to say yeah sorry about that Chris just he's using another laptop here and something went weird when he was trying to type something in so yeah we were just wondering so I know that quite a lot of people have been asked to think a bit about I'm thinking specifically in a sort of library context around AI tools that you might use for do you know helping you do things like literature reviews some of the tools that are out there I think is that one called elusive date or something like that or elicit or something was one that yeah will summarize yeah elicit that's the one yeah and research rabbit and those kinds of things so yeah that's exactly that's what you're going to say I'll shut up I'll let you carry on no no that's fine and and I think it's interesting because I've noticed that my library colleagues have started doing sessions for students on using some of these tools and also there are some we were just contacted by an academic earlier in the week who has incorporated generative AI into an assignment and it's a very well thought through assignment and again he mentions using some of these tools on the slide and but the one thing they've not really thought about is actually how do these tools deal with everyone's data what are the privacy issues and I guess it's I don't know if everyone is aware but essentially since GDPR came into being and even before GDPR actually universities have had a duty of care around the the tools which we which we buy and that we then use in our institutions so essentially whenever we buy or procure a new tool they have the vendor and even those sort of bidding on the tender will have to submit things like a web security form so we can be sure about how they're processing data about their security where the third parties have access to their servers all this sort of thing and to the data and then we will also do things like privacy impact assessments we will also do equality impact assessments to again look at issues around how it may impact students with disabilities or staff with disabilities so all of those things are considered and then there's also the whole information governments and the data processing agreements which look very closely into how what's done with the data how is it stored etc etc so all those things have to be looked at before we buy a product and in a sense I think where we are with gen AI and Sharon you probably could comment on this as well is we're a bit like we were I think with social media probably back around 2010 12 13 14 where lots of lecturers were incorporating social media into their teaching asking students to sign up to Twitter and things and now we sort of look at it look back in horror I think particularly when we see what's happened to Twitter or X as it's now known but it's what's happening with the data so I think one is are we looking at that and I know there's one tool Otter AI that has been banned certainly on this on the blacklist at Dundee but are we doing due diligence about all these other tools before we let students actually have a look at them do we ever look at the privacy statements when we sign up to them can we make students use a tool that isn't an institutional tool and we also know that some students are then choosing to pay for these tools and so that that immediately raises issues around equity of access and it's interesting because there was one of the sessions at the old winter winter summit which was a student panel and there were some students that who were who were paying for chat gpt but you know not all students can afford to do that so we do have co-pilot within within Edge Microsoft Edge so certainly at Dundee what we're saying is to lecturers and to our library colleagues is that if you are wanting students to use these tools you can't force them to do that but we do have co-pilot within Edge so those students can use that or if you want them to do it you could suggest that they sign up using a university email address rather than their own personal email address so Sharon I don't know if you've got any other comments on that I suppose the only other thing to to add to that is and and we're all familiar with this is you know you you when you do sign up to these whether it's free or whether you've you've paid for them and you also sign these terms and conditions which which virtually nobody looks at and and as well as as their own personal data and many students are are I suppose providing inputs to these tools in terms of their intellectual property and and we we really have no idea where this may eventually end up and we've seen plenty of examples particularly over the last couple of years where where companies have decided that they are going to you know sell on data and or to indeed use potentially student work to to improve the tool into the future or to improve other tools into the future so it's really as Natalie says you know when when a higher education institution signs a contract with a particular ed tech vendor and we have these these safeguards that are in place but if you're if you're sort of encouraging or or in some cases requiring students to to sign up individually and it's it's really difficult to control and yeah so certainly having that that educational or that that ethical lens to look at it as as a framework and just to be aware of the potential implications of this it's it's incredibly important yeah and just as you were saying that Sharon what came to mind was I remember when canvas the VLE was sold um by its original owners and they were the the the people I can't make which company bought it now but I remember they explicitly said we bought it for the data because we've got all this data on students so the students were the very much the commodity in that um and I linked to what Sharon was saying there um so I just wonder what this statement might provoke whether anyone has any comments or thoughts on um and actually the ethics of even using generative AI tools do you want people to put something in the chat or yeah put something in the chat and put their hands up as well because I know yeah we've had quite a we've had a couple of webinars last year about um AI and some of the implications of this and lots of people um who are the copyright specialists are being asked to sort of comment on I think it's a really really interesting statement for me the thing that um if you're using the word stolen you're using the word theft I mean it's something that from those of us who who are copyright specialists and arch pedants would point out that infringement of copyright is not necessarily the same thing as theft because if you're stealing something then you are taking something from someone else and you are then depriving them from owning it whereas we're talking about copying and using um things without authorization it doesn't necessarily deprive the original creator or owner or you don't take it from them but you are taking some I mean it it's really I think about the the language that we use and that is increasingly the case with with AI and people's worries and fears about it um but it means that we are leaning on using copyright law and using intellectual property rights laws as a as a sort of shorthand for it it seems like a bad thing yeah it looks like a bad thing and copyright is the thing that protects us from this bad thing of people taking stuff that isn't theirs and you know that that that kind of position isn't going to go away and that's going to increasingly be there but it's whether we can kind of feed in more of the but there's a bit of balance involved in this this is how intellectual property laws are are intended there's supposed to be a limitation to how much control the original creator has over the the stuff that they've created because there's a broader cultural and societal benefit in this stuff being out there it's yeah only to any you know does any one person get to control or organisation control how it gets used yeah yeah and I think it is you know it's it's it's quite uh I think what you what you have to see is that it it will be some you know in in the creative industries there will there is there is a you know a clear um line of view that that you know content needs to be kind of protected largely because it's big companies wanting to monetise that content and not wanting other people to use it and they will be very um you know negative around copyright exceptions being used whether it's being used by companies or whether it's you know we're all kind of like the same boat so it's it's it's it's really hard because I think a lot of us feel quite nervous about the way that copyright is being used I think as a way to say well that's why AI is wrong yeah and here's the way we crack down on the fact that it's wrong and it's going to destroy humanity yeah and I see leisle's got a comment in the chat as well which is interesting that um from your perspective your institution's gone quite conservative asking staff students to look at terms and conditions and not to feed copyright data into it no you wouldn't use the word theft but you also want users to be aware that they could be giving data that they don't want to or it's not theirs to air and I think that is absolutely the key thing that that most of us are focusing on I think so I think if you're you're advising sort of PhD students or academics doing research and they're saying oh look great there's all these free tools available that will you know shortcut me doing my data analysis we talked about this didn't we with using that and actually the the ethics that we signed up to as part of doing research at the University of Oxford doesn't let us use third-party tools to analyze our data I wonder how many researchers are just doing it though and going ahead and not thinking about it I think what we should definitely a call we're going to stop talking in a moment yes call back a call back to the webinar that we did with Alex Fenlan from Birmingham who I think is on the call today and I'll put a link to that because we covered some of the sort of detail on that we did yeah so yes but thank you written back to back to you Sharon and I think that's really interesting and I think this also I think speaks back to Sharon's point because just this week I think it's been announced that wordpress.com and Tumblr who both owned by the same parent company are basically giving all their data so all the data that people have published on tumblr and wordpress.com is going to be made available to open AI to build their engine so it is it does I think throw out some real tricky issues and this statement I actually saw somebody had written a comment on on LinkedIn that was saying that if you used generative AI you were a thief so I kind of just made a little twist on that sort of statement but obviously there are these class actions going on in the states New York Times suing open AI because they feel they've moved well well beyond fair use so I'm sure this is an issue that's going to be very hotly hotly debated but some time to come Sharon shall I move along too yeah yeah so so this was one that um that um um you know I suppose there's been rumblings of it um oh I think we've lost Sharon yeah I was just going to say I can't hear Sharon now is yeah I think she's disconnect yeah so I'll wait I'll leave that one to a Sharon reconnect and I'll maybe just um I'll maybe go to this one um so I'm guessing Sharon's back oh did I did I drop yeah we lost you for a second oh okay sorry I was just saying I came across a headline in a paper which which which mentioned staff um using gen AI to grade students papers now I think it was actually at um at you know school level but um at the same time it just kind of it raised the you know I can feel the hair is rising on the back of my neck and and it's it's all it's tied up in this narrative around um making academics lives easier um and and I came across this this paper last night which is it's uh it's um speculative I suppose case study um presenting the case of a of a university professor considering the implications of um uh using AI to grade student papers and and a case of implications and uh it's just something that that um it's it's it's it's a thing that's been sold to academics as this is going to make your life easier and you can spend your time doing maybe other higher value things than than basically grading papers and for me I just I feel a little bit sick in my stomach when I think about this and and when it comes to ethics and there is a little section on ethics in in this particular paper um but but my first response is why on earth would students go to the effort of writing something if they know that that their lecture isn't even going to read it um but I feel I feel very saddened that we've reached this point and and I just thought I'd include this um just to see what what people thought or or or whether you think that that there there may be um maybe I'm just overreacting maybe I'm just been around too long and and maybe this is the way we're going but um I really hope this isn't um but I know I know for sure because my background is computer science um that um uh there will be people working on this there there will be computer scientists working on this you know how can we build a tool that will be able to fully grade student paper yeah yeah you see Jane computer scientists yeah yeah yeah just yesterday just yesterday I spoke to somebody um who is really interested in the fact you know when you have really large classes um and I you know I get it I mean I teach up to sort of 25 and I you know I find marking 25 assignments pretty hard going I can't imagine if I had 400 undergraduates but I mean I know that you don't you know typically you get a team of people to do it but one of the things that we discussed was um you know trying to make it a bit more standardized so using things like rubrics and some of the tools that are in you know VLEs like Moodle and features of Turnitin but um I just wonder if it could be taking it then to the next stage as you say what would a student actually think if they thought for a minute that their lecturer hadn't read their paper and it had been marked by an AI I mean I think it's a really interesting question to have in light of the fact that you could potentially end up with them with students papers that are written by an AI and marked by an AI and yeah exactly what is the point of any assessment then but I you know and I think I think you're right Jane because what this what this says to me is you know and I've been saying this for the last 15 years is that assessment is broken but this really this yeah this this breaks the back of it this this tells me that that our assessment systems I mean why are we doing it if if if we're going to get into this cycle of students using the AI to to write the assessments and then we use the the AIs to to mark them it it's it makes a mockery of the entire system it does it really it really does and I agree with you entirely that it's it's where I kind of end up really frustrated and thinking you know we have to kind of get to a point then to say is there any point to the assessment well what you might find is that the the lecturer robots end up generating a new whole new generation of really excellent essay writing robots but the advanced advanced scholarship much more than the humans have it is yeah sorry that's a cliff statement yeah yeah yeah we've got loads of great comments coming up in the chat I don't know if you two are following I don't know if you can pick up on any of the points yeah I think I think Nikki makes the great point why be a lecturer if you don't want to read assignments um and I think some of the essentially some of the conversations I've been involved recently have focused on the issue that you know in the UK we tend we tend to have anonymous marking which is already depersonalizing the sort of relationship between the the lecturer and the student um and that doesn't seem to be the case in other countries I know for example in Australia they don't have anonymous marking um but then if you're going to also then have a I know an AI doing the marking um that depersonalizes it even more but I think again Rachel's comment there about um actually AI helping to improve the students work and I think it's interesting speaking to disability colleagues who who do prescribe some of these tools um to students to help them with proofreading and the grammar things like Grammarly obviously helping students with their writing and my own experience of teaching some students recently and we had several discussions about AI and many of them are actually using it to sort of um put you know submit drafts and get feedback how to improve the writing um the proofing all of that sort of thing the spelling um so it is actually helping them to hone their work and it was interesting that their argument was well often if you're trying to get a formative um submission to a lecturer they often haven't got time to read it and give you formative feedback but often when you're doing work and you're thinking about how you're going to structure your essay or your assignment you'll be discussing it with your friends anyway so they were saying what's the difference between doing that with friends you may give you some intellectual input um then using you know a generative AI tool so I think there's a sense in which AI is kind of we've we've known it's been there for years but there's kind of the generative AI is exploded on the scene and I think amongst we're so busy and stretched thinking time is really kind of key so I think it's um I think discussions like this and conversations are really key to helping us find a place where we're kind of happy with all of this in a sense yeah yeah I'm just conscious of time I know we've had quite a lot of questions as we're going on um as well but um did you have you got do you have any yeah have you got further slides to present or are we kind of now in general questions because I think I mean this is fascinating yeah I think I think general questions given the time yeah the other slides you've got we're just provocations really for discussion so yeah yeah let's let's take questions yeah yeah I I have a question and I think it's or sort of an area for discussion and this is what kind of prompted me to to want to to bring you on to the webinar in the first place is when those of us who are responsible for copyright get asked questions about what we should be doing or what what the corporate implications are there's often a feeling you know our job is compliance some of us actually have compliance in our job titles to make sure we're following the rules but when it comes to ethics often what the rules say and what the right thing to do are not necessarily the same thing so I'm just wondering whether to what extent and maybe this this is a question for the people on the call as much as it is for you all as much as it is for you Natalie and Sharon is to what extent does the the framework that you've created help us as copyright specialists and practitioners come up with the with the right answers I've always we Joan and I have always had ethics actually in our definition of what we talk about when we talk about copyright literacy it's enabled to create the ethical creation and use of copyright material we don't have lawful no we don't have you know authorized we have ethical yeah because we really don't like this view that that you you're sort of a copyright person is is all about compliance and doing that kind of you know as you said the little shop of horrors the audit of the vle and then having to go around and tell everybody you know no no no take all that down the question arises like okay there's stuff that arguably is infringing copyright on the vle is that really a problem if students are getting high quality resources if we are you know if we're supporting the the mission of the institution to to to help its students and it's also we were talking about some of the other wider ethical considerations it's copyrights not the the the one that that should override things like accessibility should it and some of the other concerns that we have about you know I think that that is the problem that sometimes it does it becomes like the thing that that dictates above all else and so what I was hoping we would do in in just having this discussion within our community is think about how we can when we're brought into those conversations how we can be aware of the broader ethical thing well I mean we are thinking about the broader ethical considerations all the time but you know is this a useful framework for us to use is there something that we as the copyright community can contribute to the framework what kind of case studies might they look like where we're you know being asked these questions and kind of balancing what's lawful against what's ethical yeah I I think the key because I think one of the one of the criticisms that was leveled to us when we were developing the framework was individuals agency to be able to raise ethical concerns and I think this is why you know as Sharon was outlining the ability to use the toolkit actually as a team so you know work through it with a team think you know to look at different issues their copyrights one but there'll be many other issues yeah the big thing that's sort of taxing a lot of us at the minute is how does digital you know how does that tally with trying to be a net zero university you know that's a real dilemma so I think it's about I think it's about in our teams it's instilling those values and that culture where we can talk about it and we can have those conversations and with our academic colleagues and thinking well what are the issues what you know what's our shared set of values so that actually then when you're dealing with senior people in the university there's a sense in which you've got a collective thinking that you can bring to the table because I think it's very difficult if you're just a lone voice yeah Sharon what do you think yeah I'm just noticing Simon's comment there he said his starting point he's always wanted to say yes and and the compliance angle can be a barrier in many ways to that so so I think I agree with what Natalie's saying that that I think there's an opportunity to try and to use perhaps the the group self-assessment for particular scenarios for particular projects for particular dilemmas that come to you because it's both the difference with copyright is you know copyright it obviously it's a legal thing it's it's a it's a yes or no it's black and white mostly not always but mostly and whereas whereas ethics it's it's it's more of an opportunity to explore and I would see perhaps and the the self-assessment exercise as being a way to reflect on and to explore the issues and to find a balance and that could then become an enabler or perhaps even go so far as to become an enabling guideline or set of guidelines in a particular area and I think there's a huge role for for for people like you who are working in that type of area often within within a library situation to to use that in that way to to provide those enabling guidelines and ultimately thank you so I think that's an excellent point and I think there's a good point to end it on I think what we're going to take away from this and certainly take back to our special interest group I think there's a real opportunity to build on what on what Simon's saying is if we if we do have the starting point as wanting to say yes and therefore the compliance we have already the computer says no situation if we're guided by the ethical framework as to what we're saying yes to then our job as copyright specialist is to find a way to make that work and I think that is kind of the approach that we've been taking for quite some time but it might be worth actually making that explicit statement and we can yeah we can take that away and think about it but thank you so much so that's been absolutely brilliant hopefully maybe see if we can get some people to want to contribute some case studies as well yeah yeah for you that was wonderful yeah and I think just one other sort of final thing as well I mean quite a lot of people on the call are probably librarians and there is a an ethical framework that Sillip has as well so I don't know if it's worth looking or if you did you know look at that when you were sort of drawing up you know because professional bodies um obviously do work together sometimes and I think in there is a there's an overlap between those two sort of frameworks as well so it's not to say that librarians don't think about ethics because they do hugely when they're making lots of decisions about you know the student the people's data and what tools to purchase and things like that so yeah it's been fantastic thank you so much we are at 12 o'clock so we know lots of people have to disappear but we just want to thank you Sharon and Natalie I hope this is going to be the start of some further interesting collaborations between our group and and the two of you and people working on the framework so yeah thank you huge thanks thanks very much yeah so very briefly just to say coming up next we have uh the 19th of April it's the next date that we've got which is going to be the return to the ever popular becoming a copyright specialist session where we want three people to come forward we've got people in mind so we will be picking on you but if anybody if anyone wants to do this yes then please let us know it's always fascinating to hear people's journeys yeah um and the approaches that they take yeah and there has been three of these webinars um to date and so if you're wanting to see the kinds of things that people talked about do have a look at um the earlier ones but we'd really love some people to come forward as well if you want to volunteer otherwise we're gonna we're gonna ask Chris as we've got we've got our eyes on a few people so watch out particularly those people putting very intelligent and thoughtful comments in chats and on the discussion lists yeah we'll definitely be uh be in touch uh we've got we will rearrange the session um on third body copyright with Claire and Emily yes uh the UKRI guide we haven't got a date yet probably yeah we do need copper anxiety thing um I've started to be on the other no no no we will finish this research project yeah uh cross-border licensing is a topic where I think we do we do have something lined up on that we're gonna make sure we frame that well um we've got a volunteer uh we have a volunteer diesel okay excellent diesel yes uh thought you might yeah yes picked up on that rather pointed reference uh okay and controlled digital ending we've spoken out there is a separate event but I suspect we'll probably return to it at some point yeah yeah I think we probably will any other topics please let us know yeah yeah we're we're always open to suggestions for webinars so we've just got one last thing haven't we one last thing we're not going to play the music but uh yeah one last thing this was also another um part of our exciting couple of days working on our research in Oxford and we we went along to a new exhibition that had opened it's called Write, Cut and Rewrite it's at the Western Library and it's actually um uh running until in Oxford until the 5th of January so you've got loads of time if you want to go along and see it um and it's really about the kind of work that um goes on by authors sort of when they're drafting their their text there's loads of famous writers aren't there they've got loads of amazing manuscripts and the editing process so you know that what you can see from the manuscript shows the creative process that goes on and the the job that I mean we had the head of Faber and Faber was there sort of making the point that publishers do editing but they don't really do what the writer does in terms of actually that whole process of working out what doesn't doesn't work and also we were then had a really interesting conversation because a lot of these were obviously manuscripts or they were typed and you could see the corrections that the author had put on and we were I was speculating how's that ever going to be preserved um in the digital world and sort of version history and some of the things we did when we were writing um copyright and e-learning to keep earlier versions because we we thought somebody would want to do a big exhibition of that once but we we did we found something that amused us that we wanted to share or it amused me no end um so anyone who might have played our game the publishing trap and got towards the end we have a character in it who's actually um called Simon and he's studying Jane Austen and we have a little scenario where he finds an unpublished Jane Austen manuscript which Chris rather amusingly called the trousers of Mr Bingley long lost trousers of Mr Bingley yes anyway there is actually a long lost manuscript of Jane Austen and it doesn't mention Mr Bingley's trousers and it doesn't mean anything to do with Colin Firth's shirt no it doesn't no no which also just popped up recently as well but there is there is a long lost manuscript of Jane Austen so we might have to make some amendments to the publishing trap and uh yeah I've got a photo of that that that manuscript as well apparently she lots of her manuscripts were all they're not not been preserved because authors didn't keep them then and this is a very rare thing because it's in our handwriting but please yeah if you get the chance to come to Oxford uh it would be fantastic I would say to go and visit that and if you are there and if you're around drop me a line as I might be there as well uh but yeah uh great to see you there that's it thank you so much everyone thanks to everyone who's turned up thank you Natalie and Sharon once again and uh we will stop our recording and let you all get your lunch