 Hello and welcome to this session at OER by Domain's 21. I'm delighted to introduce you to Emily Coller-Johnston, who is here with us sharing lessons from the front line and talking about challenges and strategies for inspiring a shift from surveillance to open practices. And Emily, this is one session I've been looking forward to because I'm really interested in how we can sort of move against surveillance and feel fantastically delighted to have you with us. So I'm gonna hand over to you to start the presentation and I'm gonna be monitoring the chat for comments and questions. So if you are joining the session, do post a welcome in the chat, but for now, over to you, Emily. Wonderful, thanks, Maren, and thank you everyone for being here. As Maren has said, my name is Emily Coller-Johnston and I'm a research and scholarly communications librarian at the University of Western Ontario. In this role, I support faculty and scholars with open publishing, whether that's working with faculty to make their research openly available or supporting them in adopting and creating open educational resources and practices. Today though, I want to talk about open and open educational practices as a response to surveillance technology. But I wanna frame it around the question, how do we inspire a shift to open educational practices and the ethics of care that ought to come with it in those who believe surveillance practices to be good or necessary? Now, that's a big question and personally, I don't have all of the answers, but to reflect on why I'm asking this question in this way, we have to back up a little bit to last year. Now, I don't have to tell all of you about the ways that courses were disrupted when most everything shifted online, nor do I have to tell you about the many instructors that initially considered moving their normally in-person exams to an online format because it was experts like yourselves who called on them to redesign their final exams into new formats, citing reasons like the importance of leading with care by eliminating undue stress and accommodating individual contexts and circumstances. And then of course to also avoid any perceived need for surveillance in order to maintain exam integrity. Now, at that time and certainly as the year has passed, there have been no shortage of alternative assessment strategies that shared as potential options, including a number of open educational practices or practices that include the creation, use and reuse of open educational resources as well as open pedagogies and open sharing of teaching practices. So things like renewable assignments, Wikipedia editing, public blogging and scholarship or collaborative social annotation, many of which have been explored at this conference and all of which center students as agents and creators to some degree. That's of course in contrast to surveillance technologies, which remove students agency or authority over their experience and environment. To my point though, as open advocates and practitioners, we know this. We know why surveillance tech is harmful and we don't have to think twice about why we choose open practice over surveillance. But I also know that even if I want it to be the case and even if it should be the case, this isn't the case for everyone. What about those who continue to, whose environments continue to uphold narratives and ideologies of traditional academic rigor? Or those who truly believe that practices like remote proctoring are flexible solutions since they allow exams to be carried out from anywhere. So again, I asked the question, how do we move these individuals from a mindset of surveillance to a mindset that is more receptive to open educational practices? And can we, can we do that? Now I'm interested in this angle because so much of my work from where I'm currently based in the library is open advocacy. Getting to support open practices is definitely the best part of my job, but I have to advocate for open and influence behavioral and cultural change to even get to a point where there's a need for my support. And because so many of us work in systems and structures that continue to uphold traditional narratives and ideologies and practices, I'm willing to bet that that's the case for many of you as well. Now in my case, sometimes the greater good values-based approach is an easy sell. People get it and it clicks and it's great. But in many other cases, it doesn't. People want to know about the practical implications of open. They wanna know what it means for them. They wanna know how it changes their work. And more particularly, one of the concerns that's frequently raised about open practice is, as you may have guessed from this slide, labor. So that's time, but it's also expertise. Their concerns sound like, well, I don't have time to redesign my curriculum to align with that open text or I don't have the people to mark all of those assignments or I don't know how to use that open tool. Is there someone who can handle the technical pieces for me? I'm sure you've heard similar phrases too. And so if you have, I invite you to share your own examples or phrases and concerns that you've heard in the chat. No, we don't have to look very far to see labor as a concern confirmed in research studies. A study on my own campus in 2018 revealed time and support to be the main factors influencing adoption and creation of open educational resources. The reports on this slide on faculty perceptions of open education at different colleges and universities are also further samples that point to time and personal labor as a barrier to uptake. So on the one hand, we've got open educational practices being recognized by many as labor intensive, sometimes to the point that the labor required is seen as too big of a hurdle to overcome. But then on the other hand, we've got surveillance tech companies promising to eliminate labor for faculty through artificial intelligence driven products. They promise solutionism with minimal effort and given faculty's concerns about labor, it's obviously compelling to some. And yet what I wanna convey is that despite these narratives that are put forth, open educational practices and surveillance tech are not exactly in binary opposition when it comes to labor-expended. Going even further, my experience suggests that highlighting the labor that is involved in deploying surveillance tech deconstructs these myths of tech solutionism and opens the door for faculty to be more receptive to seemingly more labor intensive practices like open education. So that's where I'm gonna focus on for the rest of this presentation. Thinking back again to last year's first COVID-19 lockdown, I was working at a different higher ed organization that was tasked with deploying a remote proctoring tool to a consortium of colleges and universities so that they could carry out their final exams with minimal disruption. I'm not going to give any more context about the project than that, except to share my own learnings and insight as someone who was in a non-negotiable position of delivering training on the remote proctoring tool. Personally, as you might imagine, I was conflicted about my own positioning in this project but my strategy was to be upfront about the intricacies, the limitations, the challenges of the tool so that others were in a position to make as informed a decision as possible. So between myself and a colleague, we delivered demos and Q&A sessions to more than 200 faculty and support staff at many different schools over a period of about a month. And I wanna share with you the top three topics that we were questioned about by faculty. The first, how much work was required to set up the tool and the answer was a lot. Setting up configurations, integrating it with the exam, communicating information to students ahead of time, having students take an onboarding test, finding alternatives for students who couldn't or wouldn't use the tool. The second was whether they could make the tool do a very specific thing that they thought they needed it to do. And unless that thing was possible with the tool's settings, this question either required compromise or more work for the instructor as they developed a workaround outside of the system. And finally, how much they personally had to do to determine if a student was cheating. So once they found out that they would actually have to watch the proctoring videos to assess behaviors that were flagged by the AI system, their next question was where they could find others to review these proctoring videos for them. You might notice that the trend here in these questions is about the labor required. For many of these faculty, there was also the additional concern that they'd have to build their exam in their learning management system first, which required learning yet another tool that many were previously unfamiliar with. Now, the people who attended these training sessions came with full intentions of deploying remote proctoring for final exams. It seemed like a logical and straightforward way for them to finish their course, similar to how they had originally planned for it to go and the tool was available to them. So it seemed like an easy option. But the more that people learned about the tool and envisioned their concrete use case for it, the more that they realized it wasn't actually the automated plug and play solution that they thought it was. And as they became aware of the work and technical aptitude required even to set up the tool, they became more receptive to other alternatives that they hadn't initially considered or really wanted to implement. So this meant that many of the instructors and departments that attended that first round of training in the spring semester of last year ended up pursuing alternative assessment measures. In cases that I was most aware of that looked like using open book exams since it was a solution that could be implemented pretty quickly. Now, it's not lost on me that the tight timeline and panic of that first lockdown may have contributed to the implementation of the tool feeling especially overwhelming. But fortunately, even with more time to prepare for this academic year, the faculty that I was aware of who reverted their decision to use the proctoring system in spring of last year didn't use it this academic year either. Again, they explored alternative options. Some even exploring open educational practices. For example, I worked with a few faculty members who had replaced their final exam with collaborative annotation or collaborative annotation activities. And my colleagues spoke of other faculty members who had been at the remote proctoring training in the winter, but by the summer had joined an open education community of practice. Now, certainly other factors came into play to influence their decision to ultimately explore open, but realizing the labor required to implement what they thought were simple surveillance tactics was an initial catalyst that sparked their receptiveness to alternatives. Now, I'm not suggesting that these examples are generalizable across the board, but especially when talking about open advocacy where change resulting from advocacy efforts can be slow going. These stories matter and the way that these behavioral changes were influenced matters because it gives us insight into tactics that we can employ in similar contexts as we continue to advocate for open and against surveillance. Now, as I conclude though, I want to be clear that behavioral change at the individual level is only one piece of the puzzle. The other piece is cultural change. So changing the systems, structures and ideologies that uphold traditional and harmful practices and also influence behavior at the individual level. Here, I want to stress that it wasn't just the labor of individual faculty members being called upon to implement this proctoring tool. It was also CIOs and CTOs, technology service departments, cybersecurity officers, and in some cases, learning technologists and e-learning specialists. That's a lot of labor and expertise being deployed in service of surveillance, which is a pretty clear signal to what these institutions value. Now imagine if all of that time and expertise from these individuals were instead devoted to supporting open. That's where my mind goes when I hear administrators flag concerns about the labor open requires as a roadblock to institutional buy-in. Labor is a valid concern, but less valid when those same institutions are willing to pull significant resources towards surveillance practices like proctoring and data analytics. To change this culture, our advocacy efforts need to be asking at every opportunity how we can redirect resources and expertise that uphold harmful practices towards ones that service care like open. So in closing, I hope that this short presentation has at the very least given you space to think about how you approach your own advocacy when it comes to both surveillance and open and whether there might be any positive effects from having similar conversations about labor in your own context. And at the same time in recognizing that this presentation has focused largely on the anger angle of labor as one piece of the puzzle, I want to urge you to think about what other strategies you employ to shift others away from surveillance practices and towards open. And I invite you to share those strategies in the chat, but also to take this time to ask any questions if you have them. Fantastic, thank you so much, Emily. That was an inspiring talk. And I can see there is quite a few comments coming up in the chat. So we're going to start with one from Tom Farrelly who this question is, are you aware or did you try to find out how many people changed their plans as a result of going to the training sessions? Any thoughts on that one, Emily? Yeah, that data is collected. It was collected. Now I am not at that organization anymore so I don't want to speak out of turn and misquote the data, but I would say that definitely in that first flurry of one month where we were teaching these sessions, it was a good proportion. A lot is what I can say. I can see a lot of applause in the questions as well, which is fantastic to see. And I wanted to pick up, there is a question from Roger Emory a little while ago about reflecting on whether the problem starts much earlier, like at school in terms of the sort of different education stages. So I wonder if you have any thoughts on sort of more the culture of exam sitting and surveillance that you touched upon? Sorry, I'm reading the question right now. So does the problem not start at school? Yeah, sorry, this seems like a comment. I guess the question, does the problem not start at school? So I would say, yes. Do you want to re-freeze that question? I got lost. I guess you're right. It is more of a comment rather than a question, but I think it was interesting to reflect on, you know, how students coming into the system that we work with, their expectations are to be invigilated during an exam from early education onwards. And I think that's maybe the sort of rebalancing of resources that you spoke about from, you know, the technologies and activities that sort of focus on surveillance to alternatives. Yeah, yeah, OK. So absolutely, then thinking back to that question where it says, does the problem start at school? I think, yes, it does. And I think that's exactly the problem that I was getting at, as you just said, with the culture. And like that is the culture that I think students come to expect, but it shouldn't be. It doesn't have to be. Well, we have time for one more question. And we have one in the chat from Autumn. Hi, Autumn, and thanks for your question. Autumn says, Emily, can you talk about the emotional labor that goes into making this switch from surveillance to authentic assessments? So that's our final question. Autumn, thank you for bringing that up. And that's something that I was thinking throughout as I was writing this presentation. And I know that there's also I don't know the exact time, but there is another presentation later today dedicated to emotional labor. So a plug for that session. There's a lot of emotional labor. And I think that's what kind of inspired this talk about open advocacy, because so much of open advocacy, especially when it comes, again, to thinking about that as like a response to surveillance tech is emotional labor. It's tiring and it's draining to just be trying to influence behavior and culture. And maybe not seeing a lot of results into immediately or even being able to track these results and know what happens after your conversations, which is I think why it's important to talk about it and to share those experiences because it can be frustrating, especially when, as I said at the beginning, like we know that open and ethics of care is what people should be doing, but not everybody gets it and not everybody resonates with those same values and that can be so frustrating to be having those conversations over and over again. So thank you, Autumn, for raising that. Well, we are coming to the end of the session now and I want to thank all of our participants for actively engaging with the session for all your comments and questions. There's a lot more in the chat that we haven't had time to cover, but I'm sure Emily can head to the session page and keep chatting, also go to the hallway chat. I want to thank you all, but most of all, I want to ask you just put your hands together to say thank you to Emily once again for wonderful talk here at OER by Domain's 21. Thank you very much. Thank you, Marin, and thank you all. Om nom nom.