 We're going to talk about something which I think is probably an objective that most of us share, which is making all of our services that we provide online accessible to all members of our public, even those who are using adaptive technologies of various sorts. And I hope it's useful because it's a slightly different approach than we have done with other library projects and initiatives, and so I thought it was worth talking about. I think, oh, I shouldn't also say who we are. Sorry, I'm Suzanne Wands. I'm Associate University Library and for Digital Strategies and Innovation at the Harvard Library, and this is Claire Marco. Claire Marco, I'm Associate Director for the same thing at Harvard Library. And so we're here to talk about a task force that we just recently wrapped up focusing on an evaluation of our, the accessibility of all of our public facing systems. And so this is an objective that we've had for many years. We are also trying very hard to make sure that all of our physical spaces are as accessible as they can be to all of our users. But we recently got a real injection of motivation from the university, which in the spring passed a new university accessibility policy for online materials. And this was both sort of simultaneously surprising and not surprising. So it wasn't surprising because this has been a concern for all of us on campus. And many of us were already taking strides and actions to make sure that our stuff was accessible. So the library had been playing a fairly strong, I think, leadership role on campus in this area. In 2015, we opened a user research center on campus, which as one of its main goals was to allow for the testing of products using adaptive technology equipment that we have in the lab and providing an accessibility testing services that anyone on campus can use and that the library has been promoting very strongly. Likewise, the Harvard University IT, the central IT for the campus had adopted an accessibility policy for their work and had created a contract writer for all third party vendors that was around accessibility standards. So there was a lot of strong interest in making our stuff good and usable by all. But it was Harvard remains a very decentralized institution. And so adoption of these standards and policies and contract writers was very much school by school. So people could decide whether or not to uphold these standards or not in the way that they chose. So it was surprising in a very good way that the university decided to pass a campus wide mandate that all of our online services should attain certain accessibility standards by December 1st, 2019. So that gave us a nice kind of short window to test all of our stuff, which, you know, is good because everyone needs a deadline. So that was great. And for me, it was particularly great that I have someone who's very good at working to deadlines on my staff that I could turn to and she came up with a really great and innovative way for us to very quickly in an action biased manner and evaluate all of our systems in a way that would make it hopefully easier for our technical staff to really understand what changes need to be made and build up a community of people who really understand what good accessibility practices are overall. So Claire, I'm going to turn it over to you. So as Suzanne said, we all understand that this work is really critical. And I know that's been a topic of interest for some time at this particular meeting, but we wanted to get to a place where we truly treat accessibility as a first priority, not secondary. And while auditing and documenting accessibility issues sounds boring and like it's going to result in a long list of someone to do items for someone that will be on a backlog and be prioritized at some point, that doesn't really feel like putting accessibility first. That doesn't feel like setting a priority. So what the idea behind this project was, is what if we modeled the way of treating accessibility remediation after the way that we do digital product development at Harvard? So we think that that starts with the people. We wanted to look at the people who are actually making digital content for our users. So that's not just the library of T group. Also reference librarians who are creating lib guides and academic technologists who support faculty, experts from special collections who are making digital exhibits. We wanted the task force to be as diverse as those groups. And then as Suzanne said, help kind of establish a culture and a group of people who understood the importance of this work and how to do it in an actionable way. So this is the folks that were on the team. We chose motivated people. This wasn't about representation of different groups across the library. It was who's interested, who's motivated. And we communicated a clear goal for them, but then let them self-organize as much as possible. We wanted to acknowledge that we were trying something new and that that might be scary. So we partnered team members together based on what we knew about them and their expertise to help build autonomy so that they felt empowered to make decisions and self-efficacy, that they could make those decisions and then iterate on them throughout the course of the work cycle. And despite this large scale of this project, we actually had a pretty narrow focus. We were only targeting public facing user interfaces supported by the library. So all staff side interfaces are not part of what's required under this policy. Not to say that we aren't going to then iterate on this approach and apply that to the same support for our staff. But in order to prove this as a proof of concept, we had that narrow scope and we grouped our common types of user interfaces together. So things that were built in common, you know, coding languages, things that had similar vendors backed. We grouped them together and assigned them to these teams, two or three people in each team. And we gave them the goal of documenting all of the issues, but not all of the instances of those issues. And that becomes important in a moment. So we didn't have an existing framework for accessibility remediation and we did a landscape review and we found many great VPATs. That's the voluntary product accessibility templates out there. But since we wanted to think differently, not just about the outcome of the work, but how the work was done, we turned, as I said, to the way that we've been doing product development. And so we looked for agile principles that we could apply here for these types of policy changes or non-technical changes in the library. So we wanted to use a scrum framework that would give us a time box approach. And for anyone who's not familiar with agile, the agile manifesto in the room, the first principle I've already articulated, which is build projects around motivated people, give them the environment, the support they need and trust them to get the job done. So that's why I keep referring to the team first. A couple of other agile principles that we brought into this project delivering frequently. Here we aren't delivering working software, but we were delivering in terms of the issues that we were documenting with a preference to a shorter time sale. So we did two-week sprints. Again, not development work, but time boxing, bringing business people and developers together daily. So we did a daily scrum. We had a one-hour public demo that we invited other people to where people could show the issues they found, talk about particularly confusing issues, things that were tough for them to document and guide each other on how they approach that work. And then we did a team retrospective at the end of each sprint as well. So we wanted to reflect on how we could be more effective, what was working, what wasn't. And we didn't have strict timelines for people as they were looking at different products. Oh, you have to do this one in sprint one and this one in sprint two. Again, we let them self-organize. So sometimes some of the teams work very quickly through maybe a smaller product and others would take longer and we allowed the retrospective to guide our conversation about that. So we did have some lightweight tools, although we allowed people to self-organize. We didn't want them to start from scratch. We wanted to make it easy to get started so that actionability piece. So we had done a previous review of our main library catalog, Hollis. And so we used a template for a workbook for that, you know, keeping track, documenting the issues. But we ran open to changing and adapting that framework across the project. So we had these workbooks for issue tracking. We allowed teams to set which page types or which parts of a page they were looking at to document very few things. The area of the page, the issue description, the WCAG criteria, the person or group that they thought would be responsible for the fix and then adding any notes or screenshots. And all we were asking them to do was we use the site improve extension and a manual review just of keyboarding, zooming in to increase the page size and checking for captions or transcriptions. That's by far not the only way to have done this, but by telling people that was all that they were required to do. We actually found that some people who had more expertise would go beyond. We had a non-sided person on the team who did a JAWS review of a lot of things, not because that was a requirement for this work, but because that would help enhance the findings that we found. So we thought strategically and with that kind of mindset about really what is required to make this actionable, not what in the best world case scenario of whatever true accessibility means could we do. So again, about halfway through the project, we moved from just trying to be action oriented and getting started to setting priorities. We wanted to think about our findings and how we would translate those for the people who would actually be carrying out the remediation. And as Suzanne mentioned, some of those people were on this team, but they work with and our colleagues with other people who were not. And so we wanted to think about how the work that we were doing would translate for those folks. So we again, my priority is the team here. So I won't go too deep into this, but we use a series of high performance team protocols developed by someone named Richard Kasparowski. If you all are in the agile space, his work is based on what Jim and Michelle McCarthy had done in terms of the core commitments. And the focus of that is really psychological safety in a team environment. And it starts with positive bias, the concept of freedom, self-awareness, connection, productivity and error handling. And I have a whole other talk about that that I can do. But the reason that I mention it is I was keenly aware of the emotional labor involved. This work, accessibility work has a significant impact on people's lives and that can feel very inspiring and make people want to get involved, but it can also be create a lot of fear and anxiety about am I qualified to be the expert in the room saying whether or not this is a major accessibility issue. So I wanted to give the team a sense of support and autonomy. As far as Harvard Library was concerned, there were no better experts in the room than or among our staff than the people in the room who, again, were motivated individuals that we brought together to do this work. One of them is the head of UX and digital accessibility who reports to me. So yes, there were experts in the room, but we wanted everyone to have that same sense of setting priorities. So our prioritization framework was very light and again was was developed by our head of UX and digital accessibility. She found support in the work of Carl Groves. If you're familiar with Carl Groves, we only had three categories. High impact, medium impact and low impact. So high impact users will be unable to perform important system tasks or unable to understand important content if the item is not repaired all the way down to low impact, which is an inconvenience, but users will be able to accomplish all tasks. Oh, I should say we also looked at some great work that's being done at Duke. They have some great web accessibility guides as well. So when we were looking at this high, medium and low, we also looked at the idea of is it a barrier or is it a usability issue? Is it what is the feature that we're talking about in this issue? Is it really important to the user's experience of this page? Is the content that's being conveyed by this essential? And then looking at conformance level, is this a single A or double A WCAG issue? That was critically important for our vended solution. So if the audience that we're communicating back to in terms of remediation is a vendor, we want to make sure that we can say here is the criteria that we're talking about and here's the conformance level that we're not meeting. And as Susan said, Hewitt had done a lot of work to try to get sorry, our Central University IT had done a lot of work to try to get these accessibility writers into all of our vended contracts. So we want to make sure that everyone's holding up their end of the bargain. And then we wanted to continuously integrate our work with effort. So the world didn't stop just because we were reviewing our systems for accessibility issues. We had ongoing enhancements and development work that was happening, but we wanted to make sure that people, again, had that sense that they were examining issues across page types. They were looking at what the user experience of a particular system would be. And we didn't want to have to the the remediators have to re review and reconstruct every issue in order to understand why it was a problem. So that was how our setting our priorities transitioned into how we communicated about the the issues. So when we communicate about our product development, we use a content strategy framework where we highlight who the audience is for this, what the goal is for a particular page or system, the value proposition, what's needed to educate the user validation and a components list. Again, we can talk a whole series about content strategy, but we wanted to apply that same thing to our communications artifacts. We wanted to be strategic about how we talked about the work. So this is an image on the left of sorry, on your right of a cover sheet. So the first communication artifact we generated was a cover sheet. And we imagine this as a one pager that a developer or content creator might use as a motivator for the work and a reminder of the impact of the work. This was not a document that in and of itself would give everyone everything they needed to fix the problems. But it would, because it was focused around a user story, it would draw your attention to things that were critical and important for users encountering this system. So I just want to highlight a couple of pieces of this beyond the accessibility user story. The next thing is that frequently occurring issues list. And again, with our priority on making this actionable, we wanted the folks conducting remediation to know not only, okay, what are the critical issues, what got that high level in the rating, which is down below in the summary. But if something is frequently occurring, in many cases, it could be one fix that would have a broad impact across the system. And so bringing those to the attention of the remediator was really important. Again, for that feeling of self efficacy, if I fix this one thing and it fixes it in 20 places in the system, I've just improved someone's life. The other area of interest here is the quote from the reviewer. And this again is to maintain the human connection between someone who took time out of their work. This was one of those other duties as assigned pieces for work of work for people who otherwise have full time jobs to say, I looked at this system, and I looked at it not just to say that you did a good job or a bad job, I or to document a list of issues and give you more work to do. I did it in a context. And so it really gave that opportunity for reflecting on the broad findings, maybe not something completely specific or that was borne out clearly by the individual issues as a list. And I should say also in terms of the psychological safety here, a lot of the people on the team doing the review are the same people who built these systems initially. So I have members of my staff who were viewing their own work and reviewing the work of other people in that room. So to say, hey, we're all humans and we're all doing our best and we're trying to make these products more accessible. That was a common mantra for all of us. And then lastly, as we moved from that first piece, that first design artifact, we then had the idea that we needed to communicate the value proposition to our leadership audience. So this is a draft of a dashboard or a placemat, as for some reason has become the term we're using here that was born out of sharing the cover sheets with management in library technology services, responding to their feedback and then doing another content strategy and sketching session around what would leadership want to know about this project as a whole, not about an individual system. And as you can imagine, the early sense that we got, you know, that Suzanne was able to communicate with leadership was, you know, people wanted a score. How accessible are we? Who's to blame? What's the bad system? Right? And we wanted to give them some of what they wanted, some of what they anticipated getting out of this work. But really, we wanted to put put forth the principle that accessibility is never done. And therefore, the impact of this work was more about how we did it and how it aligns to our product development lifecycle and how we can put accessibility first than any one finding or any one system. It will obviously help us to allocate resources in the future and figure out how staff need to swarm on our particular system. And Suzanne can talk a little bit more about when you want to make those communication artifacts that that communicate your value proposition. You also want to make that actionable in terms of the next steps for this work. So that, yeah, you guys got to see that draft of the placemat for library leadership has actually seen. Yeah. So that's exciting. But yeah, it is doing the work in a transparent way. Yeah, exactly. Yeah, exactly, which is great. So we are in the process now of developing an action plan. The as Claire said, the library technology services managers who will be overseeing the staff for doing all the remediation have now with Claire and with Amy as well. And so have seen the cover sheets have signed off on the work being added to the backlog. That university accessibility policy gave us in a very important grandfather clause, which was basically that all of our systems that were alive before December 1st are OK. But any new systems we bring up will need to be in compliance. But we're trying to go beyond that and really on the spirit of the policy rather than just the date of December 1 for new systems. So we are going to be remediating the high priority and median priority action items. And then we also just learned a lot about using this model going forward for other types of library work and university work. So with that university accessibility policy, a lot of departments who haven't given as much thought to this as library heads were kind of feeling like they didn't know how to even get started. So we are sharing this model with the University Accessibility Office and the what is it? Digital Accessibility Services offices, which is a brand new office so that they can use it as a way to communicate with others and an unapproach that they could use to evaluate their systems and services. And in the library, unless on, you know, I think Claire said a great job of showing how we can use this as we create our new digital products to make sure that accessibility is front and center from the start and not as a tag on. But also beyond that for non technical projects, I think it really is a great way to show team projects, committees, dare I say, that there is a way that they can be a little bit more action focused. So this that using some of the agile principles of like having a daily stand up just in the morning, we did that on slack, super easy, just very quickly. Here's what I'm going to be able to do today, or I'm not going to be able to look at this at all today, just so people were kind of like constantly had it on their radar, constantly were accountable for what they could or could not get done in any given sprint. And then the demos were, you know, instead of people coming to a committee meeting a month after they had said they would do something and say, oh, I never got to do that. You know, in the demo, everyone would go around and talk about what they did and show it. And it was a very a very non threatening way to do that. And when people said I couldn't get to it, it was like, OK, we understand that happens all of us. So it was a really, I thought a really great environment and really great way to show people that that's a that's a way to get work done. And then one, you know, cleared a couple of things that she could talk about for hours. The other thing that we could talk about for hours is our content that we have digitized. Not all of that is accessible yet. So it wasn't part of this evaluation system. Again, anything that we have digitized before December 1st is kind of grandfathered, but obviously we would like to make as much of our content fully accessible as possible. So for the past two or three years, any av materials that we have digitized with caption, for example, and the University has put aside some money for retroactive captioning as well. So we're going to be sort of a process that end users can request things be captioned. So we're going to be launching that and then we will just be gradually sort of chipping away at making more and more of our av materials captioned and fully accessible with that we have how many minutes for questions? Five. Five. We're probably at time. So thank you so much.