 Hi, my name is Angela Zoss. I'm the assessment and data visualization analyst in the assessment and user experience department at Duke University Libraries. And I'm Tim McGuery. I am the Associate University Librarian for Digital Strategies and Technology. Thank you for your interest in our presentation on building expertise and defining values through the data privacy and retention task force we formed at the Duke University Libraries. We carefully considered the value of investing in this work. We believe the return on this investment should be broad and far-reaching. First, we wanted to educate ourselves as ambassadors of data privacy and retention. To effectively develop ambassadors, we recognize the team should be diverse in experience as well as the parts of the library representative. We envisioned the work itself to be an incubator, particularly preparing ourselves to train other staff on privacy issues. We believe this is a measure of sustainability to ensure a commitment to privacy and appropriate data retention as an organization. We understood the spectrum of data privacy and retention is broad, yet we recognized from the start that we needed to balance our obligations to protect our users while also using data to improve our services to our users. Finally, we wanted the report of this work to be actionable. We aimed the report to set up a model for actionable value statements that would define discrete work that could continue our progress. We aimed to create and charge subteams to create policies to implement data retention and privacy actions based on priorities previously defined and priorities determined through continued progress. Three of the precursors to our work were activities directly related to data privacy. I was invited to participate in developing the seamless access entity categories and attribute bundles. I do create a data privacy officer position and one of the first engagement projects was with the libraries given our reputation for patron privacy. We were engaged in a difficult negotiation with a library vendor who was seeking, while we were seeking additional privacy protections and actions to ensure that the vendor was compliant with GDPR. This experience emphasized the tenuous nature of ensuring data privacy in the face of user data being a valuable commodity. In 2018, the Duke University Libraries participated in the national forum on web privacy and web analytics. This IMLS-funded grant brought together librarians, technologists, and privacy researchers to produce a practical roadmap for enhancing analytics practice and supportive privacy. The UL's participation in the web privacy forum, as well as the recent developments with GDPR and the CCPA provided momentum for a deeper look at our own systems. In the fall of 2019, the Assessment and User Experience Department, inspired by some of the recommendations in the web privacy forums on a white paper and action handbook, conducted a data privacy and retention audit of the UL policies and procedures. The goal was to conduct a judgment-free assessment of our current practices across various systems that collect and retain personal data for patrons and donors. We collected and identified 70 systems that might be candidates for such an audit. And we selected 24 for interviews because they were confirmed to collect and retain personal data. The resulting report, which is linked in our slides, summarizes the similarities and differences across the systems and ultimately concludes there are enough issues to warrant the creation of a new team to address these issues. One of my main goals in sponsoring this work was to empower a wide range of people within the library from multiple divisions to collaborate on how we retain data, what should remain private and why, and what good can be accomplished through appropriate use of data collected. We saw individuals whose professional experience would be valuable to our work, representation from the divisions which interface with users and user data, legal expertise, experience with accessibility services, special collections, digital curation, licensing the acquisitions, open source and priorities of open design, and using data. And most importantly to me, I wanted to empower others to leave the libraries in this work and to enable and amplify their collaboration for others to follow. Our first output as a task force was to create a value statement, but we quickly realized we were all coming from different places in our understanding of privacy issues. Task force members felt unsure of where to start and we all wanted to make sure we could speak from a place of authority. To get everyone on the same page and help everyone feel comfortable with their own understanding of privacy issues, we decided to develop a series of primers on various privacy related topics. Members of the task force volunteered to identify introductory texts for different topics related to privacy in libraries. Each meeting we reviewed materials on a particular topic and discussed the connections to library values and the implications for our work. The topics we reviewed were the benefits of data analysis, web analytics, the EU's general data protection regulation or GDPR, seamless access, authentication, anti-surveillance and privacy and archives slash special collections. After building and reviewing the topical primers, we turned our attention to creating our value statement. Of course, the library community already has a variety of value statements ranging from the very broad to the very specific. One example of a broad value statement is the ALA code of ethics. Value statements like this are important for organizing the community, but having a broad value statement means that it is hard to translate it into specific services or policies and the values themselves often conflict. For example, we want to provide useful services and to evaluate those services we often collect data, however to protect users we need to support their right to privacy and confidentiality, which is easiest to do when we do not collect data. We also have value statements that are more specific to patron privacy, like the NISO privacy principles and the IFLA statement on privacy. More specific value statements can go into more detail about our values around patron privacy, but there is still room for interpretation. Institutional differences and priorities can play a role in how to translate these values into services and policies. Our task force needed to combine what we learned about patron privacy with both the values of the library community and the priorities of our institution and departments. The task force realized that DUL already has a guiding document, the 2016 to 2021 strategic plan. More than that, we noticed that the guiding principles and priorities within the strategic plan cover many of the same topics as our privacy primers. DUL's guiding principles are value statements that cut across library departments and projects. They are high-level goals that we can keep in mind as we organize our work and make decisions about where to devote our resources energy. Each area has broad but useful implications for our work to educate patrons about privacy concerns and protect patron privacy in our own systems and services. We design and deliver user-centered services. We believe that data collected and analyzed in a responsible manner helps an organization make better decisions about how to service users. Even with the best of intentions, however, our desire to collect and analyze patron data to improve services comes into conflict with our patron's right to privacy, and we must acknowledge that tension and be thoughtful about how our data practices have an impact on our users. In addition to keeping privacy needs in mind when making decisions about data collection, analysis, and retention, we must also keep our users in mind with communicating better practices. We know that simply making information available does not make it findable, understandable, or actionable. As a library, we are especially sensitive to the importance of promoting literacy in areas related to information, especially in the midst of a pandemic that has pushed even more activity into our digital environments. Staff development leads to innovation. Staff development in areas related to data privacy is crucial because of the rapidly changing privacy and security landscape. Our individual and collective understandings of best practices need to be regularly tested and updated. Staff members need time and resources to take advantage of learning opportunities, and the institution needs to incorporate any lessons learned to guide future decisions and improve practices. Furthermore, regular staff training guards against staff practices that either unminimally or intentionally compromise data security. Diversity strengthens us. The diverse community we provide services for inherently creates a rich source of data with unique challenges to ensuring its equitable and meaningful use. Our users are multifaceted, bringing distinctly valuable perspectives and experiences to their interactions with the library. As stewards of this data, we must be mindful of how data are utilized and whose voices are included in the decision-making processes. Without careful attention and engagement of our full user community, we run the risk of introducing and perpetuating bias in our data, data policy, and our decision-making efforts. We cultivate and connect communities with thriving global research activities and an international campus at Duke and Chen University. Duke University is a large part of a large complex network of scholarly activity. This network includes our increasingly global local community. Participation in such a network complicates attempts to comply with privacy regulation and stay up to date with best practices for data collection, protection, and analysis. Ultimately, however, we seek globalization as a strength and information work and strive to take into account the important legal and cultural ramifications of our work. We break down barriers to scholarship. The library's commitments to open access, open source, and open standards encourage widespread sharing of academic work, tools, and infrastructure in hopes of allowing everyone to benefit from the products of research, which are often publicly funded. These commitments have the potential to undercut for-profit companies that offer free services in exchange for predatory surveillance practices, compromising the safety and privacy of our users. We recognize, however, that as an institution reliant on stretched budgets and staff, we are often faced with our own tough choices about using low-cost or free software that trades our users' data for convenient features and long-term stability. We use our strategic plan to frame strategic priorities and goals for our work on this task force. Each of these statements were crafted through collaboration and consideration of the context of each goal. Our libraries create platforms for scholarly engagement. As platform creators, we have the opportunity to actualize our values in software and services. We develop platforms that can serve as alternatives to those trying to monetize user data, and that can embody our commitment to open sharing of research sources and products. When we implement, modify, or design platforms, we make choices about what we require from our users, what we offer our users, and how well we communicate what is happening in our platforms. The choices that we make can encourage a high level engagement from a wide variety of people, or they can restrict activities in ways that are frustrating and hurtful to our users. For some systems, our interests as service designers and data collectors will come into direct conflict with our users' right to privacy. Our libraries teach and support emerging literacies. Information exchange about individuals and groups occurs automatically in the background of our library systems, often going unnoticed and without interrogation by the subjects of that information. This exchange occurs both to provide convenient and easy-to-use services, and as a good exchange in the economy of behavioral data. Our users can only make informed decisions about their privacy when they understand this exchange of information that occurs when they use our systems. Our libraries advance discovery. While fulfilling a role in providing access to new technologies and emerging opportunities to enable more sophisticated discovery and access of information, we acknowledge that with that role comes a charge to understand how these technologies and partnerships make use of users' information. As with other areas related to privacy, our role as an access provider will come into conflict with our users' right to privacy when we consider using tools or services that offer useful features for us or our users in exchange for problematic data practices. Our libraries partner in research. Libraries are especially well-suited to partner in research-related data privacy issues. The library field has a long history with privacy protections, as evidenced by state laws protecting the privacy of library records across the country. Libraries are often intimately involved in the development of industry standards, privacy policy language, and privacy-protecting software solutions. Library research partners include a broad range of individuals, groups, and institutions from local patron communities to consortial and global library peers. Libraries have a vested interest in research on ways to better protect patient privacy, either through software improvements, regulatory change, or user education. Our libraries transform the information ecosystem. The work we do at the Duke University Libraries can benefit from and be a great benefit to other library partners around the world. Far from being a neutral party, libraries can engage directly with social and political forces to promote privacy protections. By working in concert with a broader library community, we can leverage our collective power to affect real and lasting change on privacy issues. For each section of the strategic plan applying a privacy lens resulted in a set of value statements specific enough to guide real decisions in our organization. Our final values document has been nicknamed the P and P document for principles and priorities. For each of our values outlined in this document, we used a specific set of terms related to our institutional priorities and sphere of influence. For actions that are within DUL's sphere of influence, we have statements at two levels of priority, obligation, and responsibility. For areas outside of our direct influence, we use the word commitment. One example from the P and P is guiding principle number five. We break down barriers to scholarship. In this section, we identified three major values related to patron privacy, one at each level of the hierarchy. We have an obligation to use open source privacy protecting tools whenever possible. We have a responsibility to contribute effort back to open source projects we rely on, and we commit to advocacy and support around open tools and scholarship in academic communities that resist such openness, including efforts to challenge incentive structures that require the use of proprietary software or publication in closed journals. The final report has been posted publicly on Dukespace and shared in a companion blog post. We hope that this report will serve as a useful model to others in the library community. Final goal of the task force was to recommend specific projects that should be undertaken at DUL to make sure that our actions align with our values. Two of the first projects we'll be undertaking are a privacy by design project, which will develop a new workflow to assess privacy concerns for new technology projects, and a project to update our public documentation, pushing beyond the traditional privacy policy statement to a more findable, understandable, and actionable set of public explanations of how we use and collect data. The goal with both the report and the next step projects is to make sure that our values around privacy are well embedded throughout the libraries, and that we work toward consistency and sustainability for our privacy work. We hope that the data privacy and retention task force serves as a model of how to build expertise across a diverse group, outline priorities for future work according to our sphere of influence and institutional goals, and share work broadly, both within and outside our organization. Thank you very much for your time. Please feel free to reach out to me or to Tim directly with any questions.