 Welcome everybody to the spring 2022 in person, yes, in person CNI member meeting. It's wonderful to be back with you. I'm Cliff Lynch, the director of CNI and I have a few just sort of welcoming things to do and then we'll get on with our opening panel. This kind of closes a cycle because you may recall in 2020 we were scheduled to be here for spring 2020 and we canceled that meeting and moved it virtual on very short notice. That was one of the first meetings to cancel, not the first but one of the first in our sort of circle of higher education organizations. So it's really wonderful to be back in person. There's been an awful lot that's changed in the world and this meeting is no different than many of the other things that have changed. We're now doing a lot of, we did a lot of things virtually. Now we are finding our way back to try and come up with the most effective mix of virtual and in person activities and that has been a continuing work in progress ever since the pandemic. You've probably seen various announcements about us moving our executive roundtables virtual permanently about our prerecorded project briefings that we're going to release every two or three months throughout the year and various other things and you will be hearing more of these kinds of things as we go forward. I hope that when we gather in December I'll be able to update you in more detail on those plans but until then please bear with us. We're all learning as we go along. Now I'd like to extend a special welcome to a number of groups who are with us. I think we have a very small number of international colleagues here and my hat is off to them. International in person travel right now is still challenging. I believe we have with us a couple of the new ARL leadership fellows, some clear fellows or former clear fellows, some ARL leadership and career development program fellows and in particular some leading fellows. Leading is the LIS education and data science integrated networking group and you will meet some of those folks when they give a series of lightning presentations on their work just before our reception. I'd also like to welcome a new member, Tind, which is a spin-off from CERN that is doing some very interesting things and I'm delighted that they have joined us here today. I will just say in terms of safety protocols and things like that nobody knows quite where we are with this whole situation at this point. That's probably the most honest thing to say. There is no indoor mask mandate. There is a mask recommendation. Your guess is as good as mine. I would say the following things. People are in different places in terms of what they're comfortable with. Do what you're comfortable with. Recognize that other people may be in different places than you are and give everybody a little room. There are more people here than you think, but these rooms are deliberately set to be spacious and somewhat cavernous so there should be plenty of room for people to spread out. But I don't really know what else to say beyond that. You will note that at our various events with food we will make provisions at all of them if you want to take things to eat and take them somewhere else rather than be seated tightly with a whole lot of people. That will be doable. With that, I just want to say I'm really glad you're here. I think this is going to be a wonderful meeting. We followed the same practices as we did in December of 2021 of really deparallelizing and loosening up the meeting to allow lots of time for networking and for people to get to greet and chat with colleagues they haven't caught up with. We're going to have a very nice reception tonight and I urge you all to join us for that. With that, I'm going to sit down and I'm going to introduce our panel and make a few framing remarks and then mostly you're going to be hearing from our panel and not very much from me. So, over to the panel. Let me make some introductions going from the far end of the table back towards me. I'm delighted to welcome Kent Wada from UCLA where he is the Chief Privacy Officer. Next to Kent is Cheryl Washington from the University of California at Davis. She is the Chief Information Security Officer or CISO there. You may remember Cheryl and Kent. They are very graciously reprising roles in a session that we did a year ago at a virtual-only meeting and I think you will find their insights really, really interesting and I'm just thrilled that they're able to be with us in person so that you have an opportunity to chat with them informally as the day goes on. We're also joined by Lisa Hinchcliff from the University of Illinois, Urbana-Champaign. Lisa is with the library there and I think is pretty well known to this audience for her writing on both privacy and broader issues around scholarly communication and scholarly publishing which certainly is going to figure into this discussion in some complicated ways. So here's what I'm hoping we can get at a little bit in this session. Privacy has been a sort of a central value of libraries for a very, very long time and they have been fierce defenders of the rights of their patrons to read privacy privately and engage with the literature privately. At the same time over the last, I don't know, 15 years, maybe a little longer, there has been a rising emphasis on privacy and security issues at a really structural level within our institutions and one of the, you know, sort of key measures of that has been the increasingly common creation of a explicit chief information security officer position and a little more recently chief privacy officer positions. Those positions have institution-wide remits and interact in very interesting ways with lots of different activities and policymaking throughout the institution. How do they connect up with the kinds of privacy issues that are emerging for libraries or perhaps it would be more accurate to say, we're mastering for libraries as we move into an increasingly digital environment and the sort of old ways of doing things don't work anymore where all of a sudden privacy is tied up not just with what you do at your institution but a whole mass of third-party players. What can libraries learn from chief privacy and chief information security officers in their perspective? How can the activities of libraries and the activities at an institutional level be most effectively coordinated? How can they mutually support each other? Where are the friction points? That's where I hope we can go a little bit during the next hour. I've got a whole bunch of leading questions. There are short questions that will have long answers in many cases and we will open it up for your questions as the program proceeds as well. So I would invite you to be thinking about what kinds of questions you would pose for this wonderful panel we've got here. And I think maybe to get us started, I'm going to ask Cheryl and Kent to say just a few words about what a chief information security officer and a chief privacy officer are, what they do, where they report in their organizations because I know that those roles may be a little less familiar to you than some of the other roles such as Lisa's. Do you want to start us off perhaps, Cheryl? I was going to be gracious and defer to my colleague. That didn't work. That's a really great question. Let me set a little context. Kent and I have known each other for God knows how long, which means a very, very long time. And interesting enough, once upon a time he was in my world and once upon a time I was in his world. So to set it a little bit differently or say it a little bit differently, early in my training and development as a security officer, I spent a number of years studying and becoming, I hope, an effective privacy officer. And once upon a time I actually held the chief privacy officer title, along with the chief security officer. It was interesting that the organization I worked for decided to combine the two titles. It was truly a trailblazing move. I've never heard of any organization doing it before. But to answer your question more directly, as the university's chief information security officer, my responsibilities are defined in a lot of frameworks. I protect confidentiality, meaning that in terms of personal information, my job is to identify controls, tactics, techniques, practices, and even help craft policies to protect individuals' personal information. I also protect the availability of information. So as you can imagine, most of our organizations are large, diverse, complex. We have lots of information under our stewardship. And my job also is to make sure that data is available as needed or when needed. And the third component or dimension of my role is to protect the integrity of information. As you probably are hearing in the ether, most of our organizations under a lot of attacks, cyber attacks in particular. So the other component of my job used to make sure that the organization, again, has to write tactics, techniques, controls, practices, and policy to protect the integrity of our information. Now, I don't do it alone. Most of our campuses in the University of California have privacy officers, chief privacy officers, privacy officials. They go by a number of different titles. There's no way, as an information security officer, I can do the work that I do by myself. And so I more often than not partner with the privacy officer, hence my earlier comments about my training. I have a high degree of respect for what privacy officers have to do. It's a delicate dance in many ways. And unlike, say, our frontier, the security space, there isn't a lot of directives or frameworks or practices that they can follow as with the same degree of ease that I can follow as a security officer. Nevertheless, most of the privacy officers I've met, including my colleague here, Kent, are trailblazers in their own right. I'll do some incredible things, but I'd like to end by saying that we are partners more than anything else. Kent? Thanks, Cheryl. And indeed, we are partners in so many ways. I think, let me start actually, in terms of contrast, you said that your job as a security officer is to protect the confidentiality of personal information. So what you didn't say was confidentiality, integrity, or availability and integrity. Stay the way around. C.I.A. C.I.A. Is that you are actually responsible for protecting the confidentiality of all information, not just about people, whereas privacy really, if you think about it, is about people. So I am concerned only about data about people. There are lots of other kinds of data that are important and confidential. For example, where we have control substances on our campuses, but that's not my purview typically. It's really only data about people. So in some sense, you might think that privacy is a smaller space or a subspace of what the security officers deal with, but it's really, I think, more of a Venn diagram where we each have our own jobs. Privacy, on the one hand, does work hand-in-hand with security in terms of what I call data protection. So making sure that, well, in a soundbite, no more breaches and no one likes to hear about another breach. And both the privacy side and the security side work together to do that. There is another part of privacy, though, that I think we are recognizing in the larger world in society, and that's really thinking about not so much unauthorized access to data, but around surveillance. Big brother, the monitoring of behavior, the kinds of things that we are always worried about by some of the big tech companies, by government, by whomever it is, including often our own community worried about our administration actually looking at their communications. And so I'm always also thinking about that kind of privacy. What does the surveillance mean? What does it mean when we're aggregating data, profiling people, watching their behavior and learning from them for legitimate reasons. But again, there's a space where the law tells us what we can and can't do. Those are the easy cases, because that's just, you know, we can or can't do something. But in the middle is this gigantic area of ambiguity where it's entirely up to us, you know, at our discretion, whether we choose to do something or not. And that's particularly when we're, of course, when we're talking about information about people that I get involved. And it's really in some ways a values-based issue. Some people like to talk about data ethics. That's become a very popular term these days. But what's the right thing to do? Should we do something? We know we can do it, both technologically and by law. But in both cases, I would say those are the floors. They're not the ceilings. They tell us, again, what we must do, not whether we should do something or not. Thanks. Those really put it in excellent perspective, I think. And it's very helpful to focus on people, rather than the much broader information landscape. And I think that's probably something that resonates with the library perspective on it, which is also strongly about people. Maybe we could start a little bit with what you identified as the sort of surveillance class of threats as opposed to direct collection by the institution of information about what people are doing and whether they should do it and what uses it should be put to. Certainly there's pervasive interaction with various services and content that potentially comes with a surveillance aspect. How do we best deal with that? I invite all three of our panelists to reflect on that a little bit. And I know Lisa, at least, you'll probably emphasize the library case. So maybe we could start with you as a little bit of a frame for that. Sure. Thanks, Cliff. I think we're very familiar in the library setting with the fact that most of our information provision in a digital era is now provided not by the library directly. At the beginning of my career in the print era and even for a little while I was a little bit of a local and locally loaded databases. So all the information provision and access, if you will, was sort of contained within the walls of the library. Occasionally let people leave with something that mostly was contained within the walls of the library. And so we had a fair bit of understanding of what data was collected about people's use of information or resources. We often struggled with not being able to collect enough and we would put up saying, please don't reshelf the periodicals, right? We're trying to figure out if anyone's using these things. And the best we had was, is it on the cart? So even then we felt a certain kind of lack of data in that print environment. So fast forward, essentially at this point, very little of information access is place-based. And certainly we saw during the pandemic that very little was place-based in many of our institutions. And even our place-based information, the indexes, the OPEC, as we would say, is not place-based. So the searching for even those things that we hold physically happens on third-party platforms. So we are really, in most cases, quite unaware of the information architectures or where that data might be flowing. We are certainly not the ones managing it or controlling it. And I think one of the things that we may have lost track of as this sort of happened over time, you know, we held fast to that notion that the circulation record would be deleted when the person returned the book and sort of didn't notice that when there was no checking in and out that didn't mean there was no tracking. And so at this point I think we have to assume that any time you're on a third-party platform you're being tracked and not just by that third party but also by all the additional parties that it uses in order to provide the information that you want. One thing I like to sort of say, too, is we actually want the back button to work on the browser. And so at its most fundamental level we need a certain kind of tracking and that's just quite different than the print era. So we have any number of concerns about where that data might be flowing. Questions, too. If we are correct that information use can create a risk if other people know that you are using that information then the user should also know where their information is flowing. And one of the things that will be said, and this is an interesting contention, right, like librarians will say, well, I'm concerned if somebody knows that they're being tracked it changes their behavior. People who track are often like, yes, that is my point to change their behavior. So there's also this internal little tension around what this might do. So where this data is going but it includes library websites as well. Most of us using Google Analytics or some other product. So I really kind of take heart from a training I went to six, eight years ago now where somebody reminded librarians that our privacy policies need to be factual, not aspirational. And too many times we have policy language that sort of aspires, that says our aspiration. I can point to any number of library policies that will say things like your library uses private no one will know that you are using the library but you. That is not true. And so we can have to think about if that's not true, then who knows. And this will, I'll turn this over to my colleagues here. The other question is other people on campus also know. So historically the library circulation record kept on a piece of paper filed in those cards was physically located in one place and that was in the library. And so this will be an interesting thing for us to discuss too of how librarians conceptualize the notion that they're part of a larger institution and that a larger institution might have some thoughts about what access it might like to library data. Well, we will forge ahead. With the soundtrack. Yep, exactly. Getting it going here. I wonder if perhaps, Kent, you have a reaction to that or further thoughts on it? I'd like to think that where this two years ago I would be handling it better but I'm still tickled pink, I'm here at all. So this is actually throwing me off a little bit. Yes, this actually brings to mind several thoughts. You know, the whole third party thing is fascinating. I think for UCLA and many of our sister campuses and other higher education institutions have put together fairly mature programs for doing what we're calling third party risk management. And really in this context it's typically looking at security reviews of our third parties to make sure that they're actually going to be able to protect our data the way we expect them to. For UCLA and other UC campuses we also include privacy reviews and accessibility reviews as well. As well as security. And if you think about it, the reason we do this isn't just to ensure to the best of our ability that the third party is actually going to do what they say they can do to protect our data. It's also about how they're going to use our data. You know, we have an expectation about how our data will be used. We don't sell data to third parties, for example, about our students. At least I hope we don't. You know, we have a whole bunch of things that we simply don't do. Or that we do do. And that's not going to be true of our partners necessarily because if they're faced with a question should I do this or not, they may come up with a different answer than we would. And it's because fundamentally the mission of their company or their organization is not the same as ours. And so, you know, we're answering a question like should we in the context of our mission? And if they're different, we may come to different answers. And that's why contractual language is so important. Because that's where we actually try and agree this is how we're going to handle things and align our expectations so that to the extent possible I can trust that the third party will do will make the same decision I would make or UCLA would make in the same situation. That said, you know, contractual language is one thing. What actually happens is much, much trickier for example, the use of our data to train algorithms. You know, we are now suddenly part of an interesting experiment where we are helping a company to develop, you know, their algorithm to become better. In the end, it helps us because of course the service we're paying for becomes better as well. But at what point does that turn into, say, a new line of business where, you know, the taxpayer dollars of students who come to UCLA have been part of that algorithm training. You know, you can spin off all sorts of questions like that that are really fascinating. But I think almost the most interesting case is not the third parties. It's as Lisa was saying, it's about us because we are now starting to really realize just what an asset we are sitting on. You know, it's a goldmine of data and we all want to use it for very important reasons whether it's looking into diversity, equity and inclusion whether it's looking into how do we, you know, just administer the optimize and become more efficient or how we do better academically. And all of these things fall under typically the title of institutional research which implies a lot of analytics. You know, we're using some very sophisticated methods to actually analyze ourselves and none of it goes through an IRB. There are no controls really. Again, this fits at the heart of the discretionary space should we. And different people have different ideas about what's reasonable. I want to also pick up on both Kent and Lisa's comments. One of the things that strikes me as you were talking is taking myself back to my own space. As Ken pointed out, we have a pretty robust third party or vendor risk assessment program. But as you were talking it dawned on me that the first evolution or first iteration of that program is sort of where you are right now. And it's not enough. Simply assessing a third party doesn't really answer the questions that you you know, particularly to all of us. Question number one who owns the data and what happens to it. That's a really fundamental question. The risk assessment doesn't really answer that question. What drives us towards an answer is being real clear in our contracts and what we expect. But sometimes that too is not enough. And one of the things that strikes me is I've been working with our own campus on some equally big challenges. And that is we went through the process of assessing the vendor I think wrote a fabulous contract about the problems in terms of who owned the data where the data resided, etc. And it occurred to me that the missing element and this is a hard one is we have to monitor our vendors. We have to be not only clear but also clear with intent that if you violate our contracts violate our rules we are going to do something about that. And this is where it gets hard because I think in your space there are not a lot of players as if your hands are tied. Five players on the table, four do not want to behave. What do you do? And I think that as a body we have to be real clear with all of our service providers that we may not take this for very long because we can't afford to. As Kim pointed out we hold the keys to the kingdom. They want our data yet they're making us feel like we're doing them a favor or they're doing us a favor. And that's not the case at all. And so as we go through and my team actually participates in contract negotiations we're trying to push that message forward that you're not here doing us a favor you're a service provider. Your job is to do what we need you to do and it's in the contract and if you violate the contract you'll be talking to our attorneys next. I mean we're trying to be a little bit more forceful about this because I think it's time for us to pivot. Maybe it's long overdue to start making that pivot but it is it starts with again the assessment contracting and then monitoring and then being prepared to take whatever steps are necessary as the next step. It's really interesting that you focus on the contracts there. Some folks know of my project that I'm currently in the midst of called licensing privacy which is a look at how we can use our library contracts to leverage that relationship for better data protections and the like. Released yesterday are two white papers from that project. One reporting on interviews that Daniel Cooper who's here in the front row from Ithaca SNR did with a number of library leaders about where privacy is in the priority stack during negotiations and so how high will we push that up in the priority stack and against some of the other priorities we have like access to journals and as you alluded to we usually have one provider for those things and we are often not doing an RFP or any other processes that we have available. The other that was released was the rubric doing an assessment of your existing contracts and privacy policies of those so you could at least go through and say like we developed a concept of minimal viable privacy to say like okay this is like really low stakes but are you falling below that because then we've got some real concerns but at the last December meeting the conversation that we did have at CNI as well was okay enforcement and especially when there's not a competitive marketplace there's not a well we won't get it from you we'll get it from someone else how we leverage our contract when we have that kind of dependency is a real issue so I think it's I appreciate you kind of bringing all these things forward but ultimately it will also rely on people individually or in consortium being willing to put this kind of data management privacy ownership issues on the table and negotiate for them in the way that we negotiate for other things such as preservation, access for people with disabilities all the other values that we as a library profession have that we insist on so hopefully giving people some language to work with will help but that's really like step one of what is a much longer journey if we're going to use our contractual and just a little footnote I will say this is going to be even more difficult when we are paying for publishing services rather than reading access so as we move to those pure publishing transformative agreements I have not seen one yet that has a privacy clause when we're paying for publishing services so let me ask a very specific question and also a somewhat more general question I'm curious whether either Cheryl or can't you get involved at your institution in the library's negotiation of publisher big publisher agreements and the language in those contracts I haven't but our librarian is in the room somewhere Mackenzie? there you are maybe she could speak to that at some point but the way that the process unfolds at UC Davis is my team starts the practice we conduct the assessments we're looking for these hot button issues where is the data, who owns the data what will be done with the data and we try to illuminate as much as we can in a report to sit down with Mackenzie and others to talk about what we have found and determine what we want to put in the contracts UC has standard terms and conditions we have the option to amend or augment those T's and C's to meet the needs of our request or our sponsor which would be our librarian in this case then my legal fellow I have one on my team sits down with our contracting officer and they go through the negotiations so to a degree we are part and parcel of the process as long as the requester allows us to participate the key here and this is I think the tough part is that there are not a lot of providers to go around so it's not as if we can say no to one because five or more knocking on the door and so I think that that's going to be the tough hall and that's where organizations like this sort of continue to push the envelope and bring the service providers to the table and try to force the change but what we're trying to do at least on my end is to identify as many of these pain points as we possibly can to see what our librarians and others are willing to accept or not I might just add a little wrinkle to this over the last year or so we've come across working with our bookstore who deal with textbooks and of course now the textbooks are typically electronic as opposed to being physical books and they're no longer just electronic texts but they come with electronic quizzes and extra exercise sets and all these other things where students interact directly they have their own accounts and suddenly it's like well wait a second where's the agreement we didn't review an agreement you didn't need an agreement just for the textbook because that's typically about pricing controls and that kind of stuff and we have in some cases found there was no agreement to look at because they've never had to put one in place and so suddenly we're faced with a situation where there is no agreement to even review let alone to try and amend this is something I think sort of a state of the art right now we're going to have to I think we're going to have to work with publishing companies on this some of them have privacy policies for example but it's not directly negotiated the negotiation is about price it's not about anything else the other thing that I might add here is again a little twist is that for example the University of California has a small business initiative in purchasing and we have many such initiatives for really good reasons to try and tilt the balance in favor of small businesses where we can small businesses though I think as Cheryl probably as encountered as well are probably in the least they're least able to meet our needs in terms of data protection and the many other things we can require because well one way I could say this is that we have a high bar of expectations the other is we are a pretty challenging partner to work with and small businesses often don't have a chief information security officer or a security officer that's dedicated to the privacy office or any of these other things that we're looking for and so on the one hand we want to work with these companies and on the other it's like from a risk point of view it may actually be much more difficult so there are no answers to this it's just we're faced with having to work through this every day now we have to get creative with each of these turns they often do not have a CISO or chief information security officer or even a security plan or in some cases we have found we have to have a conversation about what is security and I don't mean that to be facetious I know how this feels by the way right but nevertheless we do it we want to encourage small businesses to develop and so part of my team does just that it's a bit more bandwidth on our part but it's worth the time and effort to bring some of these smaller businesses to a level where they can work successfully with the University of California there's no easy answer to any of this I think it's probably like concluding comment we can just be done now I wonder if you could comment a little bit in dealing with the vendors on the mix between contractual language and actual audit perhaps of what the vendors doing to ensure that they are following contractual terms around say the amassing of data or purging of data that kind of thing this is the complicated frontier to be perfectly frank with you we've written any number of contracts a handful we've gone back to ask the question again of our vendors are you doing X, Y, or Z and find out they're not they're not meeting our contractual needs this is out of my hands and into the hands of the requesting department and our legal counsel what I said before is description of an evolution of our program and that is we can't rely on just one assessment and the first contract to sort of say what the world looks like today and we are reassessing some of our vendors particularly those that are dealing with high value assets or what we would call high value assets and trying to see are they indeed meeting our needs did they indeed do what was spelled out in the contract and as I said in some cases we're finding it's not the case so it's a real challenging and a growth of our program it's a necessary growth one of the biggest challenges is just finding the resources to keep up with the pace I think what we're hearing and what we all can agree to is that so much of what we do as an institution is being done for us outside of the institution we have a lot of data sitting elsewhere with a lot of different service providers I can't even begin to tell you how many service requests I receive per week of sending data and engaging with the third party the numbers are staggering so it's a reality that we have to face and again I have to embrace this idea of expanding this risk assessment program to include that ongoing monitoring and again when we discover something we say something do you want to add anything on that I'm just thinking I thankfully actually I don't have to get into that aspect of sort of the compliance of the auditing I think an important piece of this if we rewind a little bit thinking again about the contractual piece one of the other differences I think between privacy and security and Cheryl may well I don't know you probably won't argue but I think the privacy piece part of you will agree with me is that from a security point of view what you're really trying to do is protect the institution by protecting our assets you know people can't hack into them people don't go and change data or steal data whatever it is but we're trying to protect the institution from a privacy viewpoint which again means we're talking about data about people only it is also that again no more breaches we don't want to let our data about people out there in the wild but there's another aspect as well and that's from the point of view of the people whose data it actually is about so thinking about a typical contractual negotiation you know we say here or what we UCLA expect you as company X to do to protect our data about students say for example that we're going to hand over to you to provide the service to us whatever that is and you know we can go back and forth about exactly what that means and then there's a piece of the contract that always says okay we've agreed on how you're going to protect our data and then here's the language about but in case something goes wrong anyway you know breach you know here's how we agree we're going to do things you know you're responsible for this we're responsible this we have to coordinate and we both have cyber insurance to pay for things so that when something goes wrong you know both sides have insurance and we agree we sign the contract and we go okay we have reduced the risk to something that's acceptable but if you think about it from a student's perspective you know it's their data if there's a breach and their data is out there now on the dark web out in the wild there is nothing they can do about it once it's out there it's out there there's no cure you can't undo it you can't get credit monitoring to help you know none of this stuff will help and they will look at this you know risk equation and go what do you mean it's okay just because you can pay or you know all of the money part between the university and the third party is okay doesn't make it okay for me whose data is now out there in the case of a breach so you know there's a privacy piece which is really thinking about the risk to the individual as opposed to the risk to the institution it's both I mean I'm paid by UCLA have to protect the institution as well but there's also you know sometimes I have to think about I wouldn't go so far as to say being the voice of the people who aren't at the table because I can't possibly represent them all but certainly raise the questions about if I were involved in this how would I like to be treated in the case of a breach or in this negotiation actually I wouldn't disagree with you at all in fact I strongly support your comments what I would probably keep in mind is what we're talking about is what happens when a decision is made I want to use provider X or I want to send my data here what I would suggest with respect to what you just described is you know take it a few steps back and ask do I want to do what it is described of to put it a little I'm trying to find the right words but should I do this whatever this is should I take this this repository of student student data and do this regardless of the provider and I think that is a more fundamental question that I hope that we can get into when we're dealing with departments and individuals who have this information under their stewardship and are making some decision on what to do with that information whether it's sharing it with a provider or using it in some fashion I think I'm also one of the people who probably gets well known and thus now I'm on a bunch of committees for having sending these kinds of things but as an employee one also might think about their privacy not just I mean we definitely have a responsibility of care towards students in particular I think but you know every year I'm mandated to take an incredible amount of online training on Title IX sexual harassment reporting minors it there's a lot and every time I'm the person who's actually reading through the privacy policy and you know gets to the point when it says if you have a question you can contact X person so and what's interesting I'm hearing in the juxtaposition here is something I've encountered a number of times where they say you know don't worry the campus attorney has reviewed this and you know it's fine like you are not my attorney so I notice for example in the training on related to if somebody discloses that they have been sexually abused to certain people who work at the university you have mandatory reporting requirements and you have liability if you don't follow this as a state employee well the way the training is set up it's an attempt to be interactive so it's asked you questions like please describe any time where you've seen this kind of thing happen and I'm like okay first of all if I have seen it I have reported it but you're asking me to type something in that actually creates some actual civil liability for me and I don't have a whole lot of confidence and as I can see that google analytics is running in the background that I really think that this is something I want to type in so that's great for me who just types three periods and moves on but I really worry about people who have not spent an incredible number of hours of their life thinking about this kind of thing and the degree to which we're exposing them to risk just not by the system itself but by what it prompts and they don't have a thought process to say is this something that's good for me to type in or disclose about maybe something that happened to themselves which is certainly something we're seeing on websites like the cheque and the way students are sharing papers they've written and papers in which they have discussed being abused or harassed not realizing that then that's going into this algorithm this search all these sorts of things in some cases then breached with their name on it so I mean I wonder too about our educative role relative particularly to students and employees I think it plays a dominant role one of the topics we're not talking about is the management of data and maybe you would agree with me that in that same scope we should also talk about privacy by design security by design I mean what we sort of started off talking about were the vendor relationships that we have to build and sustain again I would suggest that we take several steps backwards and ask ourselves how best should we manage data how best should we design websites and systems so that we're not forcing people to disclose more information is absolutely necessary that's a different discussion in my book a much more important discussion than which service provider you select this week yeah no I think that's sort of kind of really basic I would probably add we had an interesting discussion actually not very long ago within the group of UC privacy officers around training about privacy and whether we're really focusing on because training tends to imply you're going to learn something you know some specific skill or ability or understanding of something versus more general awareness of privacy and I think we came to the agreement that we really need just to raise the overall level of privacy literacy even if it's by a little bit so people have an understanding of what they really need to think about even just when should I ask someone about something and this comes up in so many different guises because again as Cheryl and Lisa are talking about there's a huge you know just thinking before you enter and hit return you know what am I disclosing to people do I really want to disclose this but then there's the other half once it's disclosed how can it be used even if it's legitimately used it's not a breach kind of situation and how can people take advantage of that that data that you've now provided you know there's a most of a lot of our laws about privacy and we have a lot of them you know we tend to think in this country that we don't have a lot of privacy law unlike the European Union for example but we do we have a ton of privacy laws particularly in the states right now at the state level but they tend to be very specific around certain kinds of data and and so if you aren't careful about how data is being used then people will people will use it again things like using our data to train algorithms is now becoming a really big issue what does that mean and how can you take that back once it's used to train an algorithm say to recognize your voice how can you undo that is there any way to undo that can you actually get rid of the original data set after it's been trained there are just such a huge number of questions that are facing us today both about the unauthorized use the disclosure piece which laws typically are pretty good at saying you can't disclose it or it's illegal if you do that but then there's a much more squishy piece around appropriate use and what if you have authority to access data what can you actually do with it that's a really interesting question and I might add to that what recourse do you have at any level in here we can have here's what's acceptable but if something is done that's not acceptable the data is out there but we also might have a notion that you're owed something for that harm but how do we assess that harm whose liability is it is it the institution the third party where does that fall in there so I think it gets more you said before it's complicated fundamentally it's so many of our laws if you think about our security breach notification laws where if we suffer a breach of personal information we have to let you know and the theory being well first of all you should know that you should protect yourself if you're at risk because of a social security number breach you can get identity theft protection you can put a freeze on your credit report for example or get credit monitoring but increasingly our laws and in California in fact they think of January this year they've added genetic data I believe so that if we breach genetic data we have to notify the people involved whose genetic data has been breached there's nothing you can do about it it's out there and there's no credit monitoring for genetic data and there's nothing you can do about it it's out there so why do we tell you what's the theory now you can't perform identity theft you can't protect yourself so why do we tell you well for transparency for one thing because if we don't tell you and you find out you will probably go wait a second what's up with that there's a trust issue which I think is particularly important for public institutions but for everyone and I think we are heading in this direction where it's not just about financial consequences or tangible consequences but we're really talking about intangible privacy harms back to the value thing I personally would like to know if my transcript was breached there's nothing I can do about it I don't know if anyone can harm me by looking at my transcript from how many decades ago now but I would certainly like to know about it even if there's no law that says I have to be told well you've already navigated to my next question which really was about privacy, education and support of the members community as people not necessarily as employees but just as people and what they need to know I want to open this up for questions from the audience but maybe while they're getting ready to do that you could just make a final comment on this education perspective in terms of who's the best group on campus to be doing it or is there really no best group and it should be everywhere or how do we actually get this education particularly and most importantly to the students on our campuses I can start but I'm going to actually defer a lot of that question to Kit as I think about information security and I dare say privacy protection as well it seemed to me over the years that one of the missing elements was the human element and if you think about the other triad that encompasses information security people, process and technology I'll be honest we haven't done nearly enough on the people front 30, 45 minute PowerPoint presentation information security really doesn't cut it in my book and so as part of the growth of the program that I managed for UC Davis I've added a whole new component called outreach and we're looking at it from multiple dimensions outreach to students last year a number of our students were the victims of scams it was happening at a really fast clip and we were trying to get ahead of it and it seemed to us that we needed to talk about what was going on unfortunately our voices are a little too old so we put together a group of students to talk to students about these issues that impact students we're seeing the federal level a number of directives impacting researchers so another component of the program is to talk to researchers about cyber in particular privacy eventually will come into the fold as well so long story short we have not done enough in my industry to reach out to individuals to educate them to train them to help them and so as part of our program we have again this brand new component that we're just starting that we're just launching where we're going to get out and we're going to spend more time talking to people and again it's going to be more than just the bread and butter once a year presentation and why are we doing it this way we need to learn to sit down and talk about what vendor or which vendor we're going to select I need to understand how you're going to use this information how are you going to use the data how can we craft a website going back to our scenario where we're not asking for more than it's necessary in order for you to accomplish your goal so I want more of that kind of engagement and so that's sort of what we're trying to do in terms of outreach and training I think you took the words right out of my mouth I would say there is a need for sort of a general elevation of knowledge but really I think that works best when it's contextualized in the community of practice that you are in so I think about for example thinking about students and how they react when they learn about their data being how valuable their data is as an asset or as employees in this COVID era when our campuses have collected just enormous amounts of data for campus pandemic response and how that's being used data that was previously typically only collected in your health care context between you and your primary care physician these are ways in which privacy is now expressed again AI is probably the other one where you think about privacy but what you're really saying is in this context this is why it's important and people tend to respond I think better because that's their area of expertise if we learn about what they're doing as opposed to forcing them to learn about what we want them to know so for the past couple years now I've been serving on our campus data privacy advisory committee which is more of a governance and policy committee but through that I've also been partnering with our chief privacy officer on building out in particularly an outreach and education program so our primary events so far have been around data privacy day each January trying to bring in some students as trainers actually on laptop check up and that sort of thing think to say that there is an infinite amount of work to be done in this area is really understating it I think if I can just sort of turn to the library sort of audience for a second one of the things that I think we're sort of grappling with is sort of our historic notion was we will protect your privacy we will take that responsibility as the library we'll delete your circulation we'll do this we'll do this and so we were the ones taking the action that is not an approach that works in this distributed third party kinds of environment and so we do have strong education programs in the library this is where my information literacy background is serving me well I was saying okay if we don't do it for you just like way back when we went and got the book for you out of the stacks now we can't protect your privacy for you so can we educate and help people make the choices that they want to make about the way their data will be disclosed to a third party or not but it's a real switch for us from taking the responsibility for protecting privacy to helping people see the environment that they are in so that they can make their own decisions about it which even comes back to then I'm going to say some library employee education with Kyle Jones at Indiana University University of Indiana Indianapolis we've done some research on library practicing library and knowledge around user data data ethics and like to inform a project called prioritizing privacy which is helping librarians understand learning analytics and how library analytics might play into that and what we found is that librarians themselves are not as fully educated around this data landscape and how data is flowing and the privacy concerns and I think what I would interpret the results we saw is as a result they revert to that mindset of we'll just shut it all down and make sure it doesn't happen but again out of alignment with the realities of today's web based cloud based distributed algorithmically driven environment so I think as a profession that's our internal challenge which is this switch in mindset but then also really understanding what's happening with the tools we ourselves are using on a daily basis when we're working with patrons thank you I'd like to open this up now for questions for the panel from folks here there are mics in both aisles please so hi Todd Carpenter with NISO a comment and a question the first is related to a project that we have been working on at NISO seamless access we've just released a draft contract language for attribute release and the privacy of protecting privacy and attribute release when you're using SAML for authentication purposes that is open for public comment through the end of the month so you have four or five more days we'd really appreciate additional comments there's only been a handful the question I have for you is how do we operationalize some of the things that you've been talking about a lot of these dealing with some of these issues are very specific to the context attribute release for subscription services are there other areas that you could see model contract language or community best practice to help operationalize some of these privacy goals I probably have a quick comment and again deferred to Kent one of the things that I'd love to see to your point of operationalizing the things that we've discussed is one, for the community to work as a community first and foremost but two, to take some of these themes and turn them into tools case in point EDUCAUSE created the HECVAT which is one of the primary tools that many of us use so why did that come to be because many of us were moving our repositories into the cloud the body, the community came together and said is there a way to create a common instrument that many of us can at least leverage the answer was yes what we haven't done much too much of grand listening to a lot of this conversation is we haven't incorporated privacy as an element or a theme or a body of questions in that same instrument it should be done and so maybe one step we could take I believe you're still part of the chief privacy officer's group perhaps that group can combine its efforts with the security officer's group and amend this tool so that we can incorporate some of these themes and make sure that we're all asking very similar questions why is that important because most of our vendors are getting the same instrument at some point they're going to get the message that would be one idea that floated in my head as soon as you mentioned the attribute issue so it's great that you mentioned the heck fat in particular in Kent I'm not sure if you were there when I met with the chief privacy officers last month so Brian invited me to the chief privacy officer's group of EDUCAUSE to talk about the licensing privacy for this very particular reason of this question of the discussion that was already bubbling up around the privacy aspects missing from the heck fat and do we create a second one do we integrate it and wanting to sort of hear about what the librarians were interested in seeing as well so I was really excited to have that because I think vendors don't need a library form and a campus form so the more we can partner across our professions or however we would say that would be really powerful and Brian encouraged me to please propose something for the EDUCAUSE privacy conference in Baltimore in May I still had to compete for it but I did get in so there will be a session at that EDUCAUSE security privacy conference in May around these licensing privacy issues as a member of the seamless access group that developed the model language please do comment if I can just on Todd's comment there as well so I guess I'll just add there has been work actually underway to add some privacy dimension to the heck fat but it's not quite as straightforward as we originally might have thought nevertheless this morning over breakfast I had a chance to glance through the two white papers not kind of they were very interesting and it did seem to me you had a long list of resources at the end many of which were various frameworks I mean we're not lacking for a lot of work in this space it's just going to take some time to converge to become really operational the one thing that I might add I think that would be very helpful in anything that you do is to think about the transparency piece so in describing to your community whoever your community of users are telling them what is going on with their data and just being upfront about it even if it's not what you want to be able to say you should be able to tell them what it is what the data is doing and where it's going and who's taking advantage of it if indeed that's the case because again you're trying to establish trust and you can't do that unless you're disclosing what it is you're doing as a steward of their data Jen I think you were next yeah thank you first of all I'm so excited that we're having this conversation and the reason that I'm excited about it is because it feels like it was a conversation that was happening in the academic technology space several years ago when we were starting to license software as a service tools and our learning management systems and we're really asking the question you know who owns the data what are we going to do with these student analytics we know that they can help students and others sort of understand student learning behavior yet there's a real concern there and I'm just sort of hearing echoes of those conversations from what it feels like so many years ago and I just want to sort of look to the question of where are the natural alliances in the library world and I would like raise the hand and say in the academic technology space to shine a light on vendor practices and continue to sort of raise this as a concern I would say that especially in Cheryl's space in the security realm all of our resources people are going to breaches that are notifying that are triggering that require us to notice and there's very little space and time and energy for much else in the resource world and so it can't just be the campus privacy officer and the chief information security officer it has to be a coalition and so my question to you is where else do you see the natural collaborators I'm saying academic technology we're licensing a lot of tools right now I happen to be a CIO in the collaborative world space where people are putting a whole bunch of information on whiteboards and collaborative tools and I keep thinking to myself oh god where is this going and is it identifiable and how long are they keeping it so where else are the natural collaborators and how do we keep this in the forefront of our conversation I'm going to start and I'm going to ask Mackenzie I'm sorry but years and years and years and years ago a long time ago before I became what I told as a seasoned security professional I was really focused on data and I had said early on that we may be going off track by spending so much time talking about how to secure something and not nearly enough time talking about the data itself there used to be lots of conversations regarding data management and you guys may remember this discussions and people coming together to talk about the data itself I wonder if there's a moment for Mackenzie to talk about the data management program that she has spearheaded at Davis because I think that is a representative of the answer to your question the alliances who should be brought to the table to talk about the data itself sure thank you so much Cheryl for calling me to do that I'm sorry but I'm so impressed and Cheryl you've been such a wonderful partner in this and what she's talking about is something we call the institutional data council at UC Davis which is a forum where the library the CISO other people from our IT office the chief privacy officer other people from academic technology you know we all get together on a council that discusses these issues with senate faculty and student representation so it's a group it's actually kind of a big committee and I co-chair it with a senate faculty member but it has provided a place for these conversations to happen and you know it can get ugly sometimes but you really do you get to hear everybody's perspective you get to discuss the tradeoffs between transparency and all of California's wonderful sunshine laws with the privacy expectations so I wouldn't say that it has revolutionized the campus but it has provided us with a place to have those conversations and I will give the library credit for really coming up with that idea that we needed a council a standing committee charged by the provost to keep it going so it wasn't a one-off and actually got the idea from Berkeley it just didn't last very long we need to bring it back I'll just add Jen I don't know in terms directly about alliances thinking about things like the learning or educational technology space as you say they've been grappling with this for a long time now and it feels to me a lot of the conversations that we're having right now they are operationally very complicated you know to do what we want to do like to me security is conceptually very easy to grasp just like secure it don't let the wrong people into our systems but it's exceptionally difficult to actually achieve that if not impossible and in the same sort of vein a lot of what we're talking about right now are operational challenges you know the fact that we don't have enough resources that are contractual negotiations are not perfect there are all these things that make it complicated but it's conceptually pretty simple I think there are spaces that are really much that are much more we just don't know what we don't know yet the research enterprise is one of those spaces I think it's not clear just what you know how you apply privacy to the research enterprise other than through the institutional review boards who deal with human subjects research and I think we're increasingly understanding that that's necessary but not necessarily sufficient the other thing I guess I might say one thing I might say is that our colleagues in the health sciences when they think about data release for medical records for patient records are probably quite a bit more mature than we are on the campus side in part because they have a you know a much sharper focus a much more laser like focus on what they do what their mission is what the data types are but nevertheless I think there's some good lessons to be learned but the one comment I really do want to make and that is you know privacy officers often like security officers like auditors attorneys risk management people we're always sort of the no people who tend to be seen as the no people Cheryl might be the exception no not my she is the exception that proves the rule but we tend to be seen as the no people or if not the no people the slow people you know we take our time to figure out what to tell you we can't just say yes I know immediately because we have to do the analysis and that takes time but there's a you know and legitimately that's so we have to worry about the risk to the institution but there is something to be said for thinking about are we optimizing for risk or are we optimizing for opportunity and the two are not the same thing you know we tend to on our side of the world tend to see things through the lens of risk and reducing it and saying well is the benefit sufficient that we will take this risk but in fact that's not the same question as saying is the risk too high or too well too high for us to be able to do which is really this thing that's really important to us as a university this research this this program whatever it is I mean it's sort of like if you only ask the one question it's sort of like having a lawyer argue this you know this one case on both sides and having a single lawyer do that that doesn't work well you can't give the best argument on both sides and I just would like to make sure people are always asking you know the question about opportunity as well because it's always a balance it's you know privacy is not absolute and neither is anything else we are always trying to balance well we're not balancing laws those we have to obey but you know put aside the the stuff we have to do we always have a bunch of obligations whether they're administrative or legal or policy based or contractual based contract based and a bunch of values like privacy that we have to weigh in any given situation Hi uh oh Cliff this will have to be the last question I'm afraid thank you I'm Eleni Castro from Boston University Libraries and I was thinking another place of synergy or connection could be with data scientists or computer science for example at Harvard they're working on differential privacy so how can we do machine learning and AI that combats the privacy issues that algorithms are creating now so that's just another place that we can start working together with other folks thank you though I love that comment so much we just launched an ML project that's embedded in the security program it's collecting a vast amount of information and using some really sophisticated algorithms to tell some kind of compelling story but there is a missing element and you're absolutely right we're not looking at it at this stage through the lens of privacy we're going to have to I'm not touching AI just yet because I think we need to do that first that is look at privacy and then AI but ML is such an attractive feature for us because I just don't have enough people to do what machines can do but you're absolutely right I'm going to have to add a privacy layer on top of what we're doing thank you for your comment that would be privacy by design one of the most interesting efforts I've seen on that differential privacy kind of by design is the most recent census which used that very explicitly for the first time and that's a really interesting story you might look up we are unfortunately at time I only had about 15 more questions I wanted to ask these folks and I suspect you have a few too I hope that they'll all be around for at least some of the rest of the conference and you'll have an opportunity to chat with them but right now would you please join me in thanking this wonderful panel