 Hey, welcome back everybody. Jeff Frick here with the Cube. We're in downtown San Francisco at the Twitter headquarters at the Data Privacy Day event. It's a full-day event with a lot of seminars and presentations, really talking about data privacy, something that's getting increasingly important every day, especially as we know RSA is coming up in a couple weeks and a lot of talk about fishing and increased surface area of attack and et cetera, et cetera. So privacy is really important and we're excited to have Lisa Ho, campus privacy officer at UC Berkeley. Welcome Lisa. Thank you. Glad to be here. So what does the campus privacy officer do? Well, really anything that has to do with privacy that comes across, so making sure that we're in compliance or doing what I can to help the campus keep in compliance with privacy laws. But beyond that, also making sure that we stay aligned with our privacy values. And when I say that, privacy is really important. It's critical for creativity and for intellectual freedom. So at the university, we need to make sure we hold on to those when we're dealing with new ideas, new scenarios. That's got to come up. We have to balance privacy with all the other priorities and obligations we have. Yeah, I don't know if Berkeley gets enough credit in Stanford as really being two of the real big drivers of Silicon Valley. It attracts a lot of smart people. They come, they learn, and then more importantly, they stay. So you've got a lot of cutting edge innovation. You've got a ton of open source technologies come out of Berkeley over the years, Spark, et cetera. So you guys are really at the leading edge. At the same time, you're an old established academic institution. So what role do you have formally as an academic institution of higher education to help set some of these standards and norms as the world is changing around us so very, very quickly? Yeah, well, so the, as I say, the environment needs to be set for creativity and for allowing that intellectual freedom. So when we think about the university, the things that we do there are pretty much what we want to have in the community as a whole and in our culture and environment. So some of the things that we think about, particularly first, if you think about school, you think about grades or you think about the letters of evaluation that you get. Those things that we learning when you come down to it as a personal endeavor and you developing internally as a transformation that's internal. And so what you kind of feedback you get, what kind of critical evaluation, those need to be done in an area where you have the privacy to not be, have a reputation to either live up to or live down, live, you know, live down. Those are things that you keep secret or keep private. And that's why school information and student data is so, is we've agreed as a, as a society that that's something that needs to stay private. That's one area that learning is personal, that why the university is so important in that discussion. And secondly, I'd say as we talked about community, creativity requires time to develop and it requires freedom for taking risks. So whether you're working on a book or whether it's an art piece of art or if you're a scientist, a formula, any kind of algorithm, a theory, those are things that you need time to set aside and to be in your own head without the eyes of others until you're ready that without the, not having judgment before it's ready for release and those kind of things that you want to have space for creativity so that you can move beyond the status quo and take those risks to go somewhere to the next space and beyond. And I think lastly, I'd say that this is not specific to the university, but where we hold as particularly at Berkeley is the fundamental rights that we have that privacy is one of those fundamental rights. And as Ed Snowden said so famously, if you're saying I don't, saying I don't care about privacy because I have nothing to hide is like saying I don't care about freedom of speech because I have nothing to say. So just because you may not have something to say doesn't mean that you can take away the rights of someone else and you may find that you need those at some point in your life in the future. And if no one has to justify why they need a fundamental right. So those things that are essential that come out in our in our university environment that we think of a lot are things that are applicable beyond just the learning space of the university to our the kind of society that we want to build. That's why the university is in the space to lead in these in these areas. Because Berkeley's got a long history right of activism and this goes back for decades and decades. I mean, is privacy, you know, begin starting to get elevated to the level that you're going to see more kind of active vocal points of view and statements. And I don't say marches, but you know, potentially marches in terms of making sure this is taken care of. Because unfortunately, I think most privacy applications at least historically may be changing are really opt out, you know, not opt in. So do you see this as becoming a more important kind of policy area versus just kind of an execution detail on an application? Yeah, we have a lot of really great professors working on these ideas around privacy and in cybersecurity that those that are working on security and other things also have privacy in their background and are also advocating in that area as well. As far as marches, we also pretty much rely on the students for that and you can't dictate what the students are going to find is going to is important. But there are there's definitely a cadre of students that care and that are interested in these topics. And when you tie them together with the fundamental rights like free speech and academic freedom and creativity, that's where it becomes important and people get interested in that. Right. One of the real sticky areas that this bounces into is just security security. And unfortunately, you know, there's been way too many instances at campuses over the last several years of crazy people grabbing a gun and shooting people, which, you know, hopefully won't happen today. And that's really kind of where the privacy and security thing runs up against, you know, should we have known should we have seen this person coming, you know, if we had had access to whatever that they're doing, you know, maybe we would have known and been able to prevent it. So when you look at kind of the balance, but really the kind of the conflict between security security and privacy, what are some of the rules coming out? How do you guys execute that to both, you know, provide a safe environment for people to study and learn and grow, as you mentioned, but at the same time, you know, keep an eye out for unfortunately, there are bad, bad, bad characters in the world. Yeah, well, I don't want to say that there's a dichotomy. They don't want to create a false dichotomy of it's either privacy or security. And that's not the frame of mind that we want to be in there. It's important for both. And security is clearly important, preventing unauthorized access to information or your personal information is clearly a part of privacy. And so that's necessary for privacy. And those are things that you would do to protect privacy, the two factor authentication and the antivirus and the network segmentation. Those are all things that are important parts of protecting privacy as well. So it's not a dichotomy of one or the other. But there are things that you do for security purposes, whether it's cybersecurity or for the kind of security, personal security that may be in a conflict or have a different purpose than what you would do for privacy. And monitoring is one of those areas specifically when you're monitoring for attacks. It's kind of particularly now we have the continuous monitoring for any kind of attacks or to use that monitoring data as a forensic place to look for information after the fact. Those are things that really lies in contrast with the idea in privacy of least perusal and not looking for information until you need it. So having that distance and the privacy of not having surveillance. So what we've coming to at the University of California has outlined a balancing privacy balancing analysis that's necessary for these kind of scenarios that are new when we have untested, when we don't have laws around them to balance the many priorities and obligations. And what you need to do is look at what does the security provide, look at the benefits them together with the risks and do that balancing. And so you need to go through a series of questions. What is the utility that you're really getting out of that monitoring and not just in that normal scenario when you're expecting to how you're expecting to use it? But what about in the use cases that maybe you didn't expect that you can anticipate that that it'll be wanted for those reasons? Or if you what about when we're required to turn it over for a subpoena or for another kind of letter? What are the use cases on that? What what are the privacy impacts in those cases? What are the privacy impacts if it's hacked or what are the privacy impacts of an abuse by an employee or what are the privacy impacts for sharing it with partners? So that together, the utility with the impact, you need to balance that and to look at those differences and then also look at what's the scope of that? Does the scope change if you change the scope of what you're monitoring? Does it change the privacy impact? Does it change the utility when you took those kind of factors and keep them all in line? Not just looking at what's the utility of what you're trying to do, but what's the what are the other other impacts to the privacy analysis? And then what are the alternatives that you could do the same thing and are they appropriate? Do they give you the same kind of value that the proposed monitoring provides keeping transparent about it and keeping accountable to what you're doing? Are really when it comes down to the key as you've done that analysis and making sure that you've looked through those questions of have you kept it? Are you doing the least amount of perusal necessary to achieve the goals that you're trying to accomplish with that monitoring? And what about transparency and accountability coming back to whatever your decisions are making those available to to the community that's being monitored? Wow. Well, one, you've got job security, I guess, for life, because that was amazing. Two, you know, as you're talking balance is the word I was looking for before, so that that is the right word, but you're balancing on so many axes. And even once you get through the axes that you just went through that list of phenomenal, then you still need to look at the alternatives and do the same kind of analysis for each. So really, that was a great explanation. So I want to shift gears a little bit and talk about wearables. I think you're going to give a talk later on today about the wearables. Wearables are a whole new kind of interesting factor now that provide a whole bunch more data, really kind of the cutting edge of the Internet of Things with sensor data. And we're people are things to we like to say on the cube. So so as you look at the wearables and the impact of wearables on this whole privacy conversation, what are some of the big, the big got you issues that are really kind of starting to be surfaced as these things be it more of more popular? Yeah, I think a lot of the same kind of questions around what kind of monitoring you're doing, what's the utility, what is the privacy impact and how do you balance those in the various scenarios that use cases that come up? Really the same kind of questions applied to cyber security monitoring as you do with cyber security monitoring. We finding I think in college athletics and the university sponsored use of wearable technology is really just an infancy right now. It's not a it's not a big thing that we're working on, but it ties in so much as very much parallels the other kind of questions that we are talking about around learning data and how you how you jump or how your body functions is very private, very intimate, how you think how you learn that's right up there on the spectrum of that privacy and intimacy scale. So we're looking very much and we've been talking quite a bit in the university space about learning data and how we protect that. What are the some of the questions are who owns that data? It's about me. Should I be you know it's about the student, for example, right? Should I be should I have control over how that information is used when it's around learning data, maybe the average student there may not be outside folks that are interested in that information. But when you're talking about student athletes, potentially going pro, that's a very that's very valuable data that people may want. So that people might want to pay for. Maybe the students should have some some say in that in the in the use of that data, monetizing that data. Who owns that? Is it the student? Is it the university? Is it the company that we work with to provide that kind of monitoring the analytics on that? Right, right. Even if we have a a contractor right now, if it's through the university, we'd hopefully have made really clear who make who's the ownership, where the uses a lie, what kind of things we can do with it. But as we move into the kind of a consumer space, and it's that where you're just clicking the box and students may be asked to use this technology, it's free and we'll be able to handle it because of course, how much it costs is important. Give you free slices at the pizza store. Right, well, so once that can sit once we get into that consumer realm when it's just either not even having to click the box, the box is already clicked and you say, OK, that's the new come up to where students may be giving away data for reasons or for uses that they didn't intend that they are not getting any compensation for. And in particular cases, when you talk about student athletes, that could be something that would be very meaningful for their career and beyond. Yeah, or is it the guy that's come up with the unique and innovative training methodology that they're testing? Is it Berkeley's information to see how people are learning so you can incorporate that into your lesson plans and the way that you teach? I mean, there's so many kind of angles, but it always comes back as you said, really to context, you know, kind of what's the context for the specific application that you're trying to use that? And should you or should you not have rights for that context is really interesting space, a lot of a lot of interesting challenges. And like I said, job security for you for the unperceivable future. So we're not going to run out of new and exciting applications and things to be thinking about in terms of privacy. It's just a nonstop right because they're not these are not technology questions, right? These are policy questions and rules questions. And we were at a thing last night with the center. And one of the topics was we need a lot more rules around these types of things because the technology is outpacing, you know, kind of the governance rules and the thought process of the ways that these things can all be used. There's it's a culture question, really. It's more than just what you what you allow or not, but how we feel about it and the kind of idea that privacy is dead. It's only true if we don't care about it anymore. So if we care about it and we pay attention to it, then privacy is not dead. All right. Well, Lisa, we'll leave it there. Lisa Hough, Music Berkeley. Fantastic. Thank you for stopping by and good luck at your your wearables panel later this afternoon. All right. I'm Jeff Frick. You're watching theCUBE. Thanks for watching.