 Hey, welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco at the Twitter headquarters for Data Privacy Day, an interesting collection of people coming together here at Twitter to talk about privacy, the implications of privacy. And I can't help but think back to the classic Scott McNeely quote, privacy is dead, get over it. And that was in 1999 and oh how the world has changed. Most significantly obviously mobile phones with the release of the iPhone in 2007. So we're excited to really kind of have the spearhead of this event. Michael Kaiser, he's the executive director of the National Cyber Security Alliance in from Washington DC. Michael, great to see you. Thanks for having us in. So for the folks that aren't here, what is kind of the agenda today? What's kind of the purpose, the mission? Why are we having this day? Well Data Privacy Day actually comes to us from Europe, from the EU, which created privacy as a human right back in 1981. And we've been doing it here in the United States since around 2008 and NCSA took over the effort in 2011 and the goal here really is to just help educate people and businesses as well about the importance of respecting privacy, the importance of safeguarding information, people's personal data. And then really hopefully with the end goal of building a lot more trust in the ecosystem around the handling of personal data, which is so vital to the way the internet works right now. And it seems like obviously companies figured out the value of this data long before individuals did. And there's a trade for service, you use Google Maps, you use a lot of these services, but does the value exchange necessarily, is it equal, is it at the right level? And that seems to be kind of the theme of some of these privacy conversations that you're giving up a lot more value than you're getting back in exchange for some of these services. Yeah, and we actually have a very simple way that we talk about that. We like to say that personal information is like money and that you should value it and protect it. And so trying to encourage people and educate people to understand that their personal information does have value and there is an exchange that's going on and they should make sure that those transactions are ones that they're comfortable in terms of giving their information and what they get back. Which sounds great, Michael, but then you get the Eula, you sign up for these things and they don't really give you the option you can kind of read it, but who reads it? Who goes through, you check the box and you move on? And or you get this announcement, we changed our policy, we changed our policy, we changed our policy. So, I don't know, realistic is the right word, but how do people kind of navigate that? Because, let's face it, my friend told me about Uber, I want to get an Uber, I download Uber, I'm stuck in a rainy corner in DC and I hit go and here comes the car. I don't really dig into the meat. And is there an option? I mean, there's not really, I opt for privacy 123 and I'm opting out of 567. Yeah, I think we're seeing a little bit more granular controls for people on some of these things now, but I think that's what we'd advocate for more when we talk to consumers, they tell us mostly that they want to have better clarity about what's being collected about them, better clarity about how that information's being used or if it's how it's being shared. And equally importantly, if there are controls, right, where are they, how easy are they to use and making them more prominent so people can engage in sort of making the services tailored to their own sort of privacy profile. And I think we'd like to see more of that for sure. More companies being a little more forthcoming. Yeah, you have the big privacy policy that's a long, complicated legal document, but there may be other ways to create interfaces with your customers that make some of the key pieces more apparent. And do you see a trend where, because you mentioned in some of the notes that we prepared that the privacy is good for business and potentially as a competitive differentiator, are you starting to see where people are surfacing privacy more brightly so that they can potentially gain the customer, gain respect to the customer, the business of the customer over potentially a rival that's got that buried down? Is that really a competitive lever that you see? Well, I think you see some extremes. So you see some companies that say we don't collect any information about you at all. So that's part of out there. And I think they're marketing to people who have extreme concerns about this. But I also think we're seeing, again, some places where there are more higher profile ability to control some of this data, right? Even in places like the mobile setting, where sometimes you'll just get a little warning saying, oh, this is about to use your location, is that okay? Or your location is turned off, you need to turn it back on in order to use this particular app. And I think those kinds of interfaces with the user of the technology are really important going forward. We don't want people overwhelmed, like every time you turn on your phone, you don't have to answer 17 things in order to get to do X, Y, and Z. But making people more aware of how the apps are using the information they collect about you, I think is actually good for business. I think actually sometimes consumers get confused because they'll see a whole list of permissions that need to be provided. And they don't understand how those permissions apply to what the app or service is really gonna do. Right, right. Yeah, that's an interesting one. I was at a, we were at Grace Hopper in October and one of the keynote speakers was talking about how mobile A has really changed this thing, right? Because once you're on your mobile phone, it uses all the capabilities that are native in the phone in terms of geolocation, accelerometer, et cetera, about all these things that a lot of people probably didn't know were different on the mobile Facebook app than were on the desktop Facebook app. And let's face it, most of the stuff is mobile these days, certainly with the younger kids. So as you said, I think that's an interesting tack. Why do you need access to my contacts? Why do you need access to my pictures? Why do you need access to my location? Yeah, and then the piece that I'm curious to get your opinion, will some of the value come back to the consumer in terms of, you know, I'm not just selling your stuff, I'm not monetizing it via ads, I'm gonna give some of that back to you. Yeah, I think there's a couple of things there. One quick point on the other issue there, it's like, so without naming names, I was looking at an app and it said it had to have access to my phone. And I'm like, well, why would this app need access to my phone? And then I realized later, well, it needs access to my phone because of the phone rings, it needs to turn itself off so I can answer the phone. But that wasn't apparent, right? And so I think it can be confusing to people, like maybe it's innocuous in some ways, sometimes it might not be, but, you know, in that case it was like, okay, yeah, because if the phone rings, I'd rather answer my phone than be looking at the app. Right, you can kind of read it, or can I just, you know, just see it, or you know, the degree of the access too is very confusing. Yeah, and I think in terms of the other issues that you're raising here about, you know, how the value exchange on data, I think the internet of things is really gonna play a big role in this because it's really, you know, in the current world, it's about, you know, data, delivering ads, those kinds of things, making the, you know, the experience more customized. But in IoT, where you're talking about like wearables or fitness or those kinds of things or thermostats in your home, your data really drives that. Right. And so in order for those devices to really work well, they have to have data about you. And that's where I think consumers will really have to give great thought to, you know, is that the good value, you know, proposition, right? I mean, do I wanna share all my data about when I come and leave every day just so my thermostat, you know, can turn on and off? And I think those are, you know, can be conscious decisions about when you're implementing that kind of technology. Right. So there's another interesting tack I'd love to get your opinion on. You know, we see flow from the progressive commercials, advertising the stick, the USB and your cigarette lighter and we'll give you cheaper rates because now we know if you stop, it stops lines or not. What's funny to me is that the phone already knows whether you stop, it stops lines or not. And it already knows that you take 18 trips to 7-Eleven on a Saturday afternoon and you're sitting on your couch at the balance of the time. You know, as that information that's there somehow gets exposed and potentially runs into, say, healthcare mandated requirement from the company that you must wear a Fitbit. So now we know you're spending too much time at 7-Eleven and on your couch and how that impacts your health insurance and stuff. And that's gonna crash right into HIPAA. This just seems like there's this huge kind of collision coming from, you know, I can provide better service to people at the good end of the scale and say aggregated risk models, but then what happens to the poor people at the other end? Yeah, well I think that's why you have to have opt-in, right? I think you can't make these things mandatory necessarily. And I think, you know, people have to be extremely aware of when their data is being collected and how it's being used. And so, you know, the example of like, you know, the car insurance, I mean, they can only, really should only be able to access that data about where you're going if you sign up to do that, right? And if they wanna say to you, hey, Michael, you know, we might give you a better rate if we can track your, you know, driving habits for a couple of weeks, then that should be my choice, right, to give that data. Maybe my rates might be impacted if I don't, but I can make that choice myself and should be allowed to make that choice myself. So it's funny that opt-in, opt-out. So right now, from your point of view, what do you see in terms of the percentage of kind of opt-in, opt-out on these privacy issues? Where is it and where should it be? Well, I would like to see some more granular controls for the consumer in general, right? I would like to see, and I said a little bit earlier, a lot more transparency and ease of access to what's being collected about you and what's being used, you know, outside of the formal legal process, obviously, you know, companies have to follow the law, they have to comply, they have to be, you know, write these long, you know, ULAs or privacy policies in order to really reflect what they're doing, but they should be talking to their customers and understanding what's the most important thing that you want to know about my service before you sign up for it. And help people understand that and navigate their way through it. And I think a lot of cases, consumers will click, yeah, let's do it, but they should do that really knowingly. If opt-ing in is your opt-ing in, it should be done with true consent, right? Okay, so before we let you go, just share some best practices, tips and tricks, you know, kind of at least the top level of what people should be thinking about what they should be doing. Yeah, so we really, you know, in this kind of space, we look at a couple of things. One, like, you know, personal information is like money, value and protected. That really means being thoughtful about what information you share, when you share it, who you share it with. Own your online presence. This is really important, you know, consumers have an active role in how they interact with the internet. So, you know, use the settings that are there, right? Use the safety and security or privacy and security settings that are in the services that you have. And then actually, you know, a lot of this is behavioral. What you share is really important yourself. So share with care, right? I mean, be thoughtful about the kinds of information that you put out there about yourself. Be thoughtful about the kind of information that you put about your friends and family. Realize that every single one of us in this digital world is entrusted with personal information about people, much more than we used to be in the past. And so, you know, we have that responsibility to safeguard what other people give to us and that should be the common goal around the internet. We have to have you at the Bullying and Harassment Convention on the Road. Yeah, well, great insight, Michael, and really appreciate it. Have a great day today. I'm sure there's gonna be a lot of terrific content that comes out and for people to get more information, go to the National Cyber Security Alliance. Thanks for stopping by. Thank you for having us. Absolutely, he's Michael Kaiser. I'm Jeff Frick. You're watching theCUBE. Thanks for watching.