 Hey, welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco at Twitter's World Headquarters at the Data Privacy Day, a full day event of sessions and breakout sessions, really talking about privacy. And although we heard privacy's dead in 1999, get over it, not really true, and certainly a lot of people here beg to differ. And we're excited to have our next guest, Jules Polonetsky, excuse me, CEO of Future of Privacy Forum, welcome. Thank you, great to be here. Exciting times for data, exciting times for privacy. Yeah, no shortage of opportunity, that's for sure. The job security in the privacy space is pretty high. I'm gathering after a few of these interviews. There's a researcher coming up with some new way we can use data that is both exciting, curing diseases, studying genes, but also sometimes Orwellian microphones in my home, self-driving cars, and so getting that right is hard. We don't have clear consensus over whether we want the government keeping us safe by being able to catch every criminal or not getting into our stuff because we don't trust them. So challenging times. So before we jump into it, Future of Privacy Forum, kind of a little bit about the organization, kind of your mission. We're eight years old at the Future of Privacy Forum. We're a think tank in Washington, DC. Many of our members are the chief privacy officers of companies around the world. So about 130 companies ranging from many of the big tech companies, and as new sectors start becoming tech and data, they join us. So the auto industry is dealing with self-driving cars, connected cars, all those issues. Wearables, student data, so about 130 of those companies. But then the other half for our group are advocates and academics who are a little bit skeptical, a little worried. They want to engage, but they're worried about an Orwellian future. And so we bring those folks together and we say, listen, how can we have data that'll make cars safer? How can we have wearables that'll help improve fitness, but also have reasonable, responsible rules in place so that we don't end up with discrimination or data breaches and all the problems that can come along? Right, because it's really two sides of the same coin and it's always two sides of the same coin. And typically on new technology, we kind of race ahead on the positive because everybody's really excited and lag on kind of what the negative impacts are and or the creation of rules and regulations about kind of this new technology. Very hard to keep up. You know, and this takes a high. Think about ad tech, right? We've got tons of ad tech. It's fueling free content, but we've got problems of adware and spyware and fake news and people being nervous about cookies and tracking and every year it seems to get more stressful and more complicated. We can't have that when it comes to microphones in my home, right? I don't wanna be nervous that if I go into the bedroom, suddenly that's shared across the ad tech ecosystem, right? I don't know that we want how much we sweat or when it's somebody's time of the month or other data like that being out there and available to data brokers, but we did a study recently of some of the wearables, the more sensitive ones. Sleep trackers, apps that people use to track their period. Many of them didn't even have a privacy policy to say, I don't do this or I don't do that with your data. So stakes are high. This isn't just about our ads tracking me and do I find that intrusive. This is about if I'm driving my car and it's helping me navigate better and it's giving me directions and it's making sure I don't shift out of my lane or it's self-parking that that data doesn't automatically go to all sorts of places where it might be used to deny me benefits or discriminate or raise my insurance rates. Right, right. Well, there's so many angles on this. One is I got a Lexa Dot for Christmas for the family to try it out. And it's interesting you think that she's listening all the time. So she's not. Let's talk about that. Unless she puts the little button on, right? Or is she not? Well, this is a great topic to talk about because a sheriff recently wanted to investigate a crime and realized that they had a Amazon Echo in the home and said, well, maybe Amazon will have data about what happened. Maybe there'll be clues, people shouting, I know. And Amazon's fighting because they don't want to hand it over. But what Amazon did and what Google Home did and the Xbox did, they don't want to have that data. And so they've designed these things, I think, with actually a lot of care. So the Echo is listening for its name. It's listening for Alexa. And it keeps deleting. It listens, right? It hears the background noise. And if it didn't hear Alexa, drops it, drops it, drops it. Nothing is sent out of your home. When you say, Alexa, what's the weather? Blue light glows, opens up the connection to Amazon. And now it's just like you're typing in a search or going directly. And so that's done quite carefully. Google Home works like that. Siri works like that. So I think the big tech companies by a lot of pain and suffering over the years are being criticized. And with the realization that government goes to them for data, they don't want that. They don't want to be fighting the government and people being nervous that the IRS is going to try to find out information about what you're doing, which bedroom you're in, what time you came home. Although the Fitbit has all that information, even though Alexa doesn't, right? Well, so the wearables are another exciting, interesting challenge. We had a project that was funded by both the Robert Wood Johnson Foundation, which wants wearables to be used for health and so forth. But also from a lot of major tech companies because everybody was aware that we needed some sort of rules in place. So if Fitbit or Jawbone or one of the other wearables can detect that maybe I'm coming down with Parkinson's or I'm about to fall or other data, what's their responsibility to do something with that? On one hand, that would be a bit of frightening. You've got a phone call or an email saying, hey, this is your friendly friends that you're wearable and we think you could see. Or the auto-tease doesn't be your front door. You should seek medical help. You'd be like, whoa, wait a second, right? On the other hand, what do you do with the fact that maybe we can help you? Take student data, right? EdTech is very exciting. There's such opportunities for personalized learning. Colleges are getting in on the act. They're trying to do big data analytics to understand how to make sure you graduate. Well, what happens when a guidance counselor sits down and says, look, based on the data we have, your grades, your family situation, whether you've been to the gym, your cafeteria usage, data we took off your social media profile, you're really never gonna make it in physics. I mean, the data says people with your particular attributes never, never rarely succeed in four years at graduating with a degree. You need to change your scholarship thing. You need to change your career path. Or you can do what you want, but we're not gonna give you that scholarship. Or simply, we advise you. Now, what did we just tell Einstein maybe not to take physics, right? But on the other hand, don't I have some responsibility from a guidance counselor? Who would be looking at your records today and sort of shuffling some papers and saying, well, maybe you wanna consider something else? Or, so, you know, we talk about this as privacy, but increasingly, many of my members, again, who are the chief privacy officers of these companies, are facing what are really ethical issues. And there may be risk. There may be benefit. And they need to help decide or help their companies decide when is the benefit outweigh the risk. Consider self-driving cars, right? When does the self-driving car say, I'm gonna put this car in the ditch because I don't wanna run somebody over? But now it knows your kids are in the back seat. What sort of calculations do we want this machine making? Do we know the answers ourselves? If the microphone in my home hears child abuse, hello, Barbie, here's a child screaming, or hey, I swallowed poison, or my dad touched me inappropriately, what should it do? Do we want dolls ratting out parents and the police showing up saying, Barbie says your child's being abused? I mean, my gosh, I can see, times when my kids have thought I was a big grinch and if the doll was reporting, hey, dad is being mean to me, who knows? So these are challenges that we're gonna have to figure out collectively with stakeholders, advocates, civil libertarians, and companies. And if we can chart a path forward that lets us use these new technologies in ways that advances society, I think we'll succeed. If we don't think about it, we'll wake up and we'll learn that we've really constrained ourselves and narrowed our lives in ways that we may not be very happy with. Right, it's a fascinating topic. And like on the child abuse thing, there are very strict rules for people that are involved in occupations that are deal with children, whether it's a doctor or whether it's a teacher or even a school administrator that if they have some evidence of say, child abuse. They're obligated. They are obligated, not only are they obligated morally, but they're obligated professionally and legally to report that in. I mean, do you see that those laws will just get translated onto the machine? Clearly, guy, you could even argue that the machine's probably got better data and evidence based on time and frequency than the teacher has happening to see maybe a bruise or a kid acting a little bit different on the schoolyard. You can see a number of areas where law is gonna have to rethink how it fits. Today, I get into an accident. We wanna know whose fault is it? What happens when my self-driving car gets into an accident, right? I didn't do it, the car did it. So do the manufacturers take responsibility? If I have automated systems in my home, robots and so forth, again, am I responsible for what goes wrong? Or do these things have, or their companies have some sort of responsibility? So thinking these things through is I think where we are first. I don't think we're ready for legal changes. I think what we're ready for is an attitude change. And I think that's happened. When I was the chief privacy officer at AOL many years ago, we were so proud of our cooperation with the government. If somebody was kidnapped, we were gonna help. If somebody was involved in a terrorism thing, we were gonna help. And companies, I think, still recognize their responsibility to cooperate with criminal activity, but they also recognize that it is their responsibility to push back when government says, give me data about that person. Well, do you have a warrant? Do you have a basis? Can we tell them so they can object? Is it encrypted? Well, sorry, we can't risk all of our users by cracking encryption for you because you're following up on one particular crime. So there's been, I think, a big sea change in understanding that if you're a company and there's data that you don't wanna have to hand over, data about immigrants today, lots of companies in the valley and around the country are thinking, wait a second, could I be forced to hand over some data that could lead to someone being deported or tortured or who knows what? Given that these things seem to be back on the table. And again, years ago, you were a good actor because you participated in law enforcement and now people participate, but they also recognize that they have a strong obligation to either not have the data, like Amazon will not have data that this sheriff wants. Now, they're a smart meter and how much water they're using and all kinds of other information, frankly, about their activity at home since so many other things about our home is now smarter, may indeed be available. How much water did you use at this particular time? Maybe you were washing bloodstains away or the like. That sort of information is gonna be out there. So the machines will be providing clues that in some cases are gonna incriminate us and companies that don't wanna be in the middle need to think about designing for privacy so as to avoid creating a world where whole data is available to be used against us. And then there's the whole factor of the devices are in place, not necessarily the companies using it or not, but bad actors take an advantage of cameras and microphones all over and hacking into these devices to do things. And it's one thing to take a look at me on my PC. It's another thing to take control of my car, right? And this is where there's some really interesting challenges ahead as IoT continues to grow, everything becomes connected. As the security people always like to say, kind of the attack area, it grows exponentially. Well, cars are gonna be an exciting opportunity. We released today a guide that the National Auto Dealers Association is providing to auto dealers around the country because when you buy a car today and then you sell it or you lend it, there's information about you in that vehicle. Your location history, maybe your contacts, your music history, and we never would give our phone away without clearing it or you wouldn't give your computer away, but you don't think about your car as a computer. And so this has all kinds of advice to people, listen, your car is a computer, there are things you wanna do to take advantage of new services, safety, but there are things you wanna also do to manage your privacy, delete, make sure you're not sharing your information in a way you don't want it. Jules, we could go on all day, but I think we gotta let you go to get back to the session. So thanks for taking a few minutes out of your busy day. Really good to be with you. Thank you. Absolutely. Jeff Frick, you're watching theCUBE. See you next time.