 From the CUBE studios in Palo Alto in Boston, connecting with thought leaders all around the world, this is a CUBE conversation. Hey, welcome back everybody. Jeff Frick here with the CUBE. We are getting through the COVID crisis. It continues and impacting the summer. I can't believe the summer is almost over. But there's a whole lot of things going on in terms of privacy and contact tracing and this kind of this feeling that there's this conflict between kind of personal identification and your personal privacy versus the public good around things like contact tracing. And I was in a session last week with two really fantastic experts. I wanted to bring them on the show and we're really excited to have back for I don't even know how many times Michelle's been on. Michelle Dentity, she is the former Chief Privacy Officer at Cisco. And now she's running the CEO of Identity. Michelle, great to see you. And to see as always, Jeff. Yeah. And for the first time and Kabaki and she is the Executive Director, Global Privacy and Security by Design Center. Joining us from Toronto, worked with the government and is not short on opinions about privacy. Hi Jeff. Thank you. Yeah. So let's jump into it because I think one of the fundamental issues that we keep hearing is a zero sum game. And I know and it's a big topic for you that there seems to be this trade off this either or and specifically let's just go to contact tracing because that's a hot topic right now with COVID. I hear that it's like you're you're you're telling everybody where I'm going and you're sharing that with all these other people. How is this even a conversation and where do I get to choose whether I want to participate or not? You can't have people traced and tracked and surveilled. You simply can't have it and it can't be an either or win lose model. You have to get rid of that dated zero sum game where only one person can win and the other one loses and it sums to a total of zero. Get rid of that. That's so yesterday. You have to have both groups winning positive some meaning. Yes, you need public health and public safety and you need privacy. It's not one versus the other. We can do both and that's what we insist upon. So the contact tracing app that was developed in Canada was based on the Apple Google framework, which is actually called exposure notification. It's totally privacy protective. Individuals choose to voluntarily download this app and no personal information is collected whatsoever. No names, no geolocation data, nothing. It simply notifies you if you've been exposed to someone who is COVID-19 positive. And then you can decide on what action you wish to take. Do you want to go get tested? Do you want to go to your family doctor? Whatever the decision lies with you, you have total control and that's what privacy is all about. But what about the person who was sick, who's feeding the top end of that process and is the sick person that you're now notifying that they obviously their personal information is part of that transaction. Well, what the COVID alert that we developed based on the Apple Google framework, it builds on manual contact tracing, which also take place to complement each other. So the manual contact tracing is when individuals go get to get tested and they're tested as positive. So healthcare nurses will speak to that individual and say, please tell us who you've been in contact with recently, family, friends, etc. So the two work together and by working together, we will combat this in a much more effective manner. So shift it over to you, Michelle. You know, there's there's pen and a lot of conversations all the time about personal identifiable information, but right. But then medical has this whole nother class of kind of privacy restrictions and level of care. And I find it really interesting that on one hand, you know, we're trying to do the contract tracing on another hand. If, you know, my wife works in a public school, if they find out that one of the kids in this class has been exposed to COVID somehow, they can't necessarily tell the teacher because of HIPAA restrictions. So I wonder if you could share your thoughts on this kind of crossover between privacy and health information when it gets into this kind of public crisis and this this this inherent conflict for the public right to know. And should the teacher be able to be told and it's not a it's not a really clean line with a simple answer. I don't think no. And Jeff, I and you're also layering, you know, when you're talking about student data, you're layering another layer of legal restriction. And I think what you're putting your thumb on is something that's really critical. When you talk about privacy engineering, privacy by design and ethics engineering, you can't simply start with the legal premise. So is it lawful to share HIPAA covered data, a child telling mommy, I don't feel well, not HIPAA covered, a child seeing a doctor for medical services and finding some sort of infection or illness covered, right? So figuring out the origin of the exact same 01, am I ill or not, all depends on context. So you have to first figure out, first of all, let's tackle the the moral issues. Have we decided that it is a moral imperative to expose certain types of data? And I separate that from ethics intentionally and with apologies to true ethicists. The moral imperative is sort of the things we find are so wrong. We don't want a list of kids who are sick or conversely. Once the tipping point goes, a list of kids who are well so that they are called out. That's the moral choice. The ethical choice is just because you can, should you, and that's a much longer conversation, then you get to the legal imperative. Are you allowed to, based on the past mistakes that we made, that's what every piece of litigation or legislation is, particularly in a common law construct in the US. It's very important to understand that civil law countries like the European theater, they try to prove prospectively legislate for things that might go wrong. The construct is thinner in a common law economy where you use test cases in the courts of law. That's why we are such a litigious society has its own baggage. But you have to now look at, is that legal structure attempting to cover past harms that are so bad that we've decided as a society to punish them? Is this a preventative law? And then you finally get to what I say is stage four for every evaluation is, is it viable? Are the protections that you have to put on top of these restrictions so dire that they either cannot be maintained because of the cultural process or cash, or it just doesn't make sense anymore? So does it, is it better to just feel someone's forehead for illness rather than giving a blood assay, having it sent away for three weeks, and then maybe, blah, blah, blah, blah, blah, blah. Right. You have to look at this as a systems problem solving issue. So I want to look at it in the context of, again, kind of this, this increased level of politicization and or, you know, kind of exposure outside of what's pretty closed. And I want to bring up AIDS and the porn industry very frankly, right, where people behaving in the behavior of the business risk a life threatening disease of which I still don't think there's a virus. So, you know, why could suddenly, you know, we can track for that, and that's okay to track for that. And there's a legitimate reason to versus all of the other potential medical conditions that I may or may not have that are not necessarily brought to bear within coming to work. And we might be seeing this very soon. As you said, if, if people are wanting our temperature as we come in the door to check for symptoms, how does that play with with privacy and health care? It still fascinates me that these, that certain things just kind of pop out into their own little bucket of regulation. I wonder if you could share your thoughts on that. You know, whenever you make it privacy versus fill in the blank, especially in the context of health care, you end up turning it to a lose lose as opposed to even a win lose because you will have fewer people wanting to allow themselves to be tested to be brought forward for fear of where that information may land. If it lands in the hands of your employer, for example, or your, you know, whoever owns your house, your peer and renting, et cetera, it creates enormous problems. So, regardless of what you may think of the benefits of that model, history has shown that it doesn't work well, that people end up shying away from being tested or seeking treatment or any of those things. Even now, with the contact tracing apps that have been developed, if you look globally, the contact tracing apps for COVID-19, they have failed the ones that identify individuals in the UK, in Australia, in Western Canada. That's how it started out. And they've, they've completely dropped them because they don't work. People shy away from them. They don't use them. So they've gotten rid of that. They've replaced it with the, an app based on the Apple Google framework, which is the one that protects privacy and will encourage people to come forward and seek to be tested if there's a problem in Germany. Germany is one of the largest, largest privacy data protection countries in the world. Their privacy people are highly trusted in Germany. Germany based their app on the Apple Google framework. About a month ago, they released it. And within 24 hours, they had 6.5 million people download the app because there is such trust there, unlike the rest of the world where there's very little trust. And we have to be very careful of the trust deficit because we want to encourage people to seek out these apps so they can attempt to be tested if there's a problem. But they're not going to use them. They're just going to shy away from them if there is such a problem. And in fact, I'll never forget, I did an interview about a month ago, three weeks ago, in the US on a major, major radio station that has like, you know, 54 million people followers. And I was telling them about the COVID alert, the Canadian contact tracing app, actually it's called exposure notification app, which was built on the Apple Google framework. And people in Horde said they wouldn't trust anyone with it in the US. They just wouldn't trust it. So you see, there's such a trust deficit, that's what we have to be careful to avoid. So I want to hold on the trust for just a second, but I want to go back to you, Michelle, and talk about the lessons that we can learn post 9-11. So the other thing, right, and Anne keeps going back to this over and over, it's not a zero sum game. It's not a zero sum game. And yet that's the way it's often positioned as a way to break down existing barriers. So if you go back to 9-11, the probably the highest profile thing being the Patriot Act, you know, where laws are put in place to protect us from terrorism that are going to do things that were not normally allowed to be done. I bet without checking real exhaustively that most of those things are still in place, you know, because a lot of times laws are written. They don't go away for a long time. What can we learn from what happened after 9-11 in the Patriot Act and what should be really scared of or careful of or wary of using that as a framework for what's happening now around code and privacy? It's such it's a perfect and it's not even an analogy because we're feeling the shadows of the Patriot Act even now today. We just we had an agreement from the United States with the European community until recently called the privacy shield. And it was basically if companies and organizations that were that fell under the Federal Trade Commission's jurisdiction, there's a bit of layering legal process here. But if they did and they agreed to supply enough protection to data about people who are present in the European Union to the same or better level than the Europeans would, then that information could pass through this privacy shield unencumbered to and from the United States. That was challenged and taken down, I don't know if it's a month ago or if it's still March, it's COVID time. But very recently, on the basis that the US government can overly and some would say in different nations improperly look at European data based on some of these Patriot Act FISA courts and other intrusive mechanisms that absolutely do apply if we were under the jurisdiction of the United States. So now companies and private actors are in the position of having to somehow prove that they will mechanize their systems and their processes to be immune from their own government intrusion before they can do digital trade with other parts of the world. We haven't yet seen the commercial disruption that will take place. So the unintended consequence of saying rather than owning the answers or the observations and the intelligence that we got out of the actual 9-11 report, which said we had the information we needed. We did we did not share enough between agencies and we didn't have the decision making activity and will to take action in that particular instance. Rather than sticking to that knowledge, instead we stuck to the Patriot Act, which was all that I believe two Congress people when I mean you see the hot mess that that is the US right now when everyone but two people in the room vote for something on the quick, there's probably some sort of a psychological gun to your head. It's probably not a well thought out thing. Right. We have fight with each other and it's part of being an American dammit. So I think having these laws that say you've got to have this one solution because the boogeyman is coming or COVID is coming or terrorists or child pornographers are coming. There's not one solution. So you really have to break this down into an engineering problem. And I don't mean technology when I say engineering. I mean, looking at the culture, how much trust do you have? Who is the trusted entity? Do we trust Microsoft more than we trust the US government right now? Maybe that might be your contact. Right. How you're going to build people, process and technology not to avoid a bad thing, but to achieve a positive objective because if you're not achieving that positive objective of understanding that it's safe to move about without masks on, for example. So stop. Just stop. Right. Right. You know, my favorite analogy, Jeff, and I think I've said this to you in the past is we don't sit around and debate the merits of viscosity of water to protect concrete holes. We have to make sure that when you leap into the concrete hole, there's enough water in the hole. No, you're building a swimming pool. What kind of swimming pool do you want? Is it commercial? Is it toddlers? Is it grown-ups? Then you build in flourination, protection and da, da, da, da. But if you start looking at every problem as how to avoid hitting a concrete hole, you're really going to miss the opportunity to build and solve the problem that you want and avoid the risk that you do not want. Right. Right. And I want to go back to you on the trust thing. You had an interesting comment in that other show talking about working for the government and not working directly for the people who are voted in power, but for the kind of the larger bureaucracy and agency. I mean, the Edelman Trust Barometer is really interesting. They come out every year. I think it's their 20th year and they break down kind of like media, the government and business and who do you trust and who do you not trust? What's so fascinating about the time we're in today is even within the government, the direction that's coming out is completely diametrically opposed oftentimes between the Fed, the state and the local. So what does kind of this breakdown of trust when you're getting two different opinions from the same basic kind of authority do to people's ability or desire to want to participate and actually share the stuff that maybe or maybe not might get reshared? It leaves you with no confidence. Basically, you can't take confidence in any of this. And when I was Privacy Commissioner, I served for three terms. Each term there was a different government, different political power in place. And before they had become the government, they were all for privacy and data protection believed in and all that. And then once they became the government, all that changed. And all of a sudden, they wanted to control everyone's information and they wanted to be in power. No, I don't trust government. You know, people often point to the private sector as being the group you should distrust in terms of privacy. I say no, not at all. To me, far worse is actually the government because everyone thinks they're there to do a good job and trust them. You can't trust. You have to always look under the hood. I always say trust but verify. So unfortunately, we have to be vigilant in terms of the protections we seek for privacy, both with private sector and with the government, especially with the government and different levels of government. We need to ensure that people's privacy remains intact. It's preserved now and well into the future. You can't give up on it because there's some emergency, a pandemic, a terrorist incident, whatever. Of course, we have to address those issues. But you have to insist upon people's privacy being preserved. Privacy forms a foundation of our freedom. You cannot have free and open societies without a solid foundation of privacy. So I'm just encouraging everyone, don't take anything at face value. Just because the government tells you something, it doesn't mean it's so. Always look under the hood and let us ensure that privacy is strongly protected. See, emergencies come and go. The pandemic will end. What cannot end is our privacy and our freedom. So it's a little dark in here, but we're going to lighten it up a little bit because as Michelle said, if you think about building a pool versus filling a hole, you can take proactive steps and there's a lot of conversation about proactive steps and I pulled in your thing, Privacy by Design, seven foundational principles. I'll have the guys pull up a slide, but I think what's really interesting here is is your very, very specific, prescriptive, proactive, proactive, not reactive. Privacy is the default setting. You don't have to read the ULAs. I mean, I'm not going to read all the words we'll share. People can find it. But but but what I wanted to focus on is there is an opportunity to get ahead of the curb, but you just have to be a little bit more thoughtful. That's right. And Privacy by Design, it's a model of prevention. Much like a medical model prevention, where you try to prevent the harms from arising, not just deal with them after the fact through regulatory compliance. Of course, we have privacy laws and it's very important, but they usually kick in after there's been a data breach or privacy infraction. So when I was Privacy Commissioner, obviously those laws were intact and we had to follow them, but I wanted something better. I wanted to prevent the privacy harms from arising, just like a medical model of prevention. So that's what Privacy by Design is intended to do is instantiate and embed much needed privacy protective measures into your policies, into your procedures, bake it into the code so that it has a constant presence and can prevent the harms from arising. Right, right. One of the things I know you love to talk about, Michelle, is is compliance. Right. And is compliance enough? I know you like to talk about the law. And I think one of the topics that came up on your guys prior conversation is, you know, will there be a national law, right? GDPR went through on the European side last year, the California Protection Act. A lot of people think that might become the model for more of a national type of rule, but I tell you, when you watch some of the hearings in DC, you know, I'm sure 90 percent of these people still print their emails and have their staff hand them to them. I mean, it's really scary. That said, you know, regulation always does kind of lag, probably when it needs to be put in place, because people may be abused or go places they shouldn't go. So I wonder if you could share your thoughts on where you think legislation is going and how should people kind of see that kind of playing out over the next several several years, I guess. Yeah, it's such a good question, Jeff. And it's like, you know, I think even the guys in Vegas are having trouble with setting the high lows on this. Pam Cameron said in I think it was December of 2019, which was like 15 years ago now, that in the first quarter of 2020, we would see a federal law and I participated in a hearing at the Senate Banking Committee, again, November or October in the before times, talking about the same thing. And here we are. Will we have a comprehensive, reasonable, privacy law in the United States before the end of this president's term? No, we will not. I can say that with just such faith and fidelity. But what does that mean? And I think Katie Porter, who I'm starting to just love, she's the congresswoman who's famous for pulling on her whiteboard and just saying, stop fudging the numbers. Let's talk about the numbers. There's about what she calls the 20 percent legislative flip phone caucus. So there are 20 percent or more on both sides of the aisle of people in the US who are in the position of writing our laws, who are still on flip phones and aren't using smartphones and other kinds of technologies. There's a generation gap and as much as I can kind of chuckle at that a little bit and wink, wink, nudge, nudge isn't that cute because, you know, my dad, as you know, is very, very technical and he's a senior citizen. This is Hart. I hope he doesn't see that. But then, you know, it's not old versus young. It's not let's get a whole new group and crop and start over again. What it is instead, and this is, you know, as my constant tome, sort of anti-compliance. I'm not anti-compliance. You've got to put your underwear on before your pants or it's just really hard. Right? And I would love to see anyone who is capable of putting their underwear on afterwards. I think you've made the decision of following the process that is so basic. It comes down to do you want the data that describes or is donated or observed about human beings, whether it's performance of your employees, people you would love to entice onto your show to be a guest, people you'd like to listen and consume your content, people you want to meet, people you want to marry. Private data, as Ann says, does found the foundation of our freedom, but it also forms the foundation of our commerce. So compliance, if you have stacked the deck proactively with an ethics that people can understand and agree with and have a choice about and feel like they have some integrity, then you will start to see the acceleration factor of privacy being something that belongs on your balance sheet. What kind of data is high quality, high nutrition in the right context? And once you've got that famo, you're in good shape. I'm laughing at privacy on the balance sheet. We just had a big conversation about data on the balance sheet. So that's a whole other topic. So we could go for days. I have pages and pages of notes here. But unfortunately, I know we've got some time restrictions. And so and I want to give you the last word. As you look forward, you've been in this for a while. You've been in it from the private side as well as the government side. And you mentioned lots of other scary things kind of on the horizon, like I think you call it surveillance creep, which there's all kinds of interesting stuff. You know, what advice do you give to to citizens? What advice do you give to leaders in the public sector about framing the privacy conversation? I always want to start by telling them, don't frame privacy as a negative. It's not a negative. It's something that can build so much. If you're a business, you can gain a competitive advantage by strongly protecting your customer's privacy, because then it will build such loyalty and you will gain a competitive advantage. You make it work for you as a government. You want your citizens to have faith in the government. You want to encourage them to understand that as a government, you respect their privacy. Privacy is highly contextual. It's only the individual who can make determinations relating to the disclosure of his or her personal information. So make sure you build that trust, both as a government and as a business, private sector entity and gain from that. It's not a negative at all. Make it work for you. Make it work for your citizens, for your customers. Make it a plus, a win-win. That will give you the best returns. Isn't it nice when doing the right thing actually provides better business outcomes, too? It's like the variety of opinion and women on boards and all kind of things we cover these days. Well, ladies, thank you very, very much for your time. I know you've got a hard stop, so I'm going to cut you loose or else we would go for probably another hour and a half. But thank you so much for your time. Thank you for continuing to beat the drum out there and look forward to our next conversation, hopefully in the not too distant future. My pleasure, Jeff. Thank you so much. Thank you. All right. She's Michelle. She's Anne. I'm Jeff. You're watching theCUBE. Thanks for watching. We'll see you next time.