 Good morning, everyone. Thank you for joining us. I'm Nancy Gibbs. I'm the editor-in-chief of Time. And you're docent for this morning. And I think that this is a particularly good topic for us to be addressing in the round. And I would have you take seriously the geometry of this room. This is a subject that is constantly evolving, which touches all of us and the stakes of which could not be higher. And so I'm very eager to have such an interesting circle of people to help lead our conversation. But I really want this to be a conversation in which everyone feels invited to engage. Those of you in this room, those of you who are watching the live stream on Facebook, because there are few topics that touch each of us more deeply, individually, and collectively, institutionally, nationally, internationally than a topic like privacy. There was a time when we could assume that privacy meant, I assume that no one is looking. And we have very quickly come to a point where our default is to assume that someone is always looking, is always watching. We have created in very short amount of time an entirely new economy of privacy and economy of information in which billion-dollar corporations have risen up to monetize and take advantage of something that was once impossible to turn into a product. So to help us, I also think it's striking that we are here the day before the inauguration of a new American president who has talked about why he does not use email and that if you really want to have something be communicated securely, you should use a courier, perhaps a pigeon, some other technology that we have forgotten. This is a very interesting moment in this conversation. And so to help us explore it, I am joined by Andre Kudalski, who is the chairman and CEO of the Kudalski Group, which is a global leader in cybersecurity and convergent media systems with Michael Nidor, the chairman and CEO of Centine Corporation, a Fortune 500 company that provides health care services, in-home health, software, benefits management, telehealth services. Ken Roth, who is the executive director of Human Rights Watch and a former prosecutor and author on topics having to do with security, counterterrorism, the NSA, US spying in recent years. Evelyn Rupert, professor of Goldsmiths at the University of London, and an expert in the sociology of data, and the founder and editor of Big Data and Society. And Jean-Yves Charlier, the former head of France's SFR and now the CEO of Implecom, which is the world's sixth largest mobile carrier. So before each of them gets a chance to weigh in, I'd like for all of you to have a first stab at our presenting question. So get out your phones, because we're, or we could do this in the analog way with a show of hands, because I can't rotate quite that much. You get to vote at weft.ch forward slash vote on the question. By the year 2025, will we see a gap between the privacy rich and the privacy poor? By 2025, will we see a gap between the privacy rich and the privacy poor? We did a version of this poll on time.com in the days leading up to this session. And addressed a few other questions as well. And the response was remarkably strong for something that is this much a matter of public debate. We asked whether people agree that people are increasingly becoming products to be sold to advertisers. 69% of the respondents agreed with that prospect. When we asked whether the concept of privacy itself would gradually fade away because it required too much time or money to protect and preserve it, 77% of people agreed with that statement. When we asked whether your appliances, your refrigerator, or your baby monitor should be allowed to report criminal behavior or be subpoenaed, 72% thought that that would be a bad idea. And on this question about the prospect of the opening of a privacy gap between rich and poor, 91% believe that that is going to occur. So do we have the results from this room? So not quite as strong as the general public, but still a significant conviction that this gap is going to be opening up. So I would like to put this question to each of you because we have a great range of perspectives on this subject, so starting with you, Andre. Thank you. So first I would ask another question. What do we mean by privacy rich or poor? And I don't think that there is one single answer to that. You may have the economical part, you have the education, but you have also to take into account that if someone wants to have privacy, he can sometimes be out of a network that he really appreciates. Don't forget that, for example, some person use WhatsApp, use Facebook, not because they just love it, but because they want to be at the same place to communicate with the person that they appreciate. It's a question of community. Now, even if you have some positive and negative element, take another example, smoking. People know that smoking is maybe not the best behavior for your health, but it's not saying that everyone is stopping smoking because it may be also a question of community. Now, when we look at question of privacy, imagine that you have some person that invests to grant a maximum of their own privacy. Now I will take the other question in a different aspect. Imagine that you have someone that is fat and during the day, not eating anything. You come to the conclusion that he must eat during nighttime. So fundamentally, if someone is trying to protect himself with some of the parameter of privacy, it's basically saying, oh, it's someone that is just for principle asking for having privacy, or something to hide. So basically, this behavior can even trigger more question for people that are interested to know about the profile. Take another example. If you have a very big house in the middle of, not Afghanistan, but Pakistan, and there is no traffic with mobile phone, no internet traffic, some people may ask who may be living here. So just to say that's the question of privacy with the evolution of technology is extremely difficult to imagine all the way to measure what the people are doing. So if someone is trying to look after one or two parameter, he may forget other element. For example, the refrigerator that is giving away some information that you are not thinking about that can give other information. Now, the last element I would say is that I'm sure that if there is a business model where you can have something, where you can get value out of really preserving privacy, I'm sure that some company will come here to fill the gap and come with something creative. Right. Michael. Thank you. I have a sign in my office that says the secrecy of my job prevents me from knowing what I'm doing. So when one thinks about privacy, it kind of comes into that. And I look at it, and it's going to vary by area. And when I first heard the topic, it seemed relatively simple. People are inside of privacy. But as you think about it, there are many aspects of it. You start off with those areas where there's sunshine laws and there's the transparency that's demanded. And that by definition, you give up privacy by being involved in something. There is that which is stolen, which is the more difficult area. And even when you think about trying to protect privacy, there are the two elements, and you probably know as well as anyone, when it's at rest and when it's in motion. And those are two different elements of protecting the information. And it's a question in my mind, is the technology going to grow faster that protects it or that makes it available? Now, I'm coming at it for a moment or two from the medical area, in which it's our area of the world. And in that particular place, there's serious financial consequences to how it's treated. Medical data, medical information, ID numbers and things have become more valuable than a credit card number. The individuals who get this data and get this information file false claims with fraud. They buy illicit drugs. There's a whole series of things. So I think when you start thinking about in the medical realm, I have difficulty thinking about the difference between one's wealth impacted it. In our field, everybody has to be entitled to the full privacy of their information. But there's also a corporate need, as I highlighted, to protect it from a fraudulent standpoint. Because once that data's out there, it becomes very difficult. So as we look at this, the other question that comes to mind is how easy is it to bifurcate information and data and privacy just based on wealth? Because for individuals, companies in any area, governments, to try and maintain two separate systems. One for those that can afford to pay more than the other. That in itself, I think has a lot of consequences that make it very difficult. So I'm not sure that wealth, I'll be a little contrarian, and say I'm not sure that one's wealth will necessarily provide them with a better opportunity, definitely not in a medical field. The question is in other areas, it becomes a matter of, and it's really interesting, it becomes a matter of what is one entitled to and what are they not entitled to? So I'm looking forward to hearing everyone's comments today. Thank you, Ken. Well, implicit in the question as I read it, it suggests that privacy is somehow a commodity and that if you're wealthy enough, you can buy the commodity. We shouldn't think of it that way. We should think of privacy as a right and it doesn't matter how much money you have, everybody deserves this right to privacy. The problem is that the right to privacy is incredibly poorly protected today. You know, Snowden showed that to us, but it's actually gotten worse, not better than after Snowden. I mean, there are some modest reforms. We've seen Europe make some efforts with respect to data protection, but at the same time, there's been huge efforts with respect to counterterrorism that have justified the mass sweeping up of data. We've seen private companies move toward encryption, partly by popular demand, but encryption only deals with the content of communications. It doesn't deal with the metadata. You can't encrypt the metadata because the metadata has to tell you where the communication goes. And that metadata can be very revealing. Basically, this is a tracking device and governments, companies, the police are able to follow me around and that can be very incriminating. Who you call, who you email tells a whole lot about your life. So I think we have an enormous need to have legal protections of privacy that are not there right now. So far the answer to this has been, oh, but you consented to this. When you subscribed to your internet service, you consented, you remember you read that form very carefully and then you signed at the bottom. But of course, there's no real choice here. There are a handful of companies in any given field. They all have the same gobbledygook with a consent form. And unless you're gonna opt out of private life, which isn't an option, there's no consent here. That's why we do need a regulatory regime, not a consent regime. And it's just gonna get worse. I mean, that's why I think it's important that we're having this session here. Because when you talk about privacy, people sometimes say, oh, I have nothing to hide. I don't do anything wrong. But as we move toward an internet of things where the answers you're suggesting, your refrigerator or your baby monitor or of course the computer sitting on your desk or your TV, I mean, there are many, many devices in your home that basically are gonna serve as data sucks that are gonna just scoop up all this information and send it off to some corporate entity. And that's not even counting how the Chinese may be putting into your baby monitor some kind of virus that takes up things that we're not even intended. So we are all gonna be vulnerable to have personal, very personal information being sent out into some or multiple mass databases. And then what happens? It's not, did you violate the law? Did you see a psychiatrist? Maybe your next employer can go and buy this data and figure out you're not a good bet. Or were you seeing a doctor a bit too much? So maybe that life insurance company is gonna increase your premiums or perhaps not even give you a policy. I mean, there are lots of ways in which just your day-to-day life, having nothing to do with wrongdoing is gonna be affected by the inability to protect your privacy. So there is an urgent need for laws and regulations here. Businesses should be doing the right thing, but you can't just trust business here. There's an urgent need for laws. I have so many questions. This is just great, so you don't need it. Well, to add to the mix, and I think it's great I'm following this perspective because I think about digital citizens and rights. But I think it's important to start with thinking about that privacy is not something possessed. It's something that involves an ongoing negotiation. It's not a settled thing that, like a commodity, as you suggested, assumes some kind of solidity. We know that technologies are vastly changing quickly and over time. Privacy today may change tomorrow. So we have to think of it in a processual kind of way. And one aspect to the question that's been posed, I think that's really important, that was suggested by Andre, which we in the social sciences think about in terms of the different forms of capital or resources that are necessary to secure something like privacy as a good. Not only does economic capital mean you can buy it, but also technical capital of understanding how these things operate and the kinds of whether it's cryptography or other resources that you can have access to to secure your privacy. Or even your social capital, the kind of networks you're part of and in which you can draw upon in terms of learning and also securing your privacy. So if we're thinking of the future, it's gonna be stratified along all of these different forms of capital. And there's not gonna be just about rich, poor, along one dimension. But as in society, more generally in terms of how societies are organized, it will be stratified across all these dimensions. There will also I think be really uneven effects about whether privacy is something that is secured or not. People who have fewer choices and perhaps are more marginalized will be more susceptible perhaps to the effects of not having privacy. That's also something to think about in terms of the kinds of data that we're talking about because it's very hard to generalize and generalizations generalize. So your search engine queries is that a matter of concern? Your Facebook posts, your metadata and communications, or your health data, your genome. I mean, these are very different forms of data and have very different consequences. So when talking across them, privacy is not one thing. Then we have to sort of open up and discuss how it means different things across these various forms. I think the other thing to watch is that people are voting not just through voting as people have done today and through the time poll, but they're voting through their digital actions. They are encrypting their communications. They are opting out of certain platforms. They are creating alternatives. They're maybe not using email anymore. I think that we can also listen to and observe how people are opting out and voting already about this question by saying unless this can be resolved, I don't want to be part of it. So I think those are the kinds of questions that maybe echo a few of the comments already raised, but I think they raise more questions, I'm sure. Well, let me start by saying that this is probably an immensely complex topic and we're probably going to still be discussing the topic of privacy, I think, in 2025. I think there is no radical regulation or legislation on the matter. It's going to be ever evolving as the internet evolves and as innovation evolves. I think there are two scenarios that we need to think about when we think about privacy. I think one that's been raised is that privacy matters. But let me start with a contrarian view. Maybe privacy doesn't matter. And yes, while consumers around the world are concerned about privacy matters, when you look at what they're actually doing, in fact, they're trading off today privacy for convenience, right? And convenience is a big thing in this digital age and new services are a big thing in this digital age. And the new generations that are born today might not be that concerned about privacy, right? Because the trade-offs and the benefits are such that they're willing to give up privacy for new services and convenience, right? And maybe what we're discussing is an issue of the industrial age and not an issue of the digital age, right? So I think we've got to contemplate that scenario. Then the other scenario is obviously that privacy does matter, right? That ultimately there will be a backlash. There's obviously a major concern around the internet of things, right? Because as we've exposed, the internet of things is going to expose a lot more data than what social media is exposing today. And under that scenario, there's a high probability, in fact, that regulators, and most probably in Europe in the next three or five years, are gonna step in and are going to create an individual's bill of rights on the matter of privacy. And there's a high probability also under that scenario that new business models are going to emerge, companies that vault data, companies that create innovation that allow those consumers that want to regain a degree of privacy will emerge. And that innovation cycle is accelerating as we speak. So I think an important theme is certainly whether privacy matters or doesn't matters. But behind that, I think the theme of security is absolutely critical, right? Because if privacy doesn't matter that much, then security is at the heart of the debate, right? And so I think these two scenarios are gonna play out in very different matters. As it comes to the private sector, I think the private sector has a role to play today, right? And that role is ultimately about the issue of security, and certainly giving consumers choice. And I think that's an important dimension of the whole debate, ensuring that consumers across the world have equal choice, right? And so that the debate is not about those, a digital divide on luxury or non-luxury, but the debate is around security and choice. So you've touched on exactly, I think, an interesting fault line between right versus the notion that maybe we, this is a right we're willing to trade away, which normally with fundamental rights, we don't like to think about the idea that we might value something more like convenience or like my willingness to, tell me how you react to the way. No, let me jump in, because the, I mean, to be honest, I think it's an oversimplification, what you're saying. We, in limited senses, do sometimes give up our privacy for convenience, but at this point, it's very difficult to have a partial consent. I mean, when I go shopping online, you know, the grocery store we use, the online server, you know, I like the fact that it pops up and tells me what I bought last week, because it's just easier. And I can, you know, click things off rather than having to find them and enter in again. But do I want the grocery company to then send that data to a life insurance who is gonna say, oh, you're eating unhealthy food, we're gonna raise your premiums, you know. And so you've got to look at the nature of the consent. And I may well be consenting for one particular retailer to use, you know, my preferences for convenience. But what are the protections from that spreading? You know, if I go and use a search engine, I like the fact that they remembered what I searched for last time, it just makes it a little bit quicker, a little bit easier. But should they then be sending that off to the government, because I've been spending too much time looking at up Isis or whatever it is. But isn't that you wanting to have it all, where you want the benefits, you want the convenience, but you don't want to have any of the things that make that a business for that grocery store? Well, for me, this shows the importance of regulation. In other words, just because you have consented for one company, for one purpose to have a bit of information, doesn't mean that it's fair game. And the current regime is, once you've disclosed anything to a company, it can sell it to anybody else. The government can buy it. You basically have no right whatsoever. You've given it up forever. And we've got to get past that. We've got to think of, you know, the partial consent that happens here as limited consent. And the presumption should be that information stays with the retailer for the purpose it was given, period. And there should be an explicit consent required for that to be sold for another purpose. Still, the government can come in with a formal warrant. If they think there's criminality there, they can make a probable cause case and argue for that data. That's fine. But to be able to scoop up all of our preferences or to have any company be able to sell those preferences to anybody else is an invasion of privacy. And I didn't consent to that when I, you know, agreed to shop there and click off what I bought last week. Andra, you? I would try to consider the element from a slightly different angle. Like for piracy, internet has no border. And one of the issues that we have is that the value in different countries or different continents are not the same. Take an example regarding the freedom of speech. Quite absolute in the US, much more limited in some countries in Europe. And in the US, the protection is granted through freedom of speech. In some continental European country, the freedom is guaranteed by privacy. And we have really a fault line between the two. And as internet is not limited to one country, if you have someone that have a regulation in one country and doing things in another country, it will be very difficult through regulation to address this issue. And now I think that there is a need to revisit some of the fundamental rights. We have until now the freedom of speech, but they think that through the internet age, we need to think about the freedom of thinking. Because one element that has not been said so far is through analyzing the personal data, you may now find not only what the people are saying, but what's how they think and what they think about. And if we are not granting the appropriate rights with such type of tracing, then we may face a real issue. How do you possibly come up with when you have so many different cultural attitudes, national attitudes, traditions around these? We're talking about competing values in many of these cases. So how do you, given how global this question is given, how global the internet is, how do you come up with a set of regulations that can operate in that kind of a way? Well, you certainly need some culturally specific possibilities and opportunities, but let me approach that first by responding to the point you made earlier about the young generation does not really worry about this. Well, I think we have responsibilities to the young generation. We have responsibilities to children and youth in other areas of life and social life. To teach them to care about something they don't care about? Well, also, we regulate lots of things that our children are exposed to, whether it's toys and the regulation of the toy industry so they can't be harmed by it. That's our obligation to protect children too. And so just that they don't see it as a problem, we do not leave that. As parents know too and are becoming more sensitive about posting images of their children on the internet because we know those images stay forever. And we have to be more cognizant of what kinds of rights for the future generation are we perhaps giving up through not attempting to protect them. So let's think about cultural differences too, not just across borders and boundaries of nation states, but within countries as well. These kinds of issues are extremely diverse and we cannot come up with just one solution, but we can come up with fundamental principles and rights. And I think that's where we need to start is at the sort of fundamental level which get negotiated and translated into specific national regulations or regulations that are specific to the particular kind of platforms and data that we're talking about. So we can't come up with one answer to that question and be inappropriate in a diverse world that we live in to try to do so. So I think that means we need to really worry about this also in the ways and I'm glad you raised this point about data sharing and consent which are really critical and key about thinking about privacy. So yes, convenience, I like the same things that you have mentioned, but I want the opportunity to consent to how that data is shared or not across different producers. So let me apply that point of view to health because you could argue that one of the most promising aspects of healthcare research now is the promise of big data that by looking at all of this data that we're now able to collect. I mean, this knows more about my health than any of my doctors ever did through my life until a year ago. The promise of big data and in analyzing health outcomes potentially has enormous benefits for research and development of treatment. So does the individuals right to keep their data private supersede the much broader public benefit of researchers having access to that health data? Well, I view the data and we were doing a lot with big data right now. We can look at in five minutes at a million files and identify that somebody's potassium is increasing and they're at risk for a heart attack and notify their doctor. So there's things that can be done at the analytical side of it. But I view the medical data that we collect as personal to that individual and we have an obligation to safeguard it. I think with a financial side if I can come back to that a little bit for you as we're talking about that is that it takes a given critical mass and size company to afford what it takes to protect it. And so you might find some companies charging more for their premiums or for their services because they have the scale and size to buy the protections and install the systems that are expensive. I mean, I've had our CIO come in at the drop of a hat and say, I need $30, $50 million to improve this aspect of the cybersecurity. And you say, yes, now not every company can do that. And so that's where the cost and the future in financials come in. But I definitely answer your first question. I believe medical data belongs to the individual and must be protected. So, Andre, you've written that companies need to become more security literate. What are you talking about there and what is the implication of that for this? But fundamentally through the digitalization, you have more and more critical function in our society that's our function of digital. Take an example, if you have self-driving cars, if you have a cyber attack, you can create a really big accident. So more and more you have digitalization, more you have capability to do new things. Take in the medicine, all the personalized medicine to treat cancer is something that gives you incredible possibilities. So fundamentally not allowing personal medicine will avoid that you can be treated in a very efficient way. So fundamentally to be able to do new things, you will need some data and use data in a way that is very personal, that is extremely high performance. Now, the only way you can reconcile the two is not only to protect much better the data and especially everything that is related to health, everything that is related to things that are extremely personal, must be extremely well protected. Having said that, one element that we have not maybe talked about, there is a big difference between keeping the information, how they say grabbing the information and what is the use of the information. If for example you collect some information about your DNA for example, and if it's used to give you a better treatment, I don't think that people will argue about that. They will argue if this information is used to adapt your insurance premium in a way that is unfair. So fundamentally we have to ask ourselves, is a question of just privacy as such or have we to think about how data can be used even the example of the information like DNA. If the information about you can be useful to improve the way you treat cancer. And for example someone has a cancer today that has been treated, it's not saying that he cannot have something bad happening 10 years from now as a consequence of the first cancer. And fundamentally the data used in an aggregated way may help to solve some issue and be beneficial for the same person. So we have really to think much more about what is the way to handle data in a unique way or aggregated way. And I think that we have really to think more about that rather than just consider privacy as a yes or no question. In August, Tom Wheeler the head of the FCC warned against the discount for data business model and said that if you provide discounts to customers who agree to share their data that he thinks that that is going to drive us in the direction of privacy becoming a luxury that people just can't afford. How would you answer? I think fundamentally what I was raising was that there's two scenarios that could play out. I think that there is a requirement for regulation as we've just talked about here, that there is a requirement for enhanced security. But a point that we haven't touched upon that I think also relates to this is the point of education and the point of consumer knowledge on the matters. And whether our education systems also need to step up to educate at the youngest age also are consumers as to their rights, as to regulations, but also as a matter of understanding what is potentially done with their data. So I think that to Ken's point there's no oversimplification. This is a complex matter on many points. I don't think that we can go to a model where we over-regulate because regulation will always be behind. So hence why I put on the agenda also the issue of education. Well, that feels like it touches on what you were saying, but I do think it's an interesting question about if the 20th century or pre-21st century view of privacy as something that we value as a fundamental human right as a good that deserves to be protected. If that notion were to fade away what would be lost? Well, I wouldn't assume it's gonna fade away, but I mean, what's lost? But it certainly is under, it's not in a certain way. I mean, one thing we've seen is that there's a direct relationship between the right to privacy and the right to free expression. That people are more willing to talk if they feel they have a degree of privacy because you don't always talk with a megaphone for the government to hear. You often just talk among a circle of friends. But if you fear that the government is gonna intrude and particularly if you live in a place where the government may not be so respectful of dissent, a lack of privacy means a lack of free expression. So I think privacy is a key element of self-actualization overall. Now, I mean, I'm all for education. If you educate kids, don't put photos on your Facebook page of you drinking because the employer's gonna see it and you're not gonna get the job. That's smart education. But that's very different from saying, education is a substitute for regulation because education ultimately is premised on a consent regime. We're gonna educate you to be a good consumer and then you consent to share all your data. But we can't really consent. I mean, it's just, you have to bow out of modern life these days if you really are gonna try not to have your data out there. So you need protection of it. Now, we talked about big data and I think it's useful to distinguish between different types of big data. Big data, you know, anatomized can be very useful. You can get a sense of the public health of a broader population using anonymous data on health. I mean, just, you know, driving with your Google Maps. How do they know where there's a traffic jam? It's because the phones aren't moving. So, you know, that doesn't... And so, as long as it's anatomized, I'm fine with that. But what I have a problem with is if you look at what the U.S. government used to justify its surveillance, they said, we need a haystack to find the needle. You know, how are we gonna find the needle being, you know, some terrorist plot if we don't keep the haystack, which is all of your communications? So therefore, you know, we're gonna unanomized. You know, we're gonna know who is talking to whom, who is emailing who, and we're gonna keep all that data in our computer. And then that became problematic. And indeed, I mean, everybody should worry about the U.S. government's approach because at least the U.S. recognizes some nominal right to privacy for Americans or for non-Americans in the United States. But the official U.S. government view is that it owes non-Americans outside the United States zero right to privacy. It has complete power in its view to sweep up your phone calls, contents of them, to read your emails, and to share those with your governments. And that is the official U.S. government view. And of course, then everybody else takes the view. So there's this so-called five-eyes arrangement between the U.S., UK, Canada, Australia, and New Zealand. And so if the U.S. government can't snoop on its citizen, they just ask the Brits to do it, and then they share it back. You know, so the current law is abysmal when it comes to protecting our privacy. And there's an urgent need. You know, yes, it's hard maybe to do universal laws with real specificity, but we have to focus nation by nation on improving what we have because what we have right now is virtually nothing. You know, I think you said something that is so important. That is somebody's willingness to share information when they believe it's gonna be protected and held private, or if they think it's gonna end up in the public domain. And that's true of medical, that's true in talking with legal issues, so many things that, you know, the idea of what happens if you give it up to question, the single biggest issue is we will have less information available that can be very impactful whether it be health, or lifestyle, and so many other things. There is an importance in teaching education and educating people how to protect their privacy, how to protect their credit cards from being stolen, their identities from being stolen. There's things that have nature, that's also education. But you really touched on an important point too. Can I comment on that? Because I think on the education question, I agree it's very important, but I also feel that when we think about citizens and rights, we also have to have mechanisms for people to claim rights and not just know about what existing rights we have. If we look at the history of rights and rights claiming whether it's civil rights or political rights, it's a capacity for people to make demands and make claims on new rights, ones that don't currently exist. And to me, that's what citizenship is about. It's not just being educated to perform as a good citizen, but to have the right to claim rights. And how can we build that into the very workings of the technologies that make up the internet? I think that's really a critical challenge and that's not just for children, but it's also for us as adults. So it's beyond regulation, it's beyond education, all of those important. But what mechanisms exist within the current configuration of the internet that enables someone to make a claim or a demand that my data does not get shared or I consent only to this? Those are the mechanisms, the active ones that we need to incorporate into the workings of the internet. Again, back to my first point, it's a process. It's an ongoing thing. We cannot settle it through educating somebody today. What's the evidence that there's a demand? I mean, you touched on this at the beginning, but that there's a demand for this, that my cell phone contract should have terms of service that I actually can read and understand as opposed to that endless... How much demand are we seeing from, again, if this is something that citizens in countries are not pressing for, asking for, demanding from, whether it's from the government or from private companies, then how is that going to happen? I think on the production side, we can see, especially in the invention of new apps that secure privacy and allow one to communicate through encryption and cryptography and their take-up is one sign or one piece of evidence about this happening, or people who are opting out from consenting to cookies and saying, no, I don't consent to that, even if it's gonna make my platform work better. So I think this is a growing movement. As in all historical movements of claiming new rights, these things happen slowly, and we're still the very early days of people coming to understand this. So we need to look at these actions as the emergence of new claims, and that's how we start to study and start to think about ways that we can enlarge the possibilities of that. And I think that's a very important point, the timeline on all of this, because ultimately, the point I was making is that consumers today around the world are basically opting for convenience, right? And we're seeing that. And even if these new services are growing, they're still very limited, right? I think it's 10% of the world population is sort of downloading to block ads on their browsers, right? So it's still very limited. But I think as more revelations appear, more scandals appear, as we ask fundamentally these questions, regulators potentially step in, I think that there is going to be more and more concerned. This agenda is gonna evolve ultimately. And I think it's ultimately about finding the right balance between new services, convenience, but also the right levels of protections. And I think we've gotta be careful to ensure that that balance always exists. And I'm sure the pendulum will swing one way and then to the next, over-regulation could kill innovation, right? But obviously, a completely free internet with no privacy could create major problems for consumers down the road. So it's a balancing act that we're gonna have to achieve in all of this. And maybe the starting point is around the point that you raised, which is about first looking at key principles and for our governments to agree on key principles of privacy. Now we are mainly speaking about the balance between privacy and some economical value. Now I would just like to make a short flashback in security. Everyone was thinking about economical impacts or some terrorist action. Now if we look at the last US election, cybersecurity has been invited to the table in all the discussion. And now I'm just wondering if we should not also look at what is a trade-off between privacy and possibility to influence some votes and with the impact on democracy. So just to say that one element is to think about economic element because at the end you can opt in or out to buy something or not. But if you use the elements of private information to target elements to make people change in mind, let's imagine I'm just trying to make an analogy with a microwave oven. Basically you send some wave, microwave, it's the exact frequencies that do that the watcher is resonating and starting to hit. You may imagine that if you have some private data you can find the exact emotion that will do that individual will react in a certain way. So I'm just saying that we need also to consider this part of the equation in this question of privacy. What does that take you? I think you're talking about the importance of security, first of all. And I agree that every corporation that has data has a fundamental responsibility to secure it. So I think we can probably all agree on that and we can probably also agree that it's a neglected duty. So I'm in the same page as you there. I think that as we think about what are principles that there could be broad agreement on at the level of governments and at the level of corporations because you can't micro-regulate in a forum like this but you can come up with principles that would have an effect. And we should recognize that there is a trade-off between privacy and money because the lack of privacy is big money. The corporations are selling our data and whole businesses are being built around the data. So there is huge resistance to limiting access to that data. But principles that I would like to see would be that there's a presumption that you didn't consent unless your consent is explicitly given. I would like to see a presumption that if you do consent it's for the narrow purpose of the use of the data so I can do my shopping more conveniently but not to give it to a life insurance company. And then I think if you had presumptions that required explicit consent to broader uses of your personal data that would provide enormous protection. We should recognize there's gonna be big time corporate resistance to that. How would you feel about a sliding scale of if you charge me this much for the service you can have this much of my data but if you give me this much of a discount you can have this much more. Which is essentially letting people monetize their right and sell it consciously in return for whatever economic. Yeah, I mean you could make that argument about any right. Do you wanna pay for a trial or do you wanna just have the cheap version of taxation and if you get arrested we'll deal with you. There's certain things that shouldn't be bought and my inclination would be to say we should not make privacy something just for the rich. We should come up with standards that protect everybody without regard to money then you let companies build around that and if it's valuable to them they can advertise and encourage you to consent. But they shouldn't be taxing you in essence to be able to preserve your right to privacy. I think if you did that you might decrease the value of the data that they get. Cause then the next question is gonna be well what can you tell me about the person that won't let me have it? Versus the person that took the money to do it. So you know when you start bifurcating things one result impacts another. So if you did that all of a sudden I think you'd find nobody'd want the data. Cause they couldn't be sure. Cause they're interested in the law of large numbers. I think there's also beyond the economic value we should also put some attention to there's great public good and public value of data. I think health is in one really potential good area but a lot of governments see this as a new opportunity to better understand and know services and populations in ways that they could never have done before for the purposes of better public policy for the purposes of addressing inequities, et cetera. And I think we need to also recognize that it's just not for commercial good but it's for broader public service goods that a lot of this data could be of value. So in a consent model can we have a consent to allow for public authorities to be able to use some of that data for the purposes that it could be put to in terms of better transport services or better social welfare services. But a lot of that could be done I think through anonymous data. I'm much less concerned about anonymous data. And so if the question is building better roads and you wanna know where people are traveling you don't need to know the names of the people traveling. I completely agree and I agree it's gotta be anonymized and I think governments are completely clear on that but we have to also be careful that anonymized data does not necessarily mean it's protected in terms of against re-identification especially when data is joined up. But also there are pernicious problems of anonymized data being used to profile particular groups and then making them targets of particular interventions and they can make groups vulnerable too. And there needs to be mechanisms for addressing the ethics of that which just gets into a whole other problematic area. Even a bigger issue because fundamentally if you can extract a rule on the group of people and to consider that these people are at risk for one reason or another one it's even worse than a specific individual because a specific individual you have some probabilities that can be pretty low but if you look at the group then you may have something that is much more dangerous. Yes. I mean it seems to me that the social service provision problem is exactly one of the places where you go to a gap between rich and poor. If you want access in many communities to food stamps or transportation vouchers or housing vouchers from public service then you have to give away a huge amount of your personal information whether you have a criminal history or social security number all of that in order to have access to those things. Absolutely. That brings us back to who's privacy. The amount of digital traces that each individual leaves is highly stratified in terms of the kinds of interactions or in government transactions that they have no choice to engage in. And there are particular groups who become targeted ones of digital data collection. So we can't speak of this again as everybody has the same sort of data traces and volumes of data collected about them. So it's highly unequal. We are running very short on time and I don't want to miss the opportunity for people to ask questions. We don't have a lot of time for a lot but I welcome hearing from you. Hi I have a question surrounding the privacy rights around this new Alexa and Amazon Echo. In the United States in the last month there has been a murder trial going on where prosecutors are trying to get information that might have been recorded on the Amazon Echo because she's always listening. And the prosecutors are saying that the Amazon Echo may have information that will help them put someone behind bars for a murder. And there's been a lot of discussion about whether Amazon should be forced to give the information because it's in their cloud. And it raises tremendous privacy. Ken I'd love to hear you on that. I mean it recalls what Apple went through with unlocking iPhones. I mean this is now familiar territory. Where do you come down on that? Well I mean in principle I don't object to targeted government inquiries. If there's a showing of probable cause that a crime was committed and a reason to believe that evidence is gonna be here or there. You know that's classic search warrant areas. So I mean I think that scheme is less problematic. What this raises though is the question of how much data are companies gonna retain? Because if the company retains the data it does invite the government to snoop after it. And I think we've been too little focused on data retention. Companies like to retain data because it's valuable. They can learn more about who you are. But there's a cost to us in terms of our privacy because if the data is sitting there it invites somebody to come in and seek it. But what if Amazon argues look if Alexa can be subpoenaed then she becomes less valuable and this is damaging our business model. Does that? Well I mean obviously you say the same thing about your email is less valuable if it can be subpoenaed or your phone calls are less valuable. I mean I think it's just a modern day version of an old problem. I think if when somebody puts this in their home if they understand that could happen and they made that conscious decision then it's fine. But you have to let them know that that can happen. But if you disguise that and they get surprised by it. And the difficulty is a lot of us don't have a choice. I mean I think people don't realize that this is, this monitors everything happening in the room. And it can be turned off it doesn't matter. And that's why if you ever visit an embassy they make you stick your phone in a vault before you go in to see the ambassador. Or if that's not possible you take out the battery but you can't take the battery out of these things. So the only way to ensure that this is not a listening device for whomever is tapped into it is to get rid of it. Well give it to somebody else telling them to walk around for. Yeah no, but I mean that is the difficulty of day to day life because most of us go home and you plop your phone down on the desk or in your bedside and you don't think about it but it still is monitoring everything in the room. And that's a problem. And we're gonna have more and more of these monitoring devices because it's not just issue with Amazon Alexa. Our smart TVs in our home are in fact listening devices. They're connected to the internet. So with the internet of things we're gonna have more and more of these devices that potentially are tracking exactly what we're doing. So I go back to the point there is still a fundamental issue about consumers understanding the benefits but also the issues potentially that these devices create for ourselves. And on the issue that we've debated on consents for example I think there really needs to be a close look at the structure of the consents that are being used by enterprises today because ultimately private businesses are producing consents of three, four pages that no consumer can understand. So I think that's a fundamental issue that will need to be addressed also. Consents need to be I think to Ken's point it's simple to understand by the average consumer anywhere around the world and they can't be these three or four, five page legalistic documents that protect the enterprises without an understanding of the consumers. Yes ma'am. Deborah Estrin, Cornell Tech. Do you think it would be helpful and viable if individuals are able to see the data that is held through their consumer devices that I have a personal data API to be able to see in some legible form in the same way that I can look up now in a credit report or on a credit card bill what is being charged to me. You'd be appalled. You know, but it would be. I wouldn't be, but it would be. But yes, it would be absolutely be interesting. It would be very instructive because I think it would also wake people up. I mean, that's part of the education. Let that be a light that we ask for. Yeah, I think there are proposals to have an app that can tell you what all your apps are doing and what all your apps are collecting. And one simple interface without the legalistic language of 23 pages, you could have an app that says this is what is happening with your phone. These are the possibilities of what can be collected as you are using it. That can be made through a very simple interface where one could say, oh, I'm turning that one off. Or I have a right to the data that's being collected and I should be able to request that in database format and analyze it for myself. I wanna see what's being collected about me. So I think these things are technologically really possible and available to us. And it should be a right that when you buy a phone that app is part of it. I mean, it does raise the question about whether the kinds of regulation and government role is likely to be more effective versus the technological solutions or ones that the private detector essentially makes available because there's a demand for it. I don't wanna miss people behind me and forgive me for having my backs to you this whole time. Yes. Yeah, my name is Laini Hamson. I'm an education advocate from the United States. And one of the things we're very concerned about is student privacy. Because right now the position of states is that once you put your child in a public school, you lose all right to their privacy and their data. So already we're seeing people pull out of the public school system to homeschool their children or put them in private schools in order to protect their privacy. So adults that may not feel very sensitive about protecting their data are passionate about protecting the privacy of their children, especially when it comes to disabilities, their health information, which goes straight into their education records. And now there's a push to overturn the ban on the federal government from creating a comprehensive student database. And we've already seen this in England where they have started creating this comprehensive database, which was promised to be only used for research. And now it turns out the home office has been asking for data for immigration control to be able to identify children who are undocumented to push them out of the country. And for the first time since the Trump election, there's concern that the federal government will ask for state data because some of the data that's being collected on children in public schools is their nationality and whether they're immigrants. So this is an ongoing concern because it turns out that student privacy is even less protected than medical privacy or health data. And yet health data goes straight into your education records. We are, unfortunately, we are out of time. And this is one of, in classic Davos tradition, one of those debates that I am confident is going to continue over your lunches and coffees and conversations through the day. But I appreciate the perspectives and the insight that each of you have brought. Thank you very much for joining us and thank you all of you. Thank you. Thank you. Thank you.