 Thank you for squeezing into this intimate setting on time. So I'm going to honor your commitment by starting on time. Thank you very much indeed. And my apologies for those of you who are looking at the back of my head. But I have eyes there, not least through that one-way glass. So don't think you've been forgotten in any way. We are hyperconnected. And there you can see the hashtag. So please use that, because it's a way of you reflecting as you listen to our guests on what you think they're saying, whether you like it, whether you don't like it, new points as well. And I can inject them rather than coming necessarily to you at some point during the discussion. But I do want to come to you pretty soon on, because in the end, many of you will have questions about trust or not. And you will have your own views. And also maybe your ideas of the direction of travel that we can take this in in the next 58 minutes or so. Let me introduce the panel, because it's not quite as you expected. And I hope, actually, that means you'll have even more incentive to stay. Because we have five guests, as you can see. And let me introduce them. Mark Benioff, who is chairman and chief executive of Salesforce. Then we have Tim Berners-Lee. So Tim Berners-Lee, welcome, Tim, a professor of engineering at MIT and computer science and artificial intelligence lab. Then we have Mike Fries, president and chief executive of Liberty Global, 28 years in the cable and media industry. Marissa Meier, welcome, who is president and chief executive officer of Yahoo. And we're delighted that we've been joined by the new commissioner for the digital economy and society, the European commissioner, Gunter Ertinger, welcome indeed. Now, this is about trust. I can summarize, but I suspect all of you know where the issues of trust are in this. And as we are hyperconnected, please do use that hashtag if you want to, to get your voice over and your thoughts over at this point. Of course, if we ask the question in tech, we trust. And how can trust in the hyperconnected world and company be preserved? The questions come up of, who can be trusted? What can be trusted? What is the problem? And that's where I'd like to get to initially. How significant is the problem? And Mark Benioff, can I come to you first? Because you have just written an article, The Digital Revolution Needs a Trust Revolution. You're accepting there is an issue here. Why are hyperconnected companies worrying about preserving trust? And what have they got to do? Well, I think that we're an incredible shift in the technology industry. We've seen the world move, of course, into cloud computing. We've seen the world move into social networks. We've seen the world now move into mobility. How many people here have a mobile phone with them? Everybody? Why even ask the question? And they should be using it because they should be using the hashtag. So don't switch it off. And we're about to move into the world of data science and artificial intelligence as well. And what that's yielded for our industry is that we've gone from where we started with systems of record. We've moved into these major systems of engagement, which includes consumer systems of engagement. And now we're about to move into systems of intelligence. But none of these things will maintain their form. And none of them will have referential integrity unless customers trust them. And that's really where we are today. And what we see is different organizations taking on different characteristics of trust and also different levels of transparency, which will ultimately yield where we're going with trust. And ultimately, only through radical transparency are we going to get radical new levels of trust, which is where we have to get to to make this new world really work. Given that you've written about it, do you think the industry and those in it, the big corporations, the smaller corporations, those moving very fast, accept and understand the level of this issue has now been created? In other words, trust is a serious problem now. Trust is a serious problem. You can see it across the world. I don't have to go through the stories. Everyone here has got their own personal story. Everybody has seen a societal story or a cultural story. The reality is we all have to step up and get to another level of openness and transparency. And that's not necessarily comfortable for everybody, especially the vendors. So whether you're an enterprise vendor, which is where we are, or you're a consumer vendor like Marissa, we need to all kind of open up a lot more to be able to say exactly where is the data, what's going on with the data, who has the data. And if there's a problem with the data, whether it's a security problem or some other issue with the data, immediate disclosure, complete and total transparency. That is no secrets. Because only through that transparency are we going to get to a higher level of trust. And that is not where we are today, as we all know. Marissa, a social media website news provider video and content provider search engine, where do you think the issue of trust is, how much under pressure do you now feel? I think that what I bump up to is to say personalized technology is better technology. And a personalized internet is a better internet. And for you to have a personalized internet, you need to store your data in the clouds. The question is, what do we need to do to get people comfortable with that? And ultimately, trust is about someone weighing trade-offs. How much privacy do I have? How secure do I feel? What are the benefits that I get in exchange for that? And I agree with Mark. You need to have transparency in that world. I think you also need to afford to the individual choice and control. It comes down to being able to make a statement. The users own their data. They should be able to examine it, take it with them, bring it to other sites, bring it to other vendors that they trust more, basically have a system and a market that basically helps people make these trade-offs and these decisions. But they should have control over how they use the system and whether they use the system at all. Is it clear to you, the state of trade-off at the moment, in the minds of the public out there, difficult to generalize with the public, but you know where I'm taking you? I think that overall, people sometimes have a difficult time making these trade-offs because arguably some of the vendors are not being transparent enough, not providing enough controls and choice. And I think that as we address that, it will make it much clearer for people the types of trade-offs they want to make. Because of course, when you look at mature industries, for example, we all tell our governments where we live, what we look like in order to drive. We all tell our governments exactly how much money we make and how we make it in order to partake in civil services. So there's a lot of areas where people already give up a lot of information about themselves. Without knowing it often. But they ultimately get a lot of benefit. And I just think that- But without knowing it? Well, I think sometimes you know it and sometimes you don't. And so if you start to look at various communications companies, payment companies, they do have a lot of data. But I think that when you see, if you know where that data is being stored, how it's being used and you're given choice around it, I think that that actually goes a long way. Mike Fries, what's clear in your company about where this is going on rules and responsibilities? You service tens of millions around the world. And you know what they watch. You know who they call. You know how they use what you put into their houses or onto their smartphones or whatever. What are the rules and responsibilities? Are they clear to you at the moment? Are you under pressure? So today, with all the data you described, let's give you some facts. We probably on average have access to 50 billion hours of viewing from our 27 million customers and 30 billion clicks a month. And today we do nothing. What do you mean you do nothing? We generate zero revenue from all of that information. I'm gonna lead to tell you where I'm going here. I think there is a big problem with trust today. I think we've all seen this train wreck coming. And why is that? Well, consumers have shared everything about their personal lives, of course, on the web. 90% feel like they've lost control. 85% have tried to do something to protect themselves. But 60% know that they're in trouble when it comes to sharing that information. And secondly, you've got a big disconnect between data protection and data retention for purposes of the government, for example. So on one hand, whereas the governments are saying we've got to protect consumers, we've got to make sure consumers' data is protected in Europe and elsewhere. On the other hand, we have to retain that data because we might need that data for government or security purposes. So there's a disconnect there that has them worried. And then, thirdly, you have things like what's happened with Sony and state terrorism and all these things have gotten them nervous. Net neutrality has them convinced that someone's out there to screw them. It's surprising to me consumers aren't even more worried about what's happening. Big data is big business for a lot of people. We're not one of them. We'd like to be. What's holding you back? It's this issue. It's a concern about whose data is it. Let's start with that. So you've taken an active decision. Well, we're starting to work our way into it. I mean, we're gonna find a way to monetize this, but we have some principles. First principle is with consent. We won't do anything in terms of personalized viewing and or using your data for any other purpose unless you approve it. That's the small print, the terms and conditions. No one reads. When you sign up, when you put our new advanced TV box in your household and you log on, the first question is, do you want us to use any of your data for personalized viewing? Yes or no? 70% say yes. And they sure like the fact that we're asking them. It's not as if we use it first and ask them second. We ask them first and use it second. And that's a big difference between social media or other aspects of their internet experiences. So I think there is a big trust problem. I think if you're transparent, that's the right word. If you ask their consent first, if you're open about what's happening, whether it's lawful intercept or what you might be doing with the data, then there's a chance of continuing to monetize. It's a $150 billion business, big data. But let me be absolutely clear before I go to Tim and the commissioner. When you say you're doing nothing, that is an active decision at the moment. To do nothing until you're absolutely sure about the implications. Absolutely right. Correct. At the moment, you reckon you're denying yourself cash flow? Well, I think on average I've heard there's $60 a year per internet user generated from data. I don't know if that's the right number. I just heard it somewhere. Okay. That's a big number. We would take a small portion of that if we could. Let me just check, Mark. Restrain, self-restrain. Do you restrain yourselves on opportunities? Well, you know, we're the enterprise cloud. So what that means is that for our customers that we work with, you know, which are the general electrics, the Phillips, the BMWs, of course it's their data. It's not any question. But on do nothing, is that an option for you? Of course. We can't do anything without our customers saying what we can do. Because first of all, it's their data. They tell us where they want it, how they want to use it, what applications they're using it. We can't see it. The data is black to us. It's encrypted. But I think that that very much is a model for where the consumer companies are gonna have to go. The enterprise companies are in a place where we recognize we can't do anything without our customers saying okay. That's our agreement with our customers that we sign with them. In the consumer world, it's a little bit different. And I think that's kind of what Mike is going to, which is that sometimes you know what's happening. Sometimes you don't know what's happening. I, for example, I use a consumer email service. I'm not gonna use any brand names. And I don't know where my email is. I don't know what country it's in. I don't know what laws are regulating it. I don't even know if the vendor knows exactly where the email is. And that's gonna change, right? You can't just be searching on the internet, using consumer services, doing various things, and you don't know what's going on. You're gonna have to have complete and total disclosure. I'm just going for a data point here as well. And that's why I think the enterprise cloud, if you will, is a model for the consumer cloud. But Marissa, can I just ask you as well, is do you often take a decision to do nothing? Let me just quote you what Nijeka, Harry has just said 90% of consumers feel they've lost control of their data, despite what all of you have said. Well, I would say there's other statistics too, that 70% of people like the internet being personalized. 80% of people expect their smartphone to understand where they are. 90% say they've lost control. Et cetera. And so I think that this is why I brought up the principles of transparency, choice, and control. Control, consent, you can use either word. But the idea is that you are very actively acknowledging what you're doing and we're being very open about how the data's going to be used and where it's going to flow. But you take active commercial decisions at the moment, not to enter into a certain kind of arrangement. Certainly. So I mean, we, for example, don't sell your personal data from Yahoo. So there's not, you know, you can't say, oh, I'd like to understand exactly what this person did. We might say, look, we've done an analysis on this group of users, for example, on audience advertising. And we may, for example, target, which when time the data is still retained at Yahoo. But we definitely have principles that govern what we will and won't do and we don't transfer your personal data to third parties. All right, we just wanted to check with all of you on the commercial side. Tim Berners-Lee, we just heard from Mike, this was a train crash waiting to happen. Is this the kind of train crash you always expected? Trust. I think that's a broader question than the one you've been asking everybody. But I think behind this train crash feeling is always itself to a model that we all have. Partly it's been, you know, but depending by the press, partly it's true that here we are, I'm the portraitist and I'm the portraitist. Somebody, and various, and I use these services, I have to because that's how I live my life. And then all these companies that provide the services, they're taking all my data and they're doing stuff with it and they're locking it away. And they're selling it to other people and they're concluding all kinds of things about me that may not be true when they're selling all these kinds of things to other people and it's got out of control because, now, to a certain extent, things are out of control because on one end of the spectrum, though, you can put a phone, you can, before the iPhone, to mention a brand name, have the ability to easily turn on the flashlight, you could get thousands of different apps which would do it. And depending on which one you picked, you download the thing just to turn the flashlight on your phone and it would ask access to your calendar and whatever and then the calendar is the best one and these apps are deliberately just done, made in order to steal your data, build a profile of you and these companies are sort of nefarious in what they do and they are just, their whole business model is stealing data and trying to build profiles of people and not help them in their lives at all. Then there are people in the middle that actually there's, like a lot of people around the table, they're trying to provide, they're providing a service, they're holding the data and in fact, they're not being nasty with data, they're having to do some amount of analytics in order to do a good job. Yes, a personalized service is a better service, it means that you can buy your clothes more quickly if everybody remembers what size your body is and things like that. So yeah, and in general there, one can have quite a lot of trust. And then there are some people, some places where actually the app is completely serving you. This used to be the case, in a way, perhaps you'd like to put Salesforce in that case because it's enterprise app, there's a model there that, hey, you know, if I use this, I'm prepared to pay for it. In the old days, I used to pay for apps running on my computer and they would be written by somebody who wanted the app to do good for me. Some people have been trying out the word beneficent when you do tests on people, when you do experiments on the web with people. One of the things you have to convince, I have to convince the committee at MIT is what I'm doing, beneficent, it's basically good for users. There's an old fashioned, I'd like to have a brand and in fact, I've talked to a few people that are quite excited about starting this app. You have a, suppose we have a brand where we say this is a beneficent app, that means while I'm writing the app, you're gonna pay me for the app and every time I think about how, what it's gonna do with your data, I think about what you'd want. So yeah, so I'd. Or I will help us. So that business model is one which has been almost completely asked and I think we will end up moving to it because I think people just get fed up with these other, but we'll end up with people making strong commitments to do what the user wanted and I think we will get, and by the way, it involves creating a new architecture for how we store our data and how we craft through it. Okay, well let's come back to that. We'll get back to it. Commissioner, what's your view? You've heard how they want to, they accept there's a problem. Do you accept in Brussels that there's a significant problem? Are you going to be reactive or are you going to be able to work in advance of the problems, given what Mike said about it, a train crash which was predictable and it's probable that the next generation of problems in hyperconnectivity can be predicted even if they're going to come incredibly quickly. We are in a digital revolution and we need a data revolution in parallel, a smart and pragmatic revolution. Let me compare. Since decades and since centuries, first railway systems and trains and they have so many concerns or take head and forth, Karl Benz got a deep time block. His first cars as entrepreneurs, as startups and many said, no, we don't need cars, we have horses or take first computers. And people are saying, it's not good for the eyes, for the health. Take biotechnical products for medicine or for food. And so there's a huge potential on one hand and there are some risks on the other hand and we have to balance it out. And... Is trust something that you think that Brussels now has to get involved in to try and make sure that the trust levels are higher than they appear to be running at the moment? No doubt. In Europe, we have not so many digital natives, S.E.U.S., we are elder citizenship. And take the Sony case, take Snowden, there are some concerns and people don't trust. And so transparency is a clear first advice. And the second point is we need a convincing, global common understanding. And we need a UN agency for data protection and data security. And what we are doing is to Europeanize data protection regulation and data security rules. Are you moving? Because we have 28 fragmented and that's not convincing. If anybody from out of Europe wants to make businesses in Europe, he goes to a member state with a low level of data protection and he can get all data from all over Europe. So our data protection regulation being in the council for the moment, our network information security directive for data security are our offers to the industry, to IT companies, telecom companies, to the whole industry, take Inleg of Things, take our Siemens, BMW, Mercedes companies, and on the other hand to our citizens. And when we are ready, then we would like to come into contact with the US to Japan, South Korea, others for to come to a global understanding, a global culture of a pragmatic, balanced data protection system. Now, let me be clear. In your mind at the moment, are you moving, use the word understanding? You also use the word regulation. Are we moving towards understanding, regulation, legislation? What is going to be the framing in your mind at the moment of how this can somehow be addressed to create a higher level of trust? You talk about technologies which frankly took a long time to evolve. And what we're talking about is hyperconnectivity, which is progressing at a rate which often is mind-blowing because of both the capacity and the impact. Is it clear, pragmatic, market-based regulation? Additionally, it's the industry, codes of conduct or certification of mechanisms or it's designed for privacy. So it's both. It's a public-private partnership. We need our parliaments and we need our industries and all players and we have to come to a common understanding for legislation and for practices. Right, is this achievable, Mike? Market-based regulation. That's the big question. Well, I'm giving you the question. What's your answer for you? Not in the near term. How long is the near term? I think it's going to take several years. That's my view. Despite the commissioner's objectives and I think they're the right ones, I was just meeting with Angela Merkel. She has the same objectives. Everybody's intentions are good but working in Europe for as long as we have, 25 years now, we know that getting 28 people to agree on 28 countries to agree on one framework, one set of standards and policies is not an easy thing. So I think you're going to find local parliaments, local governments taking a stab at it, which is not great, but something needs to happen. And I hope with your and your colleagues' support, we can start some broad standards because that is definitely needed. All right, well, let me just go around all of you very quickly, Marisa. Market-based regulation. Comfortable with that? Achievable? I actually like Tim's idea better of the beneficent marketplace where you basically will just make a trade-off around who you trust, how they handle your data, actually probably monetize your data yourself. And I think that that type of market-based force is likely to get us to a better, more forward-leaning solution that comes up with advantages to end users more quickly. I'm tempted to ask you, Commissioner, are you now going to go back to Brussels and ask for the translations into multiple language of the word beneficent? Because I'm sure it's on Yahoo, we can look it up. But Mark, what's your view about market-based regulation? I think actually what the Commissioner said is proven to be the right answer, which is that we need... Achievable? And achievable, we need a public-private partnership. And we've already seen that the tech industry can get out of control, the entrepreneurs can get a little bit wild. Tim gave us a great example already, but we all have to go back to our friends here in Europe and Neely Cruz and her Right to Be Forgotten concept, which is... What's that? Right to Be Forgotten, Right to Be Forgotten concept. And both of them have added a huge amount, actually, to the conversation. And I think that that's a great example in that you've got Vivian and you've got Neely and they have added tremendously to our industry, period. And one aspect is the Right to Be Forgotten. That is not something that any entrepreneur or back where I'm from, which is San Francisco, was ready to implement because they want to keep that data in perpetuity, harvest it, do whatever they want with it, and not be held accountable by any user, current or former. And that's why for the government to come in, especially specifically the European government, and to say, no, this is something that we need, that is the right role where government needs to be looking out for our rights and provide a safety net to our industry and that can only happen through partnership. All right, let me now move to all of you. I've got a lot of tweets as well. Who'd like to come in? Jonathan Zittrain, do you want to come in? Anyone behind me? Please. Sit down, it's okay, just get a microphone. All right, I tell you what, while the microphone's arriving, has anyone got a microphone they could connect us with? If you could get the microphone over there. Let me just ask one question, if I may, to Marissa, about Tumblr, another company that you acquired, how does Yahoo manage trust within companies that it's acquired, like Tumblr? They conform to our standards. So we know that when someone comes to a Yahoo property, even if it's not branded as Yahoo, they're expecting that property to behave with the same principles that all the other properties that we have pursue. And so we've worked to actually standardize their terms of service and their privacy guidelines to ours. All right, fine, that from Vibo Kipan, Jonathan. Thank you for a great discussion so far, and Nick, I think you're really kind of on the right path in talking about self-restraint as one of the themes. And I think it's right that Mark has it pretty easy because his self-restraint is defined by customers that can negotiate with him somewhat on par because he's in the enterprise space. And I wonder, Marissa and Mike have it maybe a little bit harder as do the rest of the consumer-facing industries because they tend to push the envelope as far as they can and then they step in it in one way or another, real or perceived, and have to kind of retreat a little bit. It's kind of the sort of feedback that donkey gets when it's hit across the face with a two by four. So what's your challenge? So my challenge is, is there a way to get ahead of that cycle and think about a set of principles, of declarations, kind of akin to the one that Marissa already invoked about, you own your own data, that first would say, these principles are meant to be enduring. It's not you own your own data until we figure out how to monetize it. But when you hear Mike say it's going to be several years. That's going to fly by, and then I don't want to... That was about governments agreeing, all agreeing. Yes. This could happen tomorrow, what you're describing. Yes. So I'd love to see principles that the company's unilaterally announced, possibly modeled after the kind of Magna Carta that Sir Tim has called for, and with a web we want campaign, has asked people to weigh in on. And I think those principles would include the sort of you own your data stuff. That's about the noun of people's data. And they should also be principles about the verb. And by the verb I mean the algorithms. And Marissa talked about personalization. People do like personalization, but as soon as you're into that, that's choices that the company is making about what to present you. Now it's advertising. I think we know more and more it's native advertising. Native advertising means advertising that doesn't look like advertising. And it would be awfully nice to know that the company is representing you rather than an advertiser in sort of slipping in those suggestions. Come on, Marissa. Well, here's the thing. You don't really have to go any farther on this issue than right here at the World Economic Forum because we're a strategic partner of the World Economic Forum, so they use Salesforce. And they are managing all of our information. So of course they know when we walked in this morning, I don't know if you're aware of this, you're wearing a badge. And that badge, of course, you had a reader and they're collecting where you're coming in. You didn't take that for granted. Hold on. Mark and I had a fascinating conversation last night. What are you gonna reveal, Mark? Well, I'm just gonna just give you a general, just a general idea here. There's ambient awareness of where you are throughout the conference, what sessions you're going to, what you're interested in, what you've clicked on on the website, what you've clicked on on the mobile application. What you're tweeting. What you've, they're capturing all of your tweets also. They're bringing all of that in together. All your knowledge, these coordinators who are in the room, at the end of all of the sessions, they write to capture all of the key knowledge from the session, it all gets tagged and gets linked to the participant. You can find that if you go to the central staging area, you'll see the graphical tools that they have that let you navigate that information. You can go to weforum.org and then get that knowledge that's coming out. Very useful for all of us, right? So I'm very passionate, for example, about the oceans. And I'm very interested in what's going on in the oceans. And I went into the ocean section yesterday for the weforum.org and I learned a huge amount about the oceans because the system is really working and they have, they're building knowledge, they're building networks of people, they know who you're like, who you're not like, who you're connecting with, whose profile you're browsing and they're trying to provide a better experience for you here at the World Economic Forum. And by the way, I think this year maybe it's running the most smooth as it's ever run. Would everybody agree who's been here before I've been coming since 2002? And they're doing a great job. And now- Did it tell you how long you've queued for to get through a security control? No, let me ask you a question. Do you trust them? Well, I'd like to just ask, is anyone worried on the trust issue? If you all want to come back next year, I don't think I can see any hands go up. But is it implicit in signing up and accepting the invitation to your white badges, you all knew that this level of trust would be required and understanding that this would be charted digitally in the way you've just described, Mark. Has anyone put up their hand expressing worry? You will come back next year if you put your hand up. Well, do we trust this cloud swap? We mostly do actually trust cloud swap. Do I have to ask that question too? The stakeholder theory. We like the conference. He's a good guy. He seems like he runs a quality operation. We like the people he hires. You know, we're looking around at their eyes. They know a lot about us though. Okay, but you know what? You don't have that opportunity with all vendors though, right? You're not looking at everyone's. Tim, come in. What's the most? Tim. I'd like you to, for a long time, they've had online meetings, or they'd like you to meet on top link outside Davos. They'd like originally, they had some kind of early video conferencing system. The idea was once you've come to Davos, you'd be available to all your peers and this power group would all be connected through leading edge, through world economic form technology. Now, in fact, it maybe was because it only allowed you to connect to that particular group. Maybe people didn't trust it. But there was a play there to say it's not just, we're going to run a conference really smoothly with very dinky little badges, which is- It's a social network. They were saying, well, that's meanwhile, let's provide you other things. Sure. But for example- So, suppose- So, we've all sort of opted in too. Suppose tasks also would say, you know what? We're gonna give you data about the forum. We're also gonna give you some space on the world economy. Because you're at a world economy prison. We're gonna give you some space on our server that we will, you can trust us to keep. And you can keep your fitness data there. You can keep your health data there. You can keep all kinds of data, not just fully coming from. So then maybe, because if he could establish the trust, maybe people would like to use it for all kinds of different data because they could, so stash it away somewhere in Zurich. I feel we're moving this debate into a new debate, which is in the world economic forum we trust. And I don't think we need to go there But it is a nice metaphor, don't you think? Sure, it's well worth raising. But I think there are wider issues here. Commissioner. That's the balance between security and business or other services. We are trusting Klaus Schwab, the police, the service, the hospitality, the security system as a whole. Or we wouldn't be here. And no doubt, we have to pay or not. But we need an allowance for to come into, so we need a control system. The second point, Klaus Schwab is checking how many people are in a panel, in a forum. And so next year, he can say we need more energy panels or more digital panels or more panels looking to terrorism and security or looking to eurozone and to currency or to to Draghi and the ECB. So I think it's a smart balance between security on one hand and effective services on the other hand. You all like? Is Matthew Prince in the room having registered? Yeah, Matthew, you've just sent a couple of tweets. Would you like to repeat them? Hi, Mark. I think that transparency is clear. And so I'd love you to talk about your data.com project and Jigsaw, which you acquired, which actually pays people to upload other people's contact information. That seems a little lack of transparent in terms of sales forces, data policies. And you also made a point about selling data. Yeah, that's what data.com does is sells customers data. Okay, what we have is a service called data.com. And with that service, it's kind of like a Rolodex. And you can go into that service and you can fill in somebody's name, their address, their phone number, look up other people, try to, like you would, the white pages and the phone numbers. And then when you're in sales force, if you're inside one of our contact pages, like let's say we're looking at the contact page for Michael Fry as a Liberty Global, you can say clean or update this data and update the data. So we're crowdsource the data and then we use that data and let our customers update it. Are you concerned about trust on this, Matthew? Well, I don't know about anyone else in the room, but I get a lot of sales calls that I don't necessarily appreciate. And a lot of that is because a service like Jigsaw has without my permission, aggregated my contact information and made it more difficult for me to avoid those sales calls. Explain what that means in terms of the word trust. I'd actually like Mark to explain what that means in terms of the word trust. Mark? Well, okay, that service, which is primarily available in the United States, is a service where exactly what I said, which is it's basically a giant Rolodex of contact information. So I could look you up and I can say, I can see your address, your phone number and all those things. You can also go to that service and opt out and say, I don't want to be in this service and then there is no record of your information if that's what you choose. But for our customers, for example, they are sales and service and marketing organizations and they do want clean records of data that's important to them. And through this crowdsourcing ability, we had given them the ability to update their data. Anyone else want to? Well, for everybody in the room has the option of going to data.com and opting out. Of course. The planet has got the option of, but wait a moment. Does that scale? If everybody in the, or I as an individual, I have to go to data.com and they don't go to UK and go to Finland and that. How do I know? How do I know all these places that I'm supposed to opt out? Because here. Well. Can I come here? Can I hear about them? Because he tells me so. Well, if there was just like if you get an email at the bottom of the email, let's say we're the largest email provider in the world. We own a company called Exact Target. And at the bottom of all those emails, you get an email. It says unsubscribe, opt out. I don't want to see this anymore. I don't want to get this information. And I believe in do not call, do not email. You know, do not contact me. And that's how we handle it. It's a fun question of whether it's opt in or opt out. I mean, that's what you're raising. And our principles hopefully will be to move forward, consent number one, transparent number two, and ideally anonymous. So it is helping you personalize, but we don't really know you. We're not selling your name, your address. Those are not easy things for everybody to do. They're easier for us to do, because we're already getting $50 a month from you for broadband, for television. And we honor that. You're paying us to provide this service. We're providing the service. These business models are driven by advertising or other. There's no other source of revenue, really, other than monetizing that data. So those are two worlds. I have to object to that, because I think that, for example, we run a communication company. And you're storing someone's personal data, right? I mean, by definition, if someone addresses an email to you, we're storing it. We have to actually hold on to that information. So it's more than just, it's not possible in some of these mediums to simply personalize, but not actually know that it's you. If you're storing someone's emails, if you're storing someone's client records, you actually need to know which account that was addressed to. And people actually want you to, because they want you to keep it secure for them. What question is how you monetize that? I agree with you. You have to know things about them, just as we have their credit card. But how are you using it? To the issue of trust, you have to be able to store that information. You can then create a level of abstraction when you're, for example, targeting advertising, which is what we do. That's what I mean by that. But you still do need to, you can't simply abstract it all because you'll just lose the... Given Snowden and also the counter-terrorism problem at the moment, how much has that raised much more questioning about your storage of, say, emails? In other words, how would you say Yahoo now stands on what you might call the trust index when it comes to people now worried about how much they leave with you and whether they've actually shut down or they've created dormant accounts now simply by not using what they've used up to now? Well, I mean, the first thing that happened when we heard about Snowden's allegations is we changed the way that we store data. We changed the way that we communicate data. We went to entirely secure connections on all of Yahoo's major properties, HTTPS. We changed the way it did encryption between the data centers to basically get a more secure environment for our end users to realize that that's what they wanted. So we changed all of those things in response to those allegations. And what was the impact on the trust? We didn't have a measurement necessarily before, but the measurement afterwards shows that people, their trust and their confidence in the service has rebounded as a result of it. They understand now that we're using more secure protocols to communicate and transfer their data. Okay, let me come to you, please. You were going exactly where I was going. So I was surprised that nobody- We'll take it further. So nobody in, it's been 18 months, but nobody's actually mentioned trust is coming from the Snowden effect that basically the major culprit, the person who proved the entity that proved to be the least trustworthy was actually the government. And now we're actually counting on the government to figure out how we should actually regulate data privacy. And I was directing the question, mostly at Mike and Marissa, just because I can imagine that almost regardless of any step that you take to establish trustworthiness, there's still a question, does somebody get a pick the pocket and understand essentially what data you've collected, even if you're not monetizing it? It's a great question and the answer is yes. And the history of cable and telcos and almost anybody in the infrastructure business is a partnership with government. And it's not optional or voluntary. It's obligatory. So we do have lawful intercept relationships with every government in which we operate. We don't have the object, we don't have the right to say sorry, Cameron, you cannot put people here and you don't get access to that. We don't have that right. But I'll tell you what we do have the duty to do to our consumers, for our consumers. We have the obligation to make sure that whatever the government is doing is abiding by the law. We have the obligation to give some sense to proportionality. Is it, are they really doing what's necessary? And we have, so we can protect our consumers. But in the end, you're right. There is a disconnect between data privacy and this issue of lawful intercept and data retention for the purposes of government. They're sort of above it on some level. I mean, I would just make the observation that protection and trust really come as a function of security and privacy. But there is a tension between those two. For example, I think a reaction to Snowden, you saw a lot of people get very concerned about their privacy and it became a lot more about encryption. But when things become about encryption, inherently it makes it much harder to keep things secure, right? Because then, you know, for example, if you're a government and you want to monitor for cyber crimes, terrorist threats, et cetera, you need to be able to actually see that data to be able to monitor it, see patterns, et cetera. So there is going to be a tension in terms of are you communicating in an encrypted way, are you storing it in an encrypted way? And it doesn't, it's not always obvious but there's actually, I sort of see a pendulum swinging back and forth in public sentiment where, you know, you saw Snowden really swing that pendulum very heavily towards privacy. Now with some of the recent issues that have arisen, you're starting to see that pendulum swing back towards security. A particularly after the recent incidents with much worse to come, I mean, I come from a country where the director general of MI5 two weeks ago said it's inevitable there's going to be an attack but as a result, we need more ability to track within the algorithms and everything else that GCHQ has got. We escaped from the pendulum. I think one of the things which is a shame about this whole thing is it's seen as a pendulum, as just being more police power, less police power, more powerful. And so, but in fact, let's look at it. Tim, is it really only seen in that way? I think to a lot of times here, when in all the discussions I've been in people, you've mentioned pendulum, lots of other discussions I've been in, it's been taught out as a pendulum as though it's a one dimensional thing. I'd like to break out of that pendulum. I'd like to say that for example, you say that you're asked for them, you can see that somebody's gone through the appropriate court process when they come and ask you for some data, you give them the data, you can tell that they have gone as far at that point legally. You have no way of testing what they do with it then. So from that point, what we have not done is managed to build. So instead of just pushing one way or another, more power, less power, let's go in the direction of accountability, say, well, if you want more power, then we need to build a system where you're accountable. So yeah, you can have the data, but I'm going to talk to the people to watch over you and I'm going to see. I'm going to. Well, we have set up a global council within our company. We do oversee these processes. We have standards and so it's not just, we just don't open the door, the back door, the front door and say, come on in. We actually have, there's some proportionality and oversight. Commissioner, pendulum, what kind of architecture is in your mind at the moment of how this can proceed because Tim has just highlighted what he sees as this pendulum between more power, less power, the police intervention and so on. Where do you see it going at this stage in your analysis? Let me say, I think trust and transparency. Additionally, we need to speak about a lack of information. Many people are not informed and many media like scandals or as they say, it's a scandal. And so we have a lack of information. It's a question of education in our schools, our high schools, in our media. We need more competence of all our citizens and then we have to divide the public sector and the business sector. Businesses need data and we have to check which data to store is in our interest for new businesses. Take healthcare, take automotive mobility and to optimize our roads and highways and so on. And on the other hand, the public sector. And I'm sure if our police, our politicians are declaring we need data retention, then we can get majorities in the parliament and in our citizenship and so we need a deeper information. I think since two days, a panel should be a panel for millions of citizens. There are people out there. It should be not just in future web, but in the ARD and in the ZDF and in all post-cutting programs. We need a higher level of competence for everybody. All right, but it's also about perceptions and perceptions can be made very quickly. And interesting, the Edelman Trust Perometer yesterday published two days ago said very clearly that large numbers of people are not going to the traditional media organizations, they're going to the search engines first to get their information then going to traditional media. Please. Yes, hi, I'm Yasmine from Egypt. I guess from what you're saying, we can all assume that our governments have rights to our information whenever we want it. But for governments, we assume like the West would be looking just for terrorism. What do you do for governments who would abuse that just to silence opposition or freedom of speech knowing that from past events that that can lead to their own interests? Marissa, are you under pressure on that? Well, I mean, I think that what we're seeing from the nation attacks from the Snowden allegations is whether or not they're coming through the official system to get the data. They are, in fact, getting data, right? And so we can do what we can do in order to try and protect our users, which is usually through encryption methods and the like. But that said, I think it might be the wrong question in terms of... If a foreign government asked for permission to access your users, I assume you would give it to them too, right? I mean, it's regulated by the law. So it really comes down to we look out for the individual user. We assess every request that comes in as to whether or not it's reasonable. And we actually, Yahoo! have a very good track record of standing up for what we think is not reasonable. In fact, when the FISA law originally changed allowing for what we consider to be an invasion of privacy, it turned out it was kind of ironic because we felt that the users would ultimately want to sue on their behalf and say, wait, this is an invasion of our privacy, except the law was secret and it was tried in a secret court. There was a corporation, Yahoo! It was before I was there that actually went in and said, okay, we'd like to sue on behalf of the users and say, this feels like an invasion of privacy that they're not gonna know about. So we've done a lot there to try and make sure that those government requests aren't overly invasive and we felt that our users couldn't stand up for themselves. We've assumed the voice of the user in order to try and achieve a better result for our users. Can you put a quantity on that to give us an idea of the scale at which you've been able to resist and say no? Well, every request that comes in, we scrutinize, we look at the law, we look at the request. And I would say when we publish the numbers, it's more than 1,000, less than 10,000. It depends on if you're looking at local civil crimes or if you're looking at FISA, et cetera. And so we're allowed to publish ranges. So I think for a lot of people, it was actually less than they thought. That said, it's still a large number. What about, so I think maybe the question was about what about if you have a nasty, oppressive regime? You've talked about America. Do you have... No, that's our policy around the world. No matter who it is. We will always say, look, and every request that comes in, it needs to be a well-formed proper request. So if there's anything wrong with the request, we send it back and say this was improper. And so there's often a lot of back and forth there. And we will say within the law, how can we stand up for the user? And do we think this is a reasonable request? And is it in line with our terms of service and our user's expectations? And we frequently push back. Let me, in the last few minutes, just try and push you forward as to where, say we meet again this time next year, if we're all invited, having had the discussion a little earlier, about our trust in the World Economic Forum system. But if we're all here next year, are we going to be debating this again with trust, having eroded even more, or with a new kind of plateau? What's your expectation, Commissioner? I think it's a mid-term issue, no increasing issue, but the level... No increasing. I don't think so. Younger people are better informed than my generation. So we have one more year of digital natives here in Davos, I think, and outside as well. And there is some momentum in our legislation. In many states, in our European Union, my expectation is we are ready by a decision of the Council before end of this year with our European Data Protection Regulation. And the outline of it will be what? The outline will be we have one level of data protection. So to inform our citizens is much more feasible as if you have 28 fragmented regulations. And it's mainly this proposal from, we were on reading, you mentioned, and I think it's a balanced solution between privacy, between citizens' priorities and between our businesses, our industries being able to work in our European Union. Mark, what's your definition of the way this will move? Are we going to be having the same debate if there were to be another Snowden-type series of revelations, either from him or somewhere else? Do you think that would once again make large numbers of people hyper-suspicious? Well, I think we have to kind of get back to, I think, one of Tim's earlier comments that I don't know if this is exactly where he was going, but the internet was designed, and when Tim was creating the World Wide Web, that the internet was designed to be inherently insecure and unreliable. And today we have an internet that is mostly insecure and unreliable. It was a design characteristic. And I think that the question is on the table is, do we need internet 2.0 that maybe doesn't have the same kind of protocols, doesn't use DNS, doesn't use TCPIP, maybe has a different, or it has a different type of network capability. And that would be something that would, I think, create a different level of trust, but I'd really like Tim to finish the thought that he started in that area, because I'm a proponent that we need an internet 2.0 for certain networks that require high levels of security that are not anonymous, that every bit is known, and that we are able to have the kind of authenticity on the network that I think a lot of the applications that we want and have require. What's your thinking on 2.0? The IP structure underneath is fine, because we've always built stuff on top of it. So we've always built secure things on top of, that's what SPS does, says yes, the underlying net there is insecure, but we've always built security on top of it. The security has not been very good and it's also been a bit patchy. So there's this big moved encryption everywhere that certainly the World War I of Consortium WC, go to wc.org, the technical architect group was spent the last week or so, has been talking all about bringing in encryption everywhere, some of the spooks, maybe terrified, maybe even where it didn't call for, where there's no S on the HTTP, the code might end up just encrypting stuff, because there's this big push in technical community to, I think there's been those, the trust issue led them to say let's just encrypt things because we can't trust who's going to be looking at it. I hope, so we're going to, there's going to be a lot of things like certificates. I hope that if we come back here next year that actually we're talking about things very much more positively, so rather than just worrying about the niggling fears that people are going to be spying on us and doing the wrong things with our data, instead I hope that we've woken up to a whole completely exciting possibility when it comes to our data. I hope we've realized that the value of my data to some cloud-based thing, which is figuring out things about my demographic and using it to affect my or somebody else's, that is really boring, but the value of my data to me is really exciting. So I hope we're in a world where we have beneficent apps, where we have our data, I hope that it'll be in a situation where I can store data wherever I like, that I can store it with class, I can store it with Mercer, we can store it, I can store it if I want on Salesforce, but it's stored there in a way where it's treated as mine, I can store my data on Salesforce, so there, okay, and you can just run all your apps and store it on there, I'm in control, I get to control what happens to it, and in this magical, really exciting sort of future, I get to get all my data in one place, so I can do what industries all know how to do, application integration, any industry does, it has to integrate all the data from different places, you'll be able to live as a human being, I will be doing my personal data integration, and that's what we'll be talking about, hopefully this time next year. Okay, is Barrett Brooks in the room? Is there a yes somewhere? Yeah, could I come to you, Barrett, because you've just tweeted to a three, rather interesting and rather negative views about the way this is going, if I may quote, sounds like the back and forth between government and a company like Yahoo, is probably a huge waste of time and resources. Yep, definitely. Why are you saying that? Well, it seems like it's probably pretty wasteful, and if the government's not taking the time to submit through the proper channels in the proper manner, then they're wasting business time, and that's annoying. What about your other skepticisms? About digital natives knowing about data privacy and things like this. I don't think that digital natives know about data privacy any more than anyone else. I think we're more familiar with the tools that we use that collect our data, but not how they're collecting our data necessarily. Tim? Digital, about digital natives? I thought I saw a sort of expression of scorn towards that. Not at all, and I think I misread. Maybe it was just a slightly raised eyebrow in that, I don't run. But digital, I think you should never misjudge or underestimate digital natives. I think that, yes, of course, there will be all sorts of people out there. There will be people in the digital native generation who understand these things very well, and I think the people who don't understand the very well hopefully will learn from the people and the family that do. Marissa, right, and Mike, finally, your sketch, if you can, of where the architecture might be or the pressures additionally that you might be under? I'm pretty optimistic, I agree with Tim. I think that as users and companies and governments become more fluent in all of these ideas, they get more comfortable with it, and that certainly is shown in other industries in a test over time. And I think that, for example, today companies are communicating more clearly about these concepts. A lot of companies are laying out privacy principles and privacy guidelines, putting up the most important terms up front, not buried in the fine print, et cetera. And I think that users have, we're on that learning curve, and they're coming up that learning curve in terms of what data is being collected about them, where it exists, how it's being used. And I think as both of those forces go to work, there ultimately will be more trust and acceptance and a better understanding of the trade-offs that are being made. Mike, a comment here from Ryan Heath, opt out systems much less effective and fair than opt-in systems for users and their data. Couldn't agree more. And I think we're ending where we began, because David's comment was, digital natives don't know either. And really people aren't aware, they're not, they might think they're in control, but they're really not in control. And I think the things we've talked about today around transparency and opt-in and control are really the right directions for all of us. Hopefully next year we're talking, maybe I hope that you're correct, that we are looking at a framework that gives businesses and consumers clarity. Because today there is nervousness, there is anxiety. This entire panel has been about anxiety. And hopefully next year we're not dealing with anxiety, there is some resolution around what governments will or won't do. And if there isn't general resolution in Europe, I worry that some countries will take it into their own hands, build their own internets, internets, Balkanize the internet, which is not a good thing I do. The China's already indicated it wants to do that. You don't want Germany to have its own internet. I mean, how does that work? Is that any worse to cure? Yes it is. Yeah, I mean these are not the right, this is not the right direction. You know, I think in the end, you know, if we have standards and clarity, and then I think anxiety. You will act on the interest of all Europeans, though, won't you? Okay, he nodded agreement there. But you say control, is that the right word that losing control? Confidence. Isn't it having a handle, at least, on what is happening to the data? Even if they accept, they don't control it. I think they're okay giving control to somebody else if they're making that decision. As long as they understand what they're doing. Okay, well look, thank you to all of you. I fear we will be back. Well, I hope we're all gonna be back. But thank you all very much indeed. And thanks for all your tweets. Thank you.