 from London, England. It's theCUBE, covering AWS Summit London 2019. Brought to you by Amazon Web Services. Hello and welcome to the AWS Summit here at the Excel Center in London. I'm Susanna Streeter and Dave Vellante is my co-host on theCUBE today. So much to talk about. It is immense this summit, thousands upon thousands of attendees, talking about everything to do with the cloud, of course, AI, machine learning, but privacy keeps coming up again and again. And I'm pleased to say that Bill Mu is here. He's a privacy campaigner and tech consultant and is now CEO of crisisteam.co.uk. Now, Bill, we have talked a lot really about the growth of AWS and also the startups using the public cloud. And it's interesting that that growth has intensified even though the GDPR regulations came into force. And right now, lobbyists are really hard at work, aren't they, particularly in the United States, trying to kind of limit the impact of coming regulations. Do you think they'll be successful? Well, I think there was a big argument when we first looked at the introduction of regulation around privacy and sort of ethical issues, that it would be a big restraint on innovation. And I think what we're seeing here at this AWS Summit is the fact that innovation is well and it's alive and it's really healthy and there's a great deal happening. We just need to be careful with what we do with people's data. And there's a very good reason for this. It really matters to people, you, me, people in the street, consumers. Number one issue now for most people is security and privacy and how their data is handled. It's interesting that only six months, eight months ago, if you surveyed the same group of people, they might've said diversity or sustainability now because of a number of the horror stories around data breaches, the number one issue out there is now how their data is handled. And therefore companies need to take it very seriously. And obviously AWS has got an enormous infrastructure and it's claiming that it's a GDPR compliant in the way it handles its own data. But there are a lot of people that host on its platform and they're sometimes vulnerable. So what I'm doing is I'm helping to influence where some of the regulation is going to try and head things off to ensure that we have the right balance between meaningful protections because that needs to be in place, but ensuring that meaningful protections don't hinder innovation or economic and social value. But at the same time, I also work as part of the crisis team with some of the top lawyers in cyber law and a whole bunch of crisis management experts or ex-UN and whoever. And we help step in when things go wrong for companies, not only helping them come together with a legally defensible position, helping them communicate it effectively and actually across our social media champ campaign and our reach and some of the other channels like this that we use, we help to counter some of the hysteria and misinformation that is often inevitable in that type of situation. So there's a whole spectrum of there and an enormous scope for debate. So you're talking there about fake news in particular, are you? Well, I think when a story breaks, there can be a lot of misinformation about exactly what happened. Things can get a little bit out of hand and hysteria can take off. You can talk about alternative facts, you can talk about hysteria, you can talk about fake news. What we try and do is not only help companies formulate what is likely to be the most realistic defensible position that they have, but also to make sure that they're countering some of the really terrible hysteria that can occur at a time when typically their own credibility in the market is an all-time low. And maybe there are, if you've got some credible privacy campaigners, some real thought leaders in the market who can step in and say, hold on guys, look, there's a little bit of reality we need to touch on here. This isn't quite what happened. This may have happened and this is what they're doing to try and address it. Then maybe we can counter some of that hysteria. We can help people who might be unduly concerned and also we can help protect some brands out there that are sometimes facing a lot of reputational harm. So Scott McNealy famously said one day that he was the former CEO, CEO of Sun Microsystems, very successful company that was sold to Oracle, but he said, there is no privacy on the internet, get over it. Now that was before social media took off. Social media obviously has affected this discussion, but for years people put, and still, put stuff on the social channels that is absurdly private. Yet it's open for the public. Yeah, but I think there was a level of naivety once upon a time. So if we were to ask a number of questions a while ago about privacy, I think people would not really be too concerned, but they've seen some of the breaches like the Equifax breach where there was some really very sensitive information made available and sometimes that has led to very real concerns around people. But also we're looking at new technologies that are going to come along. We've got AI on the horizon, we've got facial recognition. These kind of technologies are actually going to dominate our lives in the future. And we're already seeing in countries like China where they're using facial recognition to score people a bit like you have a credit score, you have a citizenship score, how good a citizen you are, whether you J-Walk, whether you do all sorts of other different things, and your access to credits, your access to travel opportunities, your access to a whole load of services is based on your score. And I think there would be a lot of people in possibly the democracy, democratic Western societies who might see that as a little bit big brother. Even though you are still seeing some states and cities already bringing in regulation to limit some of the advances we see here. Yeah, and it's interesting, I think in Washington state in the US, there've been a number of different proposals put forward in terms of how they introduce the sort of privacy regulations we've already seen California and elsewhere. And some of the proposals there would be nigh on sort of banning facial recognition entirely because the biometric constraints were really quite severe. And I think part of what I've been doing, I've worked with a lot of privacy campaigners, but I also work with other corporates to see how we can strike the right balance. We want meaningful protections, absolutely, because there's some really sensitive data out there and the way it is used can affect our lives. But at the same time, we don't want to stifle innovation, the type of innovation we're seeing here at the AWS Summit. And we want to maximize the economic and social value. And that's a really delicate balance to it. It is, it's tight, right? And it sounds good, but so, I think at the cloud, how it has enabled small businesses to have access to IT infrastructure that's the same quality as large companies. In a way, doesn't this stack the deck for large companies who actually can afford the compliance officers and all the infrastructure necessary in the software and the people to actually comply with these new regulations? I think there is some truth in that because there is absolutely an overhead. But I don't think we need to get away from the fact that that data is really important and it needs to be protected. And I don't think we're just looking at privacy here. We're also looking at data protection. And I don't think you should underestimate the vulnerability that we now see. I mean, we are more of an interconnected society than we have ever been. The number of attacks are on the horizon, growing exponentially. We're also seeing the fact that the number of opportunities, the threat scan landscape is increasing. We've got massive numbers of IoT devices and other things. It's going to be very, very difficult. It's going to be a full-time challenge. Indeed, it's a sort of AI arms races. Either side use AI to discover either vulnerabilities to introduce attacks or vulnerabilities in order to introduce patches. We hear a lot about just how valuable our data is. I think we were even discussing at one point that it's more valuable than oil for many companies. Do you think that consumers have really woken up to the fact just how valuable their data is? And could you foresee a time whereby actually the consumers say, you want my data? You've really got to pay me for it. I think there have been some proposals along the front in terms of how we separate private data and give people control over it. The right to be forgotten was a step in that direction. But if we can have some sort of infrastructure that has allowed people to separate their own private data and allow access it on a permissioned basis, then that could provide a future internet. And there's been a lot of discussion along that front from Tim Berners-Lee and a number of other really top thinkers in that particular arena. But the value of that data possibly was overlooked in the past, but it's also the vulnerability as well. And therefore, I think people are waking up to it now. That's why they care so much more about it now than they ever have in the past. Well, there's certainly a lot of talk in the blockchain and crypto world about using that technology to allow the users to own their own data, control their own data. I mean, take Facebook, for example. I mean, there's a built-in incentive for them to appropriate our data so they can sell ads to us. But what if, as the theory goes, the user could control it. The user could monetize his or her own data. So there is some discussion going on there. There's some technology development going on there at the low-level protocol. What do you think about that? I certainly think that technology will provide the answer. Exactly how we do a sort of new version of the internet that allows that sort of control is still open to discussion. And there are a lot of opinions both on either side here. Interestingly enough, blockchain has been put forward as a possible solution, but there's a slight irony in the fact that blockchain's immutability is actually at odds with GDPR's right to be forgotten. So the two are actually mutually incompatible. So there's some real difficult issues for us to address. So technology got us into this problem of it. It can potentially help us get out of this problem, but maybe not. It's not entirely straightforward. And actually, if we're going to be moving in a direction where we give users more control over their data, it's actually going to have to be an internationally adopted standard. The moment GDPR has come forward as a standard here in Europe, but it has set a sort of the golden benchmark against which other regulations are now going to be measured. And are you seeing signs of that? Do you expect the US to adopt a model which is very much based on the model you were? It may not be exactly GDPR-like, but there will be things in common. And I think many of the organizations worldwide that really care about their user's data, and I told you earlier about the attitudinal surveys that have been out there, companies are very wise if they wake up to this and actually take proactive steps to change the culture in their organization, to have a digital ethics culture. It means not only that they're going to care for data more often and more carefully, they're going to be less prone to the type of inadvertent leaks as well as the sort of hacks, but at the same time, a culture of that nature helps them to deal with a situation if it ever does occur. And it's actually having the right culture. And those companies that have a truly digital ethics-oriented culture have not only adopted GDPR in Europe, they've chosen to adopt it globally. Well, I think there's a sentiment in the US that, look, we're doing this for European consumers. We might as well adopt the same standards globally. Yeah. We've got the processes in place. They seem to be working for Europe. Why not use them? It's just more convenient. It's going to be lower cost to do that. So it just makes sense. And that's why GDPR has emerged as a global benchmark. And many of the other countries in India and America and elsewhere are measuring their potential regulations against GDPR. Well, I've heard it criticized on this show as a socialist agenda, but it seems to be having quite a bit of momentum and a lot of sensible parts of GDPR. Well, I'm not sure that we can, I'd call it socialist or whatever. Not my words. I'm just quoting somebody, Lillian. What I think we've seen is a change in the balance where actually previously the people's right to privacy wasn't recognized at all. And we had the tech revolution where people didn't really care and Facebook were talking all about a sharing culture and that was their orientation. We've seen the tech backlash where Facebook and others have all been punished and there's been a sudden switch or a pivot towards privacy. What we need to do is look beyond those because we need to have a level playing field. We need to have an equilibrium where we're absolutely balancing the right protections, meaningful protections, with actually maximizing the sort of innovation you see here and the economic and social value that's going to underpitter, pin our lives. It's self-governance is not likely to work. Let's face it. I think we've seen and Facebook is an example that will be oft quoted in this respect that self-regulation doesn't work necessarily in this way because it's just too tempting to use data in the way that you see fit. Unwinding some of the mistakes they've made in the past is not going to be easy for them, but we'll see how well they keep to the new promise of their pivot to both of us. I think it'll define their legacy, personally, yeah. Well, Bill, it's been fascinating having you on here because you've really been at the forefront of all of these changes. So it's great to hear your thoughts. So thank you very much, Bill Mew, CEO of CrysisTeam.co.uk and you've been watching theCUBE at the AWS Summit. In London.