 Okay, ladies and gentlemen, if I could ask you to take your seats, please. My name is Ian Wallace, I'm the Director of the Cyber Security Initiative here in New America. I want to say three quick things and then get out of the way. First of all, I want to welcome you to New America. New America, as I hope you're aware, is all about American renewal. And that means tech in support of public policy or increasingly public policy in response to technology. And we are increasingly also aware that not all of that technology is necessarily created in America or indeed the rules which are going to govern that technology in the future are created by Americans. And quite a lot of what we do in our Cyber Security Initiative is track those developments and look at what implications they're going to have for the future. Think of it as cyber and security, if you like. We believe this is really important work and we are extremely grateful to have some great partners and supporters with us in that work. They include our partners at Florida International University, our colleagues at JP Morgan Chase and particularly for this event, our partners at Future Tents which is New America's collaboration with Arizona State University and Slate Magazine who are perfect partners for this kind of conversation. One thing I want to say is that I am very excited that today is really a celebration of the new project that we're launching on data and great power competition led by Sam Sacks who you will hear from very shortly. Sam is one of the few people who speaks Mandarin but also finds GDPR interesting. I acknowledge the fact that there may be other people in this room who fit into that category but it's a relatively small subset and we are very grateful to have her talents with us. She came to us through DigiChina which is our other significant China related project. We're making a big announcement about that on Thursday so please look out for that. For now we have this project which is really based on the premise that US-China is going to be the defining relationship of the 21st century and that Europe and others' engagement with that relationship is going to be key to defining where we go and second that data is going to be fundamental to how that relationship plays out and this conversation I think is going to set us up for a really fantastic project that's going to run over at least a couple of years. And finally I just want to introduce my boss Anne-Marie Sluota who not only is the CEO of New America but she's also a former international law professor, scholar of international relations who's lived and studied in China and Europe, I think Oxford still counts as Europe and a former director of the policy planning at the State Department at a time when the US was pivoting to Asia and thinking about how technology was going to impact on diplomacy. So a perfect person to lead this conversation. Anne-Marie. Anne, thank you and Anne is the founder and director of our cybersecurity initiative. He does many wonderful things but in the context of today's event I'm going to just compliment him for being a really superb attractor of talent. Sam Sax is one of the latest examples but Rob Morges is sitting there. He's put together a really interesting team of very different people looking at cybersecurity from a range of perspectives. So my job is just to talk a little bit about the project and then I'm going to introduce a video and then we're going to hear from Sam and we'll have a panel discussion. I love this project. I love this project as somebody who has spent my life studying foreign policy but also, as Ian said, 12 years teaching international law. When I went into the State Department I really was introduced to the world of technology. That's a rare statement, right? I had to go into it, I left Princeton, went into state but in state even though the actual technology at the State Department was completely dismal, I, working for Secretary Clinton, I oversaw, I worked with Alec Ross who was her special advisor for innovation and had worked on all the tech work in the Obama campaign and oversaw Jared Cohen who then became the head of Google Ideas and then now Jigsaw working directly on how, not just how you can do tech diplomacy, meaning sort of public diplomacy using social media but much deeper questions of how technology and the opportunities and the problems should affect our relations with other countries and this, this is squarely there. The other thing I'll say about this project is in 2017 I published a book called The Chestboard and the Web, Strategies of Connection in a Networked World. And my argument was we have the chestboard world of great power competition and that world is very much with us if we look at China, if we look at Russia, if we look at Iran and a number of regional powers and we, you know, the diplomacy that we engage in with those countries is still essentially Thomas Schelling's strategy of conflict fast forward 50, 60 years. I spent the weekend writing an introduction to Schelling's arms and influence and it's remarkable how much of all of that work is still relevant. But the other world is the web world, right? The other world is the world of networks, networks of government officials of various different kinds, networks of mayors and governors also, but of course corporate networks, civic networks, criminal networks, which Secretary Chertoff knows quite a lot about networks of universities, of churches and that world we study much, much less and we don't have strategies and the strategies we need, I argued, are strategies of connection. Sometimes you have to disconnect, sometimes things are disconnected, sometimes you need to build those networks. What I love about this is it's a project about great power competition, great power competition over data or in the realm of data, in which, as Sam argues and we're going to hear from her, we need to pay much more attention to what governments are doing. But it's impossible to think about this world without also taking account of the global actors in the web world. We have to be thinking about our major tech companies. We have to ultimately be thinking about NGOs and universities and others who can hold government and companies to account. So this is a project, it's a very new America project. We like to think it's ahead of the curve. It's really tackling a big subject and it brings together a lot of the things that I think are most important to study. And with that, I've been given a tech assignment. Let us see if I can do this. We are now going to hear a video from Maricha Shake, who I'm sure all of you know if only from her Twitter feed. She is very active in the world of tech governance, data governance. And she's just stood down as a member of the European Parliament. And we're interested to see what her next steps are. And we have a video from her. Oh, there we go. Hello, my name is Marie-Chia Shake and I'm an outgoing member of the European Parliament from the Netherlands with the Liberal Group. And I would have definitely preferred to join you live and not via video. But I hope that that will be possible the next time. Questions on technology and governance, geopolitics, human rights and how the relations between countries are impacted by technological developments are really the core of my work over the past 10 years. I've worked on a digital trade strategy for Europe to streamline the rules on the free flow of data. But I've also worked on digital rights as a part of the EU's foreign policy. And I'm a member of the Global Commission on the stability of cyberspace. So that gives you a little bit of an impression of where I come from. Now for today, let me zoom in on a fundamental challenge that I believe is crucial. The Western world has not used its technological advantage to also advance a rules based order around it. After trade, human rights, but also war and peace have been framed in rules and institutions and that this has served the quality of life of our citizens so well, somehow online, the belief that rules would be limiting instead of enabling has prevailed. And I am worried about this notion. I also disagree with it. Until very recently, a more libertarian hands-off approach was indeed the dominant line in Silicon Valley and also rippling into Washington. Almost any proposed regulation was met with heavy lobbying and pushback. It would stifle innovation or it was said to inspire China and other repressive states to, quote unquote, do the same. Well, when was the last time that these regimes copy pasted what we did in a democracy anyway? Exactly. It does not happen. And instead, while Western democracies failed to develop an agenda for the global governance of technologies, China, Russia and other top-down governed countries rolled out a system for governance of technology as an extension of the broader governance models of their societies and markets. And by pushing out their technologies and standards in developing countries as well, a system is growing, that is increasingly at odds with our system based on openness. So I therefore believe it is essential that the EU and the US join efforts instead of standing with their backs towards each other as we do now and that we develop a democratic system of governance of technologies for openness so to see regulation as an enabler as a safeguard. For example, without regulation, the tech platforms would have never been able to grow as fast as they have. Section 230 in the Communications Decency Act, which is mirrored in the European e-commerce directives intermediary liability exemption, have been among the most meaningful facilitators of the growth of online platforms. Now, I believe that the areas that need clearer definition at the intersection of technology and the rule of law is growing. Anti-trust and its application to the merging and use of data, anti-discrimination in the age of artificial intelligence or the right to privacy in a time of facial recognition technologies, the proliferation of hacking and surveillance systems, data flows and trade agreements, elections and the resilience and democratic integrity or interoperability and security of supply chains and Internet of Things. Now, I'm convinced that these are only a few of a much longer list that will require minimum safeguards and an articulation of which principles should not be disrupted by technology. Laws do not only ensure the values that are the anchors of our open and free societies and that allow them to survive the digital age, but they should also allow us to engage with third countries as they develop. I believe governance and laws matter. They always have and just rolling out the Internet itself has proven not to be the magical self-fulfilling prophecy that would make democracy go viral. So what we need to think about now are checks and balances on government power but also on corporate and technological power. So I hope that this will inspire you this afternoon. We see a world where the majority of people live in non-democratic countries and so we will need ambition to project the values of open markets, open societies and the open Internet globally. We can only do that credibly if we have our own house in order. Now, I very well realize that some of the things that I've said might be considered as typically European, but the world is bigger than the United States and the First Amendment, for example. But I do believe that these are the kinds of questions that are worth discussing even if we don't agree. So perhaps some of your thoughts and feedback can be shared with me somehow and again, I'm really sorry not to be able to be part of the discussions you're about to start. Thanks for listening and including me in your program. Good morning. Thank you so much to Ian and Marie for the very kind introduction. It's really exciting to launch our new project among such a distinguished panel and with friends and colleagues in the room. I think it is clear right now that as we look out at the U.S.-China relationship, technology is front and center in a conflict that could define the next years, the years to come. This is a competition, but it also is a relationship of interconnection and data matters in this competition and this interconnection that we have with China. But I think what is less understood is the role that data governance will play in driving this dynamic. So when we look at the technologies that will define the relationship, not only the internet and cloud, but as we look out into the next generation of wireless networks and the exponential increase, the explosion of data as Secretary Chertoff has discussed in his book, this is only going to become more and more important, looking out to the data that is critical to machine learning and AI. But the rules for how that data is harnessed, who has the ability to access that data? What are the terms under which that data is used and shared? This sort of topic of data governance, I think, is far less understood as we think about data as a driver in this dynamic with China. And so that is the topic and the focal point of the study that we are beginning today. So let's talk about what exactly is data governance. You know, I think that the way that the government interacts with the private sector when it comes to data is going to be increasingly important. It's not just laws. You have laws and regulations and standards. But it's also things like international trade agreements. The way that the U.S. works with allies and partners taken together under this broad umbrella of data governance. This will be a determining factor, I think, in this growing rivalry that the U.S. has with China. But it's going to play out all around the world. So as Europe, as India, as Japan, as governments really grapple with the new challenge of creating rules for data, I think that this will have important ramifications. And there are different levers that can be pulled. One of the questions that I have is we talk a lot about the trade-offs of three important pillars. Personal liberties, the ability to use data for innovation and competition by the private sector. And also access to data by the government and what that means for national security. To date, there has not really been a serious assessment of what does it mean to have trade-offs among these factors. Is there a necessary trade-off between things like privacy and innovation? What is the balance among these factors? And I think as we look out at different governance regimes around the world, we're beginning to see different answers to that question. But this is not just a static conversation, right? We are writing rules for data that are going to have impact for years to come with technologies that have not even been developed yet. So right now, while the conversation focuses a lot on things like privacy, we also need to consider the effects with the rise of Internet of Things. And when machine learning and AI will be increasingly shaping decision-making and how societies function so that the rules for data governance today have to be adaptable for these future waves of technology and their impact on all aspects of society. So I think when we are at this inflection point now, there will be significant consequences if we don't get this right, either through inaction or making policies that are based on assumptions that have been untested. So for example, one of the questions I'm going to look at is does China have a data advantage? What does that really mean? What are the trade-offs between innovation, security and personal liberty? If we swing too far in the direction of privacy, what does that mean for economic power that comes from harnessing data? But also, if we go the opposite direction and we use China as an example for not to regulate, not to put checks on companies, then do we create resentment and mistrust of U.S. tech companies around the world at a moment where we are potentially looking at a more divided Internet where governments and consumers are going to be looking at choosing Chinese or U.S. companies. And so creating trust is going to be really pivotal to the influence and the global reach of U.S. technology companies. And this is all going on in an environment where none of this is zero-sum. This is a competition defined as much by interconnection as data does not map neatly onto international borders nor does technological innovation where you have diffused teams of engineers and researchers working on similar problem sets. So getting this right is going to be very important from a national security perspective, from a competitiveness standpoint and also from the perspective of civil liberties again. So I think that this is just the beginning of the conversation and with that I'd like to welcome the panel and ask us to come up and begin the conversation. Thank you. Yeah, here I'll come over here. Thank you. So let me introduce the members of the panel whom you have not met. So on the far left is about Buschke, who is the head of the Brussels office of the Consumer Policy Division of the Federation of German Consumer Organizations. So I just want to make clear that means you are not part of the EU establishment. You are representing a very important German organization in Brussels. I just say that because often people hear Brussels and they assume EU. I will just say though, this is, I actually think in data and the economist agrees with me, Europe is a superpower also. Europe is actually way ahead of the United States in regulation and the pressure of consumers in Europe is much stronger than it has been in the United States. And if you understand competition around standards, the EU's standards, the EU beats the US hands down on regulatory standards around the world. And those standards are really important because they are what other companies have to comply with. So we're delighted to have you. And on Isabella's right is Michael Chertoff, Secretary Michael Chertoff who is the former US Secretary of Homeland Security and indeed before that worked in the Justice Department and really spent his life as a lawyer and as a judge. He's now the co-founder and executive chairman of the Chertoff Group and most importantly for today's conversation, he's just written a fascinating book called Exploding Data, Reclaiming Our Security in the Digital Age. And I highly recommend it. The opening is not one you'll forget in a hurry. And he really I think addresses a lot of these issues. And then here on my left we have Sam Sacks. So I want to start just by asking you to respond to what Sam presented and specifically her point that we've been deeply focused on privacy but not as much on privacy or indeed Secretary Chertoff you talk about autonomy, the trade-offs between the individual liberties, let's call them autonomy and competition and innovation and the need for governments to actually have access to a lot of this data to keep us safe. So she frames this debate as a great power debate around three pillars and I'd love to hear your reactions and Isabella I'll start with you. Right, thanks a lot for the introduction. Maybe just to give you a feeling because I know that American audiences are not always familiar with the structures in the EU. So my organization is the Civil Society representation of the 80 million roughly German consumers. We get a lot of complaints bottom up. That's what we use to produce our policy positions. We have some right to private actions. So we unfortunately felt compelled to sue Facebook already four times over the last years because we sued Google a couple of times who are relatively successful with this. Right, that's a bit my day job and my day job is obviously to convince European policymakers that they do the right things for European consumers. On the triangle that Sam has laid out, it is an old debate in a new format probably because the pull between personal liberties and security for instance is one that is as old as democracy. And Europe is probably in a more complicated spot because we're witnessing from outside this tension between the United States and China and are looking introspectively what does that mean for us? What does that mean for our markets? GDPR has probably shown that in case of doubt, Europe will always go for the fundamental rights for constitution, for democracy, etc. At the same time, we know that we're importing a lot of software, IoT devices and other technology from China and from the US and we're exporting. Just mentioning car manufacturers, China is a big market. So just today there was an op-ed by our former minister of foreign affairs and his assumption was to say, Europe will find itself squeezed like a sandwich or the middle of the sandwich in between this war and that could actually lead to pick sides at some point. That's probably what Europe would not like to do, but we can always go for competition. The EU loves competition. This is where everyone is hesitant on the Huawei question, but that might not be possible ultimately. So the EU is definitely in that debate. Absolutely. Secretary Chertoff. Well, thanks, Anne-Marie. And thanks to Sam for hosting us and launching this. So I think that privacy is maybe too narrow a lens for which to look at the civil liberties issue we face nowadays. I tend to think of privacy. I think most people do is how do I keep things secret or hidden? And part of the reason I wrote the book was to explain to general audiences, forget it, the ship sailed. You can't keep it hidden. Not only are you generating a huge amount of data in ways you're not conscious of, but everybody's generating data about you. So anybody in this room who's going to tweet about what we're talking about or later write a blog post or put something on Facebook, that's all going to be ultimately uploaded to the cloud somewhere. And it quite possibly will be aggregated. And so I who do no social media will wind up on a bunch of social media sites through everything that you will do. So that's not to be critical, but it's to suggest that we've got to start to think about data in a different way than just how do I keep it hidden. And that's why I've come, I think, maybe to have a greater appreciation for what I think the European focus has been, which is the right to control data even once it's generated. Because that ultimately determines how data will be used in terms of affecting our lives. And that can be used to, for example, give us employment or deny us employment. If a potential employer looks at, for example, what we eat, how we drive, how much exercise we get, what books we read, what sites we search, and makes a judgment about whether this is desirable or not in an employee. So thinking about how these things are used becomes, I think, really fundamental to the way we live going forward. On the other hand, to talk about the tension, that's not to say that all the balance should shift in favor of absolute control on the part of the data subject with no ability of government, for example, to get data under appropriate supervision. Because one of the things that I learned through my experience on September 11th when I was at the Department of Justice, is that in the modern world, security is not any longer just a matter of having radar or the typical things we consider when we imagine attacks in the physical world. But it's about the data to understand people whose behavior and activity might pose a threat. And when you live in an age where, unfortunately, every couple of days you read about someone who walks into a school or a house of worship and shoots people up or gets behind the wheel of a car and runs people over the London Bridge. And the question is how do you determine whether someone is a threat and stop that? And data can be very useful in that respect if we wind up having the appropriate safeguards. And the final point is this, we are democracies. And what we're discovering more and more is that data can be both a tool to promote democracy and to undermine democracy. The promotion was one of the dreams that people had with the Internet. This is great. We're going to have all the dissidents who will be able to reach out and express themselves. We're going to have the Arab Spring. Fast forward, we've learned in the last several years that the Internet can be a very powerful force using data like Cambridge Analytica to target and manipulate citizens in order to undermine our electoral processes. So I think all of these various values that come into conflict require us to step back and really fundamentally ask how do we promote the things we care about, protect against the things we worry about and do it in a way that can be reconciled. Thanks. So Sam, I'm going to ask you to respond and then to talk a little more about China. But it's interesting listening to you both and the way that Maricha framed it also as open versus closed and democracies versus non-democracies. I'm very struck, and Isabel, as you were talking also about Europe's choices, Joshua Cooper Ramo wrote a book called The Seventh Sense a couple of years ago in which he ultimately said the world is going to become a set of electronic gate lands where you're going to be within a electronically protected virtual world with very high walls. And he said, and he lives in Beijing, and some of what he was writing I think was very influenced by the Chinese view. I'm going to ask Sam, but China would have its gate land, the United States would have its gate land, and Europe would join the United States' gate land. So he was precluding the idea that you could play both roles. But Sam, why don't you respond but then you are representing China here. I'd love to have you talk a little more about how China thinks about data governance and specifically does it see data governance as part of geopolitical competition. Yeah. We've been doing a lot of work on Chinese data governance. We have a project called Charting Chinese Data Governance where we've translated and analyzed many different developments in China on this front. And the way that I've been looking at it is I think that China has what I call a split identity when it comes to data privacy. This open and closed framework sometimes needs to be broken down even further. So in the past two years the Chinese government has rolled out many different regulations and standards to put a check on how companies are collecting, using, and sharing data. So I think a lot of people would go, wait a second, isn't China the wild west of data? Isn't it a free for all? The reality is that's changing and companies are being audited on how they are using data. But not using our ranking digital rights, I guess. Right. Exactly. But at the same time you also have more government tools to access that data. So just the other week China issued a draft data security measure. We translated it. It's up on our website. And what was so interesting to me is in one regulation you have things that actually look a lot like GDPR. Third party frameworks consent to collect in very detailed granular ways. But you also have articles that explicitly say for national security reasons the government can access that data. So how do you make sense of both of these things at once? And I think it's also telling that today on June 4th, which is a really important day in Chinese history, we can't forget that this is a government that has used authoritarian means and technology is very much a part of that vision today. But sometimes looking at China is like you have to hold to contradict contradiction in your head. Right. So this is a government that's also rolling out expansive measures to in some ways put a check on what companies are doing while they're very troubling uses of that technology going on at the same time. So it's not black and white. And does China think of its regime in extra territorial terms in the way that the GDPR is extra territorial in the way that the U.S. invented this with our antitrust laws, right? We have applied them all around the world. But do they think of that explicitly? So China was very focused on GDPR. The Chinese internet companies Alibaba, Tencent and others hired the best lawyers to prepare for GDPR compliance and this sort of vision of having China be a important global player in high tech sectors. They want their companies to be out there competing in global markets. But I think we're going to see a lot of friction, particularly coming from the perspective of, you know, our Chinese companies going to be able to compete in Europe when there are a lot of questions around that government access to data issue. Interesting. Isabel, you may want to respond. And let me also ask you, does GDPR work? My experience of GDPR is now that whenever I'm in Europe, I get a pop-up every three seconds. And I do exactly what I do on all American pop-ups, which I check, yes, with cookies, because otherwise I can't get to the website, which I doubt it doesn't really seem to be enhancing my autonomy in the way that you're talking about it. I think that's a super important question. And I love to debate this also in non-European audiences because my experience has been that GDPR very often is also misconstrued or not very well understood necessarily. I mean, in a nutshell, it basically says you're not allowed to collect data unless you have one out of six good reasons, right? It's laid out. And then if you have one out of six good reasons, you need to make sure that you do this and this and that. And then you can collect as much data as you want, basically. I'm simplifying. But it basically, it is obliging the collecting entity. And there we're talking government administration. We're talking private entities. We should not forget that EU privacy laws always come from what is my right vis-à-vis the state. That is the origin. And the moment the state could ask big companies that hold a lot of data, the data, we started looking at companies, right? We should not forget that. But it's basically telling companies and government how they are supposed to do things that they should know, where the data comes from, where they store it, what they do with it, so that in case someone wants to have it erased, they can find it. That's in a nutshell what GDPR does. Is it working? We're one year into GDPR. I grant you that ticking boxes of cookies is not the right thing. The problem here is that the cookies is actually not regulated by GDPR, but by pending privacy addendum, if you want, what's called the ePrivacy Regulation, that was supposed to enter into force at the same time as GDPR, but didn't. I spare you the details, we're still negotiating. So unfortunately on this one, I don't have reassuring news that is gonna change anytime soon. What we see on the other hand, you mentioned that. Is it killing innovation? I think the most reassuring official message that we found out there was the Chamber of Commerce in the EU basically saying, this was a bit overinflated. We cannot see that any company went bust because of GDPR, it needs time to adjust. And I think this is also why an assessment at this point in time is a bit complicated because European data protection authorities also explicitly wanted to give a bit of time for everyone to adjust and not crack down immediately on any infringement. So what we see is that a lot of big companies are trying to get away with cosmetic changes and I think over the next 12 years, a 12-1, sorry, will probably see more focus on those practices. Whereas generally speaking, the satisfaction is relatively high. So that's on how, the question like, is it working or is it not working? The other question is who's taking a close look at GDPR and taking it over, for instance, in China if I take my European hat, even though I'm not part of the establishment, I think the European Commission and the European stakeholders are incredibly proud of this piece. Their biggest selling point was always, we wanna lead globally on privacy standards and if we honest, it worked a little bit because everyone was paying attention and is looking at rolling similar things out. So from that point of view, it emboldened, I think, the Europeans to maybe look at other areas where something needs to happen. What I'm cautious about is just because you take over words in standards, in other types of law, do we talk about the same concepts? We come from a fundamental rights point of view. So privacy, private life is a fundamental right in the European Union that then you have to explain how it works and then you write a regulation in Europe. But if you don't have fundamental rights necessarily in your political environment because, for instance, you don't have democracy, what does it mean to have consent? What does it mean for me as an individual to have the right to get some data erased? Can I enforce this? And even if I can enforce it, does it have any consequences for me as an individual? I think this is really the chain we need to look at and not just the pure words that I written somewhere on paper. So that teased up, Michael, a perfect moment for you to talk about how you would imagine what we should do in the US. So now assume the administration comes to you and says, okay, Europe has the GDPR and sees itself as the global regulator and does come from this very strong anti-government. I mean, the concern about protecting privacy from the government, of course, is rooted in a set of very, very unhappy experiences with government. And China is thinking about it, but you are a former Secretary of Homeland Security and a former Justice Department official. What should the US regime look like? Well, so first, I'd like to have the same reaction you do, I'm ready to, when I get these things about cookies, you often just figure, well, I can't get on the website otherwise, and that brings up the first thing I would do. I mean, I think generally the idea of giving people the right to know what is happening to their data and having a say in whether it is transmitted or used, I agree with it. I think we ought to head that way and we've already seen some states moving in that direction. But there's gotta be a real choice. And particularly when you have what's effectively a monopoly because of the network effect, there isn't a choice. I mean, if you decide you don't want to accept the terms and conditions, you're shut out of the program. So I think two things are necessary for a choice. One is some companies, when they reach a certain level of domination of the marketplace, ought to be told you have to give consumers a choice. They can either pay for the service with their data or they can pay with money. Now, I'm not saying that the company needs to give its resources away for free, but you shouldn't only be able to use your personal data as the coin of the route because I think that's one element of choice. The second element is data portability. If I'm on a social media platform and I decide I don't like that platform anymore, I should have a very easy way to move to another platform. Without this, consent is an illusion. As far as the government is concerned, oddly, and this may seem counterintuitive, the government already lives under a regime of regulation about acquisition of data that is largely honored in the United States. I mean, it's very rare to have it violated and the sanctions for violating it are very strong. And occasionally there's a debate and there's an adjustment, but actually, if you're familiar with what is necessary for the government to access that in terms of the permission it needs and the reasons it has to give for justification, it's pretty robust. The one thing which I do think we need to consider is we now live in a world where the significance of data is often not evident on the face of the data itself. So on the one hand, you need to have the ability to sometimes have data held so that it's available to be looked at when you suddenly get more information. At the same time, you don't want to necessarily give the government the right once it holds the data to willy-nilly look at it and use it for various purposes. So I think there are ways to take a more finely grained approach to how we regulate the government in terms of its ability to look at data for security purposes. That's fascinating. So your two sort of sine qua non are you have to have the choice of not paying for services with your data. And yes, that at least means people can say it's worth this much to me not to share my data and then be able to be sure that you're not and the portability question, which if we all think about our cell phones, what made competition possible, although it's still not easy, is being able to transfer your number. If you couldn't transfer your cell phone number, you're locked in forever to whatever your original carrier is. So I do think that's those two make a lot of sense. So Sam, if you think about, let's assume the US then adopts something like this. We have the EU, both the GDPR and the privacy regulations. We have the US pushing a little harder on the previous question. So China has, and you're right, China already has requirements for its companies. I assume that applies to US companies or foreign companies who are in China. Does China think about the rest of Southeast Asia say as part of its data governance world, this gate land idea, or just spheres of influence, right? That is what great power competition has always been is spheres of influence. How much is the Chinese government consciously thinking about data competition? This is a great question because I think one of the trends that we've seen a lot over the past year or so has been a proliferation of rules for cyberspace and the digital economy that look a lot like China's. So for example, Vietnam has its own sort of mini cybersecurity law. The India has a data localization requirement. So one of the- Which means what? Which means that certain kinds of data would be required to be stored on servers in India. So if you're a company, that means setting up a duplicate data center, right? So one of the questions is, is this a deliberate effort to export a cybersecurity model or are there reasons intrinsic to other governments that make that approach attractive? And I think here we get at an important point, which is that there are even fault lines within the Chinese system. When we- Let's go back to this point about data localization. You have Chinese internet companies that are going global and need to be able to send data across borders. Southeast Asia is a really important market for those companies. So I don't see data localization being something that would be advantageous and actually would be at sort of odds with the Chinese government and commercial ambitions to be global players in that space. So I don't see necessarily an attempt to let's have other countries also localize data in the way that we have, right? I just don't see that as part of the aspiration. But we do have to take into consideration that with investments and technologies comes an approach to governing those technologies. And this is part of that great power competition. What rules will prevail as we think about setting norms and standards for new technologies? So you can see- Go ahead. Let me just say one thing, and at the risk of being maybe a little bit discontent of the picnic. I mean, we're not- Welcome, Scott. When I look at the issue of China, I mean, there are two things that I can't overlook. One is the weavers and the application of surveillance technology to basically take a huge chunk of population and put them under a level of surveillance and scrutiny that would not have been possible years ago. The second is freedom housing. I will disclose it on Chairman of the Board of Freedom House. They had a report on freedom of the net at the end of last year with an essay they talked about how China has been exporting its technology, as Sam said, to other countries and they even bring government leaders back to school them and how this can be used as a surveillance tool in their own countries. So in many ways, this is an approach to using the internet as a tool of advancing a kind of like-minded approach to governance, getting other countries to buy into it. And I think that is a critical element of great power competition. Absolutely. We've been exporting democracy for a long time and they can export however they characterize their system. Yes, Isabel? Yeah, thank you. I'd like to add another battlefield here to this thing actually. Setting the rules. Who's setting the rules and how are we doing this? Sam mentioned digital trade earlier, I think in your introduction to your project. Don't know who's familiar with WTO proceedings and what is happening in the trade world at the moment other than the trade wars, but there is a debate at the margins of WTO of 76 states, I think including China, including the EU, including the United States about digital trade. I mean, this has been around for a long time. How do we ensure data flows, free data flows and so on? One of the objectives though of those particular talks are particularly interesting because they are looking at banning the mandatory disclosure of algorithms. So how can we interpret this? This is just a bullet point, but companies like German car manufacturers, for instance, have long complained about the fact that when they want to invest in China, they have to hand over their technology. I mean, quotes obviously, but they need to give the government access to their technology, including algorithms if this is what they do. And we interpret this as a push to basically say this should not be, we should agree as those nations to not make this mandatory anymore. So why do I mention this? Because this has much wider implications, right? Everything we agree at a trade at an international level basically trumps our domestic laws because it's international law. Not in the United States, but in most places. In the EU it does. So our focus on this is we are all, all the three regions that we're representing here in a way on this panel. We're all trying to figure out what could be data governance regime? What do we need to do in terms of IoT? Is there anything we need to do on AI? It's very preliminary debates, but you can see that there's discussion on all sides. If you now say in a blanket way, nobody's ever allowed to ever look into any algorithm, that might create problems in the future because we might want to enable our supervisory authorities, be it for high frequency trading and financial markets, or be it for autonomous vehicles, that there's an independent check. We had a discussion yesterday, we're talking about Boeing. Maybe we want someone and maybe more than one person or one entity to look into those things. If you foreclose it at a trade level, that might have a lot of unintended consequences. So this focus on we don't like what the Chinese government does and we try to sort of get it out of the way, we always need to have the full picture. I'd love to hear your reactions because the next question I was going to ask you anticipated, which is until you just said it, there's been no mention of global organizations. You just raised the WTO, but even 10 years ago, certainly 20 years ago, we would have talked about, ICANN, we would have talked about various UN panels. We would have assumed there was some global locus of governance and yet now we're mostly talking about different regional loci of governance. So I'd love to hear your reactions. Well, first of all, I do think obviously ICANN still plays a critical role. UN has a process in place to try to develop new arms. It's ultimately been bifurcated now. There's the group of governmental experts and then the Russians and the Chinese wanted a broader group called the Open Working Group. I'm on the same commission, Maricha is on global stability and cyberspace. We're trying to come up with norms. That's a UN commission. That's a multi-stakeholder commission with government support and private sector and civil society support and it tries to bring the multi-stakeholder perspective to norms. Here's the most interesting development in the area of global norms for cyber. The private sector, the tech companies have now really stepped up to drive their own conception of what global norms are to be in. As you know, Microsoft spoke about kind of an IT or a cyber Geneva convention. There are similar groups being set up and that's because much of the actual capability is in private hands as well as the understanding that the private sector has a strong interest in making sure you have an acceptable global arrangement. So I think we will, along the lines of what you said earlier, I think we're going to see global governance small g but not with a big organization dominated by the traditional kind of UN model of everybody gets a vote but really through a net and through a web of private and public parties who agree to adopt certain norms and enforce them and then that drives at least a significant part of the world, a common standard of behavior. The last thing I would say though is although I think it's important to avoid a fragmented internet and I think we can do that, I don't think we're going to have the seamless internet that maybe was envisioned years ago because I still think there are some fundamental differences in countries that may lead to a bifurcated internet where one is more or less seamless US, Europe, Japan and then there may be some countries where there's a little bit more of a barrier or a membrane in terms of the back and forth with the first category of countries. On the question of international frameworks, at the end of this month at the G20 in Japan, there's going to be a high level focus on data governance. Prime Minister Shinto Abe has said, let's put data governance front and center on the agenda. The Osaka track is going to sort of I think be the beginning of having a conversation on this. And so the question is given this different approach, I mean I think all of the regions we talked about today start from a fundamentally different place on who owns data, does the state own data, do companies own data to the individual and how you prioritize that? So I'm looking to Osaka to see, given this different understanding of data ownership, is it possible to begin to have some kind of framework or global conversations and principles on this? So that's really interesting and I'm going to have, we're going to have one more round of comments here and then we're turning it to you. So start to think about your questions. But that, so there are things happening globally, although Michael you gave me a wonderful example of what I call webcraft. So I talk about statecraft and webcraft and webcraft is exactly this sort of multi-stakeholder but different networks. So the corporate networks, definitely the civic groups and actually if you go back and think about the forerunners of the Geneva conventions, the Hague conventions before World War I, they were, I mean there was no League of Nations or UN so they also kind of emerged out of lawyers and governments but also business interests and civic interests. And at the same time the G20 is a global, it's itself a global network. It's a network of 20 leaders that started out of a network of finance leaders and now it's chiefs of heads of state and others. And even the UN is not, none of these entities are acting in a formal law making capacity. They're acting in a norm shaping capacity. It's a long way from something like the WTO or a treaty. So my last round of, or my last question, let's look forward. So we're talking a lot about autonomy now, competition and innovation now, security now. But we're just at the dawn of the two next big waves of digital technology. So the internet of things, which a number of you have mentioned, but where for all the talk about my coffee pot, talking to my refrigerator, sending me a shopping list, it's not in evidence and I frankly, it's fine. I'm good with my little list or at least my app. And of course AI and machine learning just at the beginning. So talk a little bit about how you think about these issues as we look forward to those technologies. And Isabel I'll start with you and we'll come down the row and then turn it to the audience. Right, yeah, that's basically the most fascinating I can work on at the moment. Obviously I look for my consumer glasses, right? What is probably the most fundamental debate we need to have and that comes from the privacy debate that is the direct transition is who, we're not talking about who owns the data in Europe because you can produce more and more of the same data. It's not something that once you've used it, it's gone. It stays there, I can produce more data about myself and just because I know my date of birth doesn't mean that you cannot know it, right? So we're not talking about data ownership but obviously it's about who has access to that data and can do something with it. There's a big competition issue there. So the question for us in Europe is really about if you wanna drive innovation, we're talking about AI. If you want to make solid AI that's really reliable. What kind of data do we need to have? Is data that was generated in regions with a completely different set of population with different cultural preferences and so on and so forth. Is that really going to deliver reliable devices and services, say in Europe but also elsewhere? That is a big question. And can we really uphold our market principle, the most fundamental market principle that we have? Competition is driving innovation, competition is driving consumer prices down, competition is the basis of all our markets. But if we see this concentrated in the hands of a few big players, no matter whether it's American players or Chinese players, we have a problem there. So how do we make sure that also smaller players can gain access to that data under what condition? How do you make sure that it has a certain quality and so on? So that's the competition angle of it and the access to data question and we're not anywhere near of solving this. And then the other question is, AI is going to replace human decisions. It's going to replace a human decision about whether I see a job advertising or not. It is maybe going to replace whether I get a certain health treatment. Do we want this? We need to have a conversation as a society where AI can actually step in and what I mentioned earlier, how can we check that it actually does what it's supposed to do? Who is allowed to check that? So those are the big debates that we see coming up for Europe. I have to say, it's just striking to listen to a European talk about competition being the absolute grunt norm of your economy where in the US at the moment we're not actually enforcing our competition policy. That is not the way most Americans have traditionally thought about the US and Europe, but it's certainly true now, Michael. So you may detect I'm a little skeptical about the magic of technology. I guess with AI there are two things that I think we're going to need to confront. One is I think we need a dose of humility about whether artificial intelligence is really that intelligent. Because ultimately the foundations and the framework of the algorithm is designed by humans, even though ultimately the theories will be iterations of machines that build on that. But if there's a flaw in the initial design, it's not going to go away. And for those who have a magical admiration for engineers, I have two words for you, 737 max. I mean the lesson out of that, those tragedies appears to be that by driving the pilots and the human beings out of this decision tree some fundamental errors in the creation of the system resulted in a large number of deaths. And I think there's a moment you're going to do a deep breath. The second thing is there are some really dangerous things that can be done with AI when you harvest a lot of data. And those who read the paper will know that periodically we hear about millions and millions of accounts hacked and stolen from Yahoo from a major hotel company from the Office of Personnel Management. And these are 20, 50, 70 million files. An amount of data that no human being or criminal gang could make use of but would be very useful to a nation state in developing a comprehensive profile of every citizen in another country for all kinds of intelligence purposes or in order to try to affect elections or things of that sort. So we need to start considering that the aggregation and the analysis of this data can become actually part of geopolitical conflict. Great, thank you. Sam, you've got two minutes, according to Robin, we'll turn it to the audience. Henry, I think about the article that you wrote looking at the so-called competition in AI between the US and China. And he said something which was really striking, which is that the competition may ultimately be about who can harness technology for the good of humanity. And so are there ways that we can think about writing these rules? And this gets back to my question about what is the trade off? Is there a trade off? Can we have protection and autonomy of data be a competitive factor in terms of driving demand for that technology? Is the technology going to be put in, is there a framework to funnel that in a way that's going to be ultimately beneficial to humanity? As Isabel mentioned, what kind of data do you need to innovate? And so I see rules that really grapple with a more nuanced look at this sort of trade off with a future where we need to go. Great, thank you. All right, so raise your hand and here in the, and then when you get the mic, introduce yourself. Hi, my name is Alex Jurgen. I've really enjoyed this panel so far. My question is on data portability and AI. For, let's say, open societies to develop effective AI engines to be able to take this to its commercial and other fullest extent, do you think that there needs to be data portability so that people can sort of put their data in one place in whatever recommendations or anything you're trying to come from AI that can be more accurate? Michael, I think that one's for you. Yeah, I think I recognize that, and as I said in my final remarks a couple of minutes ago, that in terms of growing AI quickly, forcing all the data into an algorithm does feed the beast. And I think that maybe part of what's behind the Chinese movement on AI. But I think fundamentally what that does is it subordinates the individual's control of their own life to a social issue. And maybe that's a fundamental divide that I still think in the West we believe that the individual's ability to control their destiny is a fundamental value. It may be that in other cultures, the view is no, the community's interest trumps, if I can use that word, the individual's interest and therefore it has to be subordinated. So I think ultimately as with most of what we're talking about these issues of portability and control of data really bore that to what we see as the kind of essence of the good life. Thanks, Maureen Baxter from Inside Cybersecurity. I wonder if you can speak to the need for privacy and cybersecurity rules as far as they could legitimize the push to ban Huawei's use in other countries. So if we're saying that Huawei's national security threat, not disclosing reasons for it at the WTO, what is our due diligence to protect from other threats? Because there are a lot of arguments that the cybersecurity threats are agnostic to Huawei technology. Does that question make sense? So are you accepting the premise that Huawei is a threat and asking about how we protect against similar ones? Or are you questioning whether Huawei is a threat? I'm being generous in saying, let's assume there's a reasonable threat there. Then what is the onus to take other measures to make the case for that being a legitimate threat and for others taking similar actions so that it's not seen as like, okay, we're just protecting our economic interests. This is just a great power competition. Yeah, great. I think that the question gets at the importance of, as we look to a 5G world and you have exponentially more data flowing around, you have physical systems that are controlled by software, water, energy, transportation, right? The decisions about security in that environment, as Secretary Chertof mentioned, are going to be more important than ever, whether it's Huawei, whichever the vendor is, security in a 5G data intensive environment is going to be critical. And so if it's a question of, are you protecting your critical infrastructure as the sort of core and the edge merge, this just becomes more and more important. So I don't think that it's purely going to be an economic or protectionist argument. It is going to have significant safety and security issues. I don't know if you wanna add anything to that. No, I think, look, I mean, it's not just a question of data being stolen, but it's a question also of if you control the infrastructure over which 5G data moves, you actually can affect the physical environment. And I think there's a concern, I think obviously the US has to make the case to other countries that whoever is embedded in that infrastructure really has the commanding heights, not only of national security, but of the global economy over the next decade. I'll just note that you can already see the debate shaping up in the same way that they taunt, the debate over the day taunt happened during the Cold War, which is the British say, yes, we understand Huawei as a potential threat, but we would rather engage them. And if we engage them, A, we learn more about them and B, we have relationships and that's actually very valuable in the case of global crisis. And the US is taking a much harder line that says, nope, we need to actually sever relations and keep them out. That was, when you talk about geopolitical competition, that's been part of every debate I've ever been, in terms of do we engage other countries who are adversarial to us, or do we take them much harder? And I wanna just add briefly to that, is there a way that we can, given the entanglement and interconnection, can we have something that came up at our dinner last night was entanglement with conditions? Is that even possible? That sounds like hair. Right? And what would that look like, right? Hair. You've had your head. Thank you, this is a great panel. I wanna push the panel a little bit harder on the great power competition piece. So question in two parts. As we look forward to an age of machine learning, artificial intelligence, do we think the growth in data, the amount of data is going to strengthen or weaken national governments? And the second part to that is, do we think the data governance regimes in the three markets that we've been discussing are gonna make a difference between the sort of relative power between governments and companies and what that means for the great power competition? Great. You as well, wanna go first? I think that the debate about AI is a bit more complex because firstly it was all about big data and we need more data and more granular data and more and more and more. And now the debate is shifting. Can actually synthetic data produce better results because it's more reliable, it's more accurate, et cetera, et cetera? So the growth in data might not be the sole important indicator. However, the word data is also a bit fluffy, right? What are we talking about? I would just talk about the raw data that is required for training, for instance, an algorithm and then ultimately make it a stack and have AI. That's not the only thing. If you have a really bad algorithm, you can feed it with excellent tons of data, it's not gonna produce any value anywhere. So it is a bit more encompassing the power over data and the power over good engineers and the power over understanding how certain things work is certainly going to shape global power more widely. And that goes for security, that goes for economic terms. And that will also go for individual freedom. So that was the first part of the question, right? Yeah. Will the data governance structure make any difference? Of course it will, but it's data governance is sort of the cherry. It's the difference lies at a much more fundamental level, right? What is the difference between a democracy and authoritarian regime? And then as I explained in the beginning, this is then the outcome of what your data governance structure looks like and what will be possible and what will be impossible in that data governance regime. My bet would be at the moment the most liberal laissez-faire regime is not necessarily going to be the most successful. I just would add that I think the explosion of data is gonna strengthen governments and very large private institutions, both because the ability to have the data and then build algorithms around it and use it for various purposes will give the advantage to those that have a concentration of the data. And it can be used for good or for ill. I'll just add briefly to what has been said. I'm interested in discussions that are happening more in technology circles, which gets away from the question of growth and scale of the data to what kind of data and so is it possible to have privacy protective use of data which would then sort of get at this sort of tension between, as Isabel said, the open model and the closed model. So you can have both an open model but in a way that is going to engender more consumer trust. So I think there's some interesting discussions going on among technologists. Like federated learning, for example, where you can sort of have, you can train the algorithms without having as many sort of risks of privacy and that's just beginning. So I'm interested to see where that's gonna come out. And I also wonder about this assumption that it's the network effects for large companies necessarily continue. Because so we know, for instance, that computer coders are one of the categories of jobs that are going down. There will be less need for coders than there are now. Why? Because code is gonna become like Lego. You're gonna have blocks of code and anybody who does it without an engineering degree can actually be able to put things together. I wonder similarly, if there aren't ways of regulating and pooling data in ways that just as it's now much easier to have a startup, you can have an accountant here and you could have a HR person there and the things that only big companies could do that now small companies could do. That's not been true where you talk about massive amounts of data but if you think about data utilities, I wonder if there isn't some in between. All right, I need to call on somebody way in the back. Yes, I'm on time and I'll come back up here. Yes. Yeah, I can't see anybody behind you so. Hi, my name is Elvira Maranan. I'm rising throughout American University and thank you for the wonderful panel. So we understand that privacy and security are distinct but often varies as so closely associated topics and we've seen a divergence in ideologies with privacy but security hasn't been really touched upon and understood Mr. Shertoff talk about a digital Geneva convention and with the rise of 5G IoT devices and botnets ransomware and these other types of things, do you think we need more consensus now sooner than later with the deployment of 5G and IoT? Consensus on security? Well, I think it's a very good point and part of what we're doing on this global commission is to try to come up with some norms that would basically define the rules of conflict in a way that would preserve the internet for everybody. So for example, one of the norms we've promoted that's been adopted or endorsed by at least a few countries and international institutions is protecting the public core of the internet which is to say don't take action that will fundamentally destroy the availability or integrity of the availability of the internet to the general public. That doesn't mean you can't have a targeted attack on something if you wind up in a conflict but just as we don't bomb hospitals, you don't simply wipe the internet out and one of the areas we've seen this in is not just the hardware infrastructure but for example the domain name system. There was a discovery recently that networks in 13 countries had been hacked because someone had managed to pervert the DNS system as it applied to certain domains and redirect the traffic from the destined location to a location that was actually controlled by the hackers. So again, systems like the protocols that run the internet, the hardware and the software that are the basic infrastructure is an area where I do think we need to have a global agreement. Otherwise, we will actually destroy the ability of the internet to operate as a global activity. Right? Yeah, maybe to add on that, I agree with you. I also think that this development might be accelerated at some point because you mentioned IoT, you mentioned connected devices. The main feature of a connected device is that it's connected to the internet, to every one of us, to every other thing on the internet unless those are really safely protected. And with the proliferation of those devices, you will see a much higher risk of something blowing up somewhere. I mean, in my organization, we're testing connected dolls that spy on your kids. We connected fridges and toasters that explode. There's lots of stuff around. Most of it is not secure, it's not even the most basic things like admin password 000 and these things like that. This is the mainstream. This will not be able to last for much longer. And the more everyone is threatened by those devices and much more fundamental infrastructure elements are threatened by those things, I think the more quickly we'll come around the table and actually have a proper conversation of what safety stands we want to agree on. Well, it sounds like working at consumer reports is gonna get a lot more exciting. It is. It's here in the front. Right here. Hi, Adam Schrupp with the Voice of America. I was wondering if the panel could comment on the sort of disconnect between what the government says and what the government does when you come to the China model. We've seen over the years the sort of aggregation of cyber laws. We've had the national security law, the national cyber security law and the proposed data privacy law. But on the other hand, China doesn't have the most strong track record of rule of law. And I was just wondering what does that mean for Greek power competition? We know that China is not necessarily going to do what it says on paper, but on the other hand, it has a lot of stronger state power in doing a lot of things that might be conceived as intrusion to privacy, et cetera. Thank you. Thank you very much. The way that technology is being deployed in China is very troubling. Secretary Chertoff talked about Xinjiang. As I mentioned today, we are here on June 4th and it's a reminder of sort of the use of government power. I think that these laws are interesting because the way that I think of Chinese law and I have friends who are Chinese legal scholars here in the audience is that Chinese law is written deliberately very broadly and ambiguously with the idea that this is a tool that the Communist Party of China can use as an instrument as it sees fit. So sometimes that can lead to sort of selective enforcement of these things. I think we've also seen a mirroring of that in recent developments here in the United States. The Trump administration recently wrote an executive order which in my mind is far beyond Huawei in what it could potentially do. Under this executive order, I did a piece where I actually compared the executive order to China's cybersecurity law because it has in there expansive powers that would say the US can block any transaction related to the ICT sector by a foreign adversary based on national security. If you were to do sort of a blind test and go is this China's cybersecurity law or is this coming from the US? I don't think many people could tell the difference. So what I urged in this piece is let's not go down the direction and make the mistakes that the Chinese government has made in its sweeping definition of national security. They're on the very back, yep. Good morning, Lieutenant Colonel Ty Lewis, AI Task Force. Question has to do with local storage. I think that I understood that someone said that they didn't care where the data originated or store it from. And so with AI and machine learning, if we're gonna be doing processing of data maybe locally or globally, now what kind of impact does that have? And how do you account for trust and confidence in your overall results, even if you don't have to show what's within the algorithm itself? Great, thank you. Who wants to tackle that one? I guess I can start by saying is I think one of the challenges and I've heard people in a number of people in government raise this issue is the question of whether data being held in certain places is gonna create an issue in terms of whether that's being used to build artificial intelligence without the permission or the knowledge of the data subject. You know, generally, I think I take the view I'm against data localization. There shouldn't be a rule that you now get in a number of countries, including Europe, that you have to keep data within the boundaries of the country because I do think we need global rules about how you access that. That being said, as I think Sam just mentioned, there is a law in China that says that for national security purposes, the Chinese government can access any data that's being held there. So if you have a company that winds up taking US data and moving it to China and holding it there, you have to operate on the assumption that is gonna be fed into the artificial intelligence algorithm. And I do think that raises a concern from a geopolitical standpoint about whether there needs to be a restriction on the movement of that data. So, I mean, to some extent, the question of free movement of data is also gonna be a function of the strength of the rule of law and the commitment to the rule of law in certain countries where the data may be held. Great. So it's one minute to 11. So rather than taking another question, I'm gonna actually give each of my panelists, if you have a last word, a chance to say a last word, and then we will get you out of here on time. Thank you. And if you don't, you don't have to. Yeah, no, exactly. I think a lot has been said. I'm not gonna add any new ideas. I just wanted to thank you and you for the invitation. And I'm very curious to see what will come out of your research. Great. I think this is a great project. Thank you for including me in the kickoff. And I would just leave one thought. Maybe I've been somewhat critical of China or skeptical. I do think though, as you said earlier, we still need to engage, but we need to engage with eyes wide open. There are areas there can be agreement where there's mutual interest and we've seen that historically. On the other hand, you don't wanna be Pollyanna Schabin. Entangle but with conditions. Right. Yeah. For your hair. Yeah. Any last words? No, I think that Secretary Chertoff captured it well and I look forward to engaging with all of you more as the study unfolds. So thanks very much again. So I will thank all of you and thank all of you for coming and for your questions. And this really is a, as I said, it's a very new America project. It'll be at least two years. And I'm sure we'll have other events. It's part of the cybersecurity initiatives effort to combine things that have been traditionally very siloed. Cybersecurity, even in the national security world with this tiny little area that only super experts were presumed to be able to engage, that's ridiculous. Cybersecurity is all of our security. These kinds of big tech issues and the governance of tech are part of the world we live in whether we think about it from a consumer privacy or consumer welfare position or from a homeland security position or from great power and finally from a human rights position. I mean, as I think about these issues I do think about open versus closed but I also think about those of us who want to ensure that technology serves humanity rather than risking a world in which humanity serves technology. So with that, thank you all and have a good morning. Thank you. Thank you so much. You're welcome. Thank you. What a pleasure. Thank you so much. Thank you very much. Thank you. We gotta go back.