 Okay, good morning, everybody. Thank you for joining us. Welcome to Carnegie Endowment for International Peace. My name's Dennis McDonough, a visiting senior fellow here at Carnegie. It's my distinct honor and great privilege to introduce to you the Deputy Prime Minister and Interior Minister of our valued ally, the Netherlands. Minister Olingren has been in this position of Interior Minister and Deputy Prime Minister since the October of 2017. Has had two days of intensive meetings here yesterday with our government colleagues, including DHS, to talk about these matters this morning here with us and several representatives of leading tech firms to discuss their view of this enormous challenge. I come to this issue not only with intense personal interest, but with a hardened experience at the front row, with a front row view of the last, it's been a significant Russian intervention in our elections in the 2016 elections. So it's with great enthusiasm that we were asked to host the minister on her visit for this part of her visit. So it's a great honor to have everybody here. We look very much forward to your remarks, Madam Minister, and we look forward to the discussion afterwards. So without further ado, the Deputy Prime Minister and Interior Minister of the Netherlands. Ma'am, over to you. Thank you. Good morning, everyone. Thank you. Ladies and gentlemen, we are here to lay the foundations upon which the statesmen of Western democracies may stand and to create an atmosphere favorable to the decisions to which they may be led. Well, you might think that would be quite an ambitious task for today, but luckily it isn't our task, because these are not my words, but Churchill. Churchill, he said this in 1984 at the Congress of Europe. He was, of course, one of the most important founders of Europe and his primary objective back then in 1948 was to make it clear that the key issue was cooperation. Following the catastrophe that was World War II, what was needed more than anything was the willingness to work together. Because in the trenches in the face of the Shell Fire and the First World War and afterwards in the concentration camps and in bombed out cities in the gulags and while queuing for food, men and women dreamed of an undivided and free Europe. And thanks to the sacrifice of American young men, boys, 18, 19 years old, on the beaches of Normandy, and thanks to the martial aid that followed the war, that free Europe became a reality. And thanks to people like Winston Churchill, who through their strength showed us the signs of hope and undivided Europe arose around us. However, this European project, it's never complete. It's always in a state of flux. Cooperation in Europe is remarkable for its ambitions and also for its perseverance. It's a task that requires our constant attention. It requires hard work. It's a form of cooperation that for many people is sort of a constant source of surprise and sometimes even admiration. How could it be that a group of countries once so torn apart by war, cooperate so closely? How is it possible that countries are willing to give up their own currencies for a new single currency? How is it possible for a group of countries to voluntarily surrender part of their national sovereignty because that's what we've done in Europe? It's been done with a remarkable speed. Europe has rebuilt itself and it's became one of the world's largest trading partnerships in the market with 500 million consumers. Nonetheless, the foundations on which Churchill referred are not purely economical. They do go deeper than that and they should. As a consequence of that devastating war and regimes of terror, we understand perhaps better than anyone. What it means to be robbed of the most elementary fundamental rights and basic human dignities. So in response over the last 70 years, we have joined Churchill's in stating that the fundamental rights of every citizen in Europe must be protected and that's not self-evident. It demands constant effort, certainly in the current era in which internal and external forces are threatening democracy and its intrinsic values. We're faced with a major challenge of upholding those values in the modern digital domain. So let me give you a couple of examples. Last May, the European General Data Protection Regulation, GDPR, was introduced. It's the aim of Europe, even in the digital world, to continue to protect the privacy of Europeans. Fundamental rights of which the founders of Europe in their age of telegrams and fixed telephones were probably not even aware. We don't only want to take the lead in guaranteeing privacy in the digital domain. We also hope to take the lead in countering cyber threats and disinformation, two phenomena that clearly threaten public confidence in fair and free elections. Recent examples have made that abundantly clear. During the referendum about Brexit in the UK, people went to the polling station with a pen because they were afraid that if they used a pencil, the mark would be rubbed out. And if they did use a pencil, the message was you should press hard. That is worrying. And last year we saw elections in Italy. A great deal of fake news was spread via social media. Politicians were accused of nepotism at taxpayer's expense. And it all turned out to be incorrect. But it increased the distrust of citizens in politics. And during the French presidential elections, the then candidate Macron reported that his party's website were the targets of cyber attacks. So my visit to Washington DC so far has strengthened me in saying the threats are real. On both sides of the Atlantic, we need to step up our game. Protecting elections is the core task of a government, making sure that they're free and that they're fair. However, it happens at a boundary line because as a government, you wish to protect the freedom of speech, the freedom of press, the freedom of debate. You don't want a censor, you don't want to restrict that. Government is not an arbitrator of truth. So our approach is twofold. Firstly, we step up our efforts to make sure that technology operates in the interests of democracy. Representatives of the big tech companies, Google, Facebook, Twitter and others, they signed the EU code of conduct against this information. In our view, self-regulation of this kind is meaningful and an important first step by these companies and they are taking responsibility. And that's very important. And I think we should continue to closely monitor these developments, but we need more transparency and engagement in the light of elections that are coming up in Europe made this year. Secondly, we step up to strengthen the resilience of our democratic systems and the Dutch government is planning a campaign to raise awareness on media literacy. The campaign will pass no judgements on the truthfulness of content. It is instead up to citizens to decide for themselves to be critical. Now, the reason I'm here, as I said, is that these problems are not exclusive to the European Union and to Europe. And that is why I'm so interested in understanding your views on these issues and to hear and to learn from your approach. So we need to reach out across the Atlantic. We need scientific communities, civil societies. We need all of us to help to tackle this issue. We representatives of a shared history, a history that stretches from Athens to Amsterdam and Washington DC, we share the civilization. That unity of values and desires, it presents us with new tasks and with new challenges. And the values of that civilization are universal and they're unique that in a way that they permeate our history and our solidarity with each other. The central lesson of that past is that if Europe and America are divided, the outcome is generally tragic. But if we stand together, then we can overcome any calamity. And perhaps today, we could again succeed in laying foundations upon which, as Churchill said, Western democracies may stand, but this time with a digital age. Thank you. Being here, I'd like to express my gratitude as well to Carnegie for inviting the Cypher Brief to come and moderate. My name is Suzanne Kelly. I'm the CEO and publisher of the Cypher Brief. And we are going to have to make a slight change to the agenda today. The deputy minister must leave a little bit early. So we're gonna wrap up the conversation around noon, but I would like to encourage a few questions. So to facilitate that, I'm going to go ahead and give everybody an opportunity to wanna introduce you, an opportunity to answer a couple of questions and then we'll let the audience get in if that's all right with everyone. So welcome. Thank you very much. Madam deputy prime minister, is that correct? Absolutely. Okay, well thank you very much for those opening comments. I think it's important to focus in on what you talked about with anti-fake news. And I wanna get to that in just one quick moment. I'd also like to welcome Matthew Travis. He is the first deputy director for DHS's Cyber Security and Infrastructure Security Agency. And also Mr. Thomas Rood, who's professor of strategic studies just up the street at Johns Hopkins. It's a pleasure to meet you and thank you for being here. So let me waste no time in getting right to this. You mentioned how important the anti-fake news is. We've heard from our own intelligence community leaders just two weeks ago about how serious this threat is. You launched a campaign of your own in trying to take on fake news. Tell us a little bit about what you learned from that experience and the valuable takeaways. Well, it's too soon to tell because we're just starting it. We have elections in March and we have European elections in May. And this campaign, it's about awareness, media literacy. We were teaching our kids that in schools but this is about our democracy. So we need to reach out to everybody. All the target groups, we know for instance that older people are especially vulnerable in this respect that they're not used to it. So they're easy trapped for fake news and this information. So really reaching out to everybody, making them aware of this is happening. The Netherlands is a relatively small country but also a very open country. And that we have not seen the amount that some other countries have seen in Eastern Europe, for instance. We shouldn't be naive. I mean, it can happen to us. And I want to be there before the elections and not afterwards when it's too late. It's a difficult conversation as well to take through what the components of that anti-fake news campaign need to include because you would like to very much say that everyone in society would be as interested as the people in this room in terms of making sure that they're educated, they're asking the right questions, they're getting the right information, they're filtering out the bad. But much, much harder as you say, an actual reality. Tell us a little bit not about DHS's stance on this and how you're trying to attack the issue of not only fake news but really ensuring more resiliency in the election process. Sure, great to be here today. Very small question. Thank you to Carnegie and Dennis for the invitation. You know, when we look at this, we see it really as a twofold problem. There is the attempts to undermine the election infrastructure, the technical manipulation of votes, compromising of election machines, voter registration roles, websites of candidates and campaigns. And that is a kind of a physical protection problem set. But closely related then is what I call the information tax on the electorate which gets to the misinformation and disinformation. So within Homeland Security, we organize internally with two discreet teams to deal with both of those. And our role really is not, as the deputy prime minister said, and we have the same sensibility. We don't want to be the arbiter of truth. That is not the role for DHS. But what we serve, I think a unique position government are convening authority to bring in those 16 sectors of critical infrastructure, one of which is the election infrastructure industry. But also our relationships with state, local, tribal, territorial jurisdictions. And so being able to provide one information sharing on what we're seeing, where we can downgrade intelligence information from the community and share it, where we can provide capacity building resources like training, where we can do on the physical side, scan networks, provide technical assistance on how secure are their networks, and even work with them to get a better sense of what's running on their networks. And so I think the election security challenge is not like the campaigns themselves where they come to an end and everyone exhales. We, the race continues and candidates, as you see in the news, are already declared. So we are back, we're still at it. And while the midterms, we're relatively pleased with how everything turned out, I think 2020 for us will be a much more intensive given it's a presidential election. Professor, you've written a number of books as well, and you've done a lot of writing on this particular issue. What do you think is missing from the national or international dialogue on how we're understanding and framing the issue? So one, I think one observation, just a common misunderstanding is that disinformation invents fake news. The fake news is basically making up stuff. That's not really how historically disinformation and active measures, the term of art, have worked. They work by sensing existing cleavages, existing conflicts, existing conspiracy theories, for example. And then pouring fuel into that fire, just making them worse. So for example, one of the most famous examples is the AIDS theory, the theory that AIDS is an American-designed bio-weapon built in Fort Detrick, Maryland. This is a hoax, obviously. But the hoax existed in the early 1980s already in gay activist circles. And the Soviets and Stasi together lifted it out of that corner and gave it more visibility. And then, of course, they stopped at some point in 1987 and the myths continued. So Kanye West in one of his lyrics, a song called, heard him say, uses that line in 2005. Is that because of the Soviet operation? We don't know. The fact that these operations are using existing prejudices and conspiracy theories makes them extremely hard to measure, even if you have a lot of data. Right, and obviously, the influence level is extremely difficult to measure. We talked a little bit about information security and sharing information. Sharing information not only with the public if you're talking about kind of anti-fake news, but also with each other as you prepare for elections. What were some of the key lessons learned that you feel like, Madam Minister, you can share from the last election? And preparations for the upcoming in May. Yes, we're preparing for the upcoming. And I think it's, as you said, I mean, it's not a new phenomenon, but what is new is that it's now happening in the digital age with the social media. And that has made it a completely different ball game all together. And I think we're now at the stage, like you here in the US where you had an election that proved at least that was a reason for you to step up your efforts. And then you saw the midterms and then you have been able to counter it and take away some of it. And that is, I think that we're in the same situation in Europe. We have had some elections and referendums where we know that it has played a role. We cannot prove that it has actually affected the outcome. That's very difficult to prove. But we saw the awareness is much higher and that's the reason that the European Commission has made the action plan. That's the reason we have the code of conduct. That's the reason that we're talking to the tech companies and working together with the tech companies and with academia. And that's big difference two, three years ago. And for myself also, and when I first talked to the tech companies, most of them were American, of course, but stationed in Europe. And then their attitudes was a little bit, you know, searching for what can we do, what can you ask for us, what can you not ask for us. But now their attitude is different. Now they're, because it's also affecting their credibility. So I think that's a big change. It's a big step, but I'm sure we're not there yet. And then again, we don't know what is developing also in the field of technology. We're seeing more cooperation, which you think is a good thing in traveling in the right direction. Let's talk about the upcoming election in 2020 because we heard Dan Coates just recently testified before the Senate that the adversaries are still out there. They're still looking to influence the election. What are you seeing? And what lessons might you be able to share from what you're seeing now? Let me start with the lessons that we drew from 2018. And I think for us, it was a couple of things to stand out. One, we really doubled down the investment of the partnerships with the platforms. So the Silicon Valley company that we engaged last spring went out there and met with them. And to their credit, they made the investments. They made the decisions to treat this as the high priority that they did. And that made a real difference because we had willing partners who were not only with us throughout the campaign, with whom we could share information, with whom we got a better sense of how they were attacking the problem from their end. But up until election night, when we had representatives of Twitter and Google and Facebook and Microsoft with us, either in the room or on the screen, so that when there was some anomalies that we saw in terms of misinformation and disinformation, we could work with them. And again, it's their role to decide what to do with that information. So investing in those partnerships. And also I think we spend a lot of time getting the media familiar with the problem set of how elections work, where the security risks are. We held a tabletop exercise, a nationwide tabletop exercise. We brought the media in. So I think helping them to understand, again, there's tax on the election infrastructure. And then there is the information tax. Both obviously can undermine democracy, but we treat one each differently. And I think those partnerships are what have us positioned well going through 2020. But again, as I said, I think, just given the intensity and attention that presidential elections get, I think that will be the much more pronounced effort on all fronts. And then also I think an interesting question and Professor, perhaps if you could speak to this is, where as this is an evolving process and as you mentioned, social media is such a fast moving way to get information out there. And it's very difficult to ever bring it back once it's out. What do you see as potential solutions to this long term? I mean, I've probably given you the hardest question up here today, but it's for someone who studies it and who's seen the patterns and who's written books about this. What do you see as a way forward to help? So we've seen a number of companies, as well as the US government, take a number of steps that are very positive and will probably make a difference, even if the difference is hard to spot because nothing happens. Success means nothing happens. It's a bit of a difficult one to also measure. But Twitter, for example, has published it's a lot of data from the St. Petersburg troll farm, more than 10 million, around 10 million tweets. And that's an extremely rich data set to work with. So let me just tease out one finding. We saw in 2016 that the internet research agency amplified existing fissures and conflicts within the US. I would absolutely expect something similar to happen in 2020 and I'll say what probably a lot of people in the room are thinking may not want to touch. One of the biggest vulnerabilities going into 2020 is President Trump's repeatedly calling into question the legitimacy of the election itself. Now that is fodder for any disinformation campaign because they will certainly try to amplify that message calling into question whether the outcome of the vote is legitimate or not. That is ultimately also a Russian goal. And then here I think we are facing an important, we as pundits, journalists, experts, government officials, we're facing a very difficult evidential problem because any disinformation campaign that will tap into what will happen in 2020 will most likely be minuscule in effect in comparison to domestic political currents and developments. So we risk exaggerating the effect of disinformation. We basically risk building up Russia into an adversary that looks stronger and more powerful than they really are. So it seems very simple to say something like this but do we need to be looking to candidates as well for the responsibility on them to not be generating information that may not be 100% true, that may not be something that you can prove or do candidates need to take more responsibility and making sure the information that they're putting out is accurate? That's a very difficult question because now we're talking politics, right? Yes, sure. I'd like all of you to respond to actually these. We like candidates to make sure that they're practicing strong cyber hygiene measures that their password regimes and their networks are scanned and we extend by DHS and we did throughout the 2018 cycle work with the DNC, the RNC individual campaigns to scan their networks to do vulnerability assessments of their electronic infrastructure. So that's where we take the security angle. I don't have any opinion on it. I think that's a very, very self-evil answer from DHS. 100%. I would just say this, given the history of statements and I'm sorry, this is a political thing to say, but it has to be said. Given the history of statements from Donald Trump on the legitimacy of outcomes of election, the system is rigged in 2016, but he continued along those lines. I think it is an important conversation to have among elected members of Congress, especially in the Senate, possibly among former presidents of the United States, people who have the weight in a public discussion to step into a very fragile moment on election night, on November 9th. When possibly the question comes up, are we dealing with a legitimate election here and what happened? So we need, in a situation like this, there are only very few people who can de-escalate and create the perception of stability. And I think it is crucial to have this conversation ahead of time. You're sure it's happening already, but I think it's important. You're looking at academics and you're looking at media institutions to be able to weigh on an impact of something, whether or not it's intentional or not, in terms of the message. There's an impact there. I think we're really looking at senators and elected officials. And how do you look at this from outside the U.S. and think what lessons are we seeing that we can take away? Well, there are many lessons. First, perhaps, maybe go into this issue also, not, of course, in the American situation, but in the Netherlands. I think the principle of a democracy is that you have freedom of speech and the exchange of ideas. And so politicians, any candidate, should feel free to say whatever he or she wants to say and to present as is or her truth, even if it's not true, because that's the political debate. But if the system is working, then you have an independent journalism that takes care of that. That can ask questions. You have the other politicians, the opponents, that will also take care of that issue. So I think that's the kind of democracy that I feel very strongly about, that you can have this political debate. What you don't want is influence that you don't really see that's coming from elsewhere, that has a different purpose that's coming from other countries with a purpose to destabilize. Because I agree completely with what was said just now. It's not just about being true or not true, or fake or not fake. It's about destabilizing our society. That's an understanding that the adversaries have that opportunity. Because we are open society. We present them without that opportunity because we want to be open society. I would just add, Suzanne, that I think there is a basic level of responsibility for the citizenry to be a citizen in the United States. Read whatever you want, but know what you're reading. So we had some phrases, think before you link. And if people are just scrolling down with no regard to the source of the information that they're reading, then that's not necessarily being as irresponsible a citizen, I think we all need. And so this is a collective effort to make sure that democracy can still thrive. It's not just the government, it's not just the social media companies, not just the candidates, it's the electorate as well. It's a massive collective effort, which is why I think everybody realizes how incredibly difficult, but important it is. Do we have questions from the audience? I'd like to share some of the time here. Yes, please, and then we'll come to you after. And if you, do we have a microphone? Yeah, we do, it's coming. If I could just have you kindly tell us who you are and where you're from. Don't be great. I have a question for Professor Ridd. Ridd, did I say that right? That is the idea, which I've not heard before. So of having trusted voices speak on election night about the legitimacy of the results of the campaign. On one hand, that's a very appealing idea to me. And on the other hand, I think about poor General Powell, who went up before the United Nations and testified that there were weapons of mass destruction after having been briefed by the intelligence communities that there were. So I wonder how you can, what system could possibly reassure a Jimmy Carter and George W. Bush that they were comfortable saying those things? I think you may be in a better position to answer that question than me. I'll just say briefly, some of the work that has been done at DHS and elsewhere is creating visibility from understanding you and some of your colleagues correctly into potential attacks, cyber attacks against election infrastructure. And of course, attacks against the debates, the public debates are visible by default. Sometimes not openly visible, but you can see them. So it's not a problem that will appear in a flash on election night. It is a problem that will slowly arise and we will see it. Yeah, I would just say that we see ourselves as trying to improve people's ability to make risk informed decisions. And so everything we do is geared towards promoting and sharing information so that whether it's the candidates, whether it's the parties, whether it's social media platforms or the public that they can understand the threats and vulnerabilities into what they're reading. So I don't think I have an answer to your central question, but that's all I'll say on that. Understanding the threat in a timely way, I would think it's equally as important that it kind of loses its effect in this weeks later. Thank you very much. Voice of America, Russian Savas, thank you very much for being here today. I think Netherlands were probably first, one of first international victims of fake news with all this dozens of fake versions of shutdown of MH17 five years ago. And my question is related to MH17. Did you discuss it here in the United States? Maybe requesting some help or something else? What is the nature of talks of government of Netherlands with Russians because now we have statements of Russian foreign ministry confirming this talks, still denying any responsibility. Do you think that Russia is responsible for shutting down MH17? And what's the roadmap for bringing those who did it to justice? Thank you. Those are lots of questions, but very important questions. It's true that we have been affected by disinformation concerning the MH17, but that was vulnerability in our society, this terrible disaster that happened. And it has affected also our relationship with Russia ever since, and it still does. We have two tracks that are still ongoing. So for the criminal law, there isn't an investigation team that has researched what has happened and that has found proof that there was a Russian buck missile that shut it down. So the proof is there, but the court hearings still have to happen. And there is the international track. We're holding Russia responsible together with Australia, with another affected country. And in the context of that process, there will be talks with Russia. And that is what you've been reading about. They are being prepared today. And your question, if it's been part of the discussion that I've had here, yes, because MH17, for us, is very extremely important. We have always depended on the cooperation of the US and still are, it's very important to us. It was one American citizen on board. But aside from that, the US has always proven itself a very valuable partner. And it's important for us to keep them up to date and to keep them involved in... Hi there, my name is Nicholas Terpstra. I'm a student at Calvin College, Grand Rapids, Michigan, studying here and I'm interning at the Cato Institute. I had a quick question for perhaps Mr. Ridd and Mr. Travis. By and large, would you say that the things that we've been seeing are more undermining on social media, the elections, or more directed at candidate support, especially say 2016 and 2018, which of the two types of quote unquote fake news are more popular? I think what we've seen in 2016, in terms of social media disinformation campaigns, and I'll make a sort of bold statement here, we tend to overestimate the effects of these campaigns. The most effective component of the 2016 election interference was the hack and leak of ultimately John Podesta's email via WikiLeaks, not anything that happened on Twitter or Facebook. So while it's important to work with the data that we have, we really have to be very cautious in assessing effects. Keep in mind the Internet Research Agency in St. Petersburg was exposed by Russian domestic, investigative journalists four weeks after it was created, and it continued to be an ongoing story. It's a very unprofessionally run operation. I would just say that I think it's a mistake to think that the influence campaigns are only centered around elections and in fact what we see from bad actors is an attempt to, whether it's sowing discord or rabble rousing in any issue that is going to try to divide and polarize society. So you see it throughout the year, unrelated to elections in terms of trying to influence, make socially divisive issues ever more so, and that's I think the greater threat to society has become more and more polarized. That adding that element to try to accelerate that polarization is what I think we all need to be on the lookout for. That's a great point. Yes, Sarah, back to you. My name is Roger Coachetti. I work with private equity in the technology sector, and my question is a little bit more philosophical. 30 years ago when the law of the Internet was first coming together, the basic principle was that a website operator was liable and responsible. The Internet industry was liable and responsible for any content that they posted. And the comparison was that they were like a publisher. Newspaper is responsible for the letters to the editor that they publish even though they didn't write them. And a newspaper's responsible for the op-ed articles that they publish even though they didn't write them. So there was a chain of liability and responsibility. There was enormous pushback against that. And by the mid-90s, the basic legal structure was that Internet providers were not responsible for the content. They were more like a telephone company which has no responsibility for the words that people speak over the phone or post office that has no responsibility for the content of the letters that are mailed. In the last 10 years, we've seen a significant reversal of that that's very topical, very specific. Well, Internet companies are not responsible for the content, but they are responsible for making sure that child predation content is not posted. They are responsible for making sure that human trafficking content is not posted. But if we go down the path of saying that Internet companies are responsible for making sure that news is not posted by somebody who has nefarious intentions and lies from a country that I don't like, where does this stop? I mean, why can't the people of Saudi Arabia say you can't have content that's blasphemous or the people of Mississippi say you can't have content that's nudity? You know, where do we draw the line between responsibility to surveil and filter out stuff you don't want? Right, it's a great point. So sorry to cut you off, but just being a little bit mindful of time that I think your question is, whose responsibility ultimately is this monitor? We've struggled with this for quite a while. What are your thoughts on that? And then I'm gonna have each of you give us some closing thoughts before we have to wrap up. Maybe the professor first on this issue, because I think that's... Right up your alley. Yes, well, I'm not sure I have something really intelligent to say in response to this big social media content management question. And instead I'll use the opportunity to just briefly say something to minister. I think the... Just wanted to make sure that you hear that the work, especially I-V-D, has been doing really over decades in the field of exposing hostile operations, the neutron bomb operation in 1979 comes to mind where your country was at the forefront of a major election, a major interference operation. And of course, again, the work in the context of MH17, but also exposing some of the GREU hacking operations more recently with pictures even, very commendable and impressive work. So, thank you, well done. All right, shall we... I think that's a great question. I think it's a big one to grasp. Would you have a comment on either of you on that question or... I think for one, the Department of Homeland Security, through the askable, we support the First Amendment, so we have absolutely no interest in... And I appreciate the question itself. I think if you look at the molar indictments on the Russian control factory, those are because there's all foreign influence, right? So we're talking about foreign speech, trying to influence U.S. elections, there are laws that prohibit that. That's an issue there. The rest of it I think is great issues to take up in academia and other public fora. Very well. Any final thoughts? Oh, lots of them. But I think we have to wrap up. Yes, what should we walk away with from this? I think to set up our efforts to work together. I was very inspired by... I visited DHS yesterday. It was very inspiring. Some of it we're doing already in the Netherlands, but I think we have to do more of it. And I also want to leave you with the concern that we have, that free speech is the essence of everything. So we always have to respect that. And if you have that as your fundamental position, then all the other things you can do in a coordinated and cooperative manner. And we need each other. So not only are the tech and the platforms mostly American companies, but as I've been talking to them, I understand also that they're trying to understand our culture in Europe and our way of dealing with these issues. And also we understand that Europe is not one country and Europe is 28 countries, 27, in a couple of weeks time. And they're all different. And they're different cultures and different traditions and different histories with Russia, for instance. So we have to keep all of that in mind, but still find ways to work together. There was a phrase when I was at the Navy, it was when you're driving ships that vigilance is the eternal price for safety. I think vigilance is the eternal price for democracy. I think it's incumbent upon all of us to understand what we're reading, where it's coming from, and this notion of a collective effort from the social media platforms, from government, from the citizenry. There are places to go to find out where this is coming from. The Atlantic Council, the Alliance for Steering Democracy, do great work publicly available on showing, not from the government, of where some of these global near-dwells are and what they're putting out there. And I think the message that we have from DHS is we want to continue to work with our partners from the states, from the localities, from the technology companies, but also the citizens and even more visually. And that's one of DHS's biggest challenges is the education component too, right? Professor, any final thoughts? No, I think we've had enough final thoughts. Wonderful. Well, I'd like to thank each of you again. I know that the minister has to move on to appointments this afternoon, but thank you so much for sharing some of this. I think we could have dug into any of these issues and talked extensively about any one of them. So thank you, and thank you again to the Carnegie for allowing us to be here and to you for coming and listening and asking questions. Thank you. Thank you very much.