 Hello, I'm Peter Singer, Senior Fellow and Strategist at New America, and I am delighted that you've joined us for this discussion on the Future of Information Warfare, part of the larger conference, the Future Security Forum. We have a fantastic group of experts for this discussion today. We're joined by Duane Lee. He is CEO of Vast Ossent, which is a new firm working to find, fix, and finish information pollution. He's previously worked with organizations that range from the Naval Postgraduate School and the University of San Francisco, to Zignal. We're also joined by John Spikerman, and he's had a long career representing the United States as part of the State Department around the world. And he is presently Chief of the Russia Directorate with the Global Engagement Center with the State Department. Next, we're joined by Candace Rondo. She has worked for organizations that range from the Washington Post to the International Crisis Group, and she is now a colleague who is Director of the Future Frontiers Project at New America, as well as a fellow Professor of Practice at Arizona State University. And finally, we're joined by Mike McConnell. He is a former Vice Admiral in the United States Navy, and during his distinguished career, he served in roles that range from Director of the National Security Agency to the U.S. Director of National Intelligence. And presently, he is Executive Director of Cyber Florida, which he'll speak to as a program that provides cybersecurity education in support of the Florida Department of Education and University System down there. So a really great group. So I'd like to begin with a question that both takes us back, but also takes us forward in this topic. One of the notable aspects of 9-11, 20 years back, was the shared sense of not just the facts, what happened, but also unity on the problem set, unity in terms of the need to act on terrorism. Will there be any more such 9-11 moments in a world of social media polarization and conspiracy theory online? That is, will we be able to come together again around a shared sense of facts and a shared sense of threat? So, Duane, why don't you take this question first? Thank you, Peter. I am exceptionally thrilled to be in this panel. I'd like to start with a very simple observation. And with 9-11, we saw this growth of what we call transnational terrorism. And we used to use the phrase, the democratization of violence, to explain these kind of patterns. And in a way, what we are facing right now is, ironically, the democratization of propaganda and information pollution. The reason being is become so much cheap to create inauthentic content. And also, there is this growing industry that monetizes such content and traffic. To me, this is perhaps the most pernicious threat we are facing at this point. And I'm going to start with three additional observations. So I just talked about the democratization of propaganda. But also now we have this normalization of information pollution. That is, whether it is between groups or between states. What is the most routine aspect of conflict and competition? And I would offer that, that is really in the information environment plus cyber domain. Number three is the concentration of what I call data weaponization. And that is the more resources you have, the better you can weaponize this kind of patterns. And this power is so concentrated right now in a few private entities and autocratic regimes. And this is why a lot of liberal democracies essentially struggling to compete more effectively in this space. And so would we see a 9-11 like moment in the information environment? I don't want to be a doomsday guy, but that is what keeps me awake at night. And I feel like there is this evolution of information weaponization. So 10 years ago, 20 years ago, we had 9-11. 10 years ago, we had the spring. And this is where information sharing was incredibly empowering and also enabling certain movements. And I think this made a lot of autocratic regimes really, you know, they become really afraid of this potentiality. And what happened was, hey, we need to recognize the same techniques, the same data to preserve our political monopoly. And unwittingly, this created a highly diversified and also highly distributed echo in a set of echo chambers, right? Now, think about the first wave. Think about the second wave, what we call the backsliding of democracy. And imagine those two waves essentially leading to a 9-11 like moment in the information environment and essentially transitioning to the physical world, like the way we've seen in 2010 and 11. I think that's what I'm afraid of the most at this point. And with that, I'll pass it back to you, Peter. Thank you. How about you, John? What are your thoughts on this? Sure, I think if we're going to have that kind of unity moment where, you know, shared truth among a large population and so on, we need to do a better job of explaining and educating our publics about the nature of disinformation operations as kind of a fundamental resiliency building or media literacy when we talk about counter disinformation work. Right now, we have a lot of disinformation actors, including Russia and others who are very good at individual narratives creating these emotional connections between audiences and those narratives. And we know from observation and psychological studies that when audiences get that emotional attachment to a narrative, even when it's false, even when it's proven false and debunked and fact checked, they are very reluctant to let that go. So even though our fact checkers have gotten very good and we have established ways for audiences to educate themselves on the truth of individual narratives, it doesn't always work because disinformation actors are so effective at dividing people and a lot of their operations are designed and intended to so confusion and so distrust and and trigger emotionally societies on specific issues that that spark those emotional reactions. But we also know from from from other psychological studies that people are more likely to reject false information if they understand the nature of a disinformation operation, if they understand, for example, that a source of news might be directed by a foreign intelligence foreign intelligence agency that is dedicated to the mission of fooling foreign publics or making fools of them. If people can better understand that, they're more likely to reject bad information. They're more likely to or I should say less likely to to absorb and connect with disinformation. And then through that process, you have a better chance of having a shared truth or a shared set of facts. If people understand that some of those alleged facts are part of disinformation operations to fool them. But so far, we're only beginning in this area. And if we look to the future, we have a lot of work to do there to to expose and explain the the the nature of disinformation operations, not just the the the the falseness or or or truth of of individual narratives or facts. And as one of the things that is interesting, I find about your work is that you've touched on topics that range from far right extremism to as John was referencing, Russian information operations. So with that kind of insight, what are your thoughts on whether we might see a similar moment of coming together? And maybe an add on is how would you project that groups like that would operate against such an event? Yeah, really good question. I mean, at future front lines, a lot of our work does focus on trying to understand the mechanics of information warfare and trying to expose those mechanics so that people can understand how influence works, right? How how different actors, whether they're states or individuals who work for states or have some sort of tenuous connection to them influence the way people think about a lot of the complex problems that we have today. I think the way to answer the question very quickly, quick answer, will we have another moment like 9-11 where we have a shared sense of facts and truth? I think the answer is no. I think we already we have the answer in in the form of the January 6 breach of the Capitol and kind of reinterpretation of facts by various numbers of our political class, as well as those who were there. There's there are kind of these alternate universes and alternate narratives being promoted now on other platforms, right, that are less moderated. So that's Gav, Rumble, you know, the new Carver. Those that model of doing business is growing unchecked. And so the answer to the question somewhat depends on sort of I think sort of three factors. Some are on screen and some are off screen. In the off screen category, we just have these larger structural dynamics that for the next 20 years, we're going to be coping with, which is demographic change, climate change and technological disruption. And all three of those factors are going to incentivize certain states, especially those that are extremely energy dependent fossil fuels to protect, you know, their interests, protect their wealth, protect their capital, protect their elite positions and to use information as a weapon to do that. And we've already started to see that with Russia for sure. That is definitely a motivation here, obviously, as a response to the Ukraine sanctions that pose as a result of their incursion in Ukraine. So that's just one, you know, structural factor that we deal with and we have to contend with, that something that's completely off screen that, you know, governments and institutions and organizations are all struggling with. On screen, as I mentioned, we have this fundamental challenge of a industry model, you know, that is predicated on the idea that privacy doesn't matter. That, in fact, you know, individuals are a surface unto themselves. Communities are a surface to be exploited unto themselves. What's problematic about that is we have two different models now of sort of online understanding of truth and information. One is the authoritarian China, Russia model of complete state surveillance. Right. The other is the sort of capital driven model of, you know, data driven surveillance through capital concentration in the big tech industry. And that's right now. That's the predominant American model. There is a middle way that Europe, I think, is trying to kind of explore with its protections for privacy, but it's not there yet. It hasn't matured. So these three kind of competing models, unfortunately, there's no one government that has really gotten the grips with the scale of what's needed there in terms of regulation and pushback and kind of containing the overreach, either of governments or technologists. And, you know, more importantly, while we talk a lot about media literacy and sort of a general understanding of how disinformation works, very important. The reality is the model is such that if you're a YouTube influencer, whether you're a micro influencer or a big influencer, you're making money. And there's real, you know, sort of monetary value and spreading news information and misinformation. And there's also social value. There's social cash for people until we change those incentives. I think it's going to be really difficult to have a sort of shared sense of the reality. Thank you. Admiral, what are your thoughts on this? Well, I think if you look back in our history, usually some event that brought us together was physical. The attack on the Lusitania and World War won the Pearl Harbor attack 9-11. I think it's an entirely new dimension in the information age. I agree with much of what the other panelists have said. I believe from my perspective, sitting in an educational institution focused on cybersecurity education and research and so on, that education is the answer. I'm an older guy. There was no mention of communications or technology when I was in secondary school. I came to learn about it through 30 years in the Navy when I had to deliver information, sensitive time, sensitive information to ships and sea. So I gained an understanding of the process. It wasn't until I went to be the director of the National Security Agency that I was faced with the issue of how it looks in the future and how you would develop a signal's intelligence capability to protect the nation and so on. And what I've observed, particularly in my current role, is the American public just doesn't understand. There is no focus on digital literacy or cyber citizenship. Now, we quickly go start down a path referred to as civics education. And then you get into the political debate, depending on your political persuasion of what you should teach or not teach. But in my view, having a much better, fundamental understanding of what it is, how does communications work? How do communications work? What's the flow path? What are the timelines? When I was a youngster, to put money in a bank, I'd go down to my little local bank and give them my three dollars and I'd put an entry in my past book. Today, you can move $10 billion from Tokyo to New York and complete the transaction in 30 milliseconds. It is so fundamentally different. And I don't think that the American public fully understands the dimension. And here's the way I think about it. We have put everything that's important online to include our critical infrastructure for energy or electric power or banking or whatever it is. And we've offered that up to nation states who wish us harm to provide them remote control of those infrastructures. And I think we're going to see that more and more. Are we going to get to a point that is so catastrophic? Do we have a 9-11 come together moment? I don't know. But I know from my perspective, education of youngsters in secondary schools and colleges and the citizens at large is a huge shortfall that we must address. Great points on all those different perspectives. I'd like to follow up with a question of what personal lesson have you taken from the last 20 years and how it applies in particular to your work related to information threats? So why don't we go back around the horn to start us off first? Yeah, so I was in grad school when 9-11 happened. And my training was in quantitative modeling and political movements. And I was very fortunate to get my first teaching job at a government institution. And I was incredibly lucky to work with some of the people who were trying to fix our system after 9-11. And I remember two large conversations. One was that I'm pretty sure everyone is familiar with this concept of fusion cells. So after the 9-11 commission, we had to essentially decentralize information sharing and intelligence sharing. And we set up a large number of fusion cells that were coordinating the federal government with state authorities, local authorities, between DIC and law enforcement, DOD, and whatnot. And McChrystal, I did work for him on numerous locations. He was essentially one of the pioneers of actualizing this model downrange. And to me, we are facing a more scalable threat in the information environment, where we don't have this connective tissue, both literally and vertically. For example, think about disinformation. I think the federal government is doing made a lot of progress. But in the last election, I was trying to support different state governments and local election monitoring groups and whatnot. They are poorly resourced, poorly organized, to combat disinformation that was undermining our election integrity. So I feel like we are approaching the same inflection point, and that is, unless we build this connective tissue, both literally and vertically, I don't think we'll be able to fight effectively against this highly scalable and pernicious threat factor at this point. And let me give you some data points to appreciate why this is so critical for us to reorganize how we respond in the information environment. And I was speaking at another JSL event last week. And somebody asked me, so what do you recommend we do to compete more effectively in the information environment? Because it appears that that's where we are yielding the most at this point, especially against the Chinese Communist Party and the Kremlin. And I completely concur with that assessment. So my recommendation to them was, look, in the past 20 years, we've become so good at moving, shooting, and communicating against transnational terrorist organizations and also severing their support ties from certain autocratic regimes. I think the same principles apply to information warfare, except that we're not so good at moving, shooting, and communicating in the digital environment. So we have certain strong muscle memory already. We just have to transfer some of those lessons to do the same in the information environment. To me, that is really connected tissue, vertical, lateral, and also our most forward deployed classes and sensors to be able to move, and shoot, and communicate more effectively in the information environment. Great. John, what lesson have you taken personally from the events of the last 20 years? I think one thing that strikes me is I look back over specifically Russia, because that's my current focus. Looking at their successes and failures with disinformation operations is the lesson that the capabilities and the talents of a disinformation actor, or in our case, a disinformation adversary, are not static. They grow, and they develop, and they evolve over time. Certainly when you just look at the strategies, or even sometimes, in a sense, the tools of disinformation, you do see a lot of similarities going back over time. Certainly with Russia, you can go back to the time with the Cold War and the Soviet Union. Look at the active measures, as we called them, in the 1980s of Soviet disinformation operations that attempted to spread rumors about the AIDS epidemic, or crack cocaine on the streets of the US, tied to notorious conspiracy theories about the US intelligence community, things like that. You can see echoes of that general strategy in the way Russia operates today with its disinformation, using proxy sites to launder information and allow more credible sourcing. Of course, in the 1980s, that was done with traditional media, newspapers. Today, it's done on the web. That overall strategy and approach has stayed the same, but the capability and the specific tactics, we've seen a pretty significant evolution in just the last, I would say, 10 to 15 years when looking specifically at Russia. If you look at the narratives that they tried to inject into US politics or US civic discourse, I would say, looking at that, there's been a degree of development towards a better understanding of those emotional issues that trigger American audiences. You can also look at Russian interference in other countries, democratic institutions or electoral processes, where, say, 10 to 15 years ago, the kind of first initial steps by Russia to influence conversations on social media, you might more often describe those as ham-fisted, whereas today, they're using a much greater sense of what issues resonate and what lines of argument resonate against certain publics to get at that strategy that they've had for a while to so distrust, weaken social cohesion, create and foster distrust in government and democracy. But now they're using technology, they're using specific ways of building narratives in a much more skilled way. And I think as social media evolves, we can expect actors like Russia to continue on that trend. Candace raised a very excellent point about increasing numbers of social media platforms, moving away from those big platforms like Facebook and Twitter that have millions, 10s of millions, hundreds of millions of users into other applications and platforms that have a smaller number. And that's harder to reach, harder to understand for those who are countering disinformation, but easier for adversaries to embed their false narratives in. So, certainly that lesson of the ability of an adversary to evolve and improve over time, that's the lesson that we draw and that's what we need to think about when we think about the challenges of the next 10 or 20 years. Candace? So, all of us have kind of been making kind of the same point which is the United States is vulnerable, right? Europe is vulnerable. Many countries around the world, they're just vulnerable. They're vulnerable for a number of reasons. Chief among them is just the regulatory environment around information. And then also because of the emerging dynamics with very rapidly evolving technologies. But again, I think neither governments, or institutions, nor even society, or industry really has a full understanding of the kind of the long tail impact scale which you've written about, of course, in Burn-In which I think this is a beautiful book that kind of tells a great story about the unintended consequences of rapidly evolving information technologies and how they fit into our lives. For me, I was a reporter, a cub reporter with the Daily News in New York on 9-11. I did not have a cell phone, okay? That's how long ago this was. Very common, not to have a cell phone. Most of the reporting you did is on your pay phone. And I remember that day as one in which I felt, and I'm sure other reporters on the day in New York felt disconnected from an enormously challenging information event that we just couldn't get our arms around. Today, obviously, we're just operating in a different environment. And I think one of my key lessons is you have to expect that some of these technology changes, even though people are sort of downgrading the role of legacy media, legacy media is always gonna be there. Legacy media is always gonna be part of the information sphere. And so the question is, does legacy media catch up with the story of reporting on information warfare? And I think that's one of the reasons why we began our programs, to help people understand that this is literally a war that has no front. And it has to be reported on in that way. It has to be tackled in that way. So the big lesson is, I think, that we also don't know what we don't know. That is to say, there are growing synergies between the way states like China and Russia, and then other very powerful political elites are beginning to leverage information to their advantage. And we've seen that in the last year, certainly over this 2020 election period, you can almost call it like those who kind of were in junior high in 2016 have now graduated high school and now we're gonna kind of get onto a university level in terms of leveraging that attack service. And I think that's really dangerous. That synergy is something that could be controlled with greater regulation or reporting more transparency, but we're just not there yet. Amar, what is the lesson that you've taken personally from the last 20 years? One of the biggest lessons that I've learned in the past 20 years, and actually started when I was in war college years and years ago. And that is a large organization that's established with a mission and the focus and of positions and authorities and whatever. We'll choose failure over change. I've witnessed it over and over again. My beloved Navy refused to build carriers in the 30s, battleships forever until Pearl Harbor. The US Air Force refused to embrace drones even though directed by the president until the chief of the Air Force, Secretary of the Air Force were fired and new leadership was put in place. So my point is unless you force change, large organizations are gonna choose failure. 9-Eleven lessons learned, I was a product of the aftermath 9-Eleven as a director of national intelligence. And what they said was a lack of willingness to share information, which was true. There was no incentive. The community was taught need to know came out of breaking Nazi Germany code in World War II and the Japanese code on the other side of the world. Need to know, protect sources of message. That's the whole ethos and culture of my community. My belief is we have to change that to responsibility to share. If there's information that's gonna protect US banking or a bridge in Seattle or whatever it is and you are aware of that, you need to figure out a way to get the information to the people that need it so they can actually do something about it. So the lesson learned for me is don't wait for us to magically find how to deal with this. This is a leadership issue. We have to rethink authorities and roles and missions and partnerships. If we're gonna address this issue over time, it has to be a collective defense. We know how to do physical defense. We have authorities and resources and rules and so on, but this is a borderless problem. And if you're going to solve a problem where someone in another part of the world can touch critical infrastructure in the United States operated by a private sector entity, it's gonna take a collective effort between that private sector, others in that sector and the US government to share information at network speed. So I refer to this as collective defense or collective security. It must move at network speed. And the only way we're gonna get there is leadership. I go back to Goldwater Nichols to change the Department of Defense. Department of Defense was studied every year for its history. And when Goldwater was intended on forcing jointness, the secretary of the Army Navy in the Air Force, as well as the service chiefs for all four services testified against it. And then it passed anyway, the president signed it, we had our first dust-up in the Gulf War. I was the fly on the wall as the intelligence officer for the joint chiefs to go to the old office of the Congress and listen to the service chiefs, say, Goldwater Nichols, the greatest thing that ever happened. So the department chose failure over change until they had some forcing function to cause them to embrace the necessary change. I think that's where we are on this issue. Whether it's education or outreach or changing authorities or collective defense among the private sector and the public sector, new authorities for intelligence committee, all that has to be addressed if we're gonna get ahead of this problem. I'm hoping that we do, but I think it will boil down to statesmanship or maybe a better way to say it's leadership where we've had crisis in our past. Some leaders stepped up to make the case. I heard a speech last night. I went to a Department of Homeland Security conference. One of the speakers was a former congressman that chaired the House Permanent Select Committee on Intelligence. He told a story about the post-revolutionary period when the officers weren't getting paid and they were speaking of mutiny. They had been successful with the French to drive the British out. They weren't getting paid. They were actually meeting to plan a mutiny. And George Washington went to the meeting and he gave a speech and it turned the tide leadership. So I think this is a critical problem for this country, but we're gonna have to do some fundamental things that will go against the established organizational structures and authorities that we have to change. So that's how I think about 9-11 and the lessons that came out of that. And I'm really concerned about information warfare. Can we do this before it's critical? Now I'm the eternal optimist. I think we can, but if you look at our history, we tend to be reactive, not proactive. So I think it's time for the right leadership to step up. Can I just jump in here, Peter? I just wanna sort of tail on what the Admiral was saying just now. I think the network nature of this problem is one that I think the United States government in particular is still having trouble getting to grips with. There's a great temptation to create agencies and new authorities and that's good. That's important. That is part of the leadership. I think that the Admiral was talking about here. At the same time, what has changed about the information environment is anybody can penetrate and anybody can kind of use digital tools and techniques to get under the hood of information campaigns and deception campaigns. And that's where I feel like the investment isn't quite meeting the moment. We've actually had a 9-11 like moment in 2016 when Russia attacked essentially our information systems. And that has continued since 2018. And then in 2020, we saw it kind of morph into this kind of synergized threat between local political leads and then China and Russia working weirdly at cross purposes with each other in terms of what their outcomes or expected outcomes were. But in that little period of four or five years, there's also this growing group, a social movement of citizens who are trying to expose campaigns, right? Researchers, academics, journalists, they're much more empowered now because they have access to satellite technology. They're much more empowered now because they have access to different ways to look digitally under the hood. I think that's where some of the investment needs to go, not just through your typical paradigm of leadership where institutions of leaders and decision makers are kind of sitting in a silo, but you invest heavily in this area of sort of citizen activism and citizen journalism. It's a great point. Another area of underinvestment is something that both John and the Admiral brought up in terms of preparing the population, building up resilience through understanding and education. There have been over 450 different think tank, university, task force projects on this topic of information disorder. And almost all of them have focused on the actions of the adversary, changing the rules and regulations either on the government side to limit it and or the companies doing more to police their own networks. And yet, 450 looking that way versus a literal handful looking at how do we help the target of this? How do we build up their own capabilities to be resilient? It's in my mind, it's a massive imbalance. So as Candace very kindly mentioned, I'm someone who works in the future and builds stories and scenarios of the future. So I'd like to ask your help on that. We've looked back 20 years, let's look forward 20 years. What does this space look like in 2041? What is the same? What is different when it comes to information warfare and information threats? So let's go back around the horn again, Duane. Yeah, so as a recovering DOD academic, I'm gonna answer your question with something that is completely unrelated first and then I'll address your question. That's what DOD academics do, right? There is a notion about digital education, cyber citizenship. I think there's a really important inoculation are strategists that we need to build over time. There is no doubt this is something that we need to pursue. And there are a lot of civil society organizations working on the problem set and perhaps we can build more connective tissue amongst this civil society organizations, NGOs and essentially like social movements. Absolutely. So to me, those are like public health strategists but typically they take time to become effective. We also have to think about the immediate symptoms that are hurting our body politics at this point. So and the reason I'm kind of struggling to make this point is very simple. Digital literacy works when we are dealing with a finite set of observations, facts, stories, topics and whatnot, right? Let me give you a quick data point. In every 60 seconds, Facebook has what? About 200 new posts, right? And the same number of pictures and whatnot. Then goes for Twitter, Instagram, YouTube. The volume of computational propaganda is so massive. I don't think this critical thinking will be very effective unless we come up with out of solution to buy some time to achieve that. Well, this is not the right phrase but hurt immunity against this information, right? Now, having said that, let me come back to your question. And so I like using a lot of abstract nouns. I'm going to use another one here and that is what we're seeing is essentially the modernization of the information environment and is taking place globally, regionally and locally, right? So echo chambers, everybody's familiar with this concept. So this is happening locally. But if you look at how China is trying to govern the internet, how Iran is trying to govern the internet, the DPRK and the Kremlin, we also see this split connect taking place globally at the same time. Now, if you think about these two things, this creates really wicked problems, I sort of speak, because now all the great regimes can exploit our level of fragmentation to create highly entrenched echo chambers that are maintained by rapidly and affordable inauthentic content, right? I think this is what Candice talked about earlier, so smaller fringe platforms, private chat groups, encrypted chat apps and whatnot. So now we have this massive level of fragmentation and then we have this convergence of threats. What I'm doing with my work is essentially trying to expose the infrastructure of what I call information pollution. It's not just this information, it's phishing attacks, it's malware, right? And also fraud like reading websites that essentially automatically download certain software on your browser and your computer and whatnot. I understand this information is all the rage right now, but it's much more comprehensive than just misleading narratives, stories and et cetera, right? So we tend to focus on the tip of the iceberg, right? But unless we tackle this entire industry of information pollution and attack the basic incentive structure of that industry, right? Because traffic is monetization. So not all malign actors are actually maligned. They're just trying to make a buck what to by getting more traffic and more monetization. So to me, unless we really tackle this, you know, submerged portion of this industry, I don't think we'll fare well in this fight. And let me give you just a very concrete example and then we're gonna pass it back to you, Peter. Two months ago, the cybersecurity company, Barra Huda released a new report, right? And they said all this disinformation sites, URL and domains also function to mount phishing attacks. So within the span of 12 months, 12 million spear phishing attempts across 17,000 organizations in our country alone, right? So that's what I call the convergence of threat factors, disinformation, cyber attacks, phishing, social engineering and fraud, right? That's what keeps me awake at night with that very uplifting thought back to you, Peter. Thank you. So John, I'm gonna tweak the question slightly to this. What does the version, the future Foreign Service Officer who's sitting in your role in the, and maybe it's the Global Engagement Center, or maybe it's something else 20 years from now, what are they doing different? What are they facing that's different than what you're doing and facing right now? How does it evolve or change for them in the role that you have right now? Sure, so I think first I'm unfortunately going to continue the negative or the pessimistic forecasting that we seem to be stuck on in this discussion, but hopefully be able to turn it around with some sort of optimism when I think about what we could be doing, or those of us in my place 20 years from now. Right now, looking about what those challenges might be 20 years from now, we really need to address the issue of overall democratic institution building and maintaining. We are seeing disturbing trends in many of our democracies, backsliding on democratic norms, rise of authoritarianism. We're seeing in authoritarian or near authoritarian states, democratic opposition groups being crushed or facing an immense amount of pressure. And then we look at how disinformation is being used in those environments today. And I'll point to, Belarus is one example that we're looking very closely at for pretty much the last year, last 12 or 13 months. We've noticed the Lukashenko regime has increased the pressure, increased the threats of violence against media, independent media in that country has cleaned out the state media, replaced it on occasion with so-called journalists from RT and so on. And the result is an information environment or a media environment in which none or very little of the local populace believe anything that's coming from a state mouthpiece. And yet it continues. And I've asked people living and working inside Belarus, why does that happen? If nobody believes this, why is the state using its resources to continue to fund these efforts? And the answer I get is pretty striking for the future. And that's that they're not doing it to inform or misinform or disinform a public. They're doing it to express in a very audacious way their total control over the information environment. They can lie to you. They can use all of the organs of state communication to put out very evident half truths or lies. And there is nothing that you, the citizen of that country or the opposition figure in that country can do about it. And that is very demoralizing to individuals, to publics, to democratic forces. If the rest of the world, those states that are engaging in democratic backsliding or tipping towards authoritarianism, if they look at examples like this, they're going to be emboldened. In 20 years from now, we're gonna find that use of disinformation, not just as a tactic of information warfare, but as an anti-democratic weapon. We're going to see that that's much more common. And I think 20 years from now, the people in my position are going to have to address that more whole of society, whole of government approach to malign influence or anti-democratic actions of adversarial states, not just in a strictly counter disinformation or strictly information operation way. I've asked myself, how might we do that? And I think we've touched on that a little bit in some of the earlier questions. And that's better understanding how to get into those authoritarian environments where a state that has total control over the majority of the media or communications apparatus and outlets, how can you still get fact-based narratives into the public? It's probably not through the means that we're using now, which largely consists of working with the bigger social media platforms, working with larger media outlets, supporting websites or new services that attempt to reach a whole country or a whole region at the same time. Those are proving, and I think we'll prove over time, to be too easy of a target for an authoritarian regime to shut down or to block in some way. So I think 20 years from now, we'll be much more focused on the individual journalist, the freelancers, the stringers, the people who own social media accounts that are micro-influencers or nano-influencers who are just informing a couple hundred or a couple of thousand listeners or readers. And because they're much more harder to be detected and understood and then ultimately blocked by a hostile organization. So that's the direction that I think we'll go is fighting disinformation on a very, very local, maybe even a hyper-local level in order to protect democracy. Candice, what does 2041 look like? Well, we kind of have like two parallel tracks that could diverge from each other, but I think at the center of the movement are three important pieces of the puzzle. One is the rapid proliferation of synthetic media so that is sort of cheap fakes and more sophisticated fakes, synthetic voices, synthetic video. That is gonna be, I think, a crisis moment for some government, most likely the United States. It would be one of the key, I think most vulnerable countries in the world in terms of how synthetics might affect our information environment. I think the second thing is virtual worlds that are starting to emerge and the kind of virtualization of information and cultural exchange even monetary exchange, something that it's hard for us to understand what that might look like in 20 years' time, but it certainly will affect the ability of individuals, in particular, to influence the more traditional media and information environment simply by virtue of being able to collect more virtual currency and more cachet. And then last but not least is the integration of artificial intelligence with more sensors, more in terms of the internet of things means that we're also gonna be looking at the situation where not just states, but tech companies, large organizations, people who have the ability to kind of concentrate capital are going to be able to influence whole populations and change, literally change the response to events simply by dint of their ability to surveil the information environment and then leverage that. That's only gonna increase over time. So with that being the sort of center, the center dynamics, track one could be, we just continue down this lane and keep crashing into these clouds of information pollution until we asphyxiate as a nation. And I think for many democracies, I think they're facing that, the compareless outcome. For track two is we force through a combination of civil society and government and other institutional pressures, we force technologists to get real about these threats and to open up their algorithms to become more transparent about their content moderation policies to become more responsive to the needs for a cleaner information environment. And here, I think we're starting to see the emergence of some very rudimentary tools that you wouldn't expect, which is defamation and libel law. That's an area that we're seeing, I think being tested, for instance, with the Big Lie and the 2020 campaign with these Dominion voting systems lawsuits and smart manic lawsuits against the perpetrators of the Big Lie who brought the litigation course on behalf of the Trump campaign. I mean, that's a good example of quickly using the tools at hand to stamp out bad actors and bad behavior. We don't have to be always so sophisticated. We don't have to just rely on citizens to educate themselves. We've got tools right at hand, right now, that we put us on a better track toward a cleaner, more stable information environment in the future. Thank you, Amral. Payne has a scene of 2041. Well, Peter, I'd start by looking at my optimistic side. Education works somewhat of an amateur historian, looking back industrial revolution when we finally adopted universal education, we became the most powerful nation in the globe, most income, most high standard livings on. So I go back to some of the things that were said earlier about digital literacy and education understanding. You have to understand this environment. I like to refer to the idea of America as being incredibly powerful. It attracts people from all over the world to come to a place where they have opportunity and freedom and they can take an idea and build it to an estate and be successful. Democracy's hard. If you think about a fascist society or a communist society with great promise of whatever it is they're gonna do, they slip into almost a totally autocratic control society. Now, I'm apolitical. I'm a military guy. I served Republican administrations and Democratic administrations and I did it with an apolitical approach. I voted for Republicans and Democrats and I always tried to maintain that. But now I am a proponent of cyber citizenship in the little organization I'm associated with. And we want to teach youngsters about digital literacy. How important is it? How do you sort out truths from untruth or misinformation? How do you protect yourself? How do you operate in that environment? And I think it's a challenge for our citizens to understand that, to be able to function. So that's the part that I worry about. Are we gonna be able to prove that democracy still works? The Chinese have a model, the Russians have a model, we've seen some of the countries in Europe slip into more autocratic forms of government. Are we gonna maintain our leadership to demonstrate to the world that the idea of America is still there, democracy still works, and we can resist these many attacks in the information domain that's gonna tear at the fabric of that idea. So I go back to how I think about this. It's fundamental education, starting in grade school. And so that means we have to teach the teachers, we have to apply the curriculum for the students, we have to expand that thinking over time and we can't let it turn into a political debate about teaching civic subjects that's an anathema to one political persuasion or another. This is basic survival for the information aid. So that's how I think about it. So on the optimistic side, I think it's clear. I said earlier it's leadership and it's a statesmanship. I truly believe that. But it's kind of all up to people like us to be willing to engage and provide the wherewithal either leadership or source of funding or whatever it is to cause the youth of America to fully understand the environment in which they live. Well said. We're running short on time. So I'd like to ask a question and request of each of you, give roughly a 30 second response to it. And the question is this, what is one policy change that is doable? A doable, not pie in the sky, doable policy change that ought to be implemented to steer towards these more positive futures. The policy change might be by government, it might be by corporate, you name it. So let's go back around the horn, Duane. What is one doable policy change we ought to implement? The right to audit data and algorithms. To me, there is nothing social about social media. It is traditional media companies. They do audit segmentation. They drive traffic. They sell ads. There's no reason why we shouldn't be able to have the right to audit data and algorithms. Great. John. I would like to see more coordinated action by states at a multilateral level to call out disinformation. Individually, we all have growing and immense capabilities to recognize and understand disinformation. But I think we should treat it like we do with cyber attacks occasionally and speak with one voice in the international community and call out foreign disinformation when we see it. Fantastic. Candice. I agree with audits as a requirement for sure. In addition to that, I would just add that platforms release quarterly or regular reports on violations of their terms of service and include with that geocoded information, anonymized geocoded information about where those violations are most concentrated around the world. Great. Admiral. Leadership to guide collective security between the public and the private sector to significantly improve our cyber resilience posture. This is a great way to end with these fantastic ideas that hopefully can and will be implemented. I wanna thank all of you for joining us and participating in this very important conversation. Take care. Thank you. Thank you. Thank you.