 Good afternoon. Thanks. So, I'd now like to introduce Dr. Travis Morris, who's kind of the mastermind behind what the event we're doing today. He is the director of the Peace and War Center here on campus and he's the executive director of the Military Writers Symposium. So, welcome everybody. It's great to see all of you and welcome to the 25th season of Norwich University Military Writers Symposium in Northfield, Vermont. It's a pleasure to have each of you here today and also those that are joining us through digital online. The symposium is a signature event of Norwich University's Peace and War Center. As many of you know, this is the only program of its kind in an American university. Over the years, it's brought some of the most prominent military, intelligence, and international affairs scholars and thinkers to Central Vermont. The symposium is designed to educate, to inform, and to be relevant to topics that impact all of us in the room. It's never ever avoided the hard issues central to our public's understanding, whether we may disagree with them or not. As I mentioned, this was the 25th year and so some backstory. In 1996, the symposium was conceived by co-founders W. E. B. Griffith and Carlo Deste with the support of former Norwich President Russell Todd. The symposium was designed as a way to bring writers to Norwich University to engage all of you and to engage our community. No matter what your major, your source of commission, whether you're going into military or not, it's meant to engage our entire community. The subjects that we've covered have been vast and range from PTSD to cyber warfare, to the war in Iraq, to Afghanistan, and also today, the future of warfare. The program that you have in your hands, if you've looked at the theme, we're going to spend the next hour and a half looking at warfare in the first 21st century and future battlefields. That applies to all of us here in the room. It doesn't matter, as I just said, if you're going to enter law enforcement or not, each one of us as citizens will be impacted by the discussion that we will have here today. To my left, you have some of the globe's leading authority on the subject sitting right before you. And they've taken their time to come to Vermont to share some of their thoughts with you and we're extremely grateful and we appreciate the things that you've done already and the things that you'll do over the course of the panel. So before I introduce the moderator, I'd like to recognize and thank several people here are in this room that's made all of this possible. So when I call your name, please stand. North's president, Richard Snyder, Medal of Honor recipient, David Belavia, the president and CEO of Pritzker Military Museum and Library, Dr. Rob Havers, Carlo and Shirley Destay. And I'd like anyone present that has supported the symposium either with your finances, with your support, or with your love, please stand. And that should be many of you. That's right. Let's give them all a round of applause. You can have a seat. So as I mentioned briefly, I'd also like to welcome you that are joining us via streaming and whether you're watching us, whether you're deployed on the other side of the globe or somewhere else in Vermont, we welcome you and glad you could be here with us today. Our moderator today is Colonel Andy Herd, United States Air Force retired. Colonel Herd is a special assistant to the Norwich University Provost. He integrates university departments to expand expeditionary opportunities to students so that you may learn what it's like to travel abroad and to lead abroad. He's done a tremendous of other things here at Norwich University and part of that has to do with him being stationed here for ROTC but also for his hopes and helping each one of you to become better leaders. And also leading here today is being the moderator of this panel. His bio is extensive. I could spend a lot of time reading about some of the incredible things that he's done but I'll keep it short. As I mentioned, he's retired United States Air Force. He went to the Air Force Academy and University of North Carolina. His military career of logistics and airlift include three commands and for those of you that are going to be aviators, 6,000 flying hours and combat missions and four conflicts. Let's welcome Colonel Herd. Thanks Travis. Thank you for attending the Norwich University Military Writers Symposium panel of warfare in the 21st century. Future battlegrounds. My name, Andy Herd, you and I are very privileged today that Norwich University and the Peace and War Center have organized this panel to engage with experts of the 21st century conflict. The experts on the panel are global thought leaders who are intimate with battle both in planning and in direct action. They understand the evolution of conflict and how that conflict shapes policy. Their contemplation of the future influences national powers, planning and decision making through their careers of research, writing and debate. Today we are fortunate to join them for 90 minutes of their professional experience. Your Norwich experience is to develop you to lead. Whether you lead in business, the community, government or military service, preparing you to lead in the 21st century is central to this university's mission. This panel is part of that mission and these writers have spent years thinking about the evolution of warfare and how 21st century battlegrounds will impact society. From our conversation with them today, you will learn unexpected insights about your future challenges. Some of you may feel very comfortable conversing about cyber or artificial intelligence, robotics or data. Today's conversation is not just about products you can purchase and you should already know that your data, your personal data can be a threat. Your DNA, your search preferences, your social posts, they can be used for great purposes, but also they can be used to manipulate or threaten you. So today's conversation is about the future that you will live and work in and we are here to explore the future of warfare. So your phone is a powerful communication tool. It's also a potential method of tracking and exploiting you. Artificial intelligence is changing our lives. It impacts future jobs, it impacts transportation and politics. Robotics have revolutionized industry already and war. Combined with AI, big data and instant communications, robotics is a 21st century change agent, perhaps like none other in history. You cannot be on the technological sidelines as leaders. Whether you're a school teacher or a platoon leader, you must continue to reflect what's expressed today by this panel. Your job as leaders is to be open to new ways of thinking and be proactive, confronting challenge. That is what today is about, the imperative to study the technological environment within which you must lead. Some of you must contemplate the very direct threats from which you must defend us in battle. Today, most of us are connected in real time to the internet. Immediate notification of events is beamed into your own pockets. Those students who registered their cell phones in the Norwich Emergency Notification System received an exercise notification early this morning. Anyone notice that? I'm sorry if it woke you up. The rave system can direct you to act, right? Hunker down, run, report in, alert somebody. This is an amazing capability for our safety office to trigger action throughout Norwich. So how would you respond to a directive to evacuate your building late at night and report in at the UP due to some threat? As you leave your dorm in midnight, you see this drone. You should ask yourself, why is it there? Is it security, intending to search the building for a suspicious package? Perhaps it's a local reporter getting media content for the TV news. Or maybe it's law enforcement monitoring the safe evacuation or using facial recognition software to search for a suspect. Is it programmed to kill a target? So what if there were thousands of these on the battlefield, automatically seeking targets wearing your country's uniform? How are you going to lead men and women in that environment? Finally, what if this drone's programming incompletely analyzes the situation or has a coding error? What will you do or what will it do if you're standing in front of its target? These are the sort of questions you should ask. Each time you get a suspicious email or a directive text message, or you feel your car automatically bump steer away from the side of the highway, or when a drone buzzes overhead. What do these technological advances mean beyond their prima facie purpose? How far has this tech already gone in the government funded laboratories and what capabilities are already fielded? How are you going to be ready to wield those capabilities? And how will you lead people against threats who are faster, fearless or devoid of empathy? These are game changing technologies and yours is a leadership future that cannot rely on studying the past. This drone is a fraction of the capability that exists today. Tomorrow's tech will be exponentially more powerful. And for 21st century leadership, you've got to be immersed in the future. You all have to kind of become futurists. So fortunately, we're joined by three futurists today to get you started. You should have already read their biographies. If not, you can scan them now while I'm finishing these words and plan to engage each of them after the panel at the book signing. Dr. Benedetta Berti is the head of policy planning in the office of the secretary general at NATO. She is a policy advocate for human security, stabilization and peace building. She has written extensively on the future of terrorism and non-state actors. Her contributions earned her the order of knighthood from her birth country, Italy. Dr. Peter Warren Singer is strategist and senior fellow at New America. He's a leading expert in 21st century warfare, advising the Defense Department, industry and entertainment, including the software Call of Duty. He's written nonfiction and fiction on future conflict and the impact of cyber and robotics. He is listed by foreign policy as one of the world's top 100 innovators. Paul Shari is a senior fellow and director of the technology and national security program at the Center for a New American Society. Previously, he advised within the office of the Secretary of Defense on unmanned and autonomous systems. He served in the Army Third Rangers, leading special operations reconnaissance teams in Iraq and Afghanistan. And Bill Gates named his book one of the top five books to read in 2018. And he is also this year's Colby Award winner. So panel members, life is busy. We fill our days with work, our studies, our relationships and having time to contemplate the future is rare, especially when trying to think about battlefields. What is happening right now, which may have significant impact on the 21st century warfare? Dr. Bertie, would you please lead off on this topic? Surely. And thank you very much for the kind introduction and thanks everybody for being here today. So I will start by saying that part of my job today very much has to do with looking at future trends. I sit in the office of the Secretary General on NATO and I lead the team and one of our main jobs is really to look at the future. Look at five years, 10 years, 20 years down the line and look at how the trends that we see in the world today in the realm of politics, geopolitics, economic and security, of course, will affect our ability as an alliance to deal with the worst of the future. So that's certainly a question that takes up quite a bit of our time and our thinking back in Brussels where I'm based. I will start us off with a couple of points. I know that I'm here with Paul and Peter were two very much top experts on looking at how emerging technologies and new technologies are going to affect the way we fight wars. So instead I'm not going to go there. I'm going to talk about the non-technology bit that to me is also very important when we think of the future of conflict. And first the point of order as NATO as an alliance was very much thinking about how to address conflicts of the future. Our first assumption is that those conflict will be unlike the conflict of the past in at least a few dimensions. One of them is the fact we expect these conflicts to be multi-domain. So not just fought at air, land and at sea, but also in other operational domains like cyber, like space, like the information environment. They were taking a number of decisions and implementing those decisions so that we are ready to fight conflict on this multi-operational domain. Now as a researcher I would add another point that to me is perhaps sometimes forgotten when we think about the future of warfare and the future of conflict, and that is that more and more we are called into a position where we have to even question where does conflict begin, when does it end, what is really conflict in the sense that we're going to face more and more of activities below the threshold in the so-called gray zone. Mixing and matching kinetic and non-kinetic means to achieve maximum, I think military effects. So we're more and more faced in awarding which all tools in the toolbox from geopolitics to humanitarian assistance, to foreign policy, to a number of tools that we traditionally thought are separated from pursuing military and security policies are going to be mixed and matched together. And I think we still have a number of political and military adaptation that we have to undertake to really be able to deal with conflict in the gray zone. The last point I would make is that conflict also looks increasingly more unclear, where does conflict begin, where does it end, and we live more and more in a world where I would call us in witnessing a number of no war, no peace scenarios, none of which are given any indication to be going away. To the contrary, if we look at the map of political violence today and we track where the main civil and humanitarian crisis are occurring, from South Sudan to Iraq to Syria to Afghanistan to Yemen to Somalia, and I could go on. One of the characteristics here is that we are in protracted political conflicts in which the beginning and the end looks increasingly more blurred. This puts a number of really important dilemmas in terms of how do we do peace building, how do we do humanitarian assistance, how do we do development, when do we use the military more effectively. And I think that's a trend that it's only going to increase. We're going to have a word where frozen conflicts, our resolve conflicts, protracted political crisis are not gonna go away. If anything, they're gonna become more entrenched. And then I think puts us places upon us a number of serious dilemmas in terms of how do we intervene, how do we act to mitigate the impact of conflict. And then there's a lot of other trends, but I'll start us off by making this point that the battlefield is not one, but it's many. The kinetic and non-kinetic are looking increasingly more blurred. And what is war and what is peace is increasingly more undefined, which I think is a big dilemma going forward. Thank you, Dr. Birdie. Dr. Singer and Ghostfleet, there's a lot of complexities in that story. So what of that story or what trends are happening right now that are there already real time for these future leaders? Sure, so I want to begin by thanking you and the organizers for having myself back here. It's just a real honor to join you and everybody shows such fantastic hospitality. To ping off of what was brought up there, I think one of the other areas in terms of the future of warfare that's a key driver is the emergence of a series of technologies that you might think of. They go by different buzzwords. Sometimes they're called revolutionary technologies. Sometimes they're called disruptive technology. Basically what we're talking about is technologies that change the game. There are technologies that a generation ago we would have thought of as science fiction. They're now real and they're poised to change the world. And when I say change the world, I mean everything from society, business to what plays out on the battlefield. And think of these, I was at the museum here earlier today and you have distinguished graduates 150 years back who led the United States Navy adapting to the new steam engine and the idea of a armored vessel. You then have a wing that shows the first graduates of the school wrestling with the flying machine. You have the, I visited a cybersecurity course here. That's not thought of the computer circa 1980. You aren't thinking about its weaponization. So what's akin to that moving forward? We can see areas of that. You can break it down into something that both Paul and myself have worked in and the hardware space. And you saw an illustration over the corner. It's robotics, but most importantly, increasingly autonomous robotics of various sizes, shapes, forms. If you think of in the software space, it's artificial intelligence, lots of different definitions of it, but essentially machine intelligence that is either simulating human decisions or doing them better in some way, shape or form, quicker, taking in more data, et cetera. You've got the change in the internet to the internet of things. So you've got hardware, software. You have Waveware, which is basically new energy sources, but also energy becoming a weapon itself. The ray gun is no longer just something in science fiction. And then you have human performance modification using technology to change what we can do. And it might be tech that we carry on the body, a exoskeleton Fitbit, you name it, or it might be technology in the body. I'm at a student here who is doing their research project on brain machine interface technology, basically using your brain to connect up to a computer. That would, this was not a science fiction class they were in, it was in your engineering department. So these are the kind of technologies that are happening out there. I think real quickly, the first is what makes them revolutionary is they give us new questions of what's possible that wasn't possible before, but also as you brought up in your introduction, they give us questions of what is proper that is debates of right and wrong that we weren't having before. And it might be a legal ethical right and wrong question or it might be how do I best organize my military unit right and wrong question the second thing to ping off of what you brought up is it's not just that it creates in terms of battle multi-domain, but these technologies, unlike the ironclad or the aircraft carrier, they have really low barriers to entry. So it's the fact that multiple other actors will have them. So a non-state actor and insurgent group now has a little miniature Air Force. Saudi Arabia just experienced this. Saudi Arabia has the third largest defense budget in the world and yet it got hit by a combined drone and cruise missile attack. And then the other part that I would ping off of what you brought up is it's not just the idea of sort of the gray space of conflict and knowing when it sort of begins or ends, it's the speed of conflict has changed. So AI part of the goal of it is that it moves quicker decision making than humans. There's so much going on. We may not be able to weigh in in the ways we used to, but the flip side is that it means conflict may be continual. So to use the example of Ukraine, and we played with this in the ghost fleet book, the cyber war effectively was lost by Ukraine before the first armed troops crossed into their territory. So they lost the war before the actual war began because of what was happening in their networks months before the fighting began. And so an interesting thing, there are people in this room that may deploy into battle years from now and yet the outcome of that battle may be shaped right now by what's happening inside a computer network or even inside a microchip manufacturer. You mentioned speed. One thing that has not been traditionally fast is government policy. And there's a lot of that in your writing, Mr. Shari. So could you tell us a little bit about some of the real time policy successes or things that we need to be thinking about right now that impact 21st century warfare? Yeah, I mean the real challenge that we face from a bureaucratic or a policy standpoint is that we're just much slower than the pace of change out there in the world. And I think it's worth, we think about future conflicts, worth really thinking hard about, what do we need to know? Now in the last 30 years, since the end of the Cold War, we've seen US military forces deployed to Iraq, to Somalia, to Haiti, to Bosnia, to Kosovo, to Afghanistan, to Iraq again, to Syria, to Iraq again, to Mali, to elsewhere, to Libya. We don't actually need to know where we're gonna fight in the future. That's unknowable, we're not gonna know that. And that's really heavily dependent on a lot of political uncertainties and sort of specific events that may happen. That's okay, that's fine. What we need to know is what might war look like? We're not gonna get that perfect, but we need to get it close enough that the forces that we train and equip are not grossly unprepared. And we have felt the pain and the cost to soldiers and to service members when we send forces overseas that are not ready. We've fought this certainly in prior wars in World War II and Korea in early phases, but we certainly felt it too in the early phases of Iraq and Afghanistan where we fought a type of conflict that was very different than what the army had been focusing on or the military's all been focusing on in the 1990s. So when we think about the military as sort of a set of toolkit for national security decision makers, we wanna have the right tools in our belt that we're ready to address whatever conflict we're in. You know, as Peter talked about, we're seeing these explosion of digital technologies that are fundamentally changing different ways in which we're fighting. But one of the things that's interesting about this is we're also seeing that the pace of this is so incredibly rapid. It's not just that it fuels that way to people who may be of a certain age, but when we actually look at data and innovations and how rapidly technology proliferates, it is actually changing and proliferating faster than it used to be. And we continue to see exponential growth in many of these kind of digital systems. So our policies are really struggled to adapt for 20 years now, for example. The US Defense Department has been talking about the challenge of adversary innovations in precision-guided weapons, sensors, battle networks that will allow them to target our military forces that project power abroad with a great deal of precision and lethality. Things that basically render our aircraft carriers, our short-range fighter aircraft, our air bases significantly less useful in future conflicts. We've done very little to adapt. And that's because there's a lot of lock-in in our system, both in Congress, with independent bureaucracy and also culturally, where there are things that might have to change how we fight. Peter mentioned kind of the shift from sail to steam. That was a challenge inside the Navy, shifting from horses to tanks in the Army. There are lots of historical examples where adopting technology requires changing how we fight. And that's a real issue. And so I often see culture can get in the way because we have a lot of a dispute core, but sometimes it's attached to how we carry out a task rather than maybe the mission we're trying to perform and that can sometimes actually hinder military effectiveness. So with a pace of change so rapid, you're gonna be adapting in just the next few years to start leading across all fields. And in just a little more than a decade, each of you are gonna be deputies, field-grade officers, and perhaps even business partners. So to our panel, could you please address what these men and women will face 10 years out when they're advising. And each of you has been an advisor, advisors in many sorts of ways. So Mr. Shari, if you would first tell us to the future advisors of senior leaders, what should they be preparing for? Yeah, I think one of the real fundamental challenges that Benedett had mentioned was sort of this blending of what we traditionally think of as war and not war. Not only gray zone conflicts, but also the increasing emphasis of non-kinetic means of warfare, information attacks, cyber attacks. There's a high degree of transparency in military operations that I don't think we're actually prepared for. We saw in, for example, the U.S. Navy Seal ready to get in line, that was reported in real time on Twitter. So now we're operating in a world where there's a great more transparency about what our military forces are doing. That could be reported, could go viral. And all of this basically means that there's so many factors to success in war that are not about what we sort of think of as war. It's not just the kinetic aspect of it. I don't know that that's actually a change so much as it is maybe just that our concept of war has become overly narrow. And maybe it's that we've watched too many World War II movies. I'm not sure what it comes from, but I think from the ball arc of history we see that actually war is unbounded and there are many methods of fighting. Other than simply a direct clash of arms. And many of these are quite effective. Guerrilla tactics or information warfare is quite effective. And we need to be able to adapt to these realities, to use some of these tools when they make sense for the United States, but certainly be prepared for them. And I think we're probably just widen our concept of what war is. That instead of coming up with the defense community in the U.S., we have all these terms, irregular warfare, unconventional warfare, unrestricted warfare. Maybe we just need to broaden our horizons about what actually war is because it's not gonna fit into the neat and tidy boxes that we might like. Thank you. Dr. Birdie, what would you have this group focus on when they're advisors to future conflict? So I would like to continue with Paul's point about what is war and what is it, because I think that's a key question. And I think there's two, there's two, yes, that's working. Great, there are two, I think there are two challenges there. One is we really need to understand this multi-domain gray zone, hybrid type of activities. At the same time, we also be mindful that not, yeah, there's no more. Okay, yes, much better, great. Thank you. So what I was saying for those who couldn't hear me is that I completely agree with the point made about redefining and understanding what conflict may look like in the future. But with a bit caveat, and my caveat will be the problem where we describe something as conflict is that our go-to solution is if this is a conflict, then we need to use the military as a tool. And I think over, I think a lesson of the past couple of decades is military force as a role, as a purpose, as an important place in a country's broader global power projection foreign policy and security policy, but not everything can be addressed to military force. And the challenge of stabilization, and I think it's something that we need to reflect very keenly, very carefully about, is fundamentally a multi-sectoral challenge that requires economic development, reconstruction, political representation, a series of measures that cannot be delivered by the military primarily or even by the military at all. So the challenge is to know when force is useful, when force is the go-to policy tool, and when our military needs to be given the chance to do what it does best, and other parts of government needs to step in. And I think the challenge of stabilization is one of those. So that's just to react to the point that we need to be more flexible, but we need to be careful not to militarize all the problems we have, because many of them have no military solution. They require political engagement. And the related point to that, in terms of what do we need to pay attention to looking forward, just again to build on the point that if conflict is looking different, and there are a number of international legal frameworks that have served us very well for many decades to try as much as possible to limit the damage of war, especially on civilians. The sum of those war convention and frameworks are quickly becoming outdated or quickly becoming less useful in a context in which wars look different than what they looked in the past. They see much more states pitted against non-state armed groups who follow a different framework and often do not respect those international legal principles. And that I think it's something we need to reflect very carefully going forward. How do we adapt, reform, and make sure that our legal international humanitarian law principles are both solid and grounded in reality, and help us fight the wars we need to fight while staying true to our values? And to me, that would be a key challenge of the future. Dr. Singer. I was thinking through it in terms of, the question that you're really asking is what will be different that someone in that role of a staff officer in the military or a young executive in a business, what's gonna be different for them when they're advising, say, their boss on what to do. And I think there's two things that stand out, and one's to play off of what Paul brought up. It's not just the real-time element of the information, but if we're thinking not just now, but in particular 10 years out, the task of that staff officer, that executive, is not gonna be, go out and get me the information on X. Rather, it's going to be, help me to figure out which of this information is important and which of it is actually true. So we'll move from a space where you have to find the data, which is what a lot of people are in now, to know there's so much of it, what's the relevant part of it, what really matters, and then we move into 10 years out, a world of everything, and I'll be talking about this later tonight, everything from deliberate disinformation to AI-generated deep fakes. So what's just true or not? Again, whether you're talking about a battlefield operation to what do my customers think, to did this crime happen or not, or who did it? This second sort of the challenge, as again comes from opportunity, is in a world with more artificial intelligence, the decisions that will be made will be more and more guided by AI that will be the one sifting through all of that data and essentially either taking the decision itself or providing recommendations. And we already see this in everything from which way should you go, the ways map to get to the destination, to it's used to advise who should get, who's eligible for a loan for their house mortgage or not. To give you the military version of the ways map, I was at a Marine landing exercise where they were testing one that was a sort of military version of a route recommendation, but unlike ways, it didn't tell you the route to go based off of time savings, it recommended the route based off of expected casualties that you would take. It might be, so again, you could see all of these different recommendations that will be going on around you, who gets promoted or not, et cetera. And so the question that you'll be advising on is when do you actually listen to the recommendation or not? When your gut tells you, you know what? Ways is saying take a left turn, but I know that's wrong, I had to go right. Or this is saying that this person should not get a loan, but there's something about them they seem trustworthy or maybe that there's an issue of algorithmic bias and it's the wrong data fed into it. One of the things to hit, what Paul brought up about the different recent experiences of war is in the military, a challenge is that we're training our AI off of data from Iraq and Afghanistan, which is great, except is that actually gonna create, is it gonna be suitable for say, a major state conflict against China? Well, the ways that we process decisions in Iraq and Afghanistan always be appropriate. So those two elements, the idea of one, helping people sift through information, what's real or not, and two, when do you listen to the AI and when do you not as the kind of decisions I think you'll be part of that say staff officers 25 years weren't? Now, the good news of this is you're like Bain from the Batman movie, you grew up in the dark. This is what you know. So you will be in some ways better suited to help advise than say the current generation who are being flummoxed by all of these issues of fake news, disinformation, et cetera. That's a great point. It's like a derivative of it's never as good or as bad as first reported. Now it could be terrible, the inaccurate, and we need a confidence level against every fact that the commander or the decision maker sees. So to the panel in 25 years, these people will be our community leaders and our elected politicians, our business owners, and our military commanders. In that future context, when this audience will have their greatest influence on the world, is the context I now ask each of you in an individual question and certainly will have follow up from the other two as desired. First to Dr. Birdie, ma'am, you have championed human rights and written extensively on mitigating conflicts impact on civilians. Today, human rights violations conjure visions for me like homeless refugees, non-combatant casualties, and resource deprivation. With rapidly changing technology, what will characterize human rights issues in this century in the future that are new considerations that our leaders are gonna have to struggle with to manage impact on civilians? Thank you for that question. It's certainly something I spent quite a bit of time thinking about with no great answers admittedly, but I'll give you my two cents anyways. I think we have at least two main challenges to deal with and we need to deal with them today to make sure that the situation in 25 years is not as bad as it could get. Number one, I would say there has been over the past few years, led also by the harrowing example of the Syrian Civil War, where indiscriminate violence against civilians has been used as one of the key tactics by the regime and its supporters, being Russia or Iran, to gain a military upper hand over the opposition. And what has us done and what that conflict and other conflicts fought along similar lines are doing around the world, I think is eroding some of the basic principles of international humanitarian law that we have fought so hard for to establish over the previous decades. So a key example here is the prohibition against the use of chemical weapons, which we should have thought that after World War One, we had pretty much established as one of the key principles in the conduct of warfare. And yet in Syria, we see chemical weapons used with relative impunity and that is weakening that particular norm. So I would say with a great deal of urgency, looking forward, one of the challenges we have for human rights is to establish, re-establish and reaffirm the key principles of international humanitarian law. This will be harder to do as more and more we see the rise of geopolitical competition. We see a world in which a rise in China represents a different political model from, I would say, the one represented by the United States in Europe wants to shape the international order and the international legal system according to its own values, which may not necessarily coincide with ours. So I think that's a great challenge. The second great challenge is domestic. And I think we all have to grasp with it, especially as democratic countries, and that is what does a democratic, open, human rights-respectful, digital ecosystem looks like? And that involves a number of ethical and legal dilemmas from the regulation of AI to what do we do with autonomous weapons? To how do we deal with fake news and disinformation? There is a number of momentous challenges that all amount or potentially could amount to threaten some of the principles of our democracies. So I think these are key challenges that we need to get right today. If you wanna make sure that in 25 years, our job is not to try to rebuild the international legal system after it has been destroyed. The time is now. Dr. Singer, do you have any follow-up on that? I think one of the other interesting aspects about the, it's the new challenges that the institutions that protect human rights face from new types of threats. So one of, you know, to play off of that example of chemical weapons use in Syria, the resistance to trying to create accountability of it took place everywhere from within the United Nations, sort of the classic way, but it also entailed a massive disinformation campaign pushed over social media, trying to reach into the different body politics of the nations deciding whether to intervene. Basically, you know, saying no, the chemical weapons attack didn't happen or it was fake news, it was planted by the insurgents themselves and you saw near perfect alignment and actually when I say near perfect, like literally the same players that were pushing disinformation, targeting the United States 2016 election were the same ones pushing that. And it's a means of information warfare that made it harder to build respect for human rights because if we can't even agree on whether the atrocity happened or not, how do we then get to what do we do about it? And so this one of these interesting things is how do we build up resilience to those kind of attacks, not just our politics, but also we've seen human rights groups attacked. Another great example of this, a human rights group in Sudan had its emails hacked and then false information planted in it and then it was spread viral through a mix of bots and sock puppet accounts. Sound familiar? And it was the whole part of a way of damaging that human rights campaign to prevent it from operating effectively. So until we get a kind of handle on this new type of digital threat, it's gonna be sort of poisoning different parts of everything from domestic politics to global respect for human rights. Mr. Shari, do you have any thoughts on the human rights impacts? Yeah, I mean, you know, there's a long running contest, of course, between democracy and authoritarianism or liberalism and we got to the end of the 20th century and democratic values, liberal values, won that contest in the 20th century. Free nations defeated totalitarianism and Nazism and communism, but that wasn't the end of history. That wasn't the end of the story and we're seeing new forms evolve now in this century that are quite dangerous. And if you don't stand up for these values, they erode over time, right? So certainly in Syria, the gross abuse of human rights by Bashar al-Assad regime and the inability of the international community to effectively do anything about that is degrading norms in military conflicts, we're also seeing this in peacetime. China is engaging in horrific human rights abuses, detaining over a million of its citizens in Xinjiang and the world is basically silent on this. And it's really deeply troubling. We know this is happening, there's plenty of ample information about it. China successfully basically bought off other countries. Europe is largely silent on this because in the large part, we had countries getting money from China. And so there was a sort of a view at the late 1990s and early 2000s that there was this broad arc of history that was trending towards progress. President Obama used to talk about that. I think that's optimistic, but I don't think that's supported by reality. The US engaged closely with China on the assumption that over time engagement would lead to China becoming more liberal, politically and economically, that's not borne out. We're now facing a very serious competitor in China that has a very different view of the world than the United States. And I think it's a real challenge when we think about how to adapt to this because there are many countries that would love a world where human rights doesn't matter. And I don't think that's a world that we wanna live in. Thank you. Dr. Singer, Ghost Flake contains characters who consistently express honor, but they seem to struggle with the technological change that reduces or perhaps eliminates it from the battlefield. Staff Sergeant Belovia this morning professed that empathy is what makes America's military so much better than everybody else. Will honor and empathy remain essential in conflict or is it going to be a liability or perhaps both? Often you hear people, particularly in the military, lament and even blame the laws of war for some kind of outcome that they didn't like. They'll say it was like fighting with one hand behind my back or you had General Tommy Franks after he let bin Laden go, blamed his lawyers for not allowing him. This was in the early part of the 2001, right after 9-11. There's a convoy that bin Laden is in and Franks chose not to airstrike it and he blamed his lawyer. And as a, I'm the son of an Army JAG officer is anyone who knows this space knows the JAGs don't tell the four-star general what to do. It's blaming the code of honor because honor is about following a code. It's about following a set of rules of right and wrong, either normative rules or actual written down ones, whether they're written down in the little book that you all get or they're written down in the Geneva Conventions. And so essentially there's two things to note here. The first is we often blame that for outcomes that have nothing to do with it. And then the second is history shows professional forces professional forces and professional is defined by those that operate by a code as opposed to barbarian forces, which are warriors. It's a fascinating thing to see how we've amplified the word warrior, but actually it's the professional forces. History shows professionals consistently beat warriors. Professionals consistently beat barbarians. Those that follow a code are the ones that win. And that's if you literally are sort of taking a listing of wars won and lost. And the reason is because of what you see here, those that follow a code can organize, train and equip in a way that those that don't follow a code can't. Those that follow a code can win hearts and minds and trust of the local civilian force in a way that the barbarians cannot. So we often kind of blame cast it. And I think we'll continue to see this moving forward with whatever the technology that we're talking about, whether it's unmanned aerial systems or cyber, if you are using it in a manner that is barbarian like, it will create blowback upon you. And that's actually one of the interesting things we've been wrestling with, be it in Afghanistan, to Iraq, to the Saudis are wrestling with this and Yemen is, yeah, you could have all the great unmanned systems you want, but if it's causing greater civilian casualties, it's not going to deliver you the victory that you want because it ends up producing more people volunteering to join that adversary group to, it means people are not delivering you the targeting intelligence that you want to go after the bad guys. So to go back to it again, when we blame the code for our losses, we're usually blame casting our own bad decisions. And second, history shows professionals win. So be a professional. Thank you. Mr. Shari, in your book, you touch on honor, empathy, how we integrate that into AI and robotics a lot, but I don't recall getting the answer. So perhaps you could give us your answer to this. Is honor and empathy gonna survive? Yeah, I think a world actually where we don't have empathy is probably a scary one to be in. Peter had kind of mentioned this question about, do you trust the map guy telling you where to go? And I think a central question that the military is going to face going forward is, where do we use automation? Where do we use people? How do we use artificial intelligence and automation to take over various tasks when they're gonna be more effective? There've been some studies by consultants and analysts that say that roughly half, half of all tasks currently being done in the US economy could be automated today using existing technology. Now that's not all jobs. It's actually messing maybe 5% of jobs could be totally eliminated. But a lot of things that people do could be automated to some degree. Now that's gonna be true in the military as well. If you go forward 30 years and we have people manually land in planes and taking them off or manually driving vehicles or manually actually aiming rifles, we're probably doing it wrong. Those are all things that we could automate with higher degrees of precision and reliability. But there are lots of things in where there isn't a right answer. Where the right answer depends upon context or judgment. So could we train a machine to know better than a human whether someone is holding a rifle in their hands or a rake in their hands? Yeah, probably. Over time we could do that using enough data and machine learning, we could figure that out. But that doesn't tell you whether it's that person's a combatant. That doesn't, they could be holding a rake and they could be a combatant or they could be holding a rifle and not be. They could have surrendered. They could be a friendly force. They could just be a civilian in its arm that's protecting themselves. It also doesn't tell you what is the smart thing to do. Could be that they're a valid enemy combatant but if you light them up you're giving away your position and you've compromised yourself. And maybe that's not the right tactical thing to do in that instance. And those are things that we want people involved in but there's of course also this ethical component that you mentioned that I think is particularly salient when we think about the use of force in war. It's a tricky thing because there's a cost to having people involved in these decisions. There's a cost that people have to bear that moral burden. We've increased a lot of awareness in the force about things not like not only just PTSD but moral injury. The fact that those who have to make very difficult choices in war sometimes choosing between two different wrongs have to live with that. And I think it's a really interesting moral dilemma to say, you know, look, we rather have somebody do that. It is not fair as a society that we make a decision as a nation to go to war and we send off young men and women to do that and to carry that burden for the country and make those choices. And a very small segment of the population does that. But it's also worth asking, what would it look like if no one cared? If no one weighed those choices, if no one weighed the value of human lives and I'm not sure that that's what we wanna live in either. Dr. Birdie. Well, yeah, I agree with everything that has been said so far. I think there's so many ways to go about it. There is a utilitarian argument if you wish. I mean, we must not forget that the main point, the main principle of these rules of the laws of war is reciprocity. They were designed so that armies would have a code of conduct that would be reciprocal. So the first rule we were able to agree to as international community was about prisoners of war. And there is a clear interest in all sides keeping that convention. That means that if your men and women in uniforms are captured, they will be treated with dignity and not tortured and not subjected to degrading treatment. The compromises you will have to do the same if you capture your enemy forces. So there's a reciprocity, there's a utilitarian value to upholding a lot of these principles. It makes for, it tries to do something very difficult which is to mitigate the impact of war while still allowing states to use force if the situation demands to do so. So there's a utilitarian argument. There's an effectiveness argument like both that has been mentioned already, meaning if you're involved in a counter-insurgency type of operations, part of your mission is to try to pacify that area. It's much easier to do so if the population is not fighting against you. It's much easier to do so if they see you but someone that they can trust or rely upon as opposed to if you give them reason to side with the insurgent. It's also more effective. And finally, I would say there is a political cost. Our armies, our armed forces are very important part of our societies. The way they fight tells us a lot about us, about who we are, about what we stand for and what type of societies we wanna live in. So this is not just about what we do abroad reflects on how we, of what our values are home, right? So in that sense, I think it's just, it's absolutely vital for democracies to continue to maintain honor and empathy and respect international law because this is about us and how do we wanna live as with ourselves in our societies? What values define us? And I certainly hope that the values that define us tomorrow will be the same that have defined us so far. I'm very hopeful after those three answers. Honor will survive. Mr. Shari, your exploration of the future employment of AI and robotics in Army of None was fascinating. Bill Gates in his very positive review, he provides a measure of comfort that he doesn't lose sleep. But when I imagine 2040 battlefields, it worries me because I think of the tireless, fearless robot, all-knowing AI, combined with great power capabilities that ends in some sort of global Armageddon. What's the worst case in your future? Yeah, so I think the good news is, the kind of things that we see in science fiction, I don't think are real risks, of terminators turning on us. And yet another Terminator movie coming out and I'm sure it'll be very entertaining, but not a real problem we face. You know, I think the things that worry me the most are a slow movement over time towards more automation that slowly peels away human control from warfare, and that crosses some threshold that we may even not realize what we're doing at the time where humans have much less control over what's happening. This has been expressed in different ways. Chinese scholars have talked about a battlefield singularity, where the pace of automation and machine-driven warfare on the battlefield eclipses the speed of human decision-making. So now we want to use automation to get inside the OODA loop of our adversaries, to react, to observe the environment faster than them. But automation raises the prospect that your military forces are inside your own OODA loop. And that can be okay when you have people making these decisions, right? We want to tell subordinate units, hey, be flexible, be adaptable on the ground. We don't want our subordinate units calling up to hire and ask permission for everything. We have ideas like commanders intent, but the machine's not gonna understand commanders intent. And so there's a risk if you get these accidents at machine speed. And I think that's really quite dangerous. I think it is most actually significant in cyberspace, where there is the potential for harm rapidly and at scale in ways I actually think we're not really prepared for. And we lack the resiliency in the current structure of the internet against some sort of intelligent adaptive malware that could be quite disruptive. Thank you. Dr. Birdie, do you have a worst case when you think of the future? Do you think of HAL or Robocop? I certainly have a number of worst case scenarios, but I am going to answer your question, but in a roundabout way, because I think that we have mentioned so many important factors that can shape our future. And rightly so, we've focused a lot on technology. And I think we ought to. But if we're talking about the future and potential worst case scenarios that will affect our armies, I think before the time is up, I think I have to mention climate change because otherwise we are kind of ignoring a huge elephant in the room. And that is not to say, I completely agree with everything Paul said, but I would say, if the robots don't get us, there is also a chance that something else will and that something will be climate change, right? And that's something that certainly is a threat multiplier, certainly makes some of our existing challenges more complex. We see a word in the future where ongoing desertification, lack of drinkable water, we create more humanitarian emergencies, more millions of refugees escaping what essentially our climate crisis. We will have our militaries think very carefully about how do they respond to ever more severe and frequent extreme weather events, which is something we're already taking substantial work to respond to. But that I think if you project yourself into the future will become a much more important mission because the need will be ever so ever so greater. And so just to say that, of course, there's a number of potential worst case scenarios involving technologies and we should definitely prepare ourselves for that. At the same time, we also should prepare ourselves and adapt for the fact that our planet is giving some signs of distress and that's gonna affect every sector of society, including our military. And just to close on that, I find it very interesting that just last week the chief of defense of the British Army has announced a new initiative aiming to make the British Army fossil fuel independent from fossil fuels in the next few decades and to really work on what we call in NATO green defense. And I think that's very important as well. So that's not to say that everything else that has been said is not important, but we should also keep an eye on those issues because they're going to affect our future very significantly. Of course, thank you. Dr. Singer, you had a worst case scenario in your story when we got our butts handed to us in Ghostfleet. What do you think the worst case is gonna be? Yeah, you're asking someone who wrote a book about World War III, what's the nightmare scenario? The dark thing to say is we're living it. Whether it's the climate change issue, and all the craziness of yesterday in the news, you may have missed that a new report came out, a UN report, over 100 scientists from 36 different countries that essentially concluded, remember how bad you thought it was going to be? It's actually turning out to be much worse. It's going quicker than we thought, and we essentially only have to the year 2030 to take action that could appreciably turn things around. But that links to the nightmare scenario for me, which when I say we're living it, is the essence of Russian information warfare is not to make you love Putin or the like. That's how Americans think of propaganda. You make you love us, apple pie, blue jeans, rock and roll. The essence of Russian disinformation warfare, going back to its origin in the 1920s, is to make you distrust everything. And what, and trust is something that's interesting. It takes a long, long time to build, and once lost, it's almost impossible to bring back. And we have seen a systematic attack on the institutions that we trust and are crucial to a thriving American democracy. And that is true whether we are talking about healthy civil military relations to trust in a independent court and judiciary, trust in freedom of speech. We are seeing each of these core institutions that democracies need are under threat right now. And as the pledge that many of you will take from foes both abroad and domestic. And I worry we are sort of shrugging off that, you know, these, and you can see it in the polling data and the like of trust in institutions. That lost is almost impossible to bring back. And, you know, the challenge of these periods, whether you're talking about warfare, you know, what happens if you have a highly politicized military, a highly partisan military? We know that is a military that is less likely to win wars. What happens? How do you affect climate change? Well, it's very hard to do it if you are simultaneously describing it as a conspiracy theory put out by China. So I worry about these things. That's my nightmare scenario, is our institutions that are so core to our democracy are under threat right now in a way that did not happen when I was your age. Thank you, sir. So we're just past about an hour point, which is often a time to think about stretching. So when everyone stand up for a minute and stretch, and when you sit back down, those of you that are gonna have questions, go ahead and remain standing while the authors have this last question. So again, you can keep standing while I'm addressing this. But you can't talk. Okay, so your purpose as Norwich graduates is to be leaders ready for the future. So how should you prepare? That's one of the key takeaways today. And after our panel addresses this final topic, we're gonna open it up to your questions so you can begin to stand, make yourself identified as someone that would like to ask your question. So to our three panel members, I suspect that each of you, when you envisioned your future roles in life from their side of this stage, you didn't predict the developmental path that you would take with much precision. And the unexpected often makes us more prepared to lead. So what in the three of your experiences unexpectedly prepared you for the influential roles which today you have? And our audience can consider that when they're thinking about planning out leadership development. Dr. Singer, would you tell us a story? This is gonna sound like I was kissing up to the concept of the conference itself, but the advice that I would give that then links back to experience is all of the issues that we're talking about, whether it's robotics, climate change, human rights and conflict, cyber security, they are inherently multidisciplinary. That is whether you are an engineer working on a robot, we're now learning, Paul can attest to this, we're pulling from the fields of biology and design or I was meeting with the cybersecurity class earlier and we were talking about the keys to network defense involve not just the coding of zeros and ones, but an understanding of economics and what incentivizes best behavior. So it's that multidisciplinary nature that I think is gonna be so key to whatever role you take on. And for me in doing the work that I do on writing and advising DOD and other entities on the future of war, I keep coming back to history. I grew up with a love of history, of military history. And so when I'm speaking to military audiences on what Paul was bringing up of, how do you adapt to change? I'm referencing the new UAS, but I'm also talking about how the Navy adjusted to aircraft carriers and their need to do the fleet problem exercise or we've got a Marine officer sitting in the front row here, some of you heard from before and they're looking at the examples of for the Marine in the 21st century, but they're looking at how the Marines explored amphibious warfare as a concept back in the 1930s. So for me personally, that love of history has been something that I'm constantly applying in futurology. Thank you, sir. Mr. Shari, your thoughts on your unexpected developmental path. Yeah, I think I had a, one of my drill sergeants in basic training his last words to us as we marched off into the Airborne school was him screaming and that's never quit. And like you could actually get a lot done in life, but just through just like dumb perseverance. And I think there's a lot to be said for living a fulfilling life through being entrepreneurial, being bold, just simply doing things that you wanna do. In the military setting, it's a little bit different because there's more constrained opportunities for that. It looks more like I think great advice for people starting the military is volunteer for everything, but take opportunities at everything you can, volunteer for everything and take advantage of what's out there. And then, you know, out in the civilian world, like if it's something you wanna do, just do it. Find excuses to do it rather than not do it. That requires focus. That requires very much deciding these things are not important. And that is so hard. We were all chatting over lunch about how hard it is to decide where you spend your time on and the things that you agree to do and the things that you turn down. But I think it's important to let yourself set your own priorities because the world will tell you the things they want you to do, society, your parents, your boss, the things they want you to do. But ultimately it's your life. And so deciding the things that you say, you know, this is what I, this would matters to me and carving out the time for that, you can do really great stuff. Dr. Brody, you have bounced around many different countries in your career. I don't expect that was completely expected. What in your career has been driving you toward where you are today? Sure, so it definitely was not expected. I grew up in a small town in the north of Italy and I didn't really travel much growing up, sir. The fact that now I spend my week in an average of two or three different countries is certainly something I never expected would happen. But I think I would relate, I would bring up two points. One, I've been very much a tally's on what Paul say and that is to be able to find what gives you a sense of purpose. And that would drive, I think, a lot of career choices and it can help you shift from one field to another and take up a lot of very different experiences. Like I have worked in academia and the policy word in the field with the UN, now at NATO and it may sound a little bit disjointed but to me it's so coherent because he all follows the same purpose which is I'm interested in how do we mitigate the impact of conflict on civilians in the context of civil wars. That's, for me, that's what does it. That's my purpose. That's what drives my choices. And it's quite a good thing to fall back to when I'm undecided whether to take one path or another to say is this furthering what I think is my contribution? My two cents to the word. So I think it's very much goes in line with the idea of focusing. And the additional point which goes in line with what you mentioned about international experience is to really be open to have your ideas challenged. I think it's not something that we welcome enough in our societies. We tend to be reassured, stay in our own cocoon, listen to people who already agree with us. And that's easy but I don't know if it brings us to make the best type of decisions and I don't know if it really allows us to relate to each other as human beings at the best possible level. So to me my experience one of the best ways to challenge my preconceived notions, my ideas and my biases has been to travel and the more different the culture and the more different the place before being back in Europe I spent 10 years in the Middle East and I can say that was a crash course into challenging so many notions of I had been brought up with some of which I got reaffirmed, some of which I revised and I think it's a very useful exercise. And so I heard that as part of this program students also have international experience and I think that's incredibly invaluable, especially as in this globalized words that we live in today. Thank you ma'am. So we have time for questions fortunately. I'd like to start over here with the students and Colonel Cabay you'll be next. Hope you guys are doing well today. This is a question for anybody. So in the world today we see a lot of warfare. Also NARCO drug trafficking is a large part of what we see in impacting civilians and causing civil wars. So to any of you who can answer it what would you think and where that sort of area of fighting is going to today and how can it be prevented and developing in third world countries? Suggest Dr. Singer start with that. I was gonna suggest someone else. Hey, great question. I think that it's very, very important to tackle the issue of organized crime because that's something we have underestimated for very long but indeed cartels are today much more sophisticated, much more able to project power and to have an impact on political dynamics as well. So I think it's a key challenge for us to tackle in order in terms of how do you do that? I think that what we need and I think most of the strategies have this component is a multi-pronged approach. So on the one hand really to try to undermine their funding model and find ways to make it harder for them to profit. So there is an economic financial aspect that is very important. But then again there is the root causes discussion which I think we should never forget and that is what drives people to join this organization. What drives this dynamics and often has to do with poor governance fragile state. This is one of my key mantras. So I always go back to that. But the point is we also have to address the political context very carefully. And I would just close because I noticed a lot of question by saying this is especially important in regions of the world that are of strategic interest to the United States like the Middle East. For example, in North Africa we see more and more actually synergies between criminal organizations and terrorist groups. So it transcends the public security and becomes a national security issue as well. So I think it's a very good question. Do you think that future battlegrounds will be impacted by narco trafficking? Well, yes, in they are. I mean we must not forget that for example in the case of ISIS which for a while was the terrorist groups with the biggest territorial population control and also the wealthiest terrorist organization is this before our military operation put an end to that. But the point is a substantial portion of the revenues they made was out of smuggling of antiquities of oil was out of being involved with criminal organizations. So the link between terrorism and crime makes what happens in the criminal world very relevant to stabilization operations all around the world. So yes, I would say it's linked. Thank you. Dr. Kabe. As my students know, I am very old. I started programming in 1965 but I have followed developments in computer science all of my career. I am up here to give you a warning. There are two factors I did not hear mentioned. Number one, neural networks develop algorithms that are incomprehensible to the human beings who are depending on those algorithms. And there's evidence of that. For example, one of the earliest neural networks back in around 1988 was tasked with developing a integrated circuit for a particular behavior and it worked and none of the electrical engineers could figure out how it worked until they tracked it in detail for a couple of weeks. That's number one. Number two, back in the 1930s, Gerdl articulated a principle. Gerdl's theorem of incompleteness includes a prediction that all self-referential systems are inherently chaotic where chaos in mathematical terms means having disrupted responses to changes in input. Those two factors should worry the hell out of us. I'll leave it to you to comment. Anyone worried? I'll jump on that one. Great. I think on the first one, let's take it even further, it actually connects back to what Paul brought in relevant to the Chinese approach and kind of hoped for in terms of a battlefield breakthrough. In discussions of artificial intelligence in the US the sort of moment where people, it always comes back to games and there's two moments in our discussion around artificial intelligence. It's when the computer first beats the top human chess master and then the next stage is when the human, sorry, when the computer beat the top human at trivia on Jeopardy, IBM Watson. That's what we talk about. For China and in particular, your counterparts in the PLA, it was when the machine beat someone at the game of go. And if you're not familiar with it, it goes sort of a game of strategy that's thousands of years old and relevant to the comment that you brought up. It was not merely that the breakthrough happened in this game that a lot of people thought it wouldn't happen for about 10 years, but it was the way that the machine won. It came up with moves that humans who had been playing this game for over 2,000 years never thought of on their own. And that's that sort of, you know, the potential of it being a battlefield breakthrough, kind of the equivalent of a Blitzkrieg come up with but by a machine that no one's thought of before. That's what's excited the PLA. But as you lay out, it's not just this possibility. And you can see that we have medicines that are being discovered, so to speak, by AI by bringing together information in a way that human doctors and researchers would not have thought of. So wonderful, positive. But as you lay out, there's also the negative of it. Now, that's what it was kind of referencing of the advice that the machine is giving you. Part of why that advice is so good is you never would have come up with it on your own and you don't understand it. But that is also the challenge of when do you go contrary to the machine because you can't understand why did it recommend this? Why did it recommend that this person doesn't get a loan? Was it because it sifted through all of the information and has such perfect information that I never could have done on my own? Or is it because the data that was plugged into it was inherently biased against African-Americans? I didn't make up an example there. That is a true story. Or who gets selected for promotion or admission to a college? We're going to take more data than ever before than just your SAT scores and the like. It's going to sift it together in ways that a human never could because of a neural net. And guess what? It found that I think it was young white men who played lacrosse were the best for college. They actually weren't, right? It was drawing biased data. So this thing that you're bringing in of neural nets is not just about our inability to understand it. That's what's good about it. That's the problem of it. And then by the way, the other thing, it hits what Paul brought up in acquisitions. How does the military buy something that is, on one hand, holds the prospect of advising you better? On the other hand, no one can tell you how it works. Good. Good afternoon. So history shows us that mankind is often considered some weapons too violent or too dangerous or unfair during warfare. Do you think AI that thinks and then acts on its own could sometimes be outlawed in the future? Or are we too stuck on our mindset of progress to even consider that? That lands squarely with you, Mr. Sherry. Yeah. I mean, I think that's kind of the question. You know, the track record of trying to regulate weapons date back to antiquity to ancient India and attempts to ban poison-tipped and fire-tipped arrows, and it's a real mixed bag. There have been some successes or things that were largely successful like efforts to move away from chemical and biological weapons. There have been other miserable failures throughout history. And so I think, you know, it's a real challenge. It's a hard problem because, of course, if you violate these rules, what's the consequence? Right? If something's effective and you win the war, well, okay. The real issue is reciprocity. You know, and when we see effective restraint, it largely has to do with militaries agreeing sort of in either explicitly or tacitly to not use certain weapons out of fear but the other will use it against you. And there's not an advantage then. And that might make more and more horrible or unrestrained in some other way. But in order to even achieve that kind of restraint, you need to have clarity on what is the thing that you're not agreeing to do. There were attempts early in World War II to restrict area bombing only to military targets and not to cities. Well, it turns out where you locate factories, you locate them in your cities. And so over time, that line blurred and you led to mass devastation, of course, in World War II of aerial bombing of cities. I think it's a real problem for AI because where do you draw that line? Where do you say this application of AI is okay and this is not? There are lots of discussions underway among states and scholars working us to try to figure that out, but it looks like a really tough problem. Thank you. Good question. And we'll all be sharing your mic here shortly. Dr. Singer, you're up. Good way, and I think there's particularly to AI, there's first the aspect of it's a technology that is neither inherently civilian or military. So sometimes you will see people say, well, you know, and then there's a movement of banning increasingly autonomous use within the military. But simultaneous to that, our civilian world is using that technology. So, you know, I always use the example of, you know, let's move forward 10 years. And so is our proposition that a pilot will, and then maybe there's someone in this room, I met a student earlier today who's interested in going to the Air Force and operating on manned aerial systems. So they're going to wake up in the suburbs of Las Vegas and drive, or rather be driven to Creech Air Force base by their increasingly autonomous car. Not just a Tesla, that's where Chevrolet's and the like are, but then once they enter Creech Air Force base, we say no, but in war we still operate like it's 2008 and you have to hand control everything that the drone does. I don't think that happens. So the first is that nature of the technology. What I do think is possible is that we will see certain restrictions on not the technology, but potentially where you use it and how you use it. So for instance, there are very, to your concerns, there are very different civilian casualty concerns with autonomous weapon systems in the land domain in an urban environment in a city versus undersea warfare. So in an urban environment, you get it wrong as to whether it's a tank or a bus. Scores can die in the undersea environment. It's actually already mostly automated. It's not Jonesy with his really good ear listening. That sounds like an enemy submarine. It's actually a computer recognizing the algorithm. The torpedoes that are fired are actually fired and forget they're mostly automated. And by the way, if you get it wrong, there's no such thing right now as an underwater cruise ship. So I think we might say we're okay allowing autonomy in undersea warfare, but maybe not within land warfare as an example. Great. All right, over here. How's it going? Thank you for coming. I'm representing the men's lacrosse team. Funny enough. You mentioned earlier about finding the important information and the true information. And how do you see that coming in the future? Is that your simple Google search? Is that going to be through a VR or something like that? And how does AI decipher what is true? What is fake with so much data coming in and out? And especially that's only increasing. Thank you. I knew it. I was the one who brought up, so I'll jump onto that. So the first is that there is AI that's being used to try and distinguish that. So for example, as we see the creation of what are known as deep fakes, which is AI generated, hyper realistic, but false imagery. As it becomes more and more sophisticated, we will rely on artificial intelligence to sort of sift underneath it for the tells that the human eye might not see. And what's fascinating, again, going back to this sort of, where you were talking about the blurring line between war and conflict is the groups that are researching that type of technology to identify that it's fake are both the Facebooks of the world and DARPA. Facebook, because they think they need it for their platform, DARPA, because they think this technology might be used to target U.S. military, U.S. democracy. Now, that technology is for the good, but all the data also shows that it is insufficient. And what we really have missing in the United States is digital literacy. If you were all grown up in Finland or Estonia, you would have, from elementary school, junior high, high school, you would have, besides being taught your regular studies, which all of you were taught, besides being taught hygiene, wash your hands and the like, which you were all taught, you would have also been taught how to defend yourself online. You would have been taught digital literacy, how to distinguish between what's real and fake, how to manipulate people, what are the emotional tells that they're going after you. That is why the Estonians in Finland's of the world are more resilient against these threats than the United States, which is just like an open territory to it. So it's a great example of how you align national security issues with education. And by the way, if we had digital literacy inside the United States, it would not just help you be better citizens. It would help public health. You are all dealing with the return of diseases that we didn't have to because of anti-vaxxer conspiracy theory. It would help you all be better consumers. So it's a multiple good thing. And yet it's strange that it's missing within our system. Another layer of complexity is that we can get better technology that can help us detect deep fakes. Sure, but that actually accounts for a relatively small amount of the most effective information operations. As we look more and more at what Russia, for example, which is one of the main players at the moment is doing, a lot of it is not necessarily deep fake. It's much more of manipulation of the truth, which is much harder to detect. And it's certainly much harder to detect, not just for human, let alone for artificial intelligence. So as our technology gets better, also those who are perpetrated as information operations get more sophisticated. And even if we had a perfect technology to detect deep fakes, it still would not help us with a lot of these campaigns. So that's just to add, throw another layer of complexity. Thank you. So clearly we need a lot more time for your questions. And there will be a chance for that over at the book signing after this. So I'm sorry to send you back to your seats for the end of this. But thank you, audience and panel, you are all very fantastic.