 between you and cocktails is the future of war, which I think you could also think of as the future of peace, because that's really our central point here, which Rosa Brooks will start off with, which is essentially that when the line between war and peace blurs completely, we don't know when we're at war, but that also means we don't know when we're at peace. And if we can't draw that line, all sorts of very, very bad things happen. And arguably we are currently in a situation where we're not drawing that line, and when we try to draw it, it gets very difficult. So we have a wonderful panel. I will just say I love all the things that New America is doing, and occasionally I even get to participate, and this is one where I'm actually allowed to be part of the team. So I'm going to moderate, and then we're just gonna have a conversation. This is a conversation that we have among ourselves, and that we have started having more broadly as well. And you've met a couple of other members of folks today, Sharon Burke, who was up earlier, who will be part of this broader project. We're also doing this project with Arizona State University, Jim O'Brien's in the audience. So let me start going down the row with introductions very briefly, and then we'll start the conversation. So immediately to my right is Peter Bergen, CNN National Security Correspondent, and much more importantly, Director of the International Security Program at New America. To his right is Tim Maurer, who is a member of our Open Technology Institute and a cybersecurity expert. To his right is Tom Ricks, a senior advisor to the International Security Program at New America. Many of you just might know him for other things, like his book The Generals, or his book Fiasco, or his many years writing for the Washington Post and the Wall Street Journal covering the Pentagon. To his right is Sasha Meinrath. This is not political. There was no doubt about that, but even there. So Sasha Meinrath is the founder of the Open Technology Institute and now the director of the XLAB, which is on the cutting edge of where technology and policy will go next. And finally, Rosa Brooks, who was a Bernard Schwartz fellow and is now back at her home base of Georgetown Law School and a very active member of the future of war team. So with that, I'm going to, I get to move around a little bit and not stand here. Let's see, how am I gonna do this? Oh, we have a seat. Oh, we do. Oh, good. I didn't see it. Are we on? Yes. I was wondering if I was gonna have to stand up there. I couldn't see the seat. So we're going to move from broad to more specific. We are going to start, and I'm gonna ask Rosa to talk just very generally about this idea that the line between war and peace has blurred. And then we're going to bring that into what that means for drones, for special operations, for cybersecurity, or rather what it looks like from the vantage points of those fields. And we're gonna ask about how that affects our defense establishment and ask the experts here to comment on all of those things, and then we will open it to all of you. So Rosa, over to you. You have written, war has become everything and everything has become war. Please elaborate. Let me start, I suppose, by telling you something. I discovered only recently in my obsessive, crazed reading for this book project I'm working on, which the book's working title is How Everything Became War and the Military Became Everything. And one thing I've gotten somewhat fascinated by is the ways in which different societies at different moments in time have tried to understand and draw lines between war and not war, war and peace. And recently I've read the most fascinating thing about the Native American Navajo. Navajo warriors apparently, when they went on raids and they left their own territory, they would, as they left their territory, start speaking a different dialect of their own language. And the anthropologist who was writing about this translated what they had told him as a twisted language. They'd go off on their raid, then the war party would return, hopefully, successful. They would literally draw a line in the desert sand, face the enemy's territory, then turn around, step across that line and resume the common language, resume the normal language. And I found this sort of fascinating. It's not unique, actually. Almost every society historically around the world has struggled to define what's war, what's not war. And you could think of this as a subset of what humans do more generally, right? We are creators of categories. That's what we do. It's messy reality. We don't like it's being messy. It's hard to process. It's hard to organize to a mess. And so we create a variety of artificial constructs and we organize ourself around them. We have states, the building blocks of the post-Westphalian international system. We have war. We have not war. We have one set of rules and laws that apply when we're at war, but not when we're not at war. We have the military, which is an institution that we more or less define as the institution designated to deal with war. And we have a category called civilians who's the people who don't deal with war. And so forth. The trouble with this, needless to say, is that we are always trying to put the messy world into boxes. The world is always trying to get out of its boxes. And I think what ties together, what all of us here sitting together have become quite interested in, is the various ways in which all of those boxes aren't working very well anymore, in which reality is kind of biting back and making the lines that we have drawn between war and not war, for instance, between states and not states, between private and public, domestic and international, military, civilian. It's making those lines blurrier and blurrier with consequences, which we'll talk more about as we go forward, for consequences for law, consequences for how we organize and control the resources of our society in our state and just consequences across the board. The reasons for that blurriness, and I don't want to give you all know this stuff, globalization, changing technologies, cyber, rise of non-state actors, all sorts of things. And what that is increasingly meant is that as we face these changing threats and changing opportunities, we can't figure out what box to put them in anymore, because they don't fit. And we're going to go through this panel. And Peter, for instance, will talk about drone strikes, which he's done a tremendous amount of research on. If we are in a war, and I would note, by the way, that you might want to think of the Geneva Conventions, the Haig Conventions, as our modern version of the Navajo line in the sand in different language, you might think of the law of war, the law of armed conflict as part of that, just another attempt to draw these lines. But depending on which box you want to put the non-state actor affiliate of al-Qaeda in, if you put that person in the civilian box or the military combatant box, if you put US efforts to combat al-Qaeda in the war box or the law enforcement box or some other box, well, that actually turns out to make a huge difference, because if it's in the war box and it's a combatant, no problem. Drone strikes are the lawful war time targeting of enemy combatants. If we don't think that war box fits very well, then they're murders. Similarly, we look at cyber, and Tim and Sasha spent a lot of their time thinking about the ways in which changing cyber technologies really blow apart all these categories. Do we conceptualize someone messing with various, I don't even know that, I'm still embarrassing here, because I don't even have a vocabulary to talk about this. But as someone hacks into computer systems, we do, we talk about them as internet attacks. But if we conceptualize that as war, well, one set of laws that are very permissive about the government use of power and coercion applies. If we think it's not, we're a whole different set of rules apply. If we think it's war, then we need the military's cyber command to do something. If we don't think it's war, that's completely inappropriate. And needless to say, how we think about all this also affects how we think about the role of our military, what policies should constrain it, how we should organize it, how we should recruit, what kind of personnel system it should have, what kind of educational system it should have. And Tom's interests really are focusing on thinking about what does this blurriness mean for the military as an institution, as it struggles to adapt to this world in which all the old assumptions that we knew what it meant to be a military and we knew what it meant to have a war really don't apply anymore. So how do we make sense of this? So I'll stop there for now. I think we'll come back and talk about some of the broader implications before we close. And hopefully some of you will have comments as well. But we're going to run through some examples of this and then invite you all to weigh in as well. Great. Thank you. So I'm going to come all the way closer to me. You all arranged yourselves quite nicely this way. And talk to Peter. So Peter runs our international security program. And one of the things we have done under his leadership is we have a database of every single drone strike in Afghanistan and Pakistan and now building in Yemen. So we actually have the data and some of the work that we've done. We know we're doing good work because we're disliked by both sides of the drone debate. Our data shows that we are killing more people than the civilians than the military would probably like us to document. On the other hand, we actually show also that we are killing many fewer people than we used to and fewer often than human rights groups think we're killing. So we've done a lot of the more granular data on drone strikes. But Peter, the question I want to ask you is, first of all, just talk a little bit about whether a drone or whether or how a drone is any different than a cruise missile. Because I can launch a cruise missile from here and it will take something out very far away. Why is a drone any different? Well, I think there are probably four basic differences between a drone strike and a cruise missile strike. I mean, the question is, is a drone a piece of artillery that flies or is it something slightly different? And I think that the four ways, first of all, a drone can linger over a target for many hours. And it can really assess who the target is. And the fact is that the civilian casualty rate and drone strikes that are launched by the Obama administration is markedly different. There's all sorts of reasons for that. It's gone down close to zero, but not infallibly so. And the reason for that is the drones can linger over the target for many hours, smaller payloads, better intelligence. So drones, obviously, there's a moral case for drugs, right? They kill, they can kill less civilians. And I like to that, and the second point about how drones are different is there's a kind of, moral case against drones, because like cyber, they essentially allow you to engage in conflict with basically no cost to yourself. You can't kill a drone pilot who's sitting in Nellis Air Force Base thousands of miles away if you're in the tribal regions of Pakistan. There's no way to respond. And so the costs of entry into a conflict are very low. And that's one of the reasons, by the way, of course, Americans are very happy about drones. I mean, if you poll around the world about the American drone program, you have like zero favorability, particularly in countries where it's happening. But here it's a popular program, no boots on the ground. And this is why it's been embraced by the Obama administration. And I think the third point is that, and Daniel Rothenberg has written a great deal about this, is operating in a big data environment. And so a drone is not simply just, it is using a great deal of information. For instance, if the NSA can basically listen to every call that is made in a particular country, which we know for a fact now we can do, and then you rely that to reasonably go to human intelligence, the drone is operating in a very information rich environment. And so I think for all of these reasons, drones are different. But I think one of the things that we want to explore, our monopoly on this is evaporating. The Israelis and the British have used drones in combat, but we've already seen the Chinese, the Russians and the Iranians all armed drones. In fact, on Sunday, the Iranians showed a new drone which they claim was a copy of the RQ-170 drone, which is a very sophisticated American drone that they, that crashed in Iran. So our monopoly is evaporating. The big point here is that whatever president we create, we have to be comfortable with an international set of precedents that the Iranians then use in point two, or the Chinese use in point two. Because you could easily imagine the Chinese saying, hey, Uyghurs in Afghanistan who are plotting against us. So basically fair game based on anti-terrorism legislation we have, essentially the same argument that we make. And I think that right now I would suspect that we wouldn't be very comfortable with the Chinese or the Russians or the Iranians adopting wholesale the precedents we're creating. So you all are not normally so reticent. So if anybody else wants to jump in on the implications of drones, I mean one of the things it also really brings into question the whole idea of where is the battlefield? Right, the notion as Rosa said, we thought we knew what it meant to be in the military and we also thought we knew what a battle was as something defined in time and space. But this gives a very different understanding. I mean, how, if you're thinking about how to use drones, is the battle space the whole country? Is it everybody who's an al-Qaeda? I mean, how do we think about the battle space? Well, one factoid, which I think is a useful one. President Obama has, if you take the least, the most conservative estimates of the number of people that he has authorized to be killed in drone strikes is 2,400. And all of those, by the way, outside the conventional battle space in Pakistan and Yemen. Well, that's three times the number of people that President Bush ordered in Guantanamo. And yet, as I think of the general principle, there's been very little kind of discussion of this as being, if this is appropriate. Now, let's not pretend that all of these people are innocent. But I think, you know, our number suggests that at least 300 of them are civilians. And I mean, I think that speaks for itself. And so we're in a very different kind of ballgame and it gets, that gets to, I think the final point about drones, which is different from other forms, is that we're targeting people, particular persons. You're not being killed because of your status, because you're wearing a uniform, and anybody over that side of the hill is likely to be the enemy. You're being, you're targeting particular individuals. But you're not always getting it right. And by, you know, we put people on death row who turn out to be innocent. If you think about the kind of ways that we decide about whether we should kill somebody with a drone, this is not something that is a very elaborate or certainly transparent process. You know, Harold Koh spent five hours looking at Anwar Alaki's file, and said, hey, you know, the guy deserves to die. And I think, you know, I think on the merits, it was right. But is that a process that we think is the right process? And the one thing I might add to that is that our way international law works on this is we, the law of armed conflict is what we call lexpecialis, which translates from Latin into special law. And the special law applies in special circumstances and only in that special circumstance and the special circumstance in which it applies is when there's a war. So different legal system. And in, or you all know this, right? An ordinary life, you walk out and you, you know, you kill someone walking down the street towards you. The police are gonna come, you're gonna be arrested. You're going to probably go to trial for murder. And if you say that person was my enemy, that's not gonna help you. In wartime, it's obviously a very different legitimate combatants have what's known as combatant immunity. You get to kill the enemy. You maybe even get a medal if you go kill someone. So the normal rules in peacetime go more or less out the window in wartime and all kinds of things that we would regard as immoral and illegal in peacetime are acceptable, maybe even praiseworthy in wartime. When we start defining an armed conflict, the legal term, as something that could be anywhere and that sort of travels based on where the purported enemy travels, then it's hard to know how you tell the difference between when that set of very permissive rules about using force and killing people applies versus when the more rights-respecting set of rules apply. So it's very hard to know how to constrain government power. Even when we say things like drones kill very few civilians. Well, it all depends on how you categorize, who's a civilian? Once you get even into the law of war, well, it's al-Qaeda and al-Qaeda's associated forces, but what does it mean to be an associated force of al-Qaeda? We're not quite sure. We don't have a clear definition of that. What does it mean to participate in hostilities in the armed conflict against al-Qaeda? We're not quite sure we know about that either. So even those statistics get very hard to pin down what we mean by any of that. So one of the things Peter mentioned, Dan Rothenberg, who's our partner, our lead partner at Arizona State University on this project, and he's written a very interesting piece that essentially says a drone is what makes it different as Peter elaborated, one of the key reasons from a cruise missile, is that it's a flying computer. It's a flying data gatherer and processor over a long period of time, and that in many ways, the information that the drone gathers can determine whether or not it's gonna strike. In other words, it's something called a signature strike. What you're doing is striking based on a pattern of data that the drone itself is collected. So we could think of that as, okay, that's a case in which a computer attacks an individual. And obviously there is a drone operator on the other end, and there is still human agency, although it's not at all clear that human agency is necessary. It's required now, but it might not be. So I wanna turn this around. If that's a computer attacking a person, we can say that's an attack as the person dies. We're pretty clear on that. But to my cyber experts to raise the question that Rosa asked, if we attack a computer, is that an attack? Is that an attack in a military sense? I'll start with Sasha. Ooh, maybe, that's the answer. It's interesting. We're entering into all of these different areas now across society where nothing has caught up to technological reality. So for my birthday two weeks ago, I ended up talking with a guy that does smart grids. I thought you were gonna say you got a drone, knowing you, I was not admitting I sound surprised. Well, wait for it. Okay, we started talking about drones. And some of them, we were talking about this work. And he was like, oh, I have a drone. I was like, well, I have a drone too. So what did we end up doing? We were flying drones around for my birthday. And anyone with $300 can buy a drone. But of course we're geeks, so then we're thinking like, well, how can we make this drone better? So we start talking about battery power, and then we start talking about the lift and how much coordinates a $300 drone can carry. And it turns out less than a grenade's worth. But there's a $500 drone that anyone in this room can go online right now and buy. It could carry something of that size. And to me, that's a reality that's coming now. That's a reality that is here in fact now. And it changes dramatically how we think about war space, non-war space. What is a drone? What isn't a drone? What's an attack vector? What is not? And cyberspace is like that, but to the infinite power. Because there's no physicality. It's all bits, not atoms. And that changes things remarkably. So we're right in the midst of a massive data obfuscation arms race. That NSA and- Data obfuscation arms race, okay. Yeah. Right, so when NSA is trying to discover information, people are trying to hide information. And that may be for completely legitimate purposes, like you don't want your health records out there. You wanna make a purchase to Amazon and you don't want your credit card numbers spewing out over the internet. And some of it's obviously for not legitimate purposes. And in the 90s, we fought a huge battle. Would we be allowed to encrypt our communications? And the NSA came down on the side of we should allow that, we should facilitate that. And now we're facing something where we call the going dark problem. Where more and more information is becoming harder and harder to decode. Because it's much easier to encode something than to decode it. There's an asymmetry there. And so we're headed into an era where, in essence, we're facing a crypto war too. If you can't surveil in the network, because more and more stuff is obfuscated, is encrypted. Then you have to move that surveillance to the edges of the network. And so this may be everything from your cell phones being locked down in ways where you can't control. But it also means that, because we are a technologically advanced country, with lots of devices, I don't know how many devices are in this room right now, a lot, probably hundreds. And because of that, I have hundreds of different attack vectors if I want to surveil this room. I only need to compromise one of those devices. For me to be able to hear what's happening in this space right now. People are clicking them off all over the room. It actually makes me glad we have two lawyers on this panel, because I keep thinking, we're gonna get in trouble. Yeah. But this again is this asymmetry, the most technologically advanced spaces on the planet, are most susceptible to cyber attack. And so what are you going to attack in Sudan or Pakistan, like where there are no cell phones or no cell phone coverage? It's gonna be very different. Whereas our lifestyles are full of holes, right? If you want to compromise 50 million credit card numbers at Target, you go in through the HVAC system, which is exactly what happened. Your HVAC becomes your lowest common denominator for your security. And in a space where the internet of things is increasing dramatically, smart devices are everywhere in our lives now, what you end up with then is all of these different attack vectors to do a variety of different malfeasance and no ability to really meaningfully counter that. So aside from the fact that you've just given me a completely hideous view of what used to be like kids sailing model sailboats in Central Park, and you're now imagining them with their little model airplanes in different ordinances. I mean, a part of what you're describing then is our vulnerability to attack and our exponentially growing vulnerability to attack as we carry around small computers that are all connected to each other, which then the battlefield is everywhere. Then there really, it's not even a question of a sort of non geographically defined battlefield. There's simply no difference between life and the battlefield. But before we get all the way there, I wanna come back to when might we call it an attack and when might we call it an attack in the military sense versus the way, Sasha, you use the term when you say, well, yeah, of course, there's an attack going on all the time, people are probing our cyber defenses, we are probing other people's cyber defenses. But when do we think about that as an attack in the war sense? Tim, I'll come to you. You've thought quite a bit about Stuxnet, right? So no, I think Stuxnet from my point of view was an attack, right? In the sense that Iran's our adversary, we did not want Iran, I'm just quoting what was in the New York Times, I think from Peter Baker, but maybe it was Steve. But we didn't want Iran to develop its nuclear weapons and we did something about it. So that sounds like an attack. But how do we define more broadly whether or how do we think about more broadly, whatever China may be doing, whatever other countries may be doing, when is that an attack in the war sense? If I'd known the answer, I think I'd have my career made out. That's why I get to ask the question. So with regard to Stuxnet, Stuxnet was the malware that was used against the nuclear facility in Iran. And the nuclear facility uses centrifuges that Senator McCain alluded to earlier today. And those centrifuges run at a certain speed. And what this malware was, it actually changed the speed which left the technicians in the facility actually quite confused who didn't know for quite a while whether this was a malfunction or whether this was an intentional attack. So first of all, it's important to conceptualize Stuxnet not as a ephemeral attack that occurred on a day, but that was actually an operation that took months. And the advantage of it was that it confused the technicians and they actually didn't know have we been attacked and they didn't know how to react to it. So you don't want the attack to seem like an attack? Exactly, so and that's one of the unique or new features of if you conduct military operations through cyberspace and your question about whether this wasn't an attack or not is currently hotly debated among international lawyers. You have something that's called the talent manual that was developed by 15 legal experts under the auspices of a NATO Center for Excellence and it looked into this question. And Stuxnet is interesting because the, or if you look at the 2011 cyber strategy that the White House put out, the White House actually says that it will treat a cyber attack just like a kinetic attack if it has the same effect. Meaning if it kills people? They're ambiguous. Kinetic effect. And so to give another example, Stuxnet clearly had a physical impact in terms of the centrifuges. It gets even more complicated if we start talking about the financial sector. And we are not talking about the manipulation of data that will cause a physical effect that you can see in the world, but we're talking about the manipulation of data, financial data that you can actually won't have a physical impact, but could have devastating consequences for our national economy. So would that then rise to the level of the use of force? And to go back to Rosa's initial analogy of the line, once you cross the line, your enemy then has the right to respond to whatever action you take. What we face in cyberspace, most of the activity we see today is actually you're not crossing that line yet. You stay just before the line and you're already throwing stones or bombs and you're having this effect. Not bombs but stones, and you remain below that threshold of the use of force that would trigger all sorts of legal ramifications also of political action. And the question is, what do we do about that? And the debate that we've had in this last point of the last four years in terms of whether cyber war will take place or not, has somewhat diverted the attention from what we are actually currently seeing, which is most of the activity that we are seeing, and also with regard to the Chinese espionage activity, remains below that threshold that triggers the use of the law of armed conflict. And we don't really have rules and norms yet that govern this space. And we haven't really thought about what norms should apply to this activity that remains below this threshold. And how is cyber applied together with kinetic attacks? So the cross domain type of conflict. So I get, putting that together with what Sasha said about encrypting and decrypting, is it, does it make sense to think that we're in a kind of cyber cold war? In other words, something that is short of ever breaking into a hot war. The older people in this room are very familiar with living in that state. It was a war. I mean, we were at war with the Soviet Union in the sense they were our enemy. And we fought hot wars in various places. But what we were mostly doing was simply pushing at each other in various ways. I mean, is that when you talk about this idea that there's an arms race going on in the obfuscation of data. Does that, do you think of it as a war in some way? Yeah, I mean, I've used the term cyber siege before. And I think, you know, things like heart bleed this accident, possibly, probably an accident where many of the services in computers that do the encryption actually were utilizing, and by many I mean millions upon millions of the computers that you trust to secure your communications and secure your credit card and secure your medical records had a compromise in that. It was just unknown. That's a problem. And anyone that knew about that could get any data that was on those servers. Millions upon millions upon millions of services were compromised by this. And we have information that suggests that the NSA was exploiting this for at least the last two years. The question that we don't have an answer to is where other countries exploiting this. I would argue probably yes. And I would argue that it's not just heart bleed. There's a lot of things out there where somebody just made a mistake, we're human. I mean, make mistakes. Now, as anyone who in this room who's a parent or knows kids knows, if they're in the back seat and they're all like doing this but not touching each other, that can go south very fast. And cyber can do that, but infinitely faster, right? So when we see selloffs on Wall Street and multi-hundred point plunges in a place that has so many checks and balances and the dark recesses of the back spaces of server racks, it's not, it's an unknown how fast that can escalate. And unlike a Cold War mentality where you've got to move tanks or plane troops, this has no atoms that need to move from point A to point B. And that makes it meaningfully different. So I just have to say, this conversation actually scares us silly on a routine basis. So this is why we're having drinks afterwards. So Tom, let me turn to you. And I know you're gonna tell us the military is completely ready for this, completely on top of this. We got it, we're set. As I'm listening and thinking two questions. Number one, what if we have an industrial-era military in an information age? And second, Rose and I have talked about this, talked about her experience in the Pentagon on this. What if the U.S. military says, oh, we don't do those tables? Anything, we only do things that look like patent charging across France in 1944. That other stuff, that's not us. Well, it means the stuff is spilling out of the box. It's being picked up by CIA, by contractors, paramilitary organizations. And I think that's one of the fascinating things about this group, is trying to sit around the table and say, what happened to the lines? What happened to the rules? For me, when I look at it, I kind of feel like the guy cleaning up at the end of the circus parade, cleaning up after the elephants. And I'm- Meaning your fellow panel? Exactly. Thanks, Tom. The implications for the U.S. military are across the board. How you recruit, how you train, how you think about equipping your force, how, what sort of people you even want to lead your force. So for example, do we have generals who are trained to think critically about ambiguous, uncertain, blurred line situations who know how to parse through and recognize when somebody is dancing on the edge of war knowing that there's a certain line they don't want to cross because you will respond, but that you are incapable of responding below that level. I think it's kind of what you're seeing with Putin and Ukraine right now. So if you were to design the U.S. military right now from scratch, well, for example, you might have an information warfare unit based in Silicon Valley. We don't. You probably would not be spending $15 billion on aircraft carriers, which in today's world can be seen by satellites and hit by long-wage missiles. Yeah, we just did spend 15 million bucks on the Gerald Ford. You would think entirely differently. And I think that's really what this panel, for me, is about is, okay, given all these things, how can you force the U.S. military to start thinking rationally about the world it's living in and not simply building a bunch of today's battleships and being the Royal Navy, which was built battleships throughout the 1930s and was almost entirely irrelevant to World War II. So how low could we go in terms of conventional capability and still protect ourselves in the incoming decades? So let's imagine we ramp up on information, on data integration of all kinds, all data, mass data, cyber capabilities. We keep lots of special forces. There's no question that we need the ability to go in and out of countries, but to what extent does this say, look, really any as planning for large-scale land warfare is obsolete? Oh, I don't think it's, you always want to be able to plan for it, understand it, to quickly regenerate forces. But when you look at the way the U.S. military is equipping itself these days, there's no way the U.S. Air Force is gonna be able to compete in drones with Amazon and Google. And if you wanna know how to command and control a fleet of five to 10,000 drones, it's gonna be Amazon and Google you look to, not the U.S. Air Force, which is still culturally married to manned aircraft. Now, you're gonna have to militarize these things. But this is why I was saying recently that I don't think Boeing and Lockheed will be relevant in the future. You don't want defense industries. What you want is companies that make civilian goods that then you then convert to military goods. You're commandeering the very old... But you let Amazon and Google figure out these things, which they are doing right now. I noticed in an article I read the other day that the F-22, the newest cutting-edge U.S. Air Force aircraft, has a processing capability, one-one-hundredth of what Google now sells commercially. So I'm not gonna ask you the political implications of what you just said sitting here next to Capitol Hill where remember the B-52 bomber has some part of it built in every single congressional district in the country for a very good reason. I'm thinking about undoing all that. But I mean, are you, and I'll put this to the panel as a whole, are you really looking at sort of an informant... We have a military industrial complex. Are we instead sort of shifting that to a military information complex? I would say we should be, but we're not. And if you believe in deterring war, you want a military that is relevant and prepared. And increasingly, we have a military that is not that looks like an industrial-age military in an information-age world. You want people who think of drones as computers strapped to airframes, not as my new airplane. Anne-Marie, if I can jump in, I think, to my recording here. One of the things that we're trying to do in this project is really not just assume that we have to remain forever stuck with the law and the institutions that we've inherited. And for the most part, that's what 99.9% of Washington discussion does, right? And legal discussion. I mean, Tim referred a little while ago to the efforts, the Tallinn, whatever they call it, memorandum, or whatever the standards, what is it? The Tallinn manual. Tallinn manual, thank you. Which was an effort by a lot of very smart international law folks to come up with what is the law affecting cyber attacks? And they did what smart lawyers do, right? And many of you are lawyers, you know what this is. You reason by analogy, you kind of say, gosh, gee whiz, law and law and conflict doesn't say anything about cyber, but it's kind of like this, and it's kind of like that, and it looks kind of like this, and so maybe we can kind of pull something together and which through a series of analogies, we come up with this set of rules, which, oh, oh, as Sasha then immediately starts telling us, doesn't really help us that much because it's gonna get right back out of the box again, instantly. And similarly, we've got this military, which is organized in various ways that don't necessarily make any sense whatsoever, given the world we now inhabit, the kinds of threats we now have. So I think part of what we hope to do over the next few years with this project, with help from a lot of other people, is really do a bit of a blank slate exercise. Not because we imagine that we can sort of reimpose our own wishes on the world ourselves, but rather at least to think through, if you were starting from scratch, how would you want to organize American society with all of the resources that we have, human and otherwise? How would you organize to address the really blurry kinds of threats that are coming up? Would you do it by saying, oh, here's the military, and they do these things, and here's CIA, and they do these things? Or would you do it completely differently? And what kind of legal constructs would you want that both permit governments to have flexibility to respond rapidly to changing threats, but also protect the values that we care about, the rule of law norms and human rights values that this country cares about? And I think that it's a fun exercise to do, and it's valuable because even though we don't ever get to have a blank slate, there's no way to evaluate all the little incremental changes that we do get to make and decide which ones we should do and which ones we shouldn't do. If we don't know what, in an ideal world, we'd like the world to look like, and I think that people shy away from that kind of blank slate exercise because they go, oh, don't be silly, you never get it. You'll never get there. But failing to have that conversation means that, well, how would you decide whether you should do this or that with the military, whether you should categorize this kind of cyber action as a military attack or not, because we don't know what's at stake until we actually start figuring that stuff out. This kind of goes to something that Sharon Burke has been working on. Who do you think is developing a drone that can fly at 65,000 feet for five years? US Air Force? Or Google. It's Google. A drone that can fly from... For sitting at 65,000 feet for five years without coming down, pulling in solar power on its wings. But who's going to get there first, Sharon? I'm sorry, it's too pretty well-axed. And... It's already... I'm buying stock in Google. To go back to one of the points that Rosa made in terms of the, one of the questions we're looking at is the relationship between war and the state and how war shapes the state and how the state shapes war. And in cyber, it's actually a really interesting moment in time right now where you have the establishment of Cyber Command in 2009. And one of the agencies that is at the center of the debate about cybersecurity that you've probably heard of in the last, since June 5th of last year, the NSA, is not only the director of the NSA, is at the same time the head of Cyber Command. And one of the key debates that was at least taking place last year was about whether this dual-headedness should remain in place or not. So whether now General Keith Alexander no longer has stepped down, but he used to have those two hats. And one of the key questions was, does that make sense? Doesn't make sense to have an agency that will set up to do intelligence and to look at content, but also do the cryptography piece. Should they also be leading with regard to cybersecurity? And that's not only, then you can broaden this picture and not just look at the military, but also the civilian part of that and the debate about DHS. And the debate here is, well, DHS has the authority, but not the capacity, and NSA has the capacity, but not the authority. And as the problem recognition has increased, more and more responsibilities have shifted to the military side. Is that something that we as a society want? And if not, what should alternative mechanisms look like? And one thing we talk about a lot, and Tom has written about this quite a lot, and I've written about it a little bit, is civilian military gaps in this country, in forever all sorts of quite complicated reasons. The average American civilian today has far less direct contact with members of the military than the average American 40, 50 years ago, much less knowledge about the military. As more and more things in the world enter what we are currently conceptualizing as the military domain, as war becomes everything and the military is therefore asked to also become everything and take on everything, cyber, you name it, public health threats, climate change, everything under the sun, I think the costs of that kind of cultural gap get higher and higher. I mean, how many people in this room have any current or prior military experience? There we go. Tiny fraction, and this of course is not unusual, but as we conceptualize more and more as war and ask the military to do more and more, the fact that this group of incredibly smart, interesting people has so little contact with that world, the price of that gets higher in terms of our ability to as citizens effectively weigh in on things that matter enormously. Do you want to? Yeah, just a comment. During the NSA debate, it was almost not a question that people asked was, why is a military intelligence agency basically allowing itself or sort of recording itself to listen to all American civilian telephone conversations? It didn't even really come up as an issue, which is strange and I think maybe 9-11 changed our way, the way we think about these things, but the fact is, as Tim mentioned, the person in charge of this organization is a military officer and I think we're in a strange place when a military intelligence agency is recording itself the right to listen to civilian, the best domestic phone calls. Hopefully that is about to change. So I'm going to turn it over to the audience. I'm going to ask one last question and really more in just how we should think about this differently. And I'm glad, as you mentioned, we are going to have lots of other help. We have an academic advisory committee. We have members of the military. We, Mary Band here are not proposing to do all of this ourselves and it'll be a multi-year project. But I was thinking, when Russia actually attacked or engineered the annexation of Crimea, one of the things that it did was very classic, right? It cut telephone cables. That's a time-honored part of the military playbook. And it used cyber warfare to actually disconnect the Ukrainian forces in Crimea from their usual command and control. And again, that's one of the first things you go after is command and control. You disconnect your enemy so that they can't communicate. Now put that together for a minute with what Sasha said about the least connected places on earth actually being the safest in some ways, which is a radical shift from the way we think about it. As Sasha looks at it, if we're all carrying these little devices, we're vulnerable. So where do you wanna be? You wanna be in those parts of the earth that are not connected at all. They will be the places that are safest from cyber attacks. They may also be the places that are gonna be least attractive to live in because they are disconnected. But point being, it's a very different way of thinking about vulnerability versus security if you're thinking at a population level. I guess I'm just throwing out a thought for reactions that should we think about the disconnecting of an entity as itself an attack, right? That once it's an act of aggression to disconnect an entity from its normal networks and that could that be a kind of kind of an attack in itself or to distort or to use the normal connections that you have for maligned purpose. In other words, is there some way that we can use the idea of being connected as now our natural state and really because that allows us to have access to the flow of information and in an information age where we're going is that the effort to disconnect is itself some kind of military signal of military intent at least done at a certain scale. I mean, obviously if I just disconnect all of New America's entities, that would not qualify. But yeah, I mean, the solution here is a really unpopular thing to say, which is you cannot win the cyber war. We're heading into an era whereby the most technologically advanced societies are the most vulnerable. And we have faced this conundrum before, right? We demilitarized for the most part outer space. We decided biological and chemical weapons have no place in civil society. But we are run, the military is run by a group of people that do not understand the implications of the direction, the trajectory that we are placed ourselves on. That when we pioneer all these new useful things and we're a few years ahead, right? There's no doubt about it. We have a momentary advantage. We were a few years ahead on nuclear weapons, too. And looked at less. Exactly, exactly. But it's when our monopoly evaporates that we're willing to actually say, well, hey, we need some ground rules here, guys, right? And this is where we are with both cyber and I think drones, which is a monopoly of, somebody else can do a Stuxnet, probably, right, in the world other than Israel and the United States. Stuxnet is now a publicly available piece of code. So once that is out of the box and cyber and drones or armed drones are sort of proliferating, it is in our interest to create rules. And this is one of the kind of aims of this project is what are those rules? And I mean, Rosa talks about it's not, it's in a sense the use of force across state borders so that's true both of cyber and of drones. That we need to come up with some kind of international norms or a Geneva Convention or whatever that would regularize this. Are you through? Yeah, no, just one. Not forgetting that as Tom said, this brave new world doesn't make the old nasty world go away either, right? That we will also still have the kinds of threats and conflicts that we have been more familiar with for the past several hundred years. People will still blow other people up, people will still take territory, states will still fight states. So adding to the complexity, we can't just simply say, oh no problem, we can disband the military as we know it instead give ranks to everybody who works at Google and that's not gonna work either. So we have to figure out some better way to both create norms that can help us sort this stuff out but also to organize to respond to these very hybrid kinds of threats that we're facing. In fact, the first step is if the first step in future war is to blind the enemy to destroy his links then what's gonna be extremely useful is a soldier walking along carrying a weapon, wearing boots because no links are required for that soldier to operate. But related to that, I mean the idea that it was past the world that disconnected, I think Afghanistan didn't have a phone system before 9-11 at all. Now a third of Afghans all have cell phones. So it's really one of the poorest countries in the world. It's gonna be very hard to, even if a country as poor as Afghanistan, you mentioned Pakistan, Pakistan everybody has a cell phone. So there are vectors everywhere, including places that we wouldn't necessarily assume so. And as I mentioned the other day, I know a guy who commanded a unit in remotest Afghanistan and when he was leaving the country he got an email from the local Taliban commander bidding him farewell at the end of his tour of duty. The Taliban spokesman subscribes through the South Asia Daily Brief that we put out every day. I mean so these people are pretty wide in. And they have a very effective propaganda. We should never assume that these guys, just because they're poor are not technically adept. I was really struck at the Anaconda battlefield that the Al-Qaeda command and control bunker at the top of a ridge, this is spring 2002 in Afghanistan, was run by solar power. The US military had no solar power. Al-Qaeda did. Great. Okay, we're gonna open it up. Mike Lind over there. Well my question of Anne-Marie and Rosa, but anybody else who wants to respond because you're the lawyers, don't we already have a system of legal regimes and norms that governs drone strikes? And that's the combination of the concepts of extradition and of piracy. For hundreds of years the law of piracy said if an area was not actively in the control of a particular state, then any Navy whose nationals were affected could bombard it. So Galveston under Jean Lafitte could be bombarded by the US Navy. Now that's part of the United States. It's governed by a state. Then you have extradition. And following up on that, isn't there a thing? Say a little more about how extradition operates. Well so for example, if there's a suspected Al-Qaeda member working out of a hotel in Piccadilly Square in London, we do not send a drone strike. We ask the British to arrest them and interrogate them and extradite them. And if a suspected Al-Qaeda terrorist is in Moscow and they refuse to extradite them, nevertheless there's a functioning state on that territory. So isn't there a danger that we're extrapolating a bit too much of the future of war from what actually is a fairly temporary experience of Jihadist terrorists in a few lawless areas like Afghanistan and Yemen? I'm gonna ask you. No, I don't think so at all. I think that the legal regimes you're talking about are not in fact what the US has been relying on for targeted strikes across borders. You're absolutely right that there is also an international law right to use force in self-defense. Doesn't matter whether you're in an armed conflict or doesn't matter who might be threatening you, but if we think that somebody somewhere else is about to launch an attack on the United States that's gonna kill a lot of Americans, we obviously have the right to use force in self-defense. And that's always been true, although historically the standards for using force in self-defense have been pretty tight in terms of what constitutes an imminent threat and so on and so forth. I think that what we've seen change is the shift to the use of an armed conflict legal framework which as I said has a much looser set of rules concerning how, when, and towards whom force may be used and specifically, and I don't wanna get too lawyerly and in the weeds, the law of armed conflict, the law of war permits, quote, status-based targeting. You get to target somebody because of who they are and not because of what they're doing. So, EG and World War II, you can target somebody because they're a member of the German military. You don't have to wait for them to be running towards you, shooting at you. Whereas in the self-defense world that you're referring to, you do actually. You do have to have some sort of notion that there is actually an imminent threat and that you've exhausted all other means. You don't have to do that when you're in an armed conflict. You can use force as your first resort. You can bomb the enemy while they're sleeping. So when we define, when the battlefield potentially is anywhere and when the legal standards governing who we define as an enemy, how we define as civilian are secret, as they still are. We may get a little bit more information due to a recent court order. Sometime in the next few years after all the redacting is done, that's kind of scary, right? Because that suggests that anybody anywhere in a process we won't know about based on legal criteria we won't know about could be targeted. And it's also, I would add, not an approach to understanding a law of armed conflict that is shared by almost any other state in the world. We're really fairly isolated on this vis-a-vis our major European allies, as well as other states around the world. So we're also seeing kind of a fragmentation of consensus internationally on what the right set of legal norms are, which in itself, I think creates a whole set of problems for the U.S. Tim, did you wanna say something too? Yeah, that was actually, there's an interesting geopolitical dimension that comes into this. And I'm speaking now from the cyber perspective, where there has been a big debate in the last two to three years whether the Geneva Conventions and international humanitarian law applies to cyberspace or not. And you actually have had China resisting that notion that these existing norms apply to cyberspace. So you have a debate where some of the rising powers are starting to or have challenged the existing normative framework that exists. And they've last year, thankfully, agreed to that IHL actually does apply to cyberspace. So you have that on the one hand, the challenge of whether the existing norms apply. And then with regard to cyber, the additional challenge of not only how do you translate those, but what do you do about the things that's really new that hasn't been covered by existing norms so that there's self-threshold activity? Yep, great, yes. Hi, Judson French with the next group. Tom raised the issue about potentially wanting to invest in Google rather than DARPA as an example. And raised the thought, for me, being in the defense industry years ago, and the fact that Milspec meant something very important. One of the things that it meant was lack of vulnerability to things that can kill commercial off-the-shelf hardware. I didn't hear anybody mention. Military specification. Military specification. Correct, sorry. Military specification, sorry for jargon. But I didn't hear anybody mention vulnerability to things like, for example, electromagnetic pulse. Now, where would an electromagnetic pulse weapon fit? It's not really a kinetic weapon. It sends out particles, but it's really sending out a wave. But it's sending out a wave that can kill every computer commercially specified within some range which knocks out our capability back to the case where if you're in a cave with hand signals, you're safer. So I kind of like to hear what, there is a real reason for military development, and I don't think Lockheed and Boeing are going away because they need to militarize capabilities that Amazon or, you know, create with great cyber vulnerabilities right now, really, compared to what the military would want to see in our networks and in our cybersecurity. Can you address how that plays out? Yeah, actually Sasha and I have talked a lot about this, about the most vulnerable point in drone networks being the links and how much effort's gonna be required. And that will actually be the spaces I think that the pure defense contractors feel is hardening commercially developed things and also making them stealthier and faster. And going to Sharon, capable of traveling great distances, 6,000 miles on a small fuel supply. That said, I just think the difference is those, these things will not originate with defense contractors, instead you're gonna have military suppliers. Amazon and Google are gonna have no interest, a little bit like Microsoft a few years ago in being purely military suppliers, but they will say, sure, we'll sell you the knowledge of how to operate this network and you can go ahead and mil-spec it. So I think you're gonna see a lot of that stuff and I just think metal bending, generally, is gonna be much less significant. This is not a 19th century steel based military we're talking about. Oh, and I think EMP is the first shot in a real big war. This is electromagnetic pulse that just zaps out stuff. Well, you know, CENTCOM, I remember years ago, required a conventional capability as one of its requirements for fighting in the Middle East and I'm quite confident. We have an EMP cruise missile. We'll amp out people wherever we have to and they never know. Well, what people? Amp them out, EMP and electromagnetic pulse to destroy their computer systems. Amp them, not amp them. Okay. Capsy Marjon. Hold, wait. And an observation, having just recently been in Pakistan, I can tell you that our use of drones is costing us massively on the ground there in terms of popular support and this, to me, sounded like an extremely clinical conversation and yet I read that people who are on the other end of drone deployment suffer as much from post-traumatic shock as people who launch any other weapon and then finally, so it's not at all a clinical matter and then finally a question. Are drones of use in penetrating triple canopy jungles such as where the Nigerian girls have been abducted and if so, are we deploying them to find them? I think the answer is yes. We are deploying drones to find them and I think there will be some utility but not perfect. I mean, Bin Laden, the documents we recovered in the Bin Laden compound demonstrated that he wanted to move his group from Pakistan to Kunar, which is a very heavily forested mountainous area that would be harder for American drones to pick his people off so it's not a perfect solution. Certainly, our drone program in Pakistan, we used to be at 20% favorable and Pakistan now at 9% and we're close to zero in terms of favorable views and the drones have been a big part of that. That said, the president on May 23rd gave a very important speech at National Defense University saying that he was gonna really kind of start limiting the drone program and in Pakistan we haven't had a drone strike since the beginning of the year. It has basically stopped, which I think is very smart. I mean, we've run out of targets, let's start there. The Pakistanis are serious about going into North Waziristan may be where many of these drone targets are located. And if the price of the successful drone program that we pissed off, 180 million people, that's a pretty high strategic price to be penned. And so there's been a big debate with the State Department pushed back on this, as you know, Cady, and they lost some of the battles, but they won the overall war. But in Yemen, we're seeing civilians being still killed and that is becoming a political problem in Yemen in the same way that it was in Pakistan. And so just because we can do something doesn't mean we should do it. Yeah, Daniel Rothenberg and I are editing a book together and we interviewed a drone pilot and it's much more up close and personal being a drone pilot, following somebody maybe for months and then pulling the trigger and killing him and then going home to your family. So the whole idea that the drone is like very autonomous and it's very, I think, and to correct me if I'm wrong here, Tom, I think in the kind of control chain of a drone, you've got like a hundred plus people. It's a lot more people in a drone kind of attack that are a part of it than let's say an F-16 just dropping a bomb. So there are many more people and there is a sort of, and Mike Walts is somebody who sort of used drones on the side of the field. And he's got the next question. But the point is people, you're actually right, Cattie, there is this, it does affect the people who are deploying these, but this is why I find Rose's point interesting. What if it turns out that autonomous drones are better at killing and making these decisions than we are and they don't get PTSD? I think they will be better, but I think the USF holds right now is very, very clear that we are not going down that route for the moment. I think eventually we will because the slow point in the machine will be the human in the loop. And there's no way that- And the latency of communications. Yeah, and the human being will slow things down. So if it's facing an autonomously run machine, it'll always lose. You know, just the, you know the human tendency, you're young and you look around and you think I don't think I know what I'm doing, but I bet the grown up somewhere I know what they're doing and then as you get older and older you realize that- There are no grown ups. There are no grown ups. There are no grown ups. I think that, I think that the, you know the grown, the purported grown ups who, you know, run the city, run this country. We have let ourselves fall victim to what we all sort of on some level know we shouldn't, which is, which is because our lawyers tell us that we legally could do something based on our interpretations because our technologies enable us to do something. We start doing it without giving a whole heck of a lot of thought to wait a second, five years down the road, what's this gonna mean? What's the human cost? What's the strategic cost? And I think that one very good thing that is happening in the last few years is that we are not the only voices saying, whoa, you know, we need to really think about this. Does this, is this good? Is this consistent with our values? Is this consistent with our longterm security? Strategically speaking, is having a largely covert war that has killed several thousand people based on legal criteria we don't know about? Is that where we wanna go? This blurring of military and civilian, what do we do about that? So the issues you raise are enormous and very, very troubling. And I think only in the last few years has this really begun to kind of collectively, I think, sink in for people like, wait a second, we need to think about this a little harder. Yes? Mike Waltz with New America. Just a quick comment and then a question. You know, the comment which Peter you just alluded to would be to disabuse this notion that there are autonomous drones out there making kill decisions because I can assure you there are not. And to emphasize the point Peter just made, that there is literally squads of lawyers on the other end. There are humans on the other ends of these machines. I'm delighted to see that. That is actually an overlapping set. Yeah, taking those decisions with commanders that are humans that are held responsible for those decisions. And it's not the pilot, it's a series of folks. The question is, we've talked today about drones in particular and isolation. So what I'd like to ask the panel, if not drones then what? And when you're looking from a policy standpoint and you're looking at your policy options, we've talked a lot about the lives that drones take but I'd like you to address the lives that you can make the argument that drones save lives as well and have safe lives. And are we willing to truly explore boots on the ground in Pakistan or in Yemen or in other places? And if we want to address the opportunity cost of PTSD and a pilot in Ellis versus the infantry officer on the ground and the strategic consequences as we work through that in the Bush administration and then the Obama administration once they were faced with it. And before you all answer, I do think it's important to remember, you know, we're 70 years away roughly from the fire bombing of Tokyo or Dresden or weapons that had no discrimination whatsoever. And that, I do think that's very important. In part, Peter, you were saying that drones are popular with American people. They are, they're cheaper, they're not boots on the ground but I think part of what is going on also is an understanding that the closer you can get to the individualization of warfare in the sense that you're trying to get only those people who are actually responsible rather than the blunt trauma, you know, blunt force weapons look at Syria right now and think about if you could go after the individuals rather than destroying the country. That is a very important part of the moral calculus here. Yeah, and I mean, the civilian casualty rate in our drone strikes is close to zero and that is just a fact. And so there is a moral case for drones and that kind of sometimes gets lost in the politicized discussion of them. I, yeah, and I don't think any of us here would say drones are bad. They're a way of delivering ordnance. There are lots of ways to do that. You're just as dead if you have a bomb dropped on you from a manned aircraft or you're shot by an infantry, just as dead. To me, that's, there's a different question that's the right question to ask. And it's a policy question. It's not a question about one technology versus another technology. You know, the policy and strategy question is does it make sense to conceptualize this geographically diffused network of not very pleasant groups with some link to al-Qaeda as the enemy in a war? Does that make sense? What set of actions does that lead us to think makes sense to undertake to combat them? Because obviously there are a lot of different ways you can address threats from organizations that may use terrorism. And many of them are non-kinetic. Some of them are kinetic. And you know, there's a broader set of questions about how do we get the right balance? I don't actually think there is a right balance needless to say. I think that how you deal with Boko Haram and how you deal with al-Shabaab and how you deal with al-Qaeda in the Arabian Peninsula. But these are totally different organizations that present totally different kinds of challenges and threaten different groups. And the right response is going to vary. So indeed, I think that part of a real contribution that Peter's work has made has been to try to really disaggregate, you know, what is, and actually this is, I know this is the focus of the book that Peter and Daniel have coming out. You know, what is different about drones and what's not? You know, what is really a broader issue about how we're approaching certain kinds of threats? On drones versus not drones, I quote one of my favorite generals, William Sherman, in the letter he wrote to the people of Atlanta, which is war is cruelty, there is no refining it. It's just what sort of cruelty you choose to use. What I worry about with drones is when they begin to use against the American people. I think that the deep, profound and indeed irrational anger that the American people will feel when some group of bad guys gets one of Sasha's $500 drones and hangs a few sticks of dynamite on it and starts using Google Maps to fly into the houses of US officials, which I think is inevitable. And I think the American people are gonna say, why didn't anybody tell us this was gonna happen? And I think for the second we did we started doing drone strikes in Pakistan and it became inevitable. Tim. Your argument actually has an analogy to cyber and I think it's one of the most fascinating questions you had, Human Rights Watch last year come out with a campaign against drones and in any ways it mirrors the debate between pacifists and those who are like the Red Cross people who are trying to make conflict less violent. And there's a similar argument for cyber that you could make the argument it causes fewer lies and therefore it should actually be used and legally is there actually a duty to use drones or cyber if it causes fewer lives if you have that humanitarian impetus? But I think that's where it then gets to a broader question of are you only focusing on bodily harm or is the fear that you cause by the people that are worried about being targeted by the drones something else to worry about or is the side effects of a cyber attack something else to be taken into account and that's something we encountered in some of the first articles we published in terms of or Thomas Ridd's argument about cyber war not taking place because he focuses only on bodily harms and one of the questions I think we as a project have to look into of like how much do we want to broaden this to go beyond just the bodily. So I wanna take a crack at this question too though. I see drones a different way and again we keep referring to Dan and Peter's book coming out on drones but Dan's argument about drones convinced me that what is different about drones is that they are the first sign that the the edge in warfare going forward will be the information edge. So information has obviously always been critical to warfare. We have military intelligence without intelligence. You don't know what's going on in the battlefield. That's always been important but it hasn't been more important than having 100 tanks to somebody else's 10, right? So that the balance between your intelligence edge and your ordinance edge or how big your army or how advanced your weapons, that was one factor. What I think drones are announcing is that effectively the country that is the best at gathering information, processing it, integrating it and using it instantly is the country that is going to be have the military edge but then as Sasha said that's just then the next arms race, right? That's where the arms race is. So that drones, as you said to me it's not about the actual delivery of the ordinance. It's the fact that the computer on board is the mechanism, the primary mechanism I think of warfare going forward that is one way to think about it. That's all true with one important caveat. Where we're using the drones, they don't, Pakistani F-16s could shoot down our drones if they decided they want to go to war with us. So I mean drones work pretty well in an environment where people don't have effective anti-aircraft defenses. And that's where we're using them and Yemen has no air force to really speak of. So I think the general principle is true but there's an important caveat. If we start fighting the Chinese it's kind of a different matter. But I think, I don't want Sasha's point to get lost though, right? I mean part of what we're talking about whether we talk about cyber or we talk about drones. Part of what we're talking about is the democratization of the ability to kill not necessarily 10,000 people but the ability to kill five people or 10 people or 50 people which now any teenager, right? I mean don't tell your kids this, right? Any teenager can spend a few hundred dollars to get a drone, a few hundred more to create a basic weapon. And our air defenses aren't designed to find a kid flying a drone over the house of his least favorite public official or the girl who wouldn't go the prom with him or whatever it may be. And yet we have seen that kind of relatively low tech threat whether it's IEDs in Afghanistan or it's the Boston Marathon bombers can impose enormous costs on us as a nation from a politically, culturally, from a rights perspective in terms of what kind of money we then spend on a sort of fruitless effort in some sense what Sasha tells us is a doomed effort to protect ourselves from that kind of threat which, and that I think is part of what really requires us to rethink well how do we think about these threats? How do we respond in a world where in a world where we cannot fully protect ourselves from a range of attacks? Yes, Suzanne. Then Jim. Thank you all. Fascinating discussion and a fascinating project. My question is given all that you've said what are the implications for the future of good old fashioned diplomacy? If Richard Holbrook were here today how would he use this? And Amarie it's gotten me thinking maybe we should launch a parallel future of diplomacy project at New America. Stay tuned for next year. That's your question. Yeah, I know, I'm thinking. Well, I guess part of what I think even you are all hearing an early phase of this project. We've been at this for a couple of months or four or five months just thinking, setting it up but one of the things you're already hearing and Sasha put it on the table immediately was there are some real no go areas here that actually we don't wanna go down these roads and we are capable of saying no generally when we realize something's gonna be used against us that we've been using against other people and that moment is rapidly approaching. So this is the right moment to think about the kinds of laws that say no, we do not choose to have everybody develop these weapons as we have not with chemical and biological weapons. And that really, that has worked in the sense that chemical weapons as we saw really is a taboo the nuclear taboo itself and not would but the idea that there are things too terrible that we can see that we are going to destroy ourselves but that for right now this doesn't look that bad it looks like it has all sorts of advantages and then sorting out which of these weapons you would wanna use and which not which is a way of saying then there are two uses for diplomacy. One is let's say we developed the perfect code and the perfect sequence to the Geneva conventions we have on the books now getting those negotiated how to even think about starting to get them negotiated the folks, when we've had these debates before about amending or stretching the Geneva conventions in any way the ICRC, the International Committee of the Red Cross which oversees the Geneva conventions wants no part of it because they're afraid if we open up those negotiations we will backslide. So even just thinking about how you get countries and of course now it's 190 countries or say even if it's 100 that would be affected to think about this in a world in which the last time we had a major treaty was almost 15 years ago huge diplomatic issue. But I think the other import of your question is in many of these cases we are gonna decide that the costs of war are just too high which means then we have to develop our other tools. Jim Fallows I'm gonna give you what may be the last question we'll see how long the answer takes. I'll try to make the question. I just got the five minutes. This is mainly for Tom Ricks and Rosa Brooks. Tom has written over the decades about increasing separation between the military and civilian society. Rosa was talking about it now. I've written about this for a long time. A question is if we accept this is a phenomenon therefore what? Therefore the military should do something differently. Therefore civilian society should do something differently. Therefore nothing can be done differently and there'll be X consequences. What's the next stage from your tools perspective? My short answer is how much do you want your generals to look like George Patton and how much should they really be like Steve Jobs? And that tells me that we have a huge problem with our military which thinks it's an absurd question. And- Which is a white question. Which thinks that's an absurd question. Of course you want George Patton. Is their answer. And you have a military increasingly located on bases in the remote South when the Air Force opened up an information warfare unit and put it in Shaw Air Force Base, South Carolina, as far as you could possibly be from any computer company in America. And you really do want to rethink this approach. For example West Point in the 19th century was the nation's premier civil engineering school. And railroad companies, the high tech companies at the time snapped up West Point graduates because they understood the technology and how to run large organizations. These guys all went off like McClellan to be railroad executives. Nobody had to look on valleys looking to snap up West Point graduates these days because they're still civil engineers and they're great at building bridges and railroads. If you were to design West Point today it would be a computer science school almost certainly. And these basic primary questions about the US military are not being addressed. And it actually really scares me because you wind up with a big vulnerable military that is rather useless. And that is what leads other countries to think they can take us on militarily. Jim I think that there are two ways to look at your question. There's a sort of legal and rights and values piece of it. And then there's an institutional consequences piece of it. Tom quoted General Sherman. I'll quote the Lieber Code, the norms governing the Union Army during the Civil War. The Lieber Code said, among other things, war is the exception, peace is the norm. The goal of all war is to return to a state of peace. And I think what all of us are saying is that the idea that we're ever again going to be able to distinguish us sharply between war and peace is probably misplaced. But if we can't distinguish that sharply between war and peace, the consequences for law right now are bad because then we enter into a world in which there are far fewer constraints on state power and far fewer protections for individual rights until we can come up with some new set of norms to govern this sort of probably permanent state of not quite war, not quite peace, gray area. The institutional consequences, I think, favor blurring. You know, I've always loved those World War II movies where you see, you know, there's some guy sitting in New York and he's a stockbroker or he's a bridge builder, he's a lawyer and along comes the military and says, we're going to give you a rank and send you to Europe for this special mission. We don't really do that anymore, right? That we have a military recruitment and personnel system and basing system, as Tom mentioned, which creates these silos between the military and everybody else. The military doesn't have much ability to say, gosh, we want to send some of our brightest young majors out to work at Google for three years and then have them come back, nor can somebody from Google say, gosh, I'd like to go be in the military for four years and then come out. You know, and that institutionally, we're going to have to find ways to blow a little bit more if we want to effectively combat this threat. Even as legally, we need to figure out some better way to protect rights in that world. I'm going to let Sasha probably will be the last word. I realize, Rosa, as you were talking, that I forgot to tell the audience, in addition to the great distinction of having been a New America fellow and being at Georgetown, Rosa did also work in the policy planning office in the Pentagon for the first couple of years of the Obama administration. Sasha. Yeah, so for most people, tech is magic. Right, so if I tell the joke, there's 10 kinds of people in this world, those that understand binary and those that don't. There's a few people that get that, almost none of them in the military. Now, the best fire, and people look at it as magic, right, so the best firewall in the world isn't going to stop a tank, but you could hack the fuel injector, except that may make vulnerable every internet connected car in your country. And so, understanding the implications of cyber in particular, but generally it's sort of the balancing of Faustian bargains that technology has made available and probable and likely, that's like the core challenge that we're facing now in this future of war. That is a great note on which to end. I hope you'll agree that we've got a fabulous team engaged in what is a hugely important conversation and join me in thanking the panel.