 So, a couple years ago, I wrote a book about a killer application being introduced into warfare and killer application in both meanings of the term. It's amazing how far we've gone with what was once a science fiction technology of robotics, unmanned systems, remotely piloted aircraft, drones, whatever you want to call them. We went into Afghanistan with only a handful of them in the air, none of them armed, and zero on the ground. The U.S. military inventory now has gone up to over 7,000 in the air and over 12,000 on the ground. And in the Air Force, we are now training more remotely piloted systems operators than we are bomber plane or fighter plane pilots. But perhaps the, and I should note, Secretary Carter has said he doesn't like PowerPoint, so this for the record is an apprezi. The biggest ripple effect of the robot, I would argue, is how it's reshaping the narrative of this most important realm war. It's reordering how we conceptualize it, how we talk about it, how we report it. Think about the discussion over drone wars, which is the book that Peter and Dan recently released. Part of the challenge of talking about drone wars is that we're often talking about two very different things. One is the overt use of it, like here, on battlefields in places like Iraq and Afghanistan. Last year, for example, the Pentagon flew some 700,000 flight hours of unmanned systems. The other is what I call the not-so covert use of the technology and counter-terrorism. The New America databases show that we've carried out more than 300, actually we've carried out at least 396 drone strikes in Pakistan and another 105 in Yemen. That by some measures is a war or not, however you want to discuss it. But that's where we are right now. What looms with the technology itself? Another way to think about it is that the predators, the reapers, are the Model T Fords. They are the Wright Brothers Flyers. Even the way we describe them, unmanned, hoarseless carriage. What comes next? And as you can see from these visuals here, first is the shift in the forms of them, the sizes of them, to we get truly creative and maybe creative by copycatting from nature. That's a drone. The real shift that matters is in their intelligence and their autonomy. This is an X-47 UCAS taking on perhaps the toughest human pilot challenge of all that Missy can attest to, taking off and landing from an aircraft carrier. It's now carrying out tests on doing air-to-air refueling. The British are testing a similar one that is going to do target selection on its own. We're not in the world of the Terminator, but we are seeing a move from planes that were completely remotely operated to ones that are managed from the human being in the loop of decision to, as an Air Force report described, on the loop of decision to, as the head of the British Royal Air Force saying, he predicted that we'll move to being out of the loop of decision at some point in his lifetime. And he's not that young a guy. But one of the things that I would end is by going back to those discussions about design shift and thinking about not just big size systems like these jet aircraft, but how you combine the design shift with the autonomy size. Smaller systems able to do more and more. And more and more doesn't just mean tricks like this. That's the real impact of autonomy and intelligence. It's what it does to the user base and the uses of the technology itself. Think about the parallel with computers. You once had to learn a language just to communicate with your computer. Remember DOS, Fortran, those sorts of things. My two-year-old was upset yesterday because his favorite TV show that he had downloaded onto his iPad had stopped because he'd moved out of wireless range. Okay, that's where we're at right now. Same thing with the technology of drones. You once had to be a rated pilot to be able to fly something like the Predator, now we are seeing iPhone apps being designed for use with drones. And in ending, I use my own experience in the military entertainment complex to illustrate this. I was a consultant on a video game called Call of Duty. And we wanted to conceptualize what would be the kind of technologies that would be used on a battlefield in the late 2020s. And what would, we asked ourselves, what would soldiers want? We went out and interviewed it and they wanted something small, easy to operate, easy to delegate, operate in urban canyons. And so we combined two technologies that exist right now. A quadcopter drone that you can operate with a tablet strapped to your wrist that's fairly autonomous and an Uzi. You can play it in the game. Now what's interesting is we filmed a commercial for the game. I made a real-world working version of it. This is Charlene. What's fascinating about Charlene is it's not just a working version that they were able to build for a couple thousand dollars. But then amongst the viewers of this commercial were people in the Pentagon who got kind of upset that this, what they thought was a Russian, it's really an actor, had a better small unit tactical robot than the US military has right now. And so we are now seeing defense contractors try and build Charlene. So welcome to the future of war. Just as H.G. Wells' story about land ironclad's inspired the tank and his story World Set Free inspired the atomic bomb. It's the same for our science fiction today, just via video games. Thank you, Peter. So start with you, Admiral Rondo. What are you? So the question, are we going to be relying upon autonomous weapons? Of course, the answer is yes. The issue that I'm looking at and that I think is important if you look at Peter's work and the other work that is being done is that we talk about artificial intelligence and it is really important work and that is the work of the future. But there is a hand in glove piece here about the human intelligence part. It's part of the strategic envisioning of what autonomous weapons really are all about. There was a Japanese samurai warrior by the name of Miyamoto Musashi who said you should not have a favorite weapon. That was in the 16th century. And the point that I would say about autonomous weapons, it is not about the weapon nearing as much as the intelligence of what it does for the human being inside that loop. And so there's a lot of things that go on here. But from the point of view of what does this mean for the individual warfighter? What you had there, Peter, with the actor, is the warfighter of the future truly part of the military? Or is there something that happens in societies that then we have more of a warrior society that is intelligent that does future wars? Anne-Marie had talked about and as Dr. Crowe had talked about. So what it means for us in terms of governance, in terms of education and how you develop the human mind, in the fact that the midshipmen here and the cadets here are thinking in a more nonlinear way than the linear planners of Congress or the Pentagon. And that's how you get to different kinds of solutions. So very quickly I would submit that while autonomous weapons is the question of the day, autonomous intelligence and what human beings do in the governance of societies and by individuals is equally as important a part of this. And if we do not think strategically about that, then we will be behind the power curve and we will be victims of terminators in the future. And so the darkness of the lightness of the future in this respect is about how we control ourselves and impose protocols to make it a better kind of a solution rather than a dystopic future. Dystopic future, Werner. Well, let me try to separate the question of the technology from how we employ it because those are two completely different things. What you saw in the videos, the kind of technology to enable small, simple, but very sophisticated autonomous systems, that technology has been around for a long time. And there's nothing that prevents us today from making a fully autonomous system that you could send out and that would, to some degree of ability, be able to select its own targets and strike those. It's a completely different question about how you would leverage that technology in a real warfighting operation. And here I think it's important to understand that the single biggest advantage that the US military has developed in terms of a capability is a highly integrated warfighting operations capability. We don't send individual platforms out to go strike things. We have, and this is something that distinguishes the US warfighting capability from that of really any other country as a matter of degree, we very tightly integrate everything from on the very front end information collection, the intelligence surveillance and reconnaissance through the space layer, the aerial layer, the cyber domain. We integrate that through a whole decision chain into the idea of mission packages which then go out and conduct an operation. They don't work individually. They work with all sorts of electronic support, electronic warfare and other supporting assets. So the idea that a single platform is going to be the way, no matter how much SMARTs are built into that platform, that that single platform is going to be kind of our new paradigm for how we conduct strikes. I don't think that's going to happen because it throws away one of the largest advantages that we've developed over time and that is that very tightly integrated ability to conduct warfighting operations. I'll put forward just two other real quick points. One being that if you think about the integrated warfighting capability as being distinctive as a matter of degree in the American military, many of our potential adversaries don't have such capabilities and I think they're much more likely to turn to the use of relatively inexpensive autonomous systems long before the US military will. And we could well find ourselves in a situation where there is in effect pressure from the public on the military, a public that is becoming increasingly comfortable with autonomous systems that they're finding throughout their daily lives, in effect demanding that their military actually match the kind of capability that what we'll call loosely low-end adversaries might be bringing to the fight. So I think it's a much murkier picture of exactly how these autonomous systems are going to fold into our own warfighting in the near and midterm and in the long term where the pressures on us will come from. Missy? So despite my girl status here, I'm actually the nerd in the room, the nerd on the panel. I work on robotic systems every day and we are just simply not in a place where the technology has gotten to the point where we're gonna have truly killer weapons. And I, let's, for example, let's take a humanoid robot. Humanoid robots are not gonna become a reality on the battlefield for the foreseeable future because of battery problems, energy problems. They simply, we simply cannot build one with a battery that's not the size of a tank to have to go along with it. There's some other problems with proprioception and of course the whole perception issue, how do robots see? That's very similar problem that we have in drones where you have an operator looking 4,000 miles away. We have to have that operator make a decision about whether or not to launch a weapon because the artificial intelligence, which is actually, it's kind of a bad word to say in academia right now. Everybody has given that up because it's got such negative connotations. Now the new buzzword is machine learning. It means the same thing. And all that actually means is pattern recognition and where we are in the state of the world right now, we're still in its infancy. We definitely do not have the capability to have a robot be able to reason in all kinds of areas of uncertainty, which is really what you're talking about in warfare, to be able to clearly select a target and know that that target is bad and then launch at that target. And I think that that lack of capability has helped inform the DOD 3000I policy. That being said, we're working towards it. We being the broader research community. So I'm not ruling it out that that will become a possibility one day, but we aren't even close in a technology more near and dear to your heart. These same problems show up for driverless cars. And so we're gonna have lots of issues fielding everyday technologies as opposed to military technologies. But I wanted to kind of close that loop about the everyday robotic technologies because both Peter and Bernard, they said something I think is really critical. And what they're alluding to is the capability of the US when it comes to robotic technology. I've been whining about this for years now, back to the US government. We have basically, I believe, lost the cutting edge in terms of technological development in this area. I work with roboticists around the world. We are not the leaders in development of drone technology. The Israelis are, and they have been for a long time. We're not the leaders in robotic technology. The Swiss are putting on some amazing capabilities with drones and also robots. And so one of the things that I see on the horizon is there's such a low entry to getting into robotics technology and even small drone technology that we're starting to see groups of people worldwide gain capabilities. Peter's not the only one with a drone that is more capable than the US military. My students can, over the weekend, build a drone that's more capable than many US military personnel have access to and are even in the development pipeline. I'm not gonna name the company, but there was a company who was looking at purchasing a patent that I developed. And in the end, they decided not to because it was a patent for small drone control technology and they decided that they would prefer to kill it, bury the technology, because they couldn't make a profit enough per unit cost back to the US government. They couldn't sell it for enough money. And so that kind of mentality in the industry, they're ignoring the small drone technology and thus you are starting to see the upsurgeance of technologies around the world. And it's true that not everybody is developing a weapon, but the artificial intelligence, the machine learning, I can promise you that Amazon and Google are far outpacing the US government in terms of capability, in terms of artificial intelligence and the smarts that are gonna be on these vehicles. And so that actually brings up a whole new thought of what is the commercial marketplace gonna look like for robotic technologies in the future? We're not gonna be number one. Thank you. A question for all of you. Why does it matter? I mean, it's the position of the US Air Force, as I understand it, that there will always be a human in the kill chain. That's the position right now, right? It's a US government. A US government in general. Why does it matter that we would have a human in the kill chain? Well, I think Missy's comment points this out correctly. The technology, I think what she's saying is the technology exists, but not to the level of certainty that I think the public is willing to accept the likelihood of a false strike, for example, or a false target identification. So this gets into this asymmetry where we culturally in the US, we have a very strong resistance to incurring collateral damage or doing a false strike. But machines are more capable of discriminating than human beings, right? I would not say they are able, and Missy correctly points out, they're not able to reason as confidently as we believe humans are. But they're less prone to mistakes, as I guess. I would not agree with that either. I definitely don't agree with that. This context matters, right? So less prone to mistakes. One of the reasons why Watson was created was the supercomputer was, or the AI machine, until how we were gonna frame it, was not to make money off of jeopardy. It was to be applied to the field of medicine because doctors actually make more mistakes and diagnosis. Actually, it's more than half the time. And so if you can just beat that, you're fine. Same thing when we talk about the kill chain role. Context matters. We keep talking about this in the context of either Missy laid out a ground warfare terminator style or as you laid out a drone strike, i.e. kind of airstrike where there's time. Well, let's be clear here. You have context where it's already largely autonomous cyber conflict would be one. Air defense would be another. That's actually, I'll set that aside. Another is future context. Undersea warfare is something that I think will see more autonomy, more autonomous warfare than say urban combat. And one is what you laid out is because it's basically a game of first identifying and then doing pattern recognition based off of the sonar waves. And you set aside the challenge, the real challenge here is kind of a legal ethical challenge. And in undersea warfare, you're less likely to have civilian casualties because there's no kind of submarine cruise ships. Finally, context matters in terms of where you are in the war. So for essentially several decades, the United States government said unrestricted submarine warfare is a horrible illegal thing to do. And in fact, that's why we will enter World War I. December 8th, 1941, the message goes out to the U.S. submarine fleet, start unrestricted submarine warfare. And no one says, my God, that's illegal. We do it because we were on the losing end. So context matters. There's another piece here, too, is that in the history of weapons, distance has always been part of the physics of, and then it gets equated to the moral and ethical code of the warrior. So you go with the Spartans who would not do long distance weapons because they thought it was immoral. Even in autonomous weapons, it's about the distance from the delivery system and not so much a distance from the decision to go. So that the decision cycle is still within the human relevance. The ability to analyze, let's take Watson. The issue about Watson is that you have to put in and insert billions of data and the more you insert it, the more accurate it becomes. The less you enter it, the less accurate it gets. What it is doing is analyzing the volume of data in front of you. So what it is doing is a mathematical phenomenon of analysis that completely escapes the human mind. It is not making a decision. It is saying, here's what I have for you and here are the options that you have in front of you, now you decide. And so it's about the context to get back to Peter's point. It's also about the distance to the decision and the amount of data in volume you are gonna have to that point of the decision. So you still have the human in the loop. To Peter's point though, and I would submit that there is a context where you can say that autonomous systems will make you more likely to have better information. Arguably, you could say that in 1988 when the United States Navy off of the USS Vincent shot down the Iranian airliner and it was because a human being was reading the data and was trying to understand what the data was giving him. And the human being made a decision based upon the data in front of him. And as Peter and I were talking, this is like following your GPS even when you know it's taking you on a different route than what is the best route. You read the data and you say this is what I have in front of me. Autonomous systems to get to Verna's point that is a little bit smart can help you to understand what's in front of you but the decision is still gonna be the human point of the delivery that you're putting the weapon on. As these, I mean, I think we all agree then that the future is sort of robotics are here, et cetera. What does that do to the war ethos in the US military? I mean, there was a discussion about warding drone pilots combat badges which kind of was much criticized. How does that change being a warrior? Missy? Well, I'm kind of not fair to ask because I already defected to the drone technology. I do see it quite a bit. I see it, I work a lot with military pilots and UAV drone pilots are considered to be at the bottom of the barrel but that's shifted over the years. I've been doing this for 10 years and they were a leper colony 10 years ago. They're just mildly annoying now. So I think that there's definitely been an improvement and it's speaking to the generational change. For a long time back in the 60s, the Air Force was run by the bomber mafia. That's what they'll tell you. And then it shifted to the fighter pilot mafia and now it's interesting because those people who are actually getting more time in combat are the drone pilots, but it's still relatively new and those people are still up and coming through the ranks but there is still a very significant morale problem. I work a lot at the Air Force Academy and I think that's probably your best place to take a pulse check for the future and they still look at drones as the worst possible assignment to the 10th circle of hell. And until that shifts, we're gonna have a harder time getting the rest of the services to shift. So for the Marines here and the ground combat soldiers here and anybody who's been in ground combat, how they view the warrior ethos is gonna be, again, it's about the intimacy and the distance from the decision point. They're gonna have a view of this that none of us up here gonna have. So respecting that, warrior ethos is a lot about, and there are lots of different groups that can adapt it and adopt it. It's about looking out for your shipmate or your combat buddy. It's about group above self, team above self, about mission above self. It's about identity and those kind of things. That's possible and you do it at a higher level but you gotta work at it and it's not gonna be automatic. And so if we tie the warrior ethos to the weapon system, you're not gonna get there. If you tie the warrior ethos to the war in front of you, the combat in front of you, and to what's happening to everything around you, there is a tremendous psychological piece to this that again, it's about the human being. So I think it's less about the weapon system than it is about how you develop people and the character and the ethos and how they think about self versus other. And that's certainly part of the phenomenon. Peter, there's three quick points I would make. One is technology has always changed our sense of this warrior ethos. So we can illustrate it with Mel Gibson movies. So at one point in history, the ultimate warrior was the one who was most ferocious leading the charge, Ola Braveheart. Then we get a weapon that allows you to kill at a distance and musket. And then the definition of the ultimate warrior, so to speak, in the 16, 17, 1800s is the person that can stand there in a line, expose themselves to danger and not run away, the Patriot movie. Then you get the machine gun and it changes both those old definitions of warrior ethos into insanity, Gallipoli. And so this is the, we see this playing out again today. We shouldn't think that the warrior ethos is something that's enduring and always been the same. Technology shifts it. Second real quick point is there's definitely this challenge within the military now of two different cultures coming together. It's particularly playing out in the Air Force. And it's not just an issue of receiving potential medals, but it's affecting everything from that matters because of pay. They get paid less. They also, because of the way we've worked the personnel system, we've actually overworked them. There's a greater demand than there is personnel. And so we're burning people out and actually not giving them the professional opportunities to advance. So their promotion rates are not as high. And so you have this strange phenomena where, for example, at least last year you were more likely to be promoted if you were everything from a fighter pilot to a meteorologist in the Air Force than if you were a UAS remote pilot and drone operator even though you were the one most in demand and the like. And I would end on the idea of, it's at the academy. The real question for the future of war is who are the equivalents of 100 years ago, the young George Patton, who was one of America's greatest horsemen, represents us at the 1912 Olympics, but still decides to join the armor corps. Who are the equivalent of that in the academies today? And I think we'll touch on that a lot in the discussions over the next several days here. But I would say that historical review points out that the warrior ethos has to be separated from the tools that the warrior at that time has available to them. A warfighter of the future is going to increasingly have available and make use of systems that have some level of autonomy in them. That doesn't in any way remove the warfighter from the war. This is simply a tool they're using. So I think the warrior ethos survives. It is, as Ann points out, it is something that is philosophical and separate from the particular tools that they have available. Speaking of the tools that are available today, New America is launching a new website about basically drone proliferation. And we found that there are 85 countries with drones of various capacities. And of course, three countries have used them in combat. China certainly has a capability. Russia and Europe. You mean military drones. Yeah, yeah. And so we just heard the announcement that the United States is about to sell armed drones to countries that play by the rules. And that seems to be freighted with a lot of possible unintended consequences. For instance, allies like Saudi Arabia might play by the rules in some people's minds, but maybe they wouldn't, or maybe these weapons will fall into enemy hands, as happened with weapons in Iraq that are now being used by ISIS. So is this a good idea or a bad idea? Whereas the train left the station and other countries are. I think it's a reflection of the fact that an NQ-9 today is a very capable system, but it's yesterday's system. And so we're willing to offer that up for sale. It is a valuable tool, but it is not one that our future war fighting is gonna be defined around. The autonomous systems that we will leverage 10 and 15 years from now are gonna look much different and be much more capable than what an NQ-9 looks like. So you're saying it's a good thing that we're about to start? No, I'm saying it's a natural progression in the evolution of that technology, the way we've made use of it. And we're now moving on to looking at another generation of systems and this existing generation is being offered for military sales. But I wanna jump in there and say, look, the barrier to entry for building a drone is incredibly low. And that's why I wanna make the distinction between military, civilian, I would say all countries in some capacity have civilian drones at this point in time. And the line between what is military and what is civilian, what is toy, what is not toy is very blurry. You can go on the internet right now and buy any number of drones or to DIY drones, do it yourself drones and get plans and build your own drones and weaponize them. Certainly terrorists could do it. And so I think we need to be not, we don't need to say, oh, this is a terrible thing. First of all, we're putting technology out there that we know the capabilities of. It's older technology. On the scale of like, if we were to say, where's the predator in terms of, kindergarten to college, it's like first grade. It's basically a teleoperated machine in a lot of cases and we're not gonna sell our best autonomous technologies that go with it. But we also need to realize we are not the only game in town. The Israelis have been selling drones for a long time to other countries. And their drones are likely more capable than what we're gonna be selling. And so we need as a country to realize that that barrier to entry is low. Drones are easy to build. A little bit harder to weaponize, but not that hard. I would ping off of Missy's points in two ways. One, the 85 number doesn't include non-state actors. And every, again, my point on this is the future of war is robotic because the present of war is robotic. In the Libya Civil War, we had a private military company provide drone services to the rebel coalition using off-the-shelf civilian ones. We've got in the, what do we wanna call it, Iraq War 3.0 slash Syrian Civil War, all the actors have utilized on man systems ranging from us to Hezbollah to ISIS to the Iraqi government flew an Iranian drone with an Iraqi flag sticker attached to it. Everybody's using them. And some are military grade, some are civilian grade, but the reality is that the technology is, it's sort of first grade now, but it was 10 years ago PhD level, right? On the question of what that means though for the arms trade shift in policy, is it essentially now it's taken by us being willing to sell armed drones. It's changed it from being about the technology to the exact same discussions we have whether we're selling F-16s, billy clubs, or predator drones. It's all about a discussion on one hand of we want to aid an ally slash, if we don't sell it to them, someone else will. So this is good for business. And then the other side of the discussion is but what will they do with this technology once we give it to them? And that's the same kind of discussion you have with every other technology it just now changes it to drones. And it all lies, the answer of how we feel about it will lie on where in that concentric circle of friends, allies, and frenemies you are. So it doesn't change the policy towards the UK because guess what? We already sold armed drones to the UK before the policy shift. Now it's where in NATO are you and do you speak Turkish or not? Sort of affects the policy shift. Then outside, are you outside of NATO and do you speak with an Australian accent or as you raised, are you come from the Middle East? And that will change the discourse on how we feel about selling you the weapon or not. So Peter, based upon what Missy has said, but on your wired for war book and based upon what you have just said, this is about future of war. And if we get stuck on again, our favorite weapon of today, the autonomous weapon. I think we lose a point that you've made often is that we may get so smart in our weapons systems that we are not counting on the other side of this thing and that is the resilience of a primitive response. Let's take ISIS. Forget that they may have some stuff that we had in Iraq. They are using 12th century methods to instill fear across the globe and they have become very resilient. How do you address that for future war and autonomous weapons? I would say we need to recognize what is shifting in the forces that shape war and what's not. And there's, for all the discussion of autonomous drones and then later on we're held about bio and the like, the reality is that war at its essence is the same. It's something that involves violence and politics and the causes of it are still us, our human failings. And the more direct answer to put it is whatever technology we have until we get the politics side of the Iraq operation right, we're not going to succeed. And I would want the other point is ISIS, yes they have a seventh or 12th century ideology, however we wanna frame it, but the reason for their rise is an incredibly skillful use of 21st century level marketing and social media. They have a social media strategy that would do Taylor Swift proud. They've got, as an example, Twitter feeds in 23 different languages. The reason why their videos go viral, they just don't naturally go viral, it's a combination of shock value, the content of it, but also a link to the cyber side. They've got bots that are set to shoot out to retweet it 60,000 times. They've got preset hashtags before their operation to take Mosul. They've figured out how to combine traditional ground operations and social media in an amazing manner. And it's because they are a millennial organization as compared to the old fogies of Al Qaeda. So for the military officers here in the young ones, it's kind of interesting, I read last week that ISIS's use of social media is power projection. How does that compare to autonomous weapons? Okay, we're gonna open it up to questions. Can you identify yourself a question and wait for the mic, and we're gonna start with Tom Ricks of New America, who's in the front here. Where's the other mic? I have a question for Missy Cummings. You seem to indicate in this instance of patent squashing that at least in some instances, the U.S. defense industry is not interested in a lot of this stuff. Would prefer to make bigger, more expensive things. How much of an obstacle is the U.S. defense industry to the future effectiveness of the U.S. military? I think they're a big obstacle, but I really don't wanna point the finger at them because I would rather point the finger at the Defense Department, the Department of Defense, because their acquisition processes, their program management skills are I think so inherently lacking in this area of autonomous technologies. But this is something that I've testified in front of the Senate Commerce Committee about, this country basically is losing its top talent to Amazon, Google, Oracle, these kinds of Silicon Valley companies. And I don't believe that we have enough competent people inside the government to be able to understand how to set up acquisition programs for autonomous weapons or anything even remotely robotic. Google just bought our, the United States best robotics company, Boston Dynamics. And when that happened, I thought, oh my God, what are we gonna do, this country, we need to rally around it, figure out what we're gonna do. And I spoke to some high level people at the Office of the Secretary of Defense and they looked at me and they're like, we're thinking about having a meeting about that. You just lost your best technology arm and we're seeing, and we see it in the university setting too, kids today do not wanna work for Boeing's and Lockheed Martin's and Northrop Grumman's. They wanna work for Amazon's, the Googles, the cool companies. And so we're having a brain drain into the commercial sector. And I can't decide if that's a bad thing or a good thing, it's just a shift in technology. We've changed the pinnacle of technological development away from the defense industry and into the corporate environment. So unless we actually start to address the gaps that we have in the Defense Department, we can't really point the finger at Boeing and Lockheed Martin and Northrop Grumman because we in fact are the ones setting the requirements, setting the contracts or the lack thereof. Major General Dunnup. Hi, Charlie Dunnlap from Duke University of Law School. This is for my colleague, Missy, and also another question for the panel at large. Isn't the next specs big thing going to be anti-drone technology, probably laser-based, which will force a much more sophistication of drones, which will then lead drones to the nation states reducing this less sophisticated drone threat? And then the other part of it is, and this is for the panel at large, isn't the problem with fully autonomous weapons is that the world has rejected them? And I look to the landmine treaty. Landmines are your basic autonomous weapon. And even though self-neutralizing high-tech mines don't present the threat that the Ottawa Convention was really trying to address, nevertheless, here we are in a world where they're banned. How are we going to address those two aspects, I think, of the future of autonomous weapons? So I'll address the issue about the anti-drone technology. I think that's a great aspect of what's happening, but we see it with all weapon development. There's weapon, there's the anti-weapon, the anti-anti-weapon, and so that does help push technological developments. Yes, there is active research going on into laser defense of drones, but the kinds of systems that we're developing are very big and very expensive and they're the weapons platforms that the defense industries want to build because they can put the price tag on them that they want to be able to. When you can put a million drones in the air for, let's just say, a couple of hundred thousand dollars and China could easily do that, you have to start thinking about a whole new way of fighting a war when it's not just the big monolithic platforms, but it's these small drones that are very, very cheap that can just keep coming. And so this country has had a program in place for a long time called the Black Dart program where they've been looking at anti-drone technologies and it's not that we're just sitting by idly, but at the same time, we're not putting the investment into, I would say, better defense platforms that instead of spending a lot of money on very expensive laser development, which we've had laser development in this country for a long time for weapons and I think what you're hearing for the drone laser development is just a way, it's a new spin on an old technology to keep giving our buddies in the defense industry more money to keep developing these technologies as opposed to developing something new and innovative. There's multiple ways to go after a drone. One is the conventional way. We act like it's a highly sophisticated technology and the software is, but Snoopy and his World War I biplane could shoot down a predator drone. It flies slower than that. It uses a converted snowmobile engine and the same thing applies to the little tiny ones. The bigger challenge for us is again, context matters. So there's things that you can do in an open battlefield to shoot down a drone that are quite easy. It's different when you're trying to defend the White House. So for example, an air defense system that's a machine gun, easy and automated. Actually, we already have that technology. Now the CRAM system, if anyone deployed to Afghanistan or Iraq, it's the one that automatically shot down. Rockets and mortars coming in. It could easily shoot down a drone. It operates in Kabul and Afghanistan and the like. Place that in Washington, DC, shooting off thousands of machine gun rounds at a little tiny drone is a different kind of problem. So then you have the idea of, well, let's go high tech against it. Lasers or cross with cyber warfare. Hack the drone. Make it do something else than what the operator wants. That's another way. But again, context matters. We're just focused on drones. There will be other scenarios where you'll be using robotics and thinking adversaries will go after it. And I remember doing some work with the Marine Corps on this and they have a test ground robot that mounts a machine gun. And yes, you could hack that system, but probably the most dangerous foe it would face is a six year old with a can of spray paint. And by that I mean it presents an incredible 21st century dilemma for you. It could defeat the foe, but it would cause an amazing international controversy. But if it doesn't operate, they defeat that multimillion dollar system. And that's the, again, I think the future of war, if we want to say it, it's this cross in what we would describe as high tech and low tech, but it's basically just people thinking up new ways to use tools. I would argue though that many of the questions we're discussing here are difficult because we have an insufficiently differentiated terminology for these systems. We lump them into drones. The responses can range from anything from small arms fire to high power lasers and microwave weapons to integrated air defense systems. I think until we begin to develop a terminology that is more appropriate for different categories of systems, many of these issues are gonna seem opaque. It also points to a seam that an adversary might take advantage of. So the US Air Force has delivered air superiority such that US forces really haven't been bombed. There was one small episode in Vietnam and then previously in Korea and then really seriously going back to World War II. But, and then that's resulted in the army stripping out most of its air defense capabilities. Just at the time now we have all of these low level actors delivering small scale drones that of course the Air Force is not gonna be interested in knocking down. That's not what the F-35 was not designed to shoot down and won't be able to shoot down a little quadcopter that flies 50 feet above the ground. But that points to a seam that a thinking adversary would take advantage of. Lady in the back here. Good morning. Diane Divas with Inside Unmanned Systems Magazine. In November the AUVSI conference, their program review, the military program managers spoke quite enthusiastically about modularity, the ability to upgrade equipment and software over a period of time in a program. And we've also seen an awful lot of cyber attacks where stealthy hackers were able to dig into a system and remain undiscovered for quite a long time. Could you please speak to cyber as an element in this move to autonomy given that we may have additional avenues toward accessing software and the risks that are involved with that. And I'm separating that from electronic warfare where you're interfering with the communication systems. Though if you'd like to address that I'd certainly love to hear what your perspective is. So ladies, gentlemen. I'll take it. So what cyber, the cross with cyber and unmanned systems offers is something new that we haven't seen before in war which is battles of persuasion. That is where your prior goal when there was an enemy target be it a tank or whatnot is that you could destroy that tank. You would destroy it. Electronic warfare, you try and jam the communications. Cyber offers you the ability to persuade the target to do something that it wouldn't do otherwise. That is if I gain access to its software, if I get the proper permission and cyber warfare is about gaining that permission, hacking into it, I can then make it do things other than what the operator wants it to do. And it might be just something as simple as tricking it, changing the GPS location which that has definitively been done by Department of Homeland Security tested University of Texas students were able to hack the GPS of a drone. And so you make the drone thinks it's right here when it's really over there. There is the back and forth over what happened to the American systems that visited Iran on the ground for whatever reason. So you might do that or the ultimate co-option is recode all American systems as Chinese systems and recode all Chinese systems as American systems. A human pilot would say that makes no sense. I'm questioning that order. A computer, if you have the right access, will follow that instruction. And so it's a whole new realm of war. You've never been able to convince a arrow or a bullet to change direction in mid-flight. You can with this kind of system. And again, that points to both possibilities and perils. Let me address the modularity piece of that question because I think it's very important. Autonomous systems technologies are advancing very, very rapidly. And yet military systems, once we field them, they tend to be in use for decades. It's typically the time scale. And so in these kinds of systems, the shift to modularity is absolutely crucial. If you don't, you have a system that's out of date in a very, very short time. And the whole acquisition process doesn't close correctly. So we are in fact moving towards those systems being modular where you could remove the smarts very easily. You could also remove the weaponization capabilities and change those over time. I think that's critical. Not just in autonomous systems, but that may be one of the most extreme cases where that is needed. But I just want to say, I mean, this goes back to the whole issue of what makes an autonomous system autonomous. It's the software. It's not the hardware. And so by definition, all autonomous systems are modular because you're just going to improve the software. And this is something that the DOD has been woefully inadequate in doing for ever is funding software development, putting the investment into the software at the same level that they fund hardware. We love to build guns, rail gun. And this is a project and many navy-based laser weapons. We sink money into these projects. And I'm not saying we shouldn't, but what we should be doing is we should be at least matching that financial investment that we do in hardware into software if we are going to actually regain the lead in the software autonomous world. Though I would add, yes, you can switch out the software within limits on an existing system, unless you have the ability over a decades-long life cycle to be able to update the hardware to go along with that. You're still gonna have a fairly dumb system five or eight years out. I would agree with that, except that now we have 3D printers and I have students over a weekend who can build a drone, program a drone, and it'd be highly modular in both software and hardware and that certainly additive manufacturing is where we're going and other countries around the world have figured out the value of this kind of technology. And so it again just points to there's a lot of innovation happening in the commercial civilian world that the military is not keeping pace with and I can promise you at some point in the near future we will see a 3D printed drone have an attack on US forces. Well on that note, thank you very much everybody. That was really brilliant.