 So, welcome. So, I invited Professor Cagg. I first met him at a workshop I put together called Security Human Behavior, which looks at security and people and how the two interact. And he had great talk about drones, but actually something that really interests me, which I always think of as societal phase changes, when more of something changes what the thing is. There's a book on drone warfare written with a political scientist. It's a book that goes into law, politics, and ethics of drones. So, I suspect we're going to get more of the philosophy ethics in this part of the talk. So, please. Great. So, thanks, Bruce. I appreciate it. So, today we're going to talk about the moral hazard of drones. I became interested in this topic in 2006 when I was going through my MPhil in international relations at Cambridge, and I was asked in an essay or rather in an exam what's going to be the most important sort of phase change in military technology in the next 100 years. I picked drones, and it turned out that it sort of shaped the way that I thought about research for the next 10 years or a better part of a decade. So, Sarah and I just put out this book, and it began from a New York Times article entitled The Moral Hazard of Drones. And a moral hazard is a term that started in the insurance industry at the turn of the 19th, 20th century. A moral hazard is a situation in which an individual or a party is willing to take part in either risky or immoral behavior if they don't have to face the consequences associated with those actions. Philosophers and economics now call that the moral hazard. So, the insurance industry discovered that people who were insured actually were much worse drivers or much less careful. And so they adjusted premiums and incentives, sort of liabilities in order to adjust the behavior of drivers. But we've yet figured out how to adjust the behaviors associated with drone operation. So, when I first started to think through the moral hazard of drones, I was really worried about the leadership associated with the U.S. leadership when it came to the drones program. That's where I started. And it generated the first article, The Moral Hazard of Drones. And here I argue that the costless nature of drone warfare, and I realized that this is not qualitatively different, as Bruce was talking about at the beginning. It's not qualitatively different than, say, a sniper, for example, or a long-range precision-guided munition. But the costless nature of this type of warfare made leaders more inclined to take up these types of tactics and these types of strategies. In other words, to make military solutions the first option rather than the last resort. Or this was my concern initially. So in the second article, Drones, Ethics, and the Armchair Soldier, I begin to look more carefully at the way that soldiers might face this moral hazard as well. The moral hazard being Clem Ryan, who's at Oxford, has done a study on disassociation and the way that distance creates a situation in which soldiers are more inclined to take violent or lethal action and more ready to do so. So underneath all of these more specific points about drone warfare is a deep and long-standing question about ethics and the relationship between expediency and morality. So the question is... Let's ask Sarah. Sarah, the problem that I'm sort of pushing on here is, is convenience of virtue? The answer is, of course. No. But in our society, we generally think that convenience is not only a virtue but an extremely important virtue. So underlying my concern with drones is the thought that that which is easier is not always better. And additionally, that which is easier should actually be more morally suspect than those things that are hard to accomplish. And the reason might go something like this. The things that are hard have other... have prudential or instrumental reasons to go against them. In other words, if we think about long-range ballistic missiles, if we think about carpet bombing, if we think about traditional... any types of traditional ballistics, there are good prudential reasons to be suspicious of them. In other words, if we think about traditional warfare and the very obvious devastation it creates during the 1970s, for example, there was widespread concern about ballistic missiles and the delivery systems associated with nuclear armaments, right? But the dangers of these were obvious. There was no rhetoric about their precision. There was no rhetoric about the cleanliness of them. And so prudential and instrumental concerns actually covered some of our bases. Like Truman says, our morality needs to catch up with our technology. But 50 years ago, our morality didn't need to catch up with our technology. The instrumental simply... the cost-benefit analysis associated with those technologies did that work for us, okay? Now, this has changed a little bit when it comes to, for example, precision-guided munitions and drone capabilities. Instrumental reasoning and prudential reasoning is no longer going to cover those types of technologies because the force projection is... the rhetoric surrounding force projection is clean, costless, efficient, at least for the populations that are pushing those... there was a little hesitation, but, you know, clean and costless on the side of the U.S. public, okay? Now, one of the concerns that I had associated with leaders way back in 2011, not way back, but back in 2011, is that clandestine activities have always been at the disposal of political leaders, but never in the history of warfare have the costs associated with those clandestine activities been so low. After 9-11, the War Powers Resolution and the authorization of military force, when that came out, there's a concern that military leaders are given carte blanche to sort of move unilaterally in clandestine activities outside of traditional battle zones or war zones. That's a concern that I had. So, Micah Zenko, writing in a Council on Foreign Relations report, writes, despite nearly 10 years of non-battle field targeted killings, no congressional committee has conducted a hearing on any aspect of them, okay? Now, the question is why? I think that there are many ways to answer that question. And I'd be interested to sort of talk those out in the question and answer period. But one thing that I'd like us to think about is that as the precision of our technical weaponry becomes more precise, it seems that the way that we distinguish targets becomes more vague. And that's an inverse correlation that I was pretty upset about as I saw things as the targeted killing program picked up pace in 2010 and 2011. So, in other words, as our technical sophistication became better, it seemed that we were using that or allowing that to double for our normative legal ethical judgment about the justification of targeting. So, in terms of operators and to sort of drill down a little bit into the distance and dissociation issue that I mentioned at the beginning of this talk, Clem Ryan, who's been doing a lot of work on this, cites a 2004 Red Cross report, writing conflicts in which recourse is had to advance technologies which permit killing at a distance or on the computer screen prevent the activation of neuropsychological mechanisms which render the act of killing difficult. Now, if you've thought a little bit about drone warfare, you think, oh, this is just like a video game and this is sort of what Clem Ryan's research is pushing on, but it's more complex than that, okay? So, in a DOD study, the Pentagon actually says that there is a toll on drone operators, what the DOD describes as in a sort of weird way as an existential crisis, which is, I mean, for the DOD, that's a pretty interesting way to describe a problem. It's not PTSD, it's an existential crisis. And I think that this is actually interesting to the extent that the study showed that there is still some sense of moral responsibility even as the technological loop becomes more automated, okay? So, the question that I'd like to ask or have begun to ask in the book is how do we guard against this moral hazard and how do we face the existential crisis that these individuals might face, okay? As I moved out of the first two articles in The Times and started thinking about the book, Scott, this fellow here from the University of Oregon invited me to talk and one of the professors, Bonnie Mann, said something to me that sort of stuck in my, you know, claw a little bit. So, she said the following thing to me. She said, you are looking at the wrong things. You are looking at the leaders and the soldiers. You shouldn't be, okay? You should be looking at the public and the way that the public talks about technology generally, okay? And I think that that's an interesting point that she made. So, when asked in 2011 where the majority of drone strikes occurred, 67% of people couldn't say where the majority of drone strikes were occurring, okay? This was pretty alarming. Oh, good point. No, no, no, it's great. So, I'll follow up with you about that, but I actually think 47, the answer is I think 47% of people got the wrong answer and the rest said I don't know. So, one of the upshots of force protection, which is what drone warfare, at least in part, is about force protection, is that the public who is executing this force, in other words the US public, does not have any flesh in the fight, okay? So, and that is a serious concern. Back in 1780, Emmanuel Kant said something about democracies. He said democracies will tend toward pacifism and the reason for that is their citizens want to stay out of the fight, right? And they will put force on leaders to stay out of the fight except for very good reasons. But Kant was just wrong about that. I mean, he's right to the extent that human beings are sort of self-interested in so far as they don't want their brother or sister going off to war. But that doesn't mean that they're going to stay out of war. It means that they're going to come up with clean and very efficient ways to wage it, okay? And so what we see with drone warfare is actually democratic peace theory, which has been a long-standing assumption in Western society, sort of come undone in certain ways. So what we see here is that clandestine activities are now fairly routine. Kant says clandestine activities should never be routine. Why? Because democracies should push back against their leaders. Well, what technology does, certain technologies like precision-guided munitions and drones, it decouples the relationship between a democratic public and the leaders that are supposed to fight wars in their names. Okay, so let's go to the next slide. So when we published drone warfare, there was some pretty serious pushback against it. Two basic pushbacks. One was Hawks said that military leaders and Hawks said that we were undermining military, our military forces, first thing they said. Pushback also came from the left. And one thing that I'd like to talk about, because you folks are in the public and writing for the public, is the following statement that I got from someone. His name's Marco Roth and he writes for N plus one magazine. He writes, and this is directly toward the book and the writing. He says, Keg forgets that the asymmetry of the conflict, our safety, their vulnerability debases even the most well-intentioned American writing about the war on terror. When it comes to actually committing thoughts to paper and attempting to make an existentially responsible job of it, my sense is that no matter what register I choose, polemical, realist, satirical, exoticizing, it all comes out wrong in the end. With so much real suffering occurring for so many stupid reasons, my very civilian efforts to picture the war as it now enters its 12th year become obscene by their very nature as imaginative acts. In other words, maybe we can talk, I've talked for about 20 minutes, in the question and answer, I think we should, as generally civilians among us, we could talk what sort of things can we say about this war and how might we shape the public imagination associated with drones. So, on that topic, there are several things that we should think about. The media coverage of drones you might have noticed in the last two years or rather the last year at least has not centered on the targeted killing program. What it seems that the American public is much more worried about is what? Amazon packages being delivered to their door, right? This is, like, this makes me crazy as an ethicist. I mean, and it gets to the issue that Hannah Arendt, after going to the Eichmann and Jerusalem, rather, after writing Eichmann and Jerusalem, after going to the Nuremberg trials, says about evil. And she says that evil is not committed by particularly evil, like evil people, like extraordinarily evil people. It's committed by very ordinary, thoughtless populations. And I think that one thing that if you could help me in the next hour or a half hour think through is how do you get through the banality of evil? In other words, how do you speak to a population in such a way that they care more about the perhaps ethical arguments than about their Amazon packages being delivered on time? Like, that's a question that I'd love to figure out. Another issue that I think that we're up against is what Herbert Marcuse calls technological rationality. Marcuse, writing in the 50s, says that modernity is unique to the extent that it has become so mechanized. Its machinations are so seamless that they seem to happen inevitably and rationally and also ethically. In other words, there is a conflation between what is easy and what is good. And I'm wondering how do we get over that issue? Third, and this gets close to technological rationality, the conflation that we're talking about between virtue, for example, and expediency, is a conflation, a long-standing conflation in the history of philosophy between the is and the ought. David Hume, Emmanuel Kant, says that you cannot derive in ought a normative statement from an is, a technical or factual statement. And we can talk about that. I think that technologists actually need to think about this quite a bit. To what extent does the public polling and media coverage take as given about drone strikes? So these three points are fairly broad and they give jumping off points for very specific research about polling data. And that research has been started by my colleague Sarah Kreps. And this is what she's found. Most of the polling for the last seven years associated with drones has shown the following, and I'll just sort of talk you through it. Most of this, say, the upshot of these graphs is that most polls show that Americans are generally supportive of the drone program. They don't know very much about it. Sometimes they even say it's illegal. But in fact, they're in favor of it. What Sarah does in a recent article in Research and Politics that came out last month is to show that these surveys, the public opinion data, takes as given two of what Sarah considers the most controversial aspects of the drone strikes. The first one being international humanitarian law, the authorization to the use of force, one issue. And then the second issue being very standard just war tenants, namely distinction and proportionality. Distinction being how do we measure who is and who is not a combatant, distinction and proportionality is our force projection appropriate with the threat. What she's finding is that most of the general polls around drones take these as givens, that they satisfy both of these norms. But as you switch, or as Sarah did an experiment last month, or rather was published last month, about six months ago, she started it. When she took, when she actually adjusted the questions to focus in on distinction and proportionality, respondents actually tuned in a little more closely and where as you see the proportion for disapproval here, you have, or the proportion of approval, you have 28% when they hone in on distinction. If they say civilians, there's a large likelihood that civilians will be killed. American citizens say, whoa, hey, right, we don't like that idea. Now the question is how do you get that into the public rhetoric surrounding drones? The drones are not popular. They're not simply portable, you know, Amazon packaged devices. As she adjusts, when she adjusts legal authorization, for example, you see that these numbers, 39%, only 39% approved, only 32% approved. One thing that's interesting about the difference between IHL violations, distinction and proportionality, and legal authorization, domestic and international, is that it seems that Americans, when you actually paint them a picture of civilians dying, they actually care. When you sort of appeal to international or domestic authorization of use of force, they care less. In other words, the formal normative structures don't matter as much as people are going to die. Here, let me paint you a picture, right? And when you paint them a picture, they actually care a bit more. So what Sarah is suggesting is that we need to take these sort of studies into account as we develop public opinion data about drone strikes. So I think I'll, so I've laid out a ton of your thoughts and maybe we'll have time for questions and comments. So thanks for having me. I appreciate it. Okay, great. Yeah. What do you think the impact would be if people really focus on this distinction question and then the government says, oh, no problem. We have smaller drones that are more effective. Do you think focusing on the civilian casualties risks delegitimizing the arguments about proportionality or legal authorization? Good. I mean, that's great. So I think that there would be that risk. But I think a part of the argument that she and I have made is that let's say we get more precise drones as you're suggesting. And we really get the right person. Like, we're going after Bruce and we get Bruce. So the issue with that is that targeting practices in the targeted killing campaign are interestingly vague. So for example, over 16 year old males in a combat zone, combatants, covered persons. So if we are, I would suggest that it would be a good thing if drones were even more accurate. But what I don't want to say is that they can do the job of figuring out who is the guilty party or not. So in other words, we actually have to get our targeting practices as specific or as precise as our technologies. To go to this a little more carefully, there's a difference between signature strikes on the one hand and targeted killings on the other. Signature strikes just looks at behavior profiles. In other words, are you targeting people in a rural region in this area of the FATA? And that's all it takes to be targeted. I'm very deeply worried about signature strikes because it actually makes this breath. Like, lethal action is warranted because of this broad behavior profile. I think that that's a danger. I think that when you come to an official and say, this is really dangerous, like just being in a particular region warrants this type of lethal action, if they come back and say to you there wasn't much collateral damage, that is not a good explanation. So that is the explanation that oftentimes they give you. There's no danger to troops, and the collateral damage was very low. Not the point. Have you taken the argument of efficiency to its logical end, which is assassination, a single individual that's named, there's a maximum of efficiency and a minimum of collateral damage? How does that question the larger and separate debate of the assassination of our individuals or a terrorist fit into the logical context? No, it's good. I mean, that's a great question. So several lines of thought sort of get to this. One is there has been some talk of making drone operator training similar to sniper training. In other words, the same psychological profile, the same moral, the same trainings, because it's a similar type of situation. The other issue I think that you're getting to, though, is to what extent is assassination or what is generally known as decapitation strategy actually an efficient way of changing terrorist, or quote-unquote terrorist behaviors? And there's been lots of debate surrounding that, right? So I would say it's not a... I personally don't think decapitation strategy is actually the... is that efficient? Is that sort of getting to the question that you had? Well, I mean, there's a couple of things. I started with the general question, but taking us in that direction actually removes some of the technology questions around the debate that you've set up. Good. Because a single bullet and a single person is old technology and yet we still have the efficiency argument without the whole moral hazard around the technological rationalism argument that you presented. So, in part, one of the defenders, the hawks who pushed back said, what, you want us to put men and women in the field to get the same job done? And there's a way in which, no, I don't want to put soldiers in harm's way, but there's something good about putting soldiers in harm's way in the sense that at least you have a person in the loop, a very obvious person in the loop, and there's this democratic push back against particular violations. So, I mean, it's a hard question to answer, though. All the way in the back and then here. Well, I'm curious to know, when you asked about the banality of evil and shifting public interest toward targeted killing and away from sort of civilian commercial drone use, do you mean to say, not that I'm necessarily disagreeing with you, that Amazon package delivery and civilian commercial drone use are complicit in the perpetration of evil target killing? Maybe a different... Well, it's a very good question. I'm not going to... No, I am going to go on the record, but it's going to take me 15 seconds. So, they are complicit, I think, in two ways. I think that the technology used in military devices gets moved into private sector devices and then there is this push through consumerism and capitalism to keep those devices in circulation and to promote them. I think that the drone industry generally has a vested interest in both military uses and private uses, and I don't think that you can separate those two off. For example, like the global hawk, which is a high altitude surveillance drone, the DOD didn't even want it. They said, don't give it to us. The U-2 is much better, but the lobbyists kicked in and the global hawk went through. And that type of elision between commercial interests and military interests, I don't think Amazon is not complicit in that. A comment and a question. Back in the days of ICBMs, which I guess we still have, so I shouldn't, like, put all in the past. I mean, there was a school of thought that the nuclear launch code should be surgically implanted in a volunteer soldier. And the president be given a dagger because they should literally have blood on their hands if they're going to kill hundreds of millions, which, I mean, so not all the rhetoric about nuclear missiles was as clean, et cetera. And I think that always sort of informs things and just sort of personalizing the blood on the hands. One thing you haven't mentioned is government secrecy, which sort of shapes this in a large way. And let me give one local example. I was at a workshop down the block a few years ago about counterterrorism strategy. This was before, you know, drones had been acknowledged by the United States government. And there were panels full of government officials and Harvard Law faculty who were former government officials and a handful of people who were asking about drones. And you had this bizarre phenomena where, you know, Harvard Law faculty went mute because they knew but they were prohibited from talking. They actually had to import Stephen Carter from Yale to sort of talk plainly about drones. So there's this whole, I mean, you know, civilian damage from, you know, collateral damage from drones. Who's going to tell us that's far away and it's dangerous and the media never really does that? The people who actually know the targeting, they're not allowed to say. The people who were involved in the discussions about, you know, how targeting should be, they can't talk either. So we're sort of, you know, left with this mystery, you know, where all we're getting is the, you know, the gleaming targeted drones aimed right at those dangerous people. So how do you get around that? How do you address that? So I think there was a recent study done by NYU in Stanford it's called Living Under Drones. It's a qualitative analysis of interviewing people in the FATA. So I think that one way of doing that is making that report mainstream. And the question is how do you do that? In other words, how do you get people to look at things they really don't want to see? And I think that's, I mean, that's a question. So there's a type of knowing, or there's a type of self-imposed ignorance about the situation. We could be much more proactive, right, as a public. And as academics and as media, the question is, how do you get people to look at things they just don't want to look at? So Ron, and then, yeah? Is it Ron? Yeah. We find this all a little bit. It seems like during the Vietnam War, a lot of the way that the U.S. people found out what was going on was from information that was flowing into the U.S. from the court enemy side. There were anti-war groups. They were fairly effective in channeling some of that information from the other side into U.S. consciousness. We don't seem to have that at the moment. Which is surprising, given that the ways of transmitting that information are so much more sophisticated. In other words, reporters, I mean, the real-time nature of news today, we don't see it. And I'm actually quite surprised that we don't see that type. Yeah? We have data showing whether or not drones actually lead to more military as a first response. Because I think that makes sense in theory. I'm curious whether we know much about how that works in practice. Sure. So when Obama defended the AUMF, for example, the Authorization for Military Force, one of the defenses, and it was a sort of, to my way of thinking, strange defense, is that it did not constitute traditional warfare because it did not put soldiers in harm's way. Now, I don't, so that would be anecdotal evidence, but it would be a form of evidence. But I think you're right to push on, to my knowledge, there hasn't been any systematic, empirical study of this. But I think, I'll think about it. I'll think about that. It's a good question. Yeah? So to your point, or your question, with how do you get Americans to understand the consequences of the civilian capital piece, for me, I heard a report on public radio that basically was a story of some people who, well, a hellfire missile, and how the point of the story was that hellfire missiles are basically weapons of terror. You hear a drone buzzing around, and it's buzzing over a large area, and then a hellfire missile is launched, it comes in at supersonic speed, so you don't even hear anything. There's just this big explosion. In particular, the story with some interviews translated that people were obviously in tears telling the story about how it hits their home or their hut, and a little boy's body is destroyed, and his head rolls across into like his sister's lap. And I think just those sort of images that very graphically, particularly what I imagined it, made me understand what the civilian capital piece was. I need to say one thing. I think this is a really interesting point, but we say this in the book as well. We are not peaceniks. We're not those who are like, oh, war is always bad, okay? And asymmetrical, but we have to be sober about what asymmetric warfare means in an age where technology has allowed individual actors to get devastating capabilities. And that is what we're pushing against in asymmetric warfare. Those are real dangers, but so too are the real stories that you just identified. Go ahead, sorry. I wanted to ask about this idea of asymmetry. I think the topic I'm just being an example of drones is that there's an asymmetry of we have drones and other countries don't, and we're using drones in these countries that don't have the technology. We still use the same technologies against us, for example, but clearly that's only temporary. It's only a matter of time before other people have drones. So that's the first part of my question. The second part is I'm wondering about to make drones, it's unlikely that we're going to go back to a point when a larger percentage of the population is actually going off to war. That seems unlikely in my view of what's going to happen. So perhaps the way that we could make drones more concrete is through cases like Amazon using drones. And I'm wondering the concern about Amazon having drones isn't so much about delivery as it is about a larger area of concern for people in the United States about who has tools of surveillance. It's great. So to address both of the questions, on the first point, you see Americans become much more interested when there is a concern about police use of drones or drones being used to monitor the border within the United States. And this has to do with very provincial, self-interested worries. So I think you are right to say we might use these methods to make the analogy to foreign populations. On the first point, in terms of the proliferation issue, Sarah has actually done a lot of work on this. She just published a report for the Council on Foreign Relations on drone proliferation. And I think that when other countries have these types of technologies, that will be the point when Americans begin to consider this more seriously, unfortunately. That being said, I mean, one of Sarah's points is if we sensationalize the proliferation of drones, we fail to... That's one way that we could fail to regulate them. Yes. And then Sarah. I've been on the similar order. The drone similar order. And my angle of looking at this from is that it's not very dissimilar from other what we think of as sophisticated engineering equipment. In fact it's designed so that the person who's operating it, it's so simple for them that there has to be put in place something to make sure they don't fall asleep. Right. And I mean, talking about other things such as those big, you know, cranes, construction, you know, cranes, and you see, and nuclear power plants. I mean, they have to switch around the valves and the pumps, the green and the light on the nuclear power plant that you have to pay attention to get done what needs to be done. And this takes the operator, from what I've seen, takes the operator in totally distance from psychologically. There's nothing that's telling you what's happening when you're operating a drone simulator. In fact, you could actually be operating an actual drone, you know, you could be actually, you know, there could be actually real drones when you're in there. It's just making a difference. Right. So I don't know how that would change. Yeah. I mean, this gets to the issue of just as we move to greater forms of automa sort of more autonomous forms of weaponry, what are the implications? Where does responsibility lie if collateral damage occurs in these cases? That would be an operator. Well, yeah. Yeah. Sarah, I'm sorry. Yeah. So back on the point of public consciousness in the media for kind of giving us this imaginative sense of these questions, I wanted to know what your thoughts are on like Hollywood and Captain America and this idea that like the latest Captain America, their main goal was to stop the launch of algorithmically targeted aerial targets. Was it really? Yeah. And there's this point at which they say like the German from the previous like, I'm sorry. Yeah. He says, I created an algorithm. And so like there are ways that it is becoming in the public consciousness that this more non-commercial application is a risk and a fear. So what are your thoughts? Right. I don't think Captain. So I think I don't think that these Hollywood versions of this get to, I mean, I guess they make us aware of it on one level, but no more than the question that Saw III gets on my level that there are serious, like that there are mass murders or something like. I mean, I don't think that it's critical. I don't think that it's critical enough to sort of get it on our radar. So that would be my first point. Yeah. Go ahead. Game Bond is always kind of a touch on some popular context, right? Oh, okay. Yeah. So I don't disagree with you there. I'm wondering what sort of normative work it could do, which as an ethicist, I'm worried like how do you actually make people change their policies or make people help leaders change their policies? Yeah. Well, so that's exactly what I wanted to ask. I mean, you're putting out some really interesting research and you've written this book, but is there an advocacy strategy here? Is there like a... You could give me one. That would be great. No, seriously. I mean, I'm just, I'm curious. Yeah, I'm absolutely. So one problem with advocacy as I've seen it, I was at NYU doing a talk. It was called Dark, which is the most horrible word for a drone conference, but that's the acronym, D-A-R-C. I was at D-A-R-C and there were protesters outside. And Code Pink, for example, was there. And I don't necessarily think that these are the appropriate advocacies, but I'm trying to, I am trying actively to think, how do academics and members of the media actually make a normative push? I've not come to any good thoughts about it yet. It seems like there's a lot, like it's, I think calling it a dual-use technology kind of ever simplifies it, but that there is maybe something to be learned from other, you know... There is. A very big one. But it's a drug. A big fight over A-A regulations in civilian drones and whether you use them for weather, things like looking at mines and looking at environmental degradation and so on. It's a big fight that's somewhat in the public consciousness. So to answer your question, I don't have any quick answers, but I'd be very interested. Yeah, you are next. So I think one element that's useful in that conversation is that the definition and envisioning of drones is actually an area of contention in American public life. With the U.S. military and the DOD and others who are actively pushing the rhetoric that you're describing of funliness as a strategic communications process. It's not just that it's not happening in some kind of organic way. It's happening in a very specific and intentional way. Right. And so in a way, the data that you present in the American public's opinion on this topic is directly related to the efficiency or the effectiveness of that propaganda effort on the part of the U.S. government who is trying to make it seem as if this war is indeed empty. And one way to find an alternative vision is to look at the press and the media and the conversations in the countries where those drones are landing. So if you look at the way that the Pakistani public talks about drones, you find the diametrically opposed position, which is that first of all, they know that their own governments lying to them. Right. But secondly, for them, they see something very clear, which is that the value of a Pakistani life, an individual Pakistani life that's not part of an officially declared war, is less than the value of a life in an American context. And so it creates both a visceral, both a sense of how shaming and a sense of being treated is less than human, but it also doesn't stop the general flow of information coming out of the FATA and the BFP about the effects of those drone attacks on civilian populations. So that data is very clear, and it is in the public. Yeah, that seems right. And you see that drone warfare is one of the things that most infuriates the Pakistani public to the point of influencing national politics. Yeah, right. So it's not true that there isn't good data on what people who are attacked feel about the issue, even though they also are distorted, get a distorted impression from their own government. Right. Thanks. That's a really helpful comment. With respect to that, I mean, the government, I think this is the case, and I'm speaking in hand-wavy terms, but has always known that people don't like to be faced with carnage associated with warfare. And at least since World War II has actively controlled information that comes into the public, like, you know, telling newspapers not to print pictures of wounded soldiers and, you know, trying to... And I... And controlling public opinion, using things like racism. Like, which is what we're talking about, right? Like Pakistani lives being worth less than American lives in the general sense of the American public. I mean, that's Islamophobia and racism. And it's what the sort of, like... I guess we could call it pro-war argument, I'm not sure, but it's the foundation of swaying public opinion in that direction. And I'm wondering... I mean, I don't see that there's any reason to believe that anyone in a position of power is any less aware than they've been before, that people dislike seeing images of dead children, and why we think that it might be any easier now that, I mean, as you said, there's new technologies to get information, but they're spying on all of it. So, like, why we think it would be any easier for us to get that information in an unmediated way without having to also combat things like racism and Islamophobia, which, as you just pointed out, right? Like, rhetoric that is anti-drone in Pakistan is also kind of anti-America, because America is the one with the drone. And it's very easy for a government voice to discredit that information because, you know, it's all part of this sort of generalized war. I mean, I think that the bigger problems here might not really have anything to do with drones. In other words, they have problems with systematic political... not corruption, but just... No, with warfare. I mean, I think the idea that we can make this argument without saying that warfare is the problem is somewhat naive. I think what you say is a valid point, but I think that the concept of, like, the modern era of delivery of technological delivery of information about warfare actually goes back to the American Civil War with the birth of photography during warfare. It was manipulated then. Yes. I'll think of it. Yeah. Okay. Yeah. So given the efficiency of drones, I don't see them going away. Um, and, uh, I can sit... can think that it would be nice to have had drones rather than doing the carpet bombing we were doing in Vietnam. We're not going to fight that war. So what sort of regulatory environment would you like to see for drone warfare? Yeah. So, um, the standard non-proliferation agreements like the Wassener Pact or the... um, the regulations monitoring the international exchange of weaponry I think should be applied to drones sooner rather than later. I think that war is actually talked about limiting proliferation in that way. But I don't think that's the type of regime that I as a philosopher or me as an ethicist think is sufficient. So, for example, about Mia's point saying, just say that war is wrong. Just say it. Like, just... I think that's good. I think that I actually think that, um, and I've made this point before that I think that more ethical and more... more attention to ethics is warranted in an era of technological sophistication rather than less. We started the entire discussion about expediency and virtue. When expediency is taken care of all there is as a backstop against transgression are normative frameworks and you need to, like, take them more seriously rather than less seriously. So, I actually think that, um, training... training operators and... or, um, in the past just war theory has been taught to the leaders of militaries like officers. And this is... this should continue to be the case. But I think, um, as the technological sophistication increases, I think we need to spread the ethical training down the ranks. Uh, that's... But now you're asking, like, you said, what would I like? Well, that would... that's what I would like. But, um, and I would also like as Mia says, like, I would like simply to have an open discussion about what the problem is with military action. Or what the problem is with political violence. Or what the problem is with the social contract. Like, all of these things, I would... all of these things I would love. But, um, but I... but I don't see them as, like, it's not gonna happen overnight but that's why I went into philosophy at least. I mean, and... and why I'm writing not about philosophy but writing about this. Um, and many philosophers say to me, why are you writing about this? You should be writing about analytic epistemology. I don't... like, but, um, but I think, like, speaking publicly about this is actually one... one thing that you could... could do. Yeah. I think it's, uh, really interesting that, um, ISIS's strategy has been to really focus on, like, scenes of carnage whereas the U.S. strategy is to completely erase them. Um, and building on that, I'm very curious to get your take on, um, the role that Twitter and Facebook have played. Like, I know Twitter has taken down, like, terrorist Twitter accounts and it's very interesting that, like, yeah, anything, uh, showing the results of drone strikes probably will be anti-American. What do you think the ethics are of Twitter, you know, dealing with taking accounts down? I think, uh, for the most part, Twitter should not take accounts down. Period. Maybe that makes me un-American, but, like, I just... I think that there needs to be some backstop against, um, there needs to be some backstop where we say no, the surveillance or no, like, you cannot interfere with public forms of communication. You just can't. And that means shouldering and facing risks, but I think our risk tolerance needs to be a little higher as Americans. So, for example, uh, and that might mean that we allow certain things like certain Twitter accounts to remain up. So be it. Aspect of drones is going on right now. Can't help but think of the establishment of the right of flyovers when they were first doing spy satellites. Um, and that was, we would like to, about that in the first place. We weren't right there. It was us. It was a science satellite, and it was we were just trying to establish that right. So I wonder if you see a connection between that and all of the commercial efforts and all the money and all the lobbying that's going to the domestics. Yeah. That's a really good question. Um, I think that the all the discussions that we have about, uh, space and the way that space is being used, like outer space is being used, needs to be seen within this complex of the drone debate, for example. Because sophisticated drone technologies depend, like the infrastructure required for our drone program far exceeds like just, you know, drone operator target. I mean, there's this whole complex network which is supported not just by, from what I understand is not supported just by government organizations, but also a commercial commercial interests. So I think that that's one of the concerns that we should have. So the discussions about the rights in space should be taken back to what sort of uses militarily might this these, this outer space infrastructure have. What you just said is, for me, the most compelling way that I think the U.S. public imagination could be adjusted that this other kind of spatiality, a kind of mapping project is one that I think is absolutely crucial for understanding the new form of kind of visibility. Oh, I see, yeah. And although I really appreciate your story, casualty, I think that there's also a way in which we tend to focus on a kind of visibility that actually has begun to kind of cover over the narrative that is much, much more and that we need to comprehend in order to have the capacity of morality that you're suggesting. Interesting. So I just wanted to mark that. Oh, that's great. No, thanks. Appreciate it. A little bit about the culpability of commercial drones in kind of the military industrial complex, essentially. I'm curious about what you think about culpability or non-culpability of the maker movement and its interest in drones in a civilian but not commercial sense along with that whole complex. There's no pitchforks in the room. I don't know. But this is telecast. So I... What do you think? No, I'm serious. But I think it's an interesting question if we're holding the commercial interest in drones culpable. Why aren't we holding the production of technology and moving forward with that technology that happens in the private in terms of not commercial sector? Right, hobbyist sector. Thank you as well. I think I think that you're talking about the hobbyists, correct? Yeah, but the hobbyists in this tend to be... It's not a nine-year-old kid with the thing you bought from a box. Yeah, anymore. It's really serious developments in technology happening in the hobbyist sector. Right, so I would answer the question there need to be distinctions between a reaper, predator, drone and a hobbyist but there are dangers associated with each. They are different dangers. Usually they are different dangers but there are moral and legal questions surrounding the hobbyist as well. I don't think that the hobbyist should push drones full meaning if the hobbyist goes out and says, well, I'm going to lobby for drones. There's a difference between a little white drone this big that takes really cool pictures but that could deliver some sort of weaponized or some sort of weapon a predator drone that has different capabilities. So I think that distinction is useful. Just to keep on this question and perhaps to take a little bit a little bit doesn't that sound a little bit like the discussion about, say, we have the 3D printer and then people are starting out, okay, we should actually not put a 3D printer because people can actually create guns. It seems to me that it's kind of the reverse 3D printer has thought that it does a lot of normal usage and then the evil or bad usage appears whereas drones you throw the other way around, drones we all look at them as like they are actually bad but more and more now you have like Amazon and whatever. So perhaps this is actually just a technology that can be used for really bad things and we should actually prevent that usage as opposed to the technology of the drone in the same way as we should prevent people using 3D printers to print guns and we should not actually limit the protection of 3D printers. Yeah. It is. I was afraid I'd be forced to talk about hobbyists and agriculture for example, the agriculture lobby has been emailing me for many many years like asking me these questions and like saying what are we supposed to do it would make our lives much easier I don't see any problem with this like yeah but so I take your point, I think it's a sticky issue I don't have any fast answers for it though yeah I think one thing you can say about sort of the hobbyist drone movement in this stuff is that there's something about drones that captures the public imagination and I don't know flying robots duh right I think in some ways you can over complicate it I mean people like flying robots and you're you know you're fighting that doesn't seem to be a particularly useful strategy I mean it because there are good flying robots and there are bad flying robots and you know the ones with the guns are more likely to be the bad flying robots at least some of the time the ones with the cameras are you know more likely to be you know have the potential to be you know the bad ones you know whether it's a news organization or whatever the one that kids launching this backyard isn't might break somebody's window so I mean if it captures the public imagination trying to sort of fight that seems to be the losing battle as a philosopher however I have to fight that losing battle it happens to people okay it's people waging war it's people dying like that I mean that is really the upshot of what I'm trying to say you know like in all of this it's there is a person in the loop they are responsible keep somebody is going to die maybe that's an issue like and when Mia says come on like talk about war fine I'm willing to talk about war like that that's what it is it's like if war were less bloody it would yeah I think I end up talking about these in very binary terms so it's either good or bad and not somewhere on some spectrum and especially when we're talking in the public consciousness so how do we insert some more of this like bubble tea and kind of framing that helps the public consciousness and I don't think it occurs by just talking about drones I think it occurs by just talking about ethics and people and like asking people to have a call to responsibility and like make it more than just me talking to 30 people right I mean that's I think it takes that being important in our society and I think that's so to answer your question about Amazon I think that's part of the responsibility Amazon should be more interested in the soft side of technology and make it public that would be good okay that and I think that's if you can hear me that's what I'd like I'm serious that'd be awesome and I think that you should not ask technologists necessarily to do it but humanists not humanists but philosophers or artists or sociologists or psychologists yeah Mia you have a thought I just wanted to point out that so I was actually one of the people who organized the drone conference Dark? Yes which was largely on civilian and commercial use Dark and it was for exactly this reason that we wanted to talk about commercial and civilian drone use without avoiding the subject of warfare and military drone use and just to speak to I don't remember your name but you're wearing a gray shirt and you have red hair. Maggie! We invited a woman named Christina Dunmar-Hester to talk on a panel called Right to Drones and she's done work on the hobbyist and maker communities not specifically on drones but on radio and sort of tinkering and that kind of thing but her work is specifically on the ethical responsibilities of technology hobbyists so we had that conversation which isn't to say that we can't keep having it we should but so also that I think what we're getting at is that yeah like Amazon does have a responsibility because we think that flying robots in general are a category that raise like substantive issues that are both ethical and psychological and philosophical and moral maybe and and one of those many questions one reason why I was so interested in organizing that conference is because drones aren't really the question people were talking a lot about the drone question but the drone question is actually like when you say warfare what do you mean because like we had a definition of war it accrued over hundreds and hundreds of years and it relies on very specific definitions that are no longer part of what we're talking about which is why the president can say you know this isn't warfare technically it's something else right and so I guess what I'm trying to say is that if you can see that it's war then we're talking about war and what that is and what that means and why we do it and whether we're doing it for a good reason or a bad reason and what those are and so it's one of the many reasons why I think drones are so interesting is because they allow us to have these conversations with people and so I mean I would humbly suggest that we not be afraid to actually talk about those especially in a space that's as safe as this which was also kind of the point of organizing that conference was like it's hard enough to get people who have different sets of interests in the same room together it's even harder to get anyone even among friends to talk about these incredibly personal things like what do you think is wrong about killing somebody do you think it's wronger to kill somebody who looks like you but if we can foster even a little bit of that comfort level then people can actually start to open up to each other and I think that's where we actually have the important conversations that you're talking about that I agree with you definitely need to happen and you said that woman's name is Christina Dunbar Hester Mia I agree with you what are your thoughts can you tell us what the country is where the drugs are most used I'm guessing Pakistan right now well I think that there's actually well in then it was either Pakistan or Afghanistan depending on the numbers but right now it's again it's hard to determine but the respondents were responding Iraq, Yemen, Somalia those were at the time those were significantly wrong answers other thoughts good thanks a lot for