 Please remember to use the mic, hold down the button for a couple of minutes, a couple of seconds before we start talking. I believe Admiral Carter is inbound, so before we formally start, this is the Marine birthday today, so I've asked Colonel Gannon to say a couple of words about that, so Colonel. Hey, good morning everybody. Yes, today we're going to have celebration for the 238th birthday of the United States Marine Corps, so that's going to go right after the Ethics Conference today at noon right out here in the Spruance Lobby. We'll cut the cake and have a toast and say some words, and all I would ask is after the conference, if we could keep the noise and kind of the mingling to a minimum, you're all invited to come out and we're going to cut the cake and celebrate before the weekend Marine Corps balls tomorrow night for those of you going, and just wish your fellow Marine classmates here happy birthday, so thank you very much. All right, I think we'll wait just another minute at least to see whether Admiral Carter is arriving, so please, one last reminder, if you didn't badge in, please do, and turn off your electronics. All right, Admiral Kelly says go ahead. Admiral Carter may be inbound. Is he inbound, sir? We don't know. Well, welcome back. This morning's dialogue is going to be about something that those of you who are here in May may remember I talked about a little bit. It's an area of research that's new to me, but is old as dirt to George, which is something I'm beginning to think about in terms of military ethics a lot. You know, if you listen to military people talk about ethics, the words we like are integrity, character, professionalism, and those suggest that the assumption underlying that it's a philosopher broadly are Aristotelian. That is that you've developed a series of habits through repetition. These habits are reliable. They will get you through their reliable and are almost all circumstances. You can count on them to get you through and you say, well, as long as you maintain your integrity, you'll be fine. Well, a few years ago I started reading some moral psychology literature, which is not something philosophers often read. And it turns out there's a ton of empirical evidence that the character is not nearly as stable as that way of talking about it suggests that there are all kinds of situational variables that affect things a lot. When I talked about it in May, I talked about some of that evidence, but George is a real expert on it. So we've asked him up here to speak and also to dialogue with us to explore this area. Dr. George Masturiani was an undergraduate at Georgetown, did his graduate work at the University of New Hampshire in psychology at both places. He was commissioned in the Army in 1977 through ROTC and served as a research psychologist on active duty from 1981 to 1992, retired from the Army Reserve as a Lieutenant Colonel in 2006. Originally trained in sensation perception with an emphasis in the history and philosophy of psychology, more recently he has focused on military issues such as suicide, determinants of ethical conduct, and so forth initiated and co-edited a warrior's guide to psychology and performance, model on similar books first written in World War II, which is used in introductory behavioral science classes at the Air Force Academy and at West Point. He's taught at the Air Force Academy as an Army officer from 1990 to 1992 and as a civilian since 1997. So it's our pleasure to welcome Dr. George Masturiani. Thank you, Martin. Good morning, everyone. I'm very honored to be here today. As Martin mentioned, I'm a psychologist and I spent many years in the Army working as a human factors, human engineering specialist on interdisciplinary system development and testing teams composed of engineers, physicians, and operators of various sorts. We human factors specialists were not always appreciated by our more practical and hard headed teammates and their views were actually quite succinctly expressed by Admiral Hyman Rickover once who said that, and I quote, human factors is about as useful as teaching your grandmother how to suck an egg. Having had only limited success in persuading my colleagues in the engineering world, that consideration of the human factor can be worthwhile and being in addition to an egg sucker, a slow learner, I will nevertheless attempt a similar message in a different domain today, that of ethical conduct. So just as it was the province of engineers and operators to design the particular systems on which I work as a human factors specialist, it is the province of philosophers, ethicists, and others to articulate and clarify the nature of our moral obligations and the particulars of our ethical commitments. Just as I tried to apply my understanding of the sensory, cognitive, and physiological characteristics, capabilities, and limitations of humans to system design and performance, my purpose here is to suggest that many of those same characteristics, capabilities, and limitations can meaningfully inform our expectations as to how humans will implement and enact the more abstract, moral, and ethical principles arrived at by philosophers. So my goals today are three. First, I'd like to briefly address the issue of how consistent our actual behavior is with the beliefs and principles to which we claim to be committed. Second, I'd like to point out a couple ways we as individuals can try to ensure that our own behavior is more consistent with our beliefs. And finally, I'll offer a few suggestions as to how leaders can create conditions that promote good conduct in units and subordinates. If you'll indulge me just one more moment, I did want to relate to you that the intellectual and emotional journey that has led me here has its roots in my interest in the Holocaust. This photograph is the one that did it. I was reading Daniel Goldhagen's book, Hitler's Willing Executioners, on an airplane many years ago, and this photo was in the book. My children were about the age of the child being shielded by presumably his or her mother, and I sat there with tears streaming down my face on the airplane thinking about a parent trying to protect a child, the most fundamental and visceral commitment that we have as humans, and the helplessness that that woman must have felt. I'm a retired army officer, and though I hastened to add that I was never a troop leader or commander or operator, I worked mainly in laboratories and such, and my friends in the real army always referred to me as a research puke, which I think is army speak for eggsucker, but I don't think you need to be an infantryman to look at that picture and wonder about what was going on through the minds of the soldiers killing these innocent civilians with their mousers. How can we explain the behavior of the soldiers in that photo, and especially the behavior of the soldier who proudly sent it home? Were they under duress? Were they under some malign influence? Were they simply evil? Or were they really soldiers at all? So much of the psychological work that underpins my discussion today had its origins in our attempt to understand the Holocaust and the behavior of its perpetrators. I mention this mainly as a matter of context. The scope and scale of the Holocaust may make it difficult sometimes to see its relevance to other events, but I think there are psychological mechanisms at work in the Holocaust that can inform us about bad behavior in a broad range of circumstances. So in the early aftermath of World War II, the first attempts to explain the monstrous evil that had occurred became attempts to identify the monsters who had initiated it. The Nuremberg trials offered an opportunity to study the major architects of at least the European War closely, and the I Was Only Following Orders narrative that emerged from the trials and took hold in the popular minds supported the perhaps comforting idea that evil arises from within a few bad people. And if we can identify, control, and or eliminate such people, then we will be safe from such evil. This is the dispositional view. Some of us are disposed to do wrong. This approach turned out to be a kind of intellectual blind alley, and so far as psychological tests performed on the major Nazi war criminals showed little evidence of psychopathology. And later attempts to identify an authoritarian personality disposed to cruelty were similarly barren. Recent scholarship has shown how broad and deep societal participation in the Holocaust really was, and that too has made it abundantly clear that bad apples cannot be the whole story. In fact, this comforting illusion was shattered in the 1960s by a series of now iconic studies in social psychology, and I suspect that many of you have heard of Stanley Milgram's obedience studies, of which this year happens to be the 50th anniversary, and also Philip Zimbardo's Stanford prison study, which was conducted about a decade later. You'll recall that Milgram brought average people into a psychology laboratory, into New Haven, Connecticut, and constructed an elaborate ruse in which he asked them to apply what they thought were electric shocks to a person who they thought was an innocent stranger. In the most famous version of the study, there were actually 24 variations with wildly varying outcomes, but that's a story for another day, but in the most famous version, about two-thirds of the subjects administered what they thought was a 450-volt shock to a stranger. Before conducting the study, Stanley Milgram had asked a group of psychiatrists and other academics what percentage of subjects they thought would administer the highest shock level of 450 volts. Their response was less than 1%. Now, as a bit of a Milgram skeptic, I'll say that those making the prediction couldn't know exactly what the subjects would experience, and that there are indeed conditions in which compliance is that low, but the central point remains, we are not nearly as good at predicting what we'll do in certain circumstances as we think we are. In the Zimbardo Prison Study, college students were randomly assigned to be either prisoners or guards in a simulated prison in the basement of the psychology building at Stanford University. Intended the last three weeks, the study was cut short after only six days because of the abusive behavior that soon developed among the guards. These two studies, perhaps more than any others, directly challenged the dispositional view and offered instead the idea that anyone, could be induced to commit acts contrary to their internal moral beliefs when exposed to certain situations. This view became known as situationism and has dominated psychological thinking about such matters since. Without getting into too much inside baseball psychology history, these two competing and alternative conceptions have more recently merged into, in my opinion, more sensible and realistic interactionist approach. I think most of the lay public embraces a kind of naive dispositionalism in explaining some behavior. For example, when we are successful or when others fail, we tend to see these as consequences of internal attributes. I'm smart, he's dumb. We become situationists, however, when we fail or others succeed. Well, then it was clearly lady luck who frowned or smiled upon us to produce such outcomes. Psychologists call this the self-serving bias and it is a characteristic attributional pattern. My broker demonstrates this every time I talk to him. If my modest investments are doing well, he regales me with the complex and shrewd details of the strategies he has cleverly employed on my undeserving behalf. But when there is writing to be explained, suddenly the discussion is all about dismal market conditions, the so-and-so's in Washington, the Fed, the president, and pretty much everybody who is not him. So most of us intuitively know that behavior can be determined both internally and externally. And we use that intuition to make our world more agreeable. We may just need to think more clearly about when behavior is more likely to spring from the one or from the other. So in the interactionist view, both internal dispositions and beliefs and external situational factors play important causal roles in our behavior. While empirical research in this area is difficult to do and not always straightforward in interpretation, I think it is fair to say that there is good evidence that behavior is not always consistent with our internal beliefs and also that situations do not affect each of us the same way. There is a personality construct meant to capture differences in our beliefs about the forces that control our lives called locus of control. Some of us are externals, people whose attributional style is weighted toward environmental and social factors, and others are internals whose tendency is to attribute events in our lives to our own actions. These belief patterns can be reflected in varying susceptibility to external influences and in behavior. So I said at the outset that I was going to leave the philosophizing to the philosophers, but I do want to point out something that we psychologists have rediscovered, I think, that is underappreciated by those untutored in our field. Most of us have the intuition that there is something going on in our head that causes us to do what we do, to behave in certain ways. Insofar as we think about the origin of what is going on inside our heads, we most likely attribute our behavior to a set of ideas and values. But our behavior, as I have been at pains to point out and will address in more detail shortly, can also be affected by our external environment. One component of our external environment is behavior that we produce. Once we do something, the something we have done reenters our heads through the senses and becomes something else that we have observed. There is thus a sense in which the causal pathway is reciprocal. Our ideas and values produce behavior, but can also be influenced and affected by that behavior. This was famously demonstrated by Leon Festinger and James Carl Smith many years ago. They asked research subjects to perform a very boring task. They were rotating and aligning thread schools on a matrix of dowels mounted on a board. After performing this tedious, boring and annoying task, the subjects were randomly divided into two groups. One of the groups was paid $1 to tell people in a waiting room, who were, of course, in reality accomplices of the experimenter, that the task they were about to perform was enjoyable and fun. The other half of the subjects was paid $20 to tell the same lie. At the end, all the subjects were asked to fill out some forms and questionnaires, among which were questions asking them to rate the enjoyability of the task. Which of the two groups, the $1 group or the $20 group, rated the task more favorably? Well, most people, when first exposed to this story, think that the $20 group would rate the spool turning task more favorably. But of course, the opposite is true. Why? Festinger and Carl Smith coined the term cognitive dissonance to explain the results. We are rational and rationalizing creatures and seek to create a consistent vision of our world, including consistency between thought and deed. The deed to be explained in this study is lying about the enjoyability of the task. The $20 group has an easy explanation as to why they lied. In fact, they have 20 easy explanations as to why they lied. But the $1 group has a harder task. A dollar, even in the 1960s, does not seem like enough to justify lying about a task so patently tedious and boring to hapless strangers in a waiting room. So the $1 group shifts their perceptions of the task in a more favorable direction, which reduces the dissonance between thought and deed, between belief and act. So if virtuous behavior is an end, then it can also be a means. Which brings us to the broader question of how we can promote consistency between our internal convictions and our overt behaviors. If you're still struggling with the idea that we don't always do things for the reasons that we think we do them, I'll simply ask you to suspend disbelief a bit longer. I'm not one of those psychologists who seems to take a perverse delight in detailing how stupid and incompetent we are as a species. I'm actually pretty impressed with us. But we do have a few peculiarities built into us, and so a careful reading of the owner's manual, I think, can be a very useful exercise. So one way to promote consistency is to ensure that our conduct is never inconsistent with our convictions. Not even a little, not even once. As a teacher, this means to me that when I hear cadets talking before class and they say things that are intolerant or inappropriate, I correct them. Even if it was in a half-joking context and even if it makes me seem like kind of a humorless jerk when I do it. This was brought home to me by a colleague who has a daughter who's a down's child, developmentally disabled, and you still hear cadets sometimes use the word retard, for example, in their conversation. I was with my colleague one time when this occurred, and he immediately explained in a very nice way how disrespectful he felt that was and how hurtful it was to him and to his daughter to hear that word used. And the idea of maintaining a respectful attitude toward other people is important to us as teachers, and so that is yet another example of the kind of behavior that we need to be vigilant about correcting each time it occurs. Soon to be Major General H.R. McMaster has explained how important it is for soldiers to treat the population with respect in an insurgency situation, not only because of the direct effect it has on the goals of counterinsurgency, but because of the indirect effect it has in protecting our soldiers from the morally corrosive consequences of their own actions. He specifically forbade the use of disparaging terms to refer to Iraqis, for example, when he took the 3rd ACR to Tel Aviv. Recognizing that like Festinger and Carl Smith subjects, soldiers who engaged in disrespectful verbal behavior would eventually shift their cognitions in that direction, taking thereby the first steps on a slippery slope that would feed back eventually into more abusive behaviors. Dehumanizing language can easily lead to physically abusive actions, which in the end will be detrimental to the overall mission. So now I may be dating myself here, but I think the original Star Trek series is a source of many valuable leadership insights. A particularly important one is the tension between cold logic and rationality on the one hand, and raw emotion and passion on the other that is acted out by Dr. Spock and Captain Kirk. So in the military we tend to think of emotion as a bad thing, as something to be suppressed or ignored. But emotions are part of the fabric that connects us as humans. Holocaust perpetrators frequently report that they didn't think they were doing anything wrong based on their education, upbringing, and so on, but that they felt revulsion when they committed atrocities and had to steal themselves against these emotions. When emotions are deadened by fatigue and exhaustion or overexposure or ideology, overexposure to traumatic stimulation, we are at risk when we make decisions that have a moral tone. Because emotion is and should play a role when we make decisions that affect others, not necessarily the determining role, but a role. The events that introduced the world to Abu Ghraib took place in late October and early November 2003, though they did not reach public awareness until the following spring. Abu Ghraib has in some ways outgrown itself as an event and is now invoked in discussions of torture and abuse routinely, though the invocations are often more symbolic than substantive. I have argued elsewhere that the Abu Ghraib cases were not an especially useful framework for discussions of torture and interrogation policy, that in fact the cases that were most visibly prosecuted, those associated with the infamous photos were chosen, specifically because they did not involve interrogation and instead reflected only the individual sadism of the soldiers involved. The cases may not tell us much about torture policy, therefore, but they are still a useful way to frame discussions about ethical behavior, I think, especially in a military environment. Early on, there were two competing narratives. Sorry. Early on, there were two competing narratives about Abu Ghraib. The bad apple narrative and the bad barrel narrative. The administration favored the bad apple narrative. Just a few rogue soldiers enacting animal house on the night shift. Opponents of the administration saw Abu Ghraib as the inevitable consequence of policies promulgated by Bush, Cheney, Rumsfeld and company. I think the popular understanding nowadays favors the bad barrel view. A majority of Americans now sees the Iraq war as not worthy of having been fought. Many people mistrust the leaders that got us into it and people generally venerate soldiers and the military unconditionally. So the idea that the soldiers were kind of low-level scapegoats I think resonates with many people now it is. But the complex reality is that the events that occurred that fall a decade ago are not amenable to simple unidimensional explanation. To locate responsibility solely in the soldiers or to place the blame exclusively higher in the chain of command are complicated by the variety of actions and responses of the individuals involved. Attempts to understand Abu Ghraib are also frustrated by matters of definition. What exactly was Abu Ghraib? Was it the cases that were charged? Relating mainly to the photos turned in by Sergeant Darby? Or should we consider events that quite probably took place outside the glare of public scrutiny created by trials that might relate more directly to torture and interrogation policy? While the trials created a voluminous public record of the events surrounding them, so they are perhaps the best place to start. At one end of the spectrum we find Charles Grainer. By all accounts Grainer was the dominant figure among those in the infamous photos. While not the ranking soldier present, he was a powerful, informal leader. Grainer received the longest prison sentence of those charged in the Abu Ghraib cases for the violent abuses committed, and he was completely unrepentant. He proudly emailed photos of himself abusing Detainee's home and his sexual relationships with women in his unit, which existed long before Abu Ghraib testified to his contempt for the rules and regulations to which he was nominally bound. We need not agonize long over Grainer. His behavior is readily understood as nothing more than what it was. It was just bad behavior. No deep moral conflicts, no subtle pressures from Secretary of Defense. It was just bad behavior. Grainer is an old-fashioned bad apple, but he is not the whole story. At the other end of the moral spectrum we find people like specialist Matthew Wisdom. This young soldier observed the goings on at the hard site and was appalled. He was not confused about whether this was right or wrong. He was in no doubt that these activities were improper. Sadly by, as some did, nor did he join in as did others. He left and reported what he had seen to his superior in NCO. Unfortunately, the NCO disbelieved him and sent him back. The deliciously eponymous specialist Wisdom left a second time and reported to this NCO again. This time he was sent elsewhere and the rest, as they say, is history, but specialist Wisdom epitomizes the smooth operation of the concepts of virtue and character as they are as they knew was wrong and without encouragement and indeed despite pressures to do the contrary he did the right thing. The apparent source of his good behavior was as internal as was Grainer's bad behavior. But specialist Wisdom is not the whole story either. The more psychologically interesting cases are those more numerous individuals who found themselves doing things they knew or suspected they should not be doing, which represent the majority of the cases. Why did the behavior of these people deviate from the internal regulatory framework they possessed? One answer lies in the social relationships among the soldiers. The Milgram and Zimbardo studies are but two of the better known examples of social influence. Social psychologists have described the many ways people are influenced by others. Grainer was an experienced prison guard and civilian life and was an intimidating and aggressive personality. Some of us can remember moments in our lives when we were carried along by the group, perhaps because we didn't want to single ourselves or because we wanted to ingratiate ourselves with those higher on the social food chain, which in my case would be everybody. These are not moments of which we're generally proud and hopefully their rarity increases with our maturity, but they're genuine human moments rooted in the behavioral tendencies that have evolved with us nevertheless. As social creatures in addition to these social considerations, there are also vertical ones. Aspects of the soldiers' relationships with their leaders and the institutions of which they were a part to be considered. In the aftermath of the Abu Ghraib cases, a flurry of investigations and reports appeared. These reports are nearly unanimous on one thing. Abu Ghraib was a mess. Living conditions were atrocious. Soldiers were billeted in dark, dirty facilities. Sanitation was poor, security was poor. Soldiers spent long hours, tired, hungry, hot, dirty, scared. Harkening back to my days studying human performance as a military psychologist, I'll simply point out what you as commanders and leaders know far better than I, that soldiers can and will step up to the challenges they confront, but that there are limits to the checks that they can write against their reserves of resilience. Fatigue and sleep deprivation and environmental discomfort compromise task performance and diminish our capacity to process information. Working memory is impaired. Moral decisions are among the hardest decisions that we make. We psychologists used to talk about moral judgment primarily in rational terms. You may have heard of Lawrence Colberg, a psychologist who developed a stage theory of moral reasoning that has been widely applied more recently. And we've come to realize that moral intuition is just as important as moral reasoning. I mentioned earlier the idea that empathy, genuinely putting ourselves in the place of others plays an important role in guiding our behavior. And insofar as the conditions under which these soldiers served obstructed both their capacity to think clearly and the empathic responses they might otherwise have exhibited, then the deterioration in their moral performance can be understood as arising from the same stressors, same stressful conditions that caused their reaction times to lengthen in their working memory to decline. And some of that must be laid at the feet of their leaders. You may say that those who accept the obligations of military service expect to have to serve under uncomfortable, dangerous stressful conditions and that if the actions of these soldiers was partly a result of exposure to stress, then this was nothing more or less than weakness. But it is not merely the absolute level or the duration of stress that matters. Again, you as commanders and leaders have experienced what I as a fuzzy-headed egg-sucker have only observed. That a key task of leaders is to mediate stress. Leaders who mediate stress, who literally get in the middle between the soldier and the stress, also mitigate that stress. Soldiers can deal with a lot if they understand why they have to deal with it, or if they are convinced that those responsible for them are doing everything they can to provide for them. Soldiers who have a relationship of trust and respect with their leaders who feel a part of something larger than themselves are able to endure a great deal. But what if soldiers feel abandoned by their leaders? If they feel their leaders have betrayed them, if they feel that no one up the chain gives a dam about them, then the corrosive effects of these conditions may well be amplified. Another common thread in all the investigations and reports about Abu Ghraib was the lack of effective leadership. There was confusion about who was in charge, there was confusion about what was acceptable, there was poor supervision and accountability, and at least some of the soldiers were left to figure out what the conditions in which they found themselves pretty much on their own. So there is no simple one size fits all explanation for what happened at Abu Ghraib. We psychologists have no problem with that state of affairs because the first thing they teach us in graduate school is how to say it depends in response to every question we hear. But others may be less comfortable with that level of uncertainty, with the equivocation and qualification that are my stock in trade. I teach psychology to Air Force cadets and emphasize with them that human behavior is always to be understood according to the biopsychosocial model. Who we are and what we do is jointly determined by the genes we inherit, by our experience, by those around us, by what we think and choose to believe and to do. In the context of Abu Ghraib some of the observed behavior seems to be attributable to largely internal factors. There were bad apples at Abu Ghraib who were not very nice people and whose behavior at Abu Ghraib was largely consistent with their behavior before Abu Ghraib. But there were not the whole story. There were also good apples like specialist wisdom whose internal gyroscope was unperturbed by his experiences and there were also average apples whose behavior was affected both internally and externally. So external factors clearly played an important role at Abu Ghraib. The immediate environment in which these soldiers served was not one of which we should be proud and it is easy for us to point the finger at unit leaders as responsible for this but it is worth remembering that we citizens are ultimately responsible for what is done in our name. After Vietnam we adopted an all volunteer force that arguably was inadequate to the cumulative demands of the last decade. As a society we have not asked too many questions about what goes on in the military so long as we are not asked to be part of it. And as a society we have permitted our elected representatives to largely cede the decision of when and how to go to war to the executive branch. I know that I am in imminent danger of sounding as though I am unnecessarily politicizing this discussion and given my haircut I should probably shut up now, lest I be madly misunderstood. But I simply ask you to consider our collective role in creating the conditions that led to Abu Ghraib. Speaking only for myself I will say that I try not to be too smug when I think about those whose lives were changed by those events. I don't think that we Americans have fully confronted the issues that gave rise to the policy confusion that arguably contributed to at least some of the abuses at Abu Ghraib. What is torture? Is it ever permissible? If so, when? How is it to be regulated? We now learn that there are perhaps more instances of collateral damage from drone strikes than we had previously thought. What moral example does our acceptance of that set for our soldiers? Do we need to think far more deeply about how to talk about our moral posture and our ethical duties to those who serve in uniform? How can we help those with whom we serve navigate the complex moral issues we confront? For me, the takeaway lesson from Abu Ghraib is that because behavior is multiply determined we cannot as leaders rely on a single determinant of behavior to carry all the water. Yes, internal beliefs and convictions play a role in determining our behavior but not the only role and not always the dominant role. As leaders we are enjoined to help those who serve under us to talk about their conduct. We may indeed help them develop and refine their value structure. It's my understanding that in the United States Marine Corps so-called dilemma training is used to work through difficult scenarios with young Marines. This is perfectly consistent with what we as psychologists and scientists know and it's to be applauded but our obligations in promoting ethical conduct cannot end there. We must also recognize that we humans are by our natures affected by what was on around us. Leaders are also obligated to monitor and manage the social and physical environment of those we lead in such a way as to make it easier, not harder for them to do the right thing. In this sense, all leadership is ultimately moral leadership. So Abu Ghraib as a case study frames our discussion vertically. Only one officer was court-martialed and he was acquitted so we tend to look down the chain for explanations of fringe-worthy revelations in the news about misconduct among mid- and high-level officers and leaders that never seems to end. What can we say about those cases? One fun thing about being a psychologist is learning how little we know about things most folks assume are well understood. So for example, why is our central nervous system wired such that one side of our brain controls the other side of our body? Nobody knows. Well, we don't really know that either. There are two competing alternative explanations for sleep. The restore and repair hypothesis and the protect and preserve hypothesis. Now, if you'll just bear with me a few more moments, I promise this will get back to the topic. The restore and repair hypothesis says that sleep replenishes resources used up during the day's activities. The preserve and protect hypothesis suggests that sleep is just nature's way of keeping us out of trouble. The idea that nothing good happens late at night is easy to accept if you've been a parent of two girls as I have. Or if you've had to do dorm patrol at the U.S. Air Force Academy as an Army officer many years ago, as I also have done. I'll never forget reading the rulebook when I took over the duty one night which said that opposite sex cadets were not allowed to share the same horizontal plane in a dorm room with the door closed. That's what's strange by geometric visualization capabilities. But my point is that when it comes to darkness mother nature may be on to something with this sleep thing. We are very visual animals and wandering around in the dark deprived of the benefits of our exquisite visual sense is probably a bad idea. Bats do the same arithmetic and sleep on the opposite cycle because their competitive advantage is using sound to see in the dark when they can't be seen. Bats and hormonal teenagers are relevant to leadership. Bad things happen in the dark for us humans because we are supremely adapted and specialized for survival in the light of day. And I think that this is true in the social and ethical domain just as it's true in the realm of vision and physical safety. We're visual creatures and we are social creatures. We have evolved a complex set of social mechanisms by which we regulate our own conduct hierarchical authoritarian cultures such as military culture offer the potential to short circuit or distort those normal mechanisms of reciprocal social influence that guide our behavior. Some leaders struggle to keep the lights on in their relations with others. They try to continue to preserve the benefits of social exchange within the limits and constraints of the requirements of military discipline as best they can. There are others though who like a dark way to create organizational climates in which interpersonal relationships are impoverished and stunted from the point of view of normal social discourse. They may use intimidation and fear to insulate themselves from the illuminating insights that we get from one another or they may simply not put enough effort into ensuring that there is a climate of openness and trust and respect. I suspect that at least some of the cases of leader misconduct that occur are related to this kind of social darkness. So as commanders and leaders, again, you know far better than I how difficult it is to maintain some semblance of normality and social relations within the very real demands and constraints of military culture and tradition. But just as our visual systems use light to keep ethical, to keep us from walking off cliffs, our emotional and ethical systems are informed by thoughts and feelings of those above, below and around us and keep us from walking off ethical cliffs. The message I'd like to leave you with is that little things matter. Our ethical nature is revealed in high stress moments of decision but it is developed in thousands of mundane routine seemingly innocuous interactions and events that happen every day. We create our own ethical natures with our thoughts, with our behaviors and our responses to others. We shape the ethical natures of others through the examples we set the information we share the rules we break. It's easier to know good than to do good sometimes but all we can do is to try to make our little corner of the world a corner in which doing the right thing is easier rather than harder. Thank you very much for your attention. I do apologize for the drilling. I guess, John, did you make it stop maybe? That's been going on for weeks in my office so I'm used to it, but you're not. What we wanted to do is have a bit of a dialogue between us and then open it up for discussion because I think this is really an important area for us to think about. It really is a response to the philosopher's original sin, I think, which goes all the way back to Plato who said, nobody knowingly does wrong and what the psychologists are here to tell you is sure they do. They do it all the time and they do it for all kinds of reasons that we don't fully appreciate. So I think if we're really to understand actual human behavior, both your own and that of your subordinates, the kinds of things that George is talking about are absolutely vital for us to think about and we don't think about that much, I don't think in the military. Let me start with a couple of questions, George. You talked about the role of demeaning or derogatory language even if it's used apparently or intended to be ingest. I remember when I was teaching at the Army War College, I had a colonel who's now a senior army leader who routinely said, you homo to people just as a term of joking derogation. And I remember calling that guy in the office, he said, look, you really probably need to knock that off for a variety of reasons, but not the least of which is that you're all going to have gay people working for you. If not, this is long before Don't Ask Don't Tell Retail, but he said you're going to have civilians working for you, I think he ended up a two star. He took it hard, I don't think he did it anymore, but he certainly didn't intend to be doing anything malicious or evil by this. Can you say a little more about what the research shows about tolerating just loose use of language of that sort of joking derogation? I think the issue of respect and human dignity are things that we at the Air Force Academy at the moment are taking very seriously some of the things you may have read about in the news at the Air Force Academy. One of the approaches that we've taken is to scrutinize very carefully the way we talk about things, and it goes back to really the Festinger and Carl Smith study that verbal behavior is a form of behavior, and if we hear ourselves using dehumanizing language to talk about a particular group or category of people, then it's to be expected that our thinking about those people will ultimately follow suit just as the subjects of that study came to form a different judgment of the task that they had performed quite unconsciously. And one thing that we know about behavior is that it tends to change incrementally and gradually, and so verbal the acceptance of verbal insults about people can easily lead into say treating them roughly at a checkpoint rather than treating them more courteously or carefully. And acceptance of that relatively minor form of disrespect can then escalate into still more abusive behaviors. And so it's the slippery slope that you get on with verbal behavior that is concerning I think to us. And so for that reason I was very impressed with Colonel McMaster, then Colonel McMaster he showed me a letter that he had sent out to his troops in the 3rd Armored Cavalry Regiment before they went to Iraq. And he spelled out in great detail what they would and would not say. I mean, I don't know that Colonel McMaster was aware of the Fessinger and Carlsman study, but he certainly was aware of the underlying principle that behavior in little things makes a big difference. Well, I know he should I think his General Order 1 was every time you offend an Iraqi we're losing. But I think most American forces in Iraq were pretty routinely referred to Iraqis as rag heads or still worse as Hajis, which is a very honorific religious term in Islam. So you're pretty confident that that has something to do with the brutalization that followed and then let me flip the question. Carl Marlanta wrote this wonderful book What It Is Like to Go to War Who Spoke Here Last Spring and Will Be Back Again This Spring in his book about what he considers the necessity of what he calls pseudo-speciation of differentiating yourself from the enemy in some way to make it psychologically possible to do the things you do. So how do we square, if we accept at least Carl's argument that pseudo-speciation is a psychological necessity with the effects of derogatory use of language with the sort you're talking about? Well, I think they're closely related. I mean, dehumanization is a step toward abuse that's widely recognized in the genocide literature, for example. People often talk about the language that was used in Nazi Germany, for example, to refer to Jews, you know, as insects or vermin or Basili. And in the case of Rwanda, similarly, there was a specific dehumanization of each side by the other. So dehumanization serves, one of the things that's true about the Milgram studies, for example, is that those cases in which shocks were applied by more people are cases in which the victim was further away. So there was actually an instance in the study where the people were sitting as close as Martin and I, and I would have to hold Martin's hand down on the shock pad in order to deliver a shock to him. In that case, very few people administered shocks. There was another case in which Martin would have been in the other room behind a closed door, and he wasn't able to hear anything that he said. And in that case, almost everyone administered shocks. So proximity to the person makes a huge difference in people's willingness to injure them. And dehumanization produces a psychological distance from the person. It puts them in a category in which, you know, their deservingness of certain rights is altered by that category. I've seen one very into that of the Milgram experiment where the person was in another room, but there was a glass. They could see them. And by the time you're up to the 400 and 450 level shock, as I recall, the actor is screaming, writhing in pain, and eventually passes out. But 60%, I believe, was the number in that version of the experiment who were willing to administer the shock. That's right. That's the classic version. There wasn't actually a window, but there was a wall. Sorry. But the individual would thump on the wall and you would hear voices come over the intercom. And in that condition, two-thirds of the people administered the shocks all the way up to 450 volts, which was very surprising to most people. So is it fair to draw the inference from that, that although all of us in this room think if we were just put in a room with an experimenter told we could leave the experiment if we like, they have no authority over us, they're just speaking to us in a calm voice saying the experiment must continue. Is it a fair inference to real life that most of us would do that, in the face of authority? I mean, I think it's a fair inference to suggest that that would happen in that situation and in that context. Now, that's the, I depends argument. I'm asking you about the real-world application. That's no fair. You know, and that's an argument that I've made, actually, about the applicability of the Milgram studies to the Holocaust itself. I'm not sure that the situations are really comparable. What occurs in the Milgram study is, I think Lanelle mentioned yesterday, oftentimes, we're not confronted with a choice between a right and a wrong. We're confronted with a choice between two rights and we have to balance those two rights and that's really what happens in the Milgram study because the subjects have two duties. They have a duty to the experimenter who has brought them there and is telling them what to do and they also have a duty to the subject and they, in those particular circumstances, balance that, those two competing duties in a way that most of us regard as inappropriate. So, I will say this, that the Milgram studies have been, they've never been fully replicated because most people consider them to be highly unethical now and I would agree with that assessment. But they have been partially replicated many, many times, most recently. I think it was 2008 by a fellow named Berger and the results were very similar to what they were in 1961 so it would be inappropriate to look back and say, you know, people in New Haven in 1961 you know, we're different somehow than we are today. We're not. Another set of empirical studies you didn't mention but that I've gotten really interested in Dan Ariely's studies on cheating and I mentioned this before, forgive me if anyone in the room has heard this before, but he did a study in which students came into room and they were given, they seated at desks, they were given a set of math problems, asked to solve the math problems and then to pay themselves various amounts of money for every successfully solved math problem and to tear up the sheet. In the default condition just doing that most people cheated a little they said they solved one or two problems more than they probably did. But then the intervention is he puts an actor in the room and the actor just gets up as soon as the experiment leaves the room and says, I've solved them all, takes all the money and walks out and what he's interested in is would the presence of the actor increase or decrease cheating in the room and just stop for me to ask your own intuition about that I won't ask you but ask yourself what do you think, would it increase or decrease cheating? The answer is the study was done at Carnegie Mellon and Pittsburgh and if the actor was wearing a Carnegie Mellon sweatshirt the cheating went way up but if the actor was wearing a Pittsburgh shirt for the university on the other side of town then it went way down and the best hypothesis to explain this would be I think that when your team is perceived as tolerating a behavior the guy at your sweatshirt then that becomes a more acceptable behavior whereas it's the out group the guy you want to be better than it will tend to drive it down now that seems to me highly relevant for a military context because it's all about sweatshirts it's all about in-group behavior Any thoughts about that? Yeah, I'm reminded too of other studies of like in the Milgram study for example I said there were 24 conditions and some of them some of the manipulations involved bringing other people into the situation and even bringing one person into a situation like that dramatically changes the results if you are the lone person confronting the experimenter and trying to resist his entreaties to do a certain thing and you have one person sitting next to you who supports you in your resistance then compliance with his requests just plummets so the presence of another person especially a person who is perceived as having high status is very powerful in changing a social situation and that's an important lesson I think for all of us that you can be that one kid on the playground who raises your hand and says no that's wrong you shouldn't be bullying that other person and it will have a big impact just being that one person So what would be the practical military application of that insight in terms of how a leader would organize a unit or a team yesterday to some of the conversation when Dr. Longenecker was here in the afternoon I think the idea of openness and transparency in leadership relations is important because you want a situation in which people feel comfortable voicing dissent when things appear to be going wrong General Powell in his book talked about how he liked a noisy leadership environment how he liked it when people were talking and fussing because then he knew he was hearing the things that he needed to hear and I think that sometimes when we use our positions of power and authority to inhibit the sort of open dialogue and conversation that we normally hear around the dinner table for example again within the limits of military tradition and culture I think we help ourselves by permitting those sorts of interactions to occur I'd like to explore the cognitive dissonance stuff a little further as well the Festinger study I know is the book when prophecy fails where he studied a sort of space alien cult this group in California that believed that flying saucers were going to come and whisk them away to salvation on planet Xenon or whatever it was and they had specific dates when this was supposed to happen and up to the date they were very secretive and internal and didn't really talk much outside the group but after the date had come and gone and come and gone a couple of times after recalculations the weirdest thing happened which is the people who were kind of marginal group dropped out just gave up on it but the core group were ever more dedicated to the truth of their original belief this suggests that when faced with objective disconfirmation we're often inclined to double and triple down on the false belief that got us there can you tell us a little more about well, I mean in a particular instance of cults there's a powerful set of social mechanisms that bind those people together and which cause them to engage in seemingly heroic attempts to reconcile their beliefs with an obviously inconsistent reality and those social, I mean these are typically people who have been socially marginalized are brought into the cult and they're bound to the leader and in the case of the Jim Jones cult it was through religious beliefs in the case of the space alien cult they often times have to surrender all their property to the cult leader all of their friends and social relations are within the cult and so there's all those social mechanisms that we use to regulate each other's behavior in their case become highly concentrated in the hands of one person the cult leader and so they have in many cases have sold their houses and all their property and given all the property to the cult so they're committed to it in a way that most of us are not committed to other things but it is testimony I think to the power that we have to shape our cognitive world in ways that are comfortable for us we tend to see things as we want them to be rather than as they really are and it takes effort to force yourself to not do that and to see things as they really are now where I'm going to go next I do with trepidation but I want to do the military application of what you just said the military is especially with the olive oil interforce pretty isolated from the American people Admiral Mullen used to say the American people don't know us we don't know them and that's the main thing that worries me military people tend to live in environments of people mostly like themselves who tend to think rather similarly in general and to be exposed at relatively lower levels to other perspectives so to what degree is this cognitive dissonance phenomenon a risk for the health of military culture and the health of civil military relations well I think it is a risk and again at the Air Force Academy it's a fairly homogenous student population that we have three quarters of our faculty are Air Force officers so it's difficult sometimes I think for cadets to appreciate the variety of opinions that other people have and moreover I think there's a dangerous tendency within some in the military to express contempt for civil culture for civilian society to view the military as somehow morally superior to civilian society as a whole I personally think that carries with it risks not only for civil military relations but also it's inimical to the development of behaviors in the military that will be consistent with those of civilian society so if people in the military talk among themselves you know if you walk into any room at the Air Force Academy that has a television in it in a public space it's on Fox News you know it's not necessarily a bad thing I like to watch Fox News sometimes but not hearing the other viewpoint ever I think is problematic I guess one last thing about Zimbardo I mean that's truly an amazing story if you haven't read that study I recommended it it's just Stanford undergraduates right peers just picked out of the random Stanford population taking a psych class as I understand it randomly assigned to be either guards or prisoners and within three days the prisoner group is pretty seriously abusing the guard group is pretty seriously abusing the prisoner group now that just seems on the face of it totally implausible in that short amount of time especially with people who are in many respects like you I mean so this speciation thing isn't really happening you haven't had a history of using this derogatory language toward them they were three nights before your buddies your dorm mates whatever what does that tell us I mean what is the take away from the fact that that can happen even in that environment and I'll preface what I'm about to say by saying that there's a lot about the Stanford prison study Zimbardo recently wrote a book called The Lucifer Effect which relates the Stanford prison study to Abu Ghraib and I think there's a great deal about the Stanford prison study that is problematic from a methodological viewpoint there was a lot more experiment or influence on the subject's behavior than Zimbardo has admitted in the past but that notwithstanding the central point that Martin makes is correct which is that when you put people in a particular role and that role carries with it certain populations and you don't regulate them very carefully then they oftentimes will assume a much more exaggerated set of behaviors that they think represents whatever role they're in these were not trained prison guards they were college kids and what did they know about prison not much but they we've all been through basic training or experiences like that and we see with cadets all the time when they get into the basic training environment at the Air Force Academy some of them become, they have to be regulated quite carefully because they assume that role as trainer and it looks a lot like the Stanford prison study behaviors can become abusive very rapidly others do not are not so affected by that role behavior but it's why it's so important as a leader I think to one of the things that people ask about Abu Ghraib is where the heck were the officers you know there were 10,000, 8,000 detainees at Abu Ghraib and it was less than a battalion on the facility and so people were spread thin they put the two most experienced guards they had on the hard side assuming that they could be trusted with the least supervision to operate reasonably it turned out to be a bad judgment but that's I mean I think the lesson of the Stanford prison study is that it's important to be scrupulous about being where your soldiers are as much as you can I mean it's hard in this day and age but you need to supervise carefully you know George wrote a very nice piece in parameters which is the Army War Colleges Journal about Zimbardo and Abu Ghraib and Zimbardo ended up testifying at at least one of the court martial trials and what you point out in your article is that when it comes to the question of ethical and legal responsibility Zimbardo's commitment to situationalism basically makes him incapable of making any judgments of responsibility so how should we think clearly about how to balance accepting the situationalist critique but not washing away all questions of accountability, moral and legal? That's a really tough one and I think squaring that circle is something that has gotten situationists in trouble again and again and again Zimbardo continues to say that situationalism is determinative of behavior that it's in the transformation that occurs in a situation like Abu Ghraib according to Zimbardo is inevitable that it would affect anyone that way which sure sounds as though that person would therefore not be responsible for their actions if it was that inevitable but then he says but on the other hand he's still responsible for their actions and how an action that is involuntary it would seem to me would not have a moral tone to it so he never in my opinion and I don't think any of the situationalists ever articulate a reasonable way to balance the moral tone of actions against the situationalist explanation the way it works out I think and I talk a little bit about this in the parameters article I think when people perceive that there are situational constraints in courtrooms for example mitigation in punishment is often the result not necessarily acquittal but mitigation so in the case of Abu Ghraib there were two dog handlers who were charged with using their dogs inappropriately and there was significant evidence of command influence on those two particular dog handlers there was a civilian contractor who had pretty much told them to do illegal things with their dogs he couldn't be prosecuted because at the time the military extraterritorial jurisdiction act did I get it? Okay did not permit that but when the courts when they were court-martialed they received very mild sentences because I think legitimately the courts said well you know these guys were under influences that you know might reasonably have been expected to affect their behavior and so their behavior their punishment was you know was mitigated as a result so I don't think there's a logical I don't you know I've never heard a successful sort of coherent you know discussion of how you can reconcile the two things but I think as a matter of practice our court systems do seem to take those factors into account at least in some cases. Well the examples that you gave in your talk we've got Greiner at one end of the scale clearly bad Apple clearly bad guy you should get the full force of the law you got wisdom clearly good guy good to go then you had your in-between cases and Lendi England and tell me about her I mean what's her level of responsibility? Well I mean she received a relatively short sentence she was involved in some of the more egregious photographs which actually today's her birthday I think I was mentioned on the slides when they took place exactly ten years ago today she was you know by most accounts a very pliable sort of person you know a person who was not did not have a strong personality and Greiner had you know a very strong personality they were sexually involved and so you know I think that she if you listen to her talk I've read things that she has written and she has said yes I knew that what I was doing was wrong but on the other hand you know she will often say things were happening to American soldiers and you know there were emotional sort of complications and I was under the influence of these other people and so I did what I did even though I had the idea at the time that it was probably wrong so what is her level of responsibility I mean I think the fact that that she knows that it's wrong you know implies that she ought to be held responsible at some level on the other hand I mean she was you know a young kid from West Virginia to a situation which as I described was extremely dysfunctional and to judge her too harshly is perhaps you know she at least proved Socrates was wrong when he said no one knowingly does wrong one last thing that we'll put to the questions we've seen quite a spate of senior leader failure in all services in recent years there are only really two logical possibilities here right that some of these are bad apples who've been at this for a long time and just finally got caught that's a question about the promotion system and how well we screen and so forth but what about the other possibility that there are people who've been perfectly fine in the situations they've been in and then something about the new environment of the rank that they've risen to is a disorienting situational factor any thoughts about if that were true what would those factors be do you think say talk about say the 050607 level yeah I would I mean this is pure speculation but I mean it would seem to me that that transition brings with it a host of changes some of which may include disruption of previously existing sort of social support systems so that if you have people around you who you're familiar with who are willing to tell you when they think you're perhaps going astray then you can perhaps be more successful in maintaining good behavior but suddenly if you're thrust into a situation in which you're kind of alone at the top the people who have been around you in the past are not there and you have access to a tremendous amount of power that you haven't had before I mean it makes sense to me that it's at least possible that that dislocation of previously existing sort of social support systems psychological accountability with other people could contribute to those kind of things I said that was my last move to actually have one more if that's true Clint yesterday was talking about building ethical guardrails I mean he's not a psychologist and he's working off riffing on a Bible story right from the point of view of an empirical scientist how plausible were his guardrail suggestions I thought they were quite good actually I mean as I was here listening to some of the discussion and seeing what some of you all came up with in terms of the temptations and the guardrails and I think many of the issues that were raised were very we're right on I think it's important to maintain openness and awareness and transparency and units and leadership environments as much as possible and I think the and anything else the thing that can help us to do the right thing is to listen to other people and to respect our feelings when I said earlier I probably shouldn't have said it the way I said it but I think emotion is important not in the sense that if you get really enraged that's the right time to act I think you should calm down and think about it a little bit but on the other hand certainly when we feel something that's not necessarily something we should ignore that feeling is there for a reason it's something deep within us that is telling us something important and I think it's valuable to pay attention to okay great we have about 20 minutes for questions from the audience so please don't be shy Robert history we've seen examples of beliefs or behaviors which society as a whole seems to be legal or acceptable the first situation at hand then it goes on to change its mind about even wanting to be moral legal in fact yesterday during the discussion on constitutional ethics we talked about how as officers we should make our judgment based on the law