 Continuing on, continuing on. Our next presenter is Robert Kurzban. He's the editor-in-chief of Evolution and Human Behavior. And not the magazine, the actual evolution and... No, the magazine. His talk is called Why Everyone Else... Everyone, parentheses, Else is a hypocrite. Here's his limerick. We think we're immune, naturally, of any and all hypocrisy. But we're all guilty of it. No one hears above it. Of course that doesn't apply to me. Please welcome to the stage Robert Kurzban. Well, thank you for that surreal introduction. I want to say hi, and I also want to say atariah to any of our Andromedian guests who happen to be with us in the audience today. This is actually my first tam. I haven't been to a tam before, although I've read a lot about it and I have to say everyone here has been really friendly. Although I have to say not as friendly as I sort of thought people were going to be given everything I've read about this meeting. If you know what I mean. That guy does. What I'm going to be talking to you about today are some ideas in a book that came out a few years ago. And in many ways these ideas can go all the way back to this quotation here by Walt Whitman where he writes in this stanza, Do I contradict myself? Very well then. I contradict myself. I am large. I contain multitudes. And what I want to do today in the small amount of time that I have your attention is sort of try to persuade you that Walt Whitman was deeply right about this. That he anticipated advances in cognitive science and psychology in a way that I think was very deep and very profound. And in certain respects the kind of opening salvos of some of the ways in which I think that Whitman was right about the ways in which we all contain multitudes can be found in the revolution that occurred in the 70s and 80s. The cognitive revolution exemplified here with Marvin Minsky's book Society of Mind. And he asked this question, which is a really deep question. It's one of the most basic questions I think in psychology which is that we're all made of kind of stupid stuff, molecules and atoms and proteins and so on, which are not themselves smart and yet we're able to do really smart things. Although with this conference it's also clear that people can also do some really dumb things and think some stuff which isn't too bright. But by and large people do stuff which is really, really smart. And the crucial part of Minsky's answer that I want to talk about today in any ways, the work that I'll be talking about really does stand on the shoulders of giants such as Minsky and in fact Dan Dennett, who we'll be talking later on this afternoon, is that the answer lies in the very simple notion of parts. That the way that things become intelligent is through the way that parts interact. So, and it's important by the way to see how novel this is by challenging your intuitions because there are certain ways in which we think we know how people get smart and this is an answer drawn from, again from my experiences here, I'm guessing many of you have probably seen Men in Black. And this is a good audience for Men in Black jokes. So this is a character from Men in Black and if you remember the film, it turns out this is not a human being at all, this is actually sort of a robot being kind of driven or ridden by the actual smart thing inside of it which is this soulful creature with the big head and big eyes, right? So that guy is sort of running the show with the levers. And when the human looking guy is doing something smart, it's really because there's a smaller guy in there who's being really smart and moving him around. And so the big guy's smartness comes from the little guy's smartness which then just raises the question, well, how does that little guy get smart? Which you might think maybe there's another little guy in there, right? Which then raises this next question, well, how'd that guy get so smart? Well, maybe Gentle Rosenberg is some other version that's in there, right? This can't be right, right? You have get infinite regress. So we're smart is not because of some smart center in our brain, the smart bit inside of us that makes the rest of us look smart. The analogy that I use to get across this idea is to use something else that we call smart which is a smart phone. So drawing from Whitman, I sort of think of this as containing multitudes. So this is an information processing device and it has lots of specialized applications. And the reason we call this a smart phone is not, of course, because it's a very good phone. Your phone is like my phone. It's actually not that good as a phone. It's great for all sorts of other stuff, but it's actually not even as good as landlines used to be, which is puzzling and really irritating. And I'm sure a lot of you people are in IT and I'd like you to get on that because I find it very irritated. But what makes these things smart phones? Well, we call those smart phones because you can put cool stuff on it, right? You can get the weather from it and you can play games on it. You can make little memos to yourself. Which are specialized to do a particular job. And what I want to argue is that, again, over the last 30 years, one of the things that modern cognitive science has taught us is that you can think of the brain the same way. I am not saying that brains are literally computers or literally digital. I'm saying that there are analogical properties that are shared between brains and smart phones and particularly this one property, which they consist of lots of different applications. So I'm an evolutionary psychologist by training, so I don't really talk about applications. Instead, I talk about adaptations. So I think of the brain as this big mechanism that consists of lots of specialized bits inside of it. So, for example, you're all walking around with brains that have adaptations designed to cause you to care for children, that cause you to build friends and alliances, that allow you to infer other people's beliefs and desires, that help you look for people who are cheating, that cause you to be able to identify when other people are saying things which are inconsistent. You have mechanisms that guide your mate choice. For example, they cause you to avoid having sex with people who are closely related to you. You'll see, I didn't use an example here. I was thinking for this audience of putting the Lannisters in there, but then I thought, eh, eh, right? Right, you have little revenge systems in your head to stop people from doing bad things onto you. You have these mate selection systems. You have these systems which are guiding you towards certain kinds of food choices, like these yummy cookies out in the lobby. You have language systems. And one of my favorite systems, you have mechanisms in your head which cause you to evaluate other people's acts as morally reprehensible. There seems to be a lot of that right here. And then another one of my favorite mechanisms is this sort of public relations system. So one of the problems that mechanisms with parts have is that you need some bit to speak for the whole thing. And so I think about this as the part of your brain that's sort of involved with persuading other people, talking to other people, the speech act is that one. So that's sort of the big picture. And one of the things that's interesting from the talk rather than this, that's sort of the idea. So this is the modern view of the mind that many of us hold. And when I use the word module, I just mean something with a function. So just like in your cell phone, you couldn't drill a hole through part of your phone and get rid of the Angry Birds application. That would be crazy. That would be nuts. Because who would want to get rid of Angry Birds? Because it's so fun. The way you would do it is you would uninstall it or whatever. And the reason is because Angry Birds doesn't work on your phone throughout the lecture. Just like stopping your head is to speed up through your entire brain, right? So when I talk about modularity, I'm not talking about something spatial. I'm talking about something functional, things with jobs. So here's a photoreceptor. The photoreceptor is retina. All they do is they sit around all day and they turn electromagnetic radiation into the visual scene. That's all they do. And so they're modular in the sense that they have a narrow job. Okay. So for the rest of my time, I'm going to make just 10 points, and all of you people are hypocrites. And I'm going to start here with flip-brain studies. This draws on work by Mike Zanagan colleagues. You're looking at here sort of a God's-eye view. Can I say that? A view from above of a patient who has had his corpus callosum suffered. So this is the bit that communicates between the two hemispheres of the brain. And when you have such a person, what you can do is you can show one half of the brain, one image, and the other half of the brain, another image. You can test various hypotheses about the way that the brain works. It's also really fun because you can really irritate the flip-brain patients because they get, obviously, two different images and two different hemispheres of the brain. But for my purpose, I just want to make one point about such studies. Consider a patient in such a study and ask your question. When you show this wintry scene to the right hemisphere and the chicken claw to the right hemisphere, what has the patient seen? And what I want to argue here is that this is a bad question. This is a question that assumes something which is contrary to fact, which is that it is sensible to talk about the patient as a unitary whole. So there's really just no answer to that question because there's no such thing as the patient quaw the entire patient. And this is really important because if it turns out that human brains that are not artificially modularized, artificially severed as flip-brain patients are, then it could be that lots of sensible questions you might have asked about us make no sense for the same reason. If you consist of lots of different modules, then it might actually be a problem to talk about what beliefs you hold because there's different bits of you in your brain. So I want to just talk about one way in which you can see this. This is a visual illusion, some of you might have seen it. I don't know if all of you can see the images here, but there's a black box in the back with a letter A and then there's a lighter shaded box in the middle of that diagram labeled B. I might have those two backwards. I can't see it either. And the first thing I want you to do is evaluate the extent to which you think those two squares are the same shade or different shades. And if you have normal vision then they'll appear very different to you. And what I'm going to do is move those two squares from the left side of the screen into the middle and illustrate that in fact, bizarrely enough, they're the same shade and you can see that clearly when I remove the surround. One part of your brain the visual system has a representation that is the information that those two squares are different shades. A different part of your brain, the one that sort of puts representations into sentence form, thinks that those two squares are the same shade. So just like these split brain patients, you have two mutually inconsistent beliefs or sets of information structures in the same brain. And the point of this is just to illustrate, you have the same sort of property that these split brain patients would do, which is that you can have mutually inconsistent beliefs in the same brain. Okay? This leads to some very interesting consequences. Now we're getting closer to my own field in the judgment decision-making. This is some work by Dick Nisbet back in the 70s and colleagues where they asked people who were coming into a store to evaluate four different pairs of pantyhose. People were asked which ones they liked the best. Unbeknownst to the subjects, the pantyhose were all exactly identical. And in fact, between subjects, the pantyhose were rearranged on the table. So they were just putting the different pantyhose identically in different positions. Now people will answer this question. If you pick very few subjects will say, I don't know, they all look the same to me. Subjects are very compliant in this respect. So you can get them to say and usually what they'll do is they'll just pick the one on the right for reasons that are not important. They use the position of the pantyhose instead of some property. But then if you ask them why, they don't actually tell you that it had to do with the position, they make up some property because I like the color. Now we know this can't be right because they have identical colors. So the point is that the modules that make the choices are not able to say the reason behind the choices. So just like the split brain patients, you have some modules which in some sense really don't know what the other modules are up to. And the sort of the public relations part of your brain that I mentioned earlier, it has part of its job is to explain behavior that the person did or the whole thing did. So it's got to kind of spin this narrative. So the way I sort of see the world is that there's all the stuff going on in your brain. And then there's this public relations system and that's sort of the stuff that you know about. I put you in quotation marks there because once you start thinking about the brain as this modularized system, the whole notion of there being a central you in there becomes quite problematic. It becomes very difficult to think of an individual in the same way that you might without the notion that what's really going on in your head is all these different parts are doing their particular jobs. So just like these phones my argument is that these different modular systems have jobs and what I do as a psychologist is I try to explain the social mind by explaining the functions of those little modules. Like what are all those different things up to. And one of the I think most important parts of understanding human behavior is really drawn from a very old set of ideas, the theory of games and so we're always playing games with each other. So I haven't tried this here in Las Vegas, your streets are too wide out here but in Philadelphia from where this picture is taken, by the way I know for those of you who actually live in Philadelphia, this is actually Chicago, but here in Vegas who's going to know. Here's the way you can cross the street in Philadelphia. I'm not advocating doing this by the way, but here's the way if you want to cross the street, here's what you do. You kind of, you could look at your cell phone or you could just look oblivious and just sort of wander across the street and here's the way that drivers behave in Philadelphia. If a driver sees that you haven't noticed them, they'll actually stop for you. Because they know that you won't be able to get out of their way because you haven't seen them coming. And they want to avoid hitting you because that leads to a lot of paperwork which takes a lot of time and it's sort of irritating, right? So as long as you are ignorant of the oncoming car, they will actually slow down for you or even stop at the intersection, which is great. Now conversely if you look and see them coming, then they'll just barrel through the intersection because they know to jump out of the way because right in the game between car and human, human always which is the head and played, right? So here's a case where ignorance is really useful. This is going to be important for lots of different aspects of human behavior and you might hear, I'm not sure exactly what the next talk is going to be about precisely, but you might be hearing some of this in that presentation. The point here is that because of the way humans interact with one another, there are certain ways in which being ignorant and wrong can be an advantage. I don't think this is potentially something which is of interest to the skeptic community. I understand that there are certain people who feel an obligation to correct others in the errors of their ways, but by the same token I think one of the lessons from psychology is that there are certain kinds of false police which can be advantageous to people in various sorts of ways. And I'm not going to push that too hard. I'm happy to talk about that with people. I'll be around during the rest of the conference and I'm happy to discuss that point of view. It might not be something that is particularly best to reiterate, I actually think that certain kinds of false beliefs and certain kinds of ignorance can be advantageous under certain circumstances. Now, not always, right? So if you're playing Frogger, which again in this audience, I'm guessing a lot of you played Frogger as kids or maybe you still do, here's a case where you definitely don't want to ignore the oncoming cars, right? Because in a video game there's no actual driver in those cars, they're just run by the computer, so you want to know where all the different cars are. So my point here is simply that I'm not saying that being ignorant and stupid is always an advantage. In games like Frogger where you're playing against the world you want to just have the best possible information that you can in games in which we're engaged in some kind of social interaction that's not necessarily true. And people know this, right? So Will Bailey from the television series The West Wing has this really nice quotation where he says, I do my best work when I'm the least informed person in the room. So here, again, and the idea here is very simple. You are, in fact, the press secretary of the president of the United States of America and there is some piece of information that would be damaging if it got out. And that's going to be important for the final point that I'm going to be making. There are some pieces of information that would be damaging if they got out. So if Will Bailey had that piece of information and was asked about it, he has two choices both of which are bad. He can either lie, it's bad to lie to the press if you're the press secretary, or he can tell the truth now that damaging piece of information has gotten out. So what Bailey is saying in this quotation is saying, look, I don't even want to play this game. I don't want to know this damaging piece of information because that leaves me in a better strategic position. I can now sincerely and honestly say, no, to the question about this particular damaging piece of information. My argument is that people are like this too. There are certain kinds of information that in some sense the secretary module of your brain is better off not possessing in virtue of the strategic advantages of ignorance. So if this is true, then as an empirical matter we might look for various kinds of ways in information that would be damaging if it got out was kept out of this public relations conscious center of the brain. And that would be interesting. If the mind were actually designed in such a way that the part of us that talks, the part of us that's aware was being shielded from information that was damaging if it got out under certain conditions. So again, the way I sort of think about this stuff is that there's stuff going on in your brain and there's information in this modular system, the public relations system that once information gets out there has the potential to leak out into the world. In the same way that if the press secretary knows a piece of information, he might leak it by the same token. There are certain kinds of information where once it gets into those modules that are aware that can talk, the risk of that leaking out into the social world goes up. I think there's lots of ways in which all of us are wrong and I think the evidence is very strongly in favor of this. So because there's a gazillion examples to choose from, all of which make people in general look like idiots, I always start with one that makes me look like an idiot because it refers to a class that I belong to. So this is some great work by Patricia Cross done in the 70s where college professors were asked whether they were above average, below average, average teachers, 94%. As you can see, here rate themselves as above average teachers, which obviously can't be true and 68% rate themselves in the top quarter of teaching performances. So this means that many of us are absolutely, we have to be wrong. We don't know which ones are wrong. We don't know who it is that's overestimating their skills. I, for example, I'm in the top quarter of all college instructors. And if I had more time I would entertain you with many more examples. There's another one which I absolutely love where maybe as closer to your lives where people are asked to evaluate their driving skills and they take two groups of people. One of them is a control group. They're just drawn from the population. The other one is a group of people who were in the hospital because they hid stuff that wasn't moving like trees and barrels. And then you compare these two. How good a driver are you compare these two groups and there's no difference between the two groups, right? So cold hard reality is not enough to keep to get people to be accurate about their driving skill evaluations. Anyway, so, you know, in the psychological literature there are a lot of big explanations for why we're so stupid. And one of the main explanations has to do with happiness. And this might be the one that occurred to you. I think it's a very natural sort of explanation. So one kind of explanation is, look, it's really nice to think that you're in the top quartile of all instructors. It's really nice to think you're a good driver. So maybe we have these false beliefs just to make ourselves feel good. And that might be true. As an evolutionary psychologist, under the circumstances that we're talking about, what would happen to a modular system that made itself happy? So the way I always think, I think of these as sort of the head-in-the-sand models, these ostrich models. So if you're an ostrich, and this isn't true, by the way, so there's this urban legend that ostriches, if they're being threatened by a lion, they stick their head in the sand so that they avoid the existential terror about to be eaten by a lion. And the way to think about why that's a bad idea is just to envision two ostriches. So a lion comes along, sticks their head in the sand, and for 5.3 seconds or whatever it is, it's like, oh, I'm so happy I'm not going to get eaten by a lion. And then he gets eaten by a lion. And then the other ostrich sees the lion and runs away. Which one of those two will have better reproductive success? It's got to be the second ostrich. So ostriches that save themselves from existential terror don't tend to have a lot of descendants who similarly save themselves from existential terror. And, you know, this is important because when people tell you that the explanation for positive beliefs, for the view that your future is rosy and bright, for the idea that you're a good driver, for the idea that you, you know, you have skills and abilities above average and so on, the leading explanation, at least in certain parts of psychology, have to do with this notion of happiness. So what I'm trying to suggest here is that there's probably a different reason having to do with persuasion. So the nice thing about my thinking that I'm a really good instructor is that you guys don't know me very well. So when I made the joke about being in the top quarter of instructors, many of you thought, eh, that's a pretty good talk. He's probably a pretty good instructor. He knows what he's talking about. And all of a sudden, my reputation is better, right? So here's a way in which if we presume I'm wrong, I've gotten an advantage and really located, and again, the social games that we played with each other, the fact that you're going to be persuaded by people's false beliefs, right? One of the best sources of information we have about the people who we encounter on a daily basis is what they believe about themselves. And so here what I'm suggesting is that the reason that we're so wrong is often not located in how it makes us feel, but in the actual, tangible, persuasive effects that we have on the people around us. Okay. So let me close in now on hypocrisy for a few minutes. So I want to come back to this little module here, which is the moralistic judgment module. So this is the one where we all have this in our head. So we sit around all day, and we evaluate other people's behavior, and we evaluate it in the moral dimension, is what that person did, was that morally right, was that morally wrong, and so on. And, of course, this is a very active module, more active in some communities than others. But the point that's important here is that there's no necessary link between the behaviors that your moralistic system condemns and the behaviors that the other parts of your mind cause you to engage in. So you can very easily have some modular systems which are condemning certain patterns of behavior while you yourself are engaging in that behavior. Right? So the hypocrisy in many ways becomes this very natural part of the human mind and the human condition, which is that the reasons which I won't talk about in detail here, part of what it means to be human is to observe other people's behavior and condemn them for doing things which we judge as wrong. But another part of human behavior is we often take advantage of various kinds of opportunities that are available to us, even when those opportunities entail engaging in those very behaviors that we ourselves condemn. And so this, to my mind, is why hypocrisy is more or less inevitable. It has to do with the fact that the systems that are causing us to condemn other people's behaviors are not the same modules as the ones that are causing us to engage in those behaviors. So the way I sort of think about this is you've got systems in your head which publicly condemn what other people are up to. For example, you might have the view although, again, this is kind of a funny place, a funny state for this particular moral view. But in most parts of the country people believe that prostitution is wrong and they say this publicly and they behave in ways that are consistent with that moral condemnation. And then you also have people who are interested in taking advantage of various kinds of mating opportunities and they cause you to engage in certain kinds of behavior. And the result is this guy. This is for you West Coast people. This is Governor Spitzer, former Governor Spitzer, who loudly decried the practice of prostitution and was caught engaging in it. This is simply because you have one set of systems which are engaged in condemnation and another set of systems which are engaged in the actual behavior. And I should say before I go ahead there I'm not trying to say this is something about politicians. In fact, I'm trying to make the reverse argument. The thing about politicians is that they're forced to take public positions on lots of issues and their behavior is carefully scrutinized. What that means is that unlike you politicians viewpoints are documented on their websites and interviews and so on and their behavior is recorded. So it's very easy to cast them in consistencies because the public eye is on them all the time. But I think that that's not because politicians are necessarily more hypocritical than humans. What I'm saying is that it's just easier to catch them at it. And I think we're like this all the time. It's just harder, but occasionally you can. So you might not be able to see this depending on where you're seated. There's two bumper stickers there. On the left it says hang up and drive. Guns don't kill people, drivers with cell phones do. And you also might not be able to see it but the driver is there, got his cell phone to his ear. And this I think is the perfect metaphor. So this is basically these moral condemnation systems advertising what do I think is bad. You know what I think is bad? People talking on the phone while they're driving. But then you're in a hurry and you don't want to pull off the side to make a phone call so you engage in that behavior. This is the natural state of the human mind. So to go all the way back to where I began I think this is why we contradict ourselves. And it really is because we contain multitudes. We contain multitudes of specialized systems which are designed for different functions. And sometimes those functions come into conflict. And this is going to be a necessary property of any incredibly complex system especially one as complex as the human mind. And that just leaves one little question which is, you know why is it that we ourselves don't feel as though we're so inconsistent? And I think that's a really interesting question. And I'm pretty... I don't know that this is the right answer but here's how I sort of think about it. So remember I mentioned briefly earlier that people are pretty good at detecting other people's inconsistencies. We're not perfect but we're pretty good at it. We notice when people do things which are inconsistent with their state of beliefs and values. We notice when people tell lies if we have all the necessary information so on. One of the things that would be good to keep out of your little press secretary system are cases in which you yourself have done something which was inconsistent. And the advantage of that ignorance is that you yourself don't notice the various ways in which you're inconsistent. You're much less likely to leak that information out into the world. So my guess, and it's just a guess my guess is that one of the features of the human mind is that there's a specific design in there to keep your inconsistencies away to keep you ignorant of your own inconsistencies. So while you're really good at detecting when other people are doing things which are inconsistent, you yourself don't notice when you've done it. And that I think is why you think everyone is a hypocrite in particular, it's why you think that everyone else is. And with that, I want to thank you for your attention today. Thank you. Robert Kursman. Thank you, Robert.