 Okay, let's get started. Let's get started. This is the fun bit. This is the fun bit. Alright, two years ago I did songs for each person. Last year we did Haikus. This year it's interpretive dance. So the first topic is... I'm kidding, I'm not doing that this year. I'm kidding. I'm kidding. Calm down, sir. I'm not going to do it, I promise. Maybe later we can talk and we'll figure it out. No, this year it's Limericks. So the first talk is Can Rationality Be Taught? Daniel Dennett, Julia Galef, Barbara Drescher, Scott Lillianfield, and Ginger Campbell. And here we go. Ready? Dan, Ginger, Julia, Barbara, and Scott are about to tell you what they've got. The topic you see is rationality. And if the damn stupid thing can be taught. Please welcome to the stage Daniel Dennett, Julia Galef, Barbara Drescher, Scott Lillianfield, and moderator, Ginger Campbell. Trying to see if I can see everybody. Of course I can't. Well I'm really happy to be able to do a panel at the very beginning when everyone's fresh. So we're going to be exploring a pretty interesting idea of can rationality be taught. To start out with I'm going to ask my panelists to briefly answer, do they think yes or no? And if so, why? Just really briefly just to jump things off. And I think that since Daniel Dennett is probably an inspiration to all of us, we're going to let him go first. Thank you. Thank you very much, Ginger. As a philosopher you think I'd say yes, rationality can be taught. But then I look at my philosophical opponents. And these are professionals. And they say the most irrational things. And they can be remarkably impervious to arguments. So much so that I often say, and my colleagues say as well, I'm not trying to convince them of errors in their ways. I'm just trying to convince their students of their ways. In fact if, for instance, if John Searle started agreeing with me at this point, I'd be worried about his health. And I'll save to the next time I speak some interesting research that I think that bears on that. So roughly I would say to some small extent rationality can be taught. But much less than you'd think. And professional philosophers are a pretty good example of the limits of rationality. Do you want to pick the next most inspiring person, Ginger? I set myself up for that, didn't I? Well it must be Julia. Go ahead. Well the organization that I co-founded is called the Center for Applied Rationality. And we are founded on the premise that rationality can be taught to at least some extent, to at least the crowd of people who want to come to a workshop and learn rationality. Which might sound like a trivial claim but it actually isn't. I think that wanting, being interested in rationality and wanting to be better at rationality is a, maybe not necessary but certainly very helpful precondition but it doesn't actually get you that far in terms of being rational in your sort of day to day decisions and judgments. So that alone is a claim that I'm pretty confident in and one that I think would make a big difference if we're right about that. I think that we don't actually have a lot of direct, strong evidence yet in the academic literature about how exactly rationality can be taught. Nevertheless I am optimistic because of a whole bunch of other pretty compelling indirect evidence. Which maybe I'll save for later in the discussion. Barb? I'm not as optimistic as Julia is. I think I like the way that Dan put it that there are limits to it. So my short answer was yes and no. And I'm not an optimist at heart so I lean on the no a lot but of course I wouldn't be here and I wouldn't do what I do if I didn't think that we could teach it to some degree and to go into any detail I would have to go on and on about that. But I think there are things about, we have to break down rationality into what makes it different than things like critical thinking and intelligence and tease it apart which we have, there's lots of literature on so we can talk about that. And then talk about the parts of that that can be taught and the parts that I still try to teach but I don't know if I have a lot of hope for. Last but not least Scott. So I'm a psychologist by training and I actually teach a course on critical thinking, at least I did until a couple years ago and I sort of like Barb I wouldn't be in this business unless I thought it could be taught to some degree. And I do think it could be taught to some degree but I think the educational psychology literature gives us a little bit of reason for pause and I think for a call correct me, Dan may correct me, I think the first three words in his intuition pumps book are thinking is hard. And it is really hard, it does not come naturally and I think, and I can say more about this, but I think if there's one finding at least in my reading of the educational psychology literature that's consistent it's that critical thinking can be talked to a degree but it's very domain-specific is the problem. It's first of all it's very hard to teach in abstract, I think one mistake I made when I first started teaching critical thinking is just sort of teaching it in very abstract principles without giving people an actual sense of what the subject matter has to be applied to. I think that's one issue but the literature I think suggests that in general it does work to a degree but then it often doesn't generalize to other disciplines. I think that's always the key problem and I think we can all talk about, Dan gave a couple of examples but we can probably regale ourselves with tales of all the Nobel Prize winners who are obviously brilliant in one domain who have remarkably irrational ideas in others which I think suggests this kind of domain specificity. So the question is, which maybe we can address, is at what level should critical thinking and rationality be taught? That's not something we have a good handle on just yet. Okay, so let's back up for one step to the question of well, is rationality something that you're born with or a skill that you have to learn and is it something different from critical thinking? It's very different from critical thinking. I mean critical thinking is a component in rationality but I think that all of us, even those of us who study it, often tend to think of rationality as being intelligence and it's baffling why really, really smart people do some incredibly stupid things and it's very different from, they are different things. There's a different skill set and there are different factors that go into how well somebody can rational, how rational people are and how consistent they are with rationality than there is with intelligence. Intelligence is basically cognitive abilities and rationality includes this huge component about what you will do, not necessarily what you can do but what you will do in given situations. So you may be able to be rational all the time but none of us are. So clearly there are components that are dispositional. So if they're dispositional then we have to figure out where do our dispositions come from and there has to be some mix of genetics and environment and development there that still needs to be teased out I think. Do any of you have a working definition for rationality? Not critical thinking, rationality, what's your working definition? Anyone? No, I don't think... I do but I wanted to give somebody else a chance to talk. Without getting too technical, there's descriptive rationality which is the study of how people actually reason, then there's normative rationality which is like perfect reasoning. So making all of your... updating your beliefs in response to new evidence in perfect accordance with probability theory, reasoning perfectly logically without contradictions, like internal inconsistencies. And if we're going to expand the definition of rationality to sort of instrumental definition that they use say in behavioral economics then that would also include sort of maximizing your expected utility, et cetera, et cetera. This is like not... This is so far off from what the human brain is designed to do that I don't know of anyone seriously arguing that this is something that we could realistically strive for. And then third there's what's called prescriptive rationality which is a sort of like guideline for how to start with what the brain we've got and do things that may not look like trying to optimize our expected utility or trying to do probability calculations but that result in us having judgments and decisions that are in practice actually a little closer to the ideal. So that would include sort of various debiasing techniques and changes to your disposition to make you more inclined to use those techniques. Diane, so it seems like the ideal of being rational came to us from philosophy. So what are philosophers talking about when they talk about rationality? Well, pretty much what Julia said but I want to pick up on something that Julia said and say that I used to worry I used to bother me, I used to depress me that philosophers and scientists are so good at defending their own, at arguing against other people and so bad at seeing the flaws in their own arguments. And then a couple of years ago in 2011 Pascal Mercier and Dan Sperber published a wonderful article in Behavioral and Brain Sciences called Something Like Why Do We Reason? And they argue very convincingly that our cognitive systems are designed to be partisan that there's sort of an opportunistic partisanism built right into our capacity to reason which means that we tend to favor positions that we can find good arguments for not necessarily the best positions but we like to be able to support with arguments the positions that we publicly avow and we're much better at persuading others than we are at seeing the difficulties in our own arguments. They argue, I think very convincingly that we really have to think of reasoning the way we think of romance. It takes two to tango. There has to be a communication or at its best there's a debate of sorts not like a high school debate which is sort of stagey but an interaction between people with opposing views that we're sort of biased. It's like the gain is set in our critical thinking to expect to be up against a wall of disbelief and to be very good at persuading not so good at finding the flaws in our own arguments. I highly recommend that article it's in behavioral and brain sciences and the great thing about an article in behavioral and brain sciences is that it illustrates this very point and always has. Every issue has a target article or several target articles together with reactions, comments right there in the same issue from usually several dozen people in the field and in a field like cognitive science where it's very interdisciplinary almost everything you read in it it's not quite your field and you want to know what the people in the field make of this and you get an instant read on what they take seriously and what they tend to disagree with. So I think group discussion is not just pleasant it's actually an important element in correcting the flaws in our own reasoning. Go ahead. So just to follow up on what Dan said I agree and I like that article too by the way I think it's quite provocative. I think in some ways it goes back to your initial question which is are we born with it doesn't have to be taught and that's one of the few ones we can give a clear answer to. The answer is I think it's clear it's not something that comes naturally to the human species and I think that's one of the key things I've learned, one of the main things I try to inculcate in my students is that scientific thinking which I believe at least in principle that it's best can lead to rationality is not something that's natural. So take confirmation bias. I think there are articles in many respects an article about how confirmation bias evolves and in essence my reading of their articles of confirmation bias really this tendency that we all have, I'm prone to it, you're prone to it, we're all prone to to seek out evidence consistent what we believe, deny, dismiss, distort evidence that is inconsistent what we believe but that's something that really is a fairly natural byproduct I think of the way the brain is structured and I see science in many ways as a set of safeguards against confirmation bias but as Dan points out scientists themselves and the late Mike Mahoney, the psychologist showed this, scientists themselves and I suspect all of us including me are no more immune to confirmation bias than the average person his research shows that but the scientific community is really the best safeguard against confirmation bias scientists themselves good scientists should try to compensate for their own propensities toward confirmation bias but they're not very good at so it's up to the scientific community in the kind of grand argument that Dan talks about to hold their feet to the fire and make sure that their confirmation bias does not get in the way of their corroborating their own hypothesis I wanted to just speak to that one question that you had about a definition of rationality because it feels like we haven't really pinned it down and I think I have a simpler, I work with a simpler definition of it we can break it down into those pieces that Julie was talking about and instrumental and so forth but in a nutshell at least to a lay audience I usually say that it is the reasoning process and a set of beliefs that you currently hold that are consistent with meeting your goals with maximizing benefits and meeting the goals that you actually have and I qualify it with actually have because quite often we get to where we are and then we decide what our goals were based on the answer that we came and that's irrationality but rationality would be a process that leads to the goals that you actually have with you know happiness or benefiting humanity saving as many lives as possible all of those things is that what you can all agree on? It gets confusing because I think that's a great definition of instrumental rationality but you have to add in that we also have beliefs that are consistent with the knowledge we currently have so you have to qualify it with that It does, I think that one thing that gets lost when we talk about how humans aren't naturally good at doubting their own judgments or like looking for alternative points of view and taking them seriously which is true and I agree completely with that but one thing that gets lost is that there's a significant amount of variation between people in how good they actually are and a researcher named Keith Stanovich who I assume you guys all know who's on our advisory board has done some of the best work out there on sort of charting what the deviations are or what the spread of skill at these various aspects of rationality is and trying to pick apart what makes the difference and maybe I can tie together these two things about into one by saying that Stanovich likes to break down the sort of practical skills that go into being rational into two main categories one of which is being able to think in terms of manipulating symbols which is what you're doing when you're reasoning logically or probabilistically and this actually is pretty correlated with IQ so in that sense it's like kind of innate and so people who are strong on this dimension are good at avoiding the kinds of biases and fallacies that involve probability and logic like the gambler's fallacy where this coin came up had three times so I'm due for a tail, that sort of thing and then the other category of rational skills is you could call it reflectiveness or open-mindedness it's more of a disposition than a skill and it's just the tendency to ask yourself, is that right? might there be another explanation for what I'm seeing and also the tendency to be actually open to changing your mind in response to new evidence and interestingly that one's not correlated with IQ I was really interested in that so both of those dimensions the abstract symbol manipulation and the openness to changing one's mind show a lot of difference between individuals and I think that is not sufficient evidence to show that we can definitely train it but I think that it's more promising than it would be if everyone were uniformly bad at it he's trying by the way too he's been working as I understand in the last couple of years to develop a battery of test to measure RQ rationality quotient and it is most of the elements are surprisingly independent of IQ and interestingly even when you statistically control out for IQ using some there's still a glue that ties a lot of these tasks together which I guess does suggest that there is at least some domain generality across all of these different rationality tasks there is at least a weak to moderate positive correlation which I agree Julie doesn't show we can train it but does suggest that maybe if you traded one domain it could generalize to others at least it leaves open that possibility but based on all these definitions that you've given us it sounds to me like a person could make a decision that is rational based on what they know and believe that another person might not think was rational because they have a different knowledge they're working different information then it's perfectly rational to start with two different sets of information or two sets of presumptions and come to a completely different conclusion and both of them be rational but part of the problem is that we also wrap into rationality what you start coming in with so if you're coming in with a set of beliefs that's not necessarily consistent with knowledge that's a premise right so there's so many aspects to this you can provide all of the knowledge and you can provide all of the skills but putting that together is a big problem when we're talking about teaching people the skills part of the problem with teaching people skills and the domain specificity of that is there's a whole set of cognitive tasks that people are very good at when they're abstract and as soon as you add context to it it falls apart there are other tasks where the opposite is true but a good example I used to teach intermediate statistics and I always taught probability theory because they think it's a foundation and they really need to understand it and I could train students to do really well on calculating the probabilities of getting a specific color marble pulling it out of a bag and they could do you know joint probability and all of this and as second I changed marbles to M&M's the whole thing fell apart and it's because human beings work with schemas schemas are blueprints for how the world works and you can give people a domain specific set of schemas and they will make good decisions and rational decisions but they won't necessarily be able to generalize that or transfer that to a novel situation that has a different context and that's where we run into trouble but I think that if you do enough of this I think that if you practice this enough in enough different domains I think that it starts to become more automatic when they're no longer having to think step by step what bias am I going to have a problem with and I've got to avoid and when they no longer have to think that it becomes a little more natural that's when I think it starts to transfer so I think there's some hope is that the ability to transfer across domains is that dependent on intelligence I don't think we know the answer to that I would guess that that's all it is that it has more to do with experience and practice as a number of us have pointed out that our native thinking powers have been enhanced and corrected and refined with a lot of invented tools statistics logic all sorts of thinking tools and as Barb was just saying if you have to talk to yourself about how to use the tool you're not really adept yet you have to become a skilled user of the tool and of all of the tools and then and this is I think a point that Scott was getting at you have to be sort of self conscious about your own weaknesses when you use the tools and that's really that's really hard I want to tell a little confession I think in fact that people mainly academic scientists their heart plays a bigger role their emotions play a bigger role than we usually acknowledge I've been arguing I mentioned John Searle before he'll be my example I've been arguing against John Searle's gone for a second Chinese room argument for what 30 years and I think it's pretty demonstrably a fallacy in its various forms but you'll never convince John of that nor will you convince many people the Chinese room argument is a very attractive argument and I learned to explain to people why it was run this is the argument that artificial intelligence is impossible strong AI as he calls it and I would say look I'm going to point out the fallacies in this argument and their eyes glaze over and it's very clear they don't want to hear about the fallacies they like his conclusion so much and they like the fact that a famous Berkeley professor says that strong AI is impossible they don't want to hear my my details and I used to have contempt quite frankly for that attitude and then I caught myself doing the same thing I confess to having a deep visceral dislike of the boar interpretation of quantum mechanics but I'm no expert and then I read my friend Murray Gelman's book The Quark and the Jaguar and he laces into the boar interpretation he has a chapter called quantum flap doodle and he just beats them up with a stick and I just loved it and I'm reading this and say second Murray go go this is fantastic and all of a sudden I realize look I'm impressed at this Murray Gel Prize winner famous Murray Gelman agrees with me am I able to assess his arguments reasonably no I'm just happy to have him on my team you know and once I recognize that first of all I became much more understanding of what the problem is that I face people get invested in their views particularly scientists philosophers get invested in their views and you actually wouldn't want it any other way you want that cutting edge of inquiry to be peopled by partisans who are giving their all to show that they're right and there's a lot of ego in that but it's that clash because there's actually playing an important role I'm glad you brought that up because I think we've now touched on what I think are the two keys to actually teaching rationality and this is somewhat speculative because we don't have decades of academic research on this but to me the failure of academia to have found ways to reliably teach rationality that work and that that people will sort of seamlessly generalize across domains that failure is not actually that discouraging to me and that's because I think that the two important pieces that you need to have in order to have a method of teaching rationality that actually works are first you need people to be able to recognize in the field so to speak that oh this is like an instance where I might be committing the whatever bias or like oh this is an instance where it would be valuable to do a thought experiment or something like that so they need to recognize that this is the situation to be wary or to use this tool or whatever and that is absolutely hampered by domain the problem of domain transfer it's also hampered just by the difficulty of establishing habits you can like even if you practice something on a bunch of domains in a class if you leave that class and are never reminded of it again then chances are you're not going to use it again and then the second problem which Dan brought up is the problem of wanting to correct for the bias and wanting to use the technique and so I don't think that the there hasn't actually been that much effort to try de-biasing interventions in the literature and there have been critical thinking classes and you know skeptic magazines and so on of course but I don't think that any of those efforts so far really made much of an attempt to try to address those two key points of getting people to notice when to try to be rational and getting people to want to do that so those are the two points that CIFAR that my organization is really trying to focus on in various ways and just to give you like a very quick picture of what we're doing at the workshops that we run we spend about two thirds of the time just talking about real life case studies and people's lives of decisions they're trying to make or problems they're having and listening to other people's case studies as well and then talking about how to address those with the techniques rather than just talking about how to reason probabilistically or how to use reference classes and then we also try to give people triggers such that when they're out in their day-to-day life and they notice this trigger to think oh maybe I'm rationalizing so you can learn to notice oh I have defensive body posture in this argument that I'm having that's my cue to relax my body posture and try to be more open-minded or do some little meditation or mindfulness exercise or whatever has been shown to work for me and then in terms of wanting to use the techniques I think to some extent people can be convinced that considering that they might be wrong is actually good for them like there's this little meditation that I do sometimes actually quite a lot where I imagine the world in which I am wrong but I don't know it and what the consequence of that would be so for example when we were just starting CIFAR I was excitedly telling a friend of mine about it and he was pretty pessimistic about the business plan and we had actually sustain ourselves and I noticed myself getting defensive and finding reasons why he was wrong and not actually listening to him but because I knew this pattern I stopped and just in a moment asked myself if he's right and this is a bad idea I want to believe that he's right it would be bad for me if I closed off my mind to this argument and so if you have a trigger action plan set up this is actually a thing in cognitive science they call implementation intentions where you're much more likely to establish habits if you have discrete triggers that you can recognize in the world that's something that can I think be more motivating for people and maybe they haven't even thought about before and then lastly I think that being in a social environment where changing your mind in response to evidence and refusing to change your mind is looked down upon a little bit I think that can be very motivating for people and so I don't know if this is something that a single semester class can do to sort of change the way people think about thinking but that's something we're certainly trying to hit at CIFAR to spread this meme of admiring mind changing and I hope that will go a long way towards making people actually inclined to use, you know, kinds of techniques You know, it's interesting, you said two things that struck me, one of them was being in an environment where admitting that you're wrong is admired and I think that just telling people in intro psych classes even making sure that people understand that that's everywhere that in general people are more admired when they admit that they have done something wrong and changed their mind than when they're stubborn about it is a big thing but you said something about holding holding that idea of there might be a world where you're wrong and that's something that actually Stanovic talks about it's part of that open-mindedness is I think that is a skill, I think we may be able to teach that and I certainly tried to in my classes you need to be able to have, to set your own views aside and kind of hold them in a little escrow while you consider evidence because otherwise you cannot be objective about the evidence coming in so we need to have that ability to kind of compartmentalize and say okay, this is what I believe but I'm not going to consider that right now when I'm looking at this evidence then bring the filter back in and see how it all kind of shakes out does that make any sense? Yeah, I mean I think there's a bit of a I like to hear that I'm picking up because on the one hand I think what I hear you folks saying is that we need more role models we need more examples of people scientists may be politicians, maybe that's speaking too much I don't know, actually changing their mind admitting they were wrong about something it's not very likely but I think that's what I try to do in my teachings I try to talk about all the dumb mistakes I've made over the years and I can go on forever about it in fact I often tell my students pretty openly one of the things that drew me to skepticism I think is that when I was a teenager I was into all this stupid stuff I was into ancient astronauts and ghosts I was probably just as smart, probably smarter in terms of raw IQ back then than I am now but I was a terrible critical thinker I didn't have those skills so I think being a good role model is important, on the other hand from what Dan is saying because of our very nature of scientists to push and push and push and be dogmatic and often you to a very strongly held idea we're often not going to do that very well we're often not going to be very good role models so the question is how to find good role models out there I think it's really really difficult you know, a scientist named Brendan Nyhan or Nehan Nyhan, thank you he's focused on how to get people to be willing to change their mind in response to evidence he just finished a three year long study of what will get people to change their mind about the vaccine autism connection and it was actually pretty depressing, in the article I read summarizing his results he said I'm so depressed like three times nothing really works that well but so far in his study of this phenomenon he's found one thing that does seem to help people sort of you know, accommodate new evidence and that's he calls it a self affirmation basically like a little thing that you do to remind yourself of you know the things about yourself that you're proud of the aspects of your identity that I'm good enough, I'm smart enough and dog done it, people like me I don't know if that was verbatim in the study but this is basically this is tied to our our need to retain beliefs that are important to our identities which is a surprising number of beliefs on the object level like does this homeopathic medicine work or not, getting rid of that belief could actually feel very threatening to your identity and so when I look at what has made me to the extent that I'm good at changing my mind I think the main factor is that I have this identity of being someone who changes her mind and so I actually get this little dopamine hit when I notice that I might be wrong about something because I feel all virtuous and proud of myself look at me changing my mind so you're saying that everybody can win arguments with you because that makes you feel good but I think one of the tricky issues here which we haven't talked about maybe directly is one of the dangers of some of these debiasing techniques and I looked at some of the stuff a couple of years ago and speaking of being depressed man that was depressing because there's this literature on how to debiase against people against confirmation bias and hindsight bias and it's probably more absence of evidence than evidence of absence there's often just not a lot of psychological research on it as you point out but a lot of it's not very promising and the one issue you have to be careful about that I've come to recognize is that there's actually an increasingly large literature on backfire effects and one of the ones you're talking about I think is relevant to this is that when you challenge people's identity very strongly your belief gets stronger so you actually have exactly the reverse belief so if you have someone who's deeply religious for example you start challenging their beliefs really really hard you threaten to pull the rug from under them you may actually strengthen their beliefs because it encourages them to find all kinds of reasons that their beliefs might actually be right it kind of mobilizes their intellectual defenses mobilizes their intellectual immune system so that's I think one of the big challenges is how to debiase people without actually inadvertently strengthening the beliefs that's something we want to talk about but I think that's I've been arguing for a couple of years that that's actually a really important direction for the skeptical community is how to do this right, how to do this properly because I think sometimes we just assume that when we debunk people's beliefs everyone's going to be as rational as we are about these beliefs but that's not always the case talk about backfire effects reminding me of a chastening experience in my career many years ago ABC television has maybe they still do they had an annual retreat in Palm Desert where they took over a Hyatt hotel and had all their top executives show up with their significant others for a big blowout party but it was an intellectual party and I was invited along with the Chicago neuroscientist Jerry Levy and Jonathan Miller the British polymath television presenter opera producer and so forth and we were all asked to give talks we show up treated like royalty it was wonderful we got there and we found that we were the bad guys and they also had some new age type people the man Norman Cousins who himself out of cancer and some other absolute Randy Fodder you know and when we realized that our job was to do battle with these people we got down to business and we just took them apart and I was feeling pretty happy about the whole thing there was a closing lunch and on Sunday and I was expressing my pleasure at what we had done to Jonathan Miller and he said oh no Dan watch this and he got up and he said ladies and gentlemen I want to ask you before today what percentage of you believed in ESP, clairvoyance healing with laughter and about a third of the hands went up and I remember these are very smart people they are ruthless climbers in the competitive world of network television about a third of the hands went up and he said after this weekend how many of you believe it two thirds of the hands went up I was it was like being kicked in the stomach I was so shocked by this and I went around and asked some of those their hands and they all said the same thing they all said well I don't know I just figured if you smart people work this hard to criticize it there must be something to it just one quick follow up so there's Norbert Schwartz is a really clever solo psychologist at University of Michigan he's done some work on some of these backfire facts he gave me one concrete example from the health literature we talked about vaccines and so you give people a statement that the side effects of a flu vaccine are typically worse than the vaccine itself can you remember that? That's wrong that's not true it's very rare in the side effect and you tell people that and then you test them a few minutes afterwards they get it right and there's a control condition you come back a month or two later you ask them again and what you see is the false belief has actually strengthened they now get it backwards there's a note in your brain that says no that's wrong and that little sticky note sometimes falls off over time oh yeah I remember that thing they were saying about the side effects of flu I mean I discovered the same thing myself in my intro side class I hope it works I hope I'm not actually increasing false belief but I do this little demonstration when I talk about memory and I have someone run in someone run in and steal something from me like steal my umbrella or something like that and then the person leaves and then I ask them all kinds of questions like did the person have a mustache what color shirt was the person and it's amazing how bad people are at that it really hits them up front how bad eyewitness memory can be and I all remember that and on several occasions I can think of one case in particular a few years after the course a student coming up to me saying oh yeah I remember that I remember that demonstration where you showed how accurate eyewitness testimony can be I was like oh no and I think what this shows is that in the process of debunk we certainly should debunk I mean all four I do it too but we also have to be sure to focus on the true beliefs as well we have to keep our eyes on the ball and really present people with good science because if we keep repeating the false belief over and over and over again people often use familiarity realistic to truth I mean that's of course commercial producers know that really well that's why they keep repeating the same darn jingle over so you keep repeating it over and over again and people start saying oh yeah I've heard this a million times there must be something true to it so that's something we have to keep in mind when we debunking false beliefs I would also add that I feel concerned sometimes about the effect of giving people examples like very obviously irrational which you know could include ESP and aliens and maybe homeopathy etc I just worry that if those are the examples they get of people having irrational beliefs and it seems so obvious to them that those beliefs are irrational that you know I have a friend who once described this phenomenon as the cow pox of doubt by which he means that like getting tons of examples of being wrong where it's so obvious to you that the person was wrong is almost inoculating you against thinking that you yourself could be wrong because my beliefs don't look anything like those beliefs well that's good and you know which is not to say that we shouldn't you know talk about debunk things of course but this is a danger and this is kind of a catch 22 I tried to you know teach about biases that on the one hand you know the bias has to be clearly a bias to people so that they like get what you're saying but it also can't be a bias that they themselves have and that you like can't prove to them on the spot that is wrong or else they'll just think you know you're just wrong about what they're wrong about so there's like very little room in between those two you know skill and what the right answer to a problem is as if you made up the problem yourself you know like classic stuff from the literature I'd have students literally argue with me about what the correct answer was just like mathematical questions stuff that's you know from the cognitive literature but problems that are meant to show a way of thinking and rather than accept because they don't see the problem with thinking that way rather than accept that it is a problem to continue to try to literally argue with it it's very difficult what I try to do in that case is I do try to force people to face their own biases but I try really hard to do it in a way that's not threatening so what I mean by not threatening is nobody else has to know that they had that belief or that they had that answer they came to the wrong answer they are the only ones that have to know they have their own answer in front of them and they don't have to share it to admit to everybody else that they were human you know I've recently this is a slight change in topic I've recently done an experiment of sorts testing out a method for getting people to see each other's views that I'm happy to say we had pretty darn good results I call it the X on Y method and here's how it works you have a workshop 6, 8, 12 people with very strongly differing views on some scientific or philosophical issue everybody sends in a chapter this is my initial contribution to this everybody's supposed to read those then I ask them whose work would you like to introduce those in X on Y and it goes around the circle no two people talk about each other's work so that you end up spending the first half an hour of the workshop on this topic introducing and explaining the views of one of the others and it has a delicious effect on the attitudes and the open-mindedness of the group I did a week-long workshop in Santa Fe on cultural evolution and I had Boyd Richardson and Henrik on one side and I had Sperber, Cladier and Moran from France and then I had some other parties but there was a real failure of communication there at the outset that's gone it's really wonderful and then we just did one more recently on philosophy of mind of those of you in the field who know anything about the field imagine spending a week together the churchlands Andy Clark Jesse Prince David Chalmers and me and we got on famously so I highly recommend this technique and in a few days there will be a summary of the first of these workshops on Dan Sperber's cognition and culture website and you'll be able to see what happened I want to thank all of you and I know I'm supposed to be able to see the time but I can't see the time but no that's almost over we've got 7.40 do we have time for questions George does anybody have anything that they just really need to say before we let the audience ask questions while we're taking questions I do a version of Dan's thing just myself it's called the steel man technique and it's the opposite of the straw man where you like knock down a weak caricature of your opponent's argument instead you like try to come up with the best articulation of their point of view because actually people often give worse arguments than they have reason to give because people are just bad at explaining why they believe something and so you just like look for ways to strengthen their argument before you consider it or before you try to rebut it maybe there was a premise that they hadn't stated they were just sort of assuming but if it is assumed then they're right and you can think about whether that premise is right or you can like if they exaggerated they said like all women do such and such and ask well is it you know is it most or is it like more than men or whatever you can find a you know more conservative claim than the one they made and see if that is actually something that you you know might agree with more than you thought you did what's one of um is it wrap reports rules I think of orientation I think you like and I try to do that I wasn't familiar with those rules until I read your book but I like that too and I think it's a general principle skepticism which is always try to give the other side the best chance try to be charitable so if they make some claim about UFOs they're the low-hanging fruit and debunk the easiest one let's get the strongest evidence and see if that still withstands scrutiny and I think that's a basic first of all I think it's a good visit inculcates principal charity but also I think it's the better method of argumentation does anyone have a question I'll come to you real we have time for like one or two one or two make sure your question's a question not a story a yarn or a tail should have a point and be nice and concise or else I'll throw you in question jail for Daniel Dennett what's the say in the last ten years what's the biggest thing what's the biggest thing you've changed your mind on the biggest thing I've changed my mind on I have a new view of the brain as a computer which is I still think the computer analogy is fine but when you start thinking about some of the differences between neurons and flip-flops you realize that we don't have much of a handle on what the architecture of such a computer would be I'm working on it and thinking about it and one of the things that comes up is that it's going to be deeply competitive and relatively anarchic there's no routine subroutine hierarchy and the most important fact is that almost all the computation that we do with our devices in our pockets and our laptops and everything else relies on the fact that way down in the basement the parts are all exactly alike the flip-flops, the registers are all exactly alike no two neurons are alike no two neurons are alike and when you try to make an architecture, a computational architecture of neurons that should loom large in your thinking and it didn't use to I used to think of the neuron as basically a McCulloch-Pitts logical neuron and I knew what we could do with those but you can't do that with real neurons we got one more quick one I think right here I hope it's quick I'd like the speakers to address if they would the issue I haven't heard address and that is personality types for example, the Myers-Briggs classifications or the classic inter-directed versus outer-directed person in terms of people who just really aren't open to rational thinking they're more concerned with for example what other people think or how they feel about personality types in 30 seconds I'll go for it the Myers-Briggs is not a great measure it's okay, I think that there are some better measures out there and I think Julia was hitting at this earlier there is a personality dimension called openness to experience or openness that's a little bit related to IQ it's actually one of the few personality variables that's correlated to like 0.3, 0.4 with IQ but not very strongly and that is a very potent disposition well understood dimension of personality in my view but I think it does relate to this broader disposition to be open to new ideas to question one's beliefs and more broadly Syepstein's done some work on this there are broad individual differences in terms of say intuitive, experiential versus deliberate rational thinking which of course has been a little bit to what Stanovich and West and Kahneman later call system one versus system two thinking so some people are much more prone to just and if they used to be called like need for cognition there's a closely related construct some people just really really really really I mean Carl Sagan I got to meet once was reminded me of that I think he was sort of from the garden very intellectually ferocious and I saw a bit of that when I was with him it was very nice to me but I always wanted to know, I was always asking questions really really really wanted to figure stuff out and there is that very deep disposition that some people are very strong and other people are not what I don't know is how much you can push that around I suspect there are probably limits to that so you might have to give people almost like little intuition pumps or intellectual prostheses to kind of overcome some of those dispositions would be my hunch but I'm not sure I'm hoping that the RQ test can tease some of that out once it's developed and tested and I would also say that there is a middle ground in between not wanting to change your mind at all because it's unpleasant and you know makes you anxious or whatever and wanting to change your mind because you want to be charitable to other points of you and you like the idea of being the kind of person who changes your mind etc there's a middle ground where you're sort of taking advantage of some of your personality traits that maybe aren't as like noble as wanting to you know be a mind changer or something that I've totally used to good effect before like so if I'm in an argument and I don't want to change my mind because I'm feeling sort of in like battle mode like competitive and I don't want to lose the argument I just remind myself that this isn't exactly like a battle because in this argument if I lose I get a copy of the other person's weapon in other words their winning argument and then I can go around winning arguments with that weapon so I just re-channel my competitive instinct and I think there's a number of things like that that you can do to take pride or fear or competitiveness and actually make it work for you in this regard and with that I think that's excellent how about a round of applause for our great panel excellent