 Tom, you've written a book called How We Know What Isn't So. Can you tell us a little bit about that? Sure. The purpose of the book was to try to understand how is it that we can look at all the evidence that we encounter in our everyday life and come to some conclusion. The world's telling me this thing that's true. It's obvious that it's true. But if you look at it a little more dispassionately, some of those beliefs are very questionable. I mean, everyone's familiar with the idea that people believe what they want to believe. Those are some important kinds of erroneous beliefs. The book is not so much about that. It's about thinking that we really have the evidence for things. The world is telling us something. But in fact, the world's telling us something a little more complicated. And how is it that we can misread the evidence of our everyday experience and be convinced that something's true when it really isn't? That was the thesis of the book. What do you mean by erroneous beliefs? What are some of the examples that you have in the book? Well, there are a variety. But take, for example, the pretty common belief that things happen in threes. Natural disaster. So if two of them have happened in close proximity, people will sometimes tell you, oh, my God, I wonder what the third one is going to be. Or homicides. Or fatalities on the part of famous people. If you look at all of those things, they don't tend to cluster in threes at all. And so why do people believe those kinds of things? There's a belief in the sports world and something called the Sports Illustrated Jinx. You get your picture on the cover of Sports Illustrated. Oh, that's a terrible thing. Whatever success got you there is unlikely to continue. That's been shown to be false as well. Another belief in the so-called sophomore jinx. You've been exceptional as a first-year performer or rookie. And the thought is that if you've done really well your first year, you're jinxing yourself. Or more common kinds of things. You're at a grocery store. The line you're in is really bogged down, going nowhere. There's someone in front of you with a million coupons sifting through them or can't get their change organized. And the line right next to you is zipping through. And you're tempted to go to that line. Why stay in the slow one when there's a faster line over there? And many of us often think, oh, wait a minute, I know that if I do that, that line's going to slow down and this one's going to speed up. I don't know what principle of the universe would create that. But it's kind of easy to understand and that's what we explore in the book, is why do we believe that the lines we go to slow down are the lines we leave speed up if there's no evidence for it. So beliefs of that sort. And then you can take those kinds of things and apply them to people's beliefs about a variety of alternative medical practices. Some of which have been shown to be valid and useful. A lot of them shown not to be. And why do we believe every bit as much in some of the ones that don't work as the ones that do? Belief in supernatural things, extrasensory perception, et cetera. So tell me, you said that there's not much evidence for a lot of these folk myths and these alternative health therapies, et cetera. So why do we believe these strange things in the absence of data or evidence to the country? There's a whole bunch of reasons. There's not a single answer to this. If there was a single answer to it, we could easily teach that to people in school and then it would be gone. But a whole bunch of things conspire to it. One of them, and most of them, are sort of a side effect of this impressive intellectual machinery that we have in our heads. Part of its job is to identify patterns out there. It's hard to get that job accomplished perfectly, so people look out there for patterns and they're often going to see things that really aren't there. And if you go on the internet and type in illusions, you'll see all sorts of them, people spotting faces in clouds or faces in a cinnamon bun or what have you. If you take, for example, grab a bag of Eminem's port and a jar and you look at it and the different colors are randomly arranged, they don't look random. They just say, oh, there's a bunch of blue ones over there and a bunch of green ones over there. We sort of see order where there isn't any. So we can see things happening in three. We organize things into certain clusters that are really the kind of clusters that you'd see by chance. So, formally, what are the other kinds of cognitive mechanisms that are operating when we had these beliefs or opinions? I think one of the most powerful and most interesting ones is something that a colleague, a former student here at Cornell, Scott Lillianfeld, calls the mother of all biases known as the confirmation bias. And that's a term that most people are familiar with and they're familiar with the idea that in particular, if we want to believe something, we'll go and seek out evidence for it. We won't seek out evidence against it. That is really true. It's a very pronounced tendency to treat information that's consistent with what we want to believe in a pretty friendly way and be really hostile to information that's consistent with something we don't want to believe. It's almost as if we ask ourselves of something that we want to believe, can I believe this or is there evidence for this? And there's evidence for almost anything. Even the most outlandish things, there's some evidence for it. The question is, is there enough evidence, is there sufficient evidence? And we don't tend to ask ourselves, you know, must I believe this? Is there enough evidence here? So all of that's true. All of people can relate to that. But it's even more pronounced than that. That is, even if you don't care about a particular belief, you have no vested interest in it, you tend to look for evidence consistent with the idea rather than information that's inconsistent with it, which of course, if we want to have a balanced picture, we've got to look at both. So if I asked you, I gave you some plants, a bunch of hostas, and said, here's some, here's some, you're a nice guy, here's some extra hostas from my garden. I think they probably need a lot of water. But you might want to test that. How would you test that? Well, if you're like most people, you'd give it a lot of water and see how they do. And what you wouldn't do is give some a lot of water, or some hardly any water at all, and see which one does better. You'd look for evidence for it rather than against it. And that's a very natural tendency and at some level it makes sense because it reflects a broader belief that, look, if this thing's true, there must be some evidence for it. So let me look for some evidence for it. You're doing a very reasonable thing. However, you're doing an incomplete thing as well. You need to look not only for evidence for something, but evidence against it. So if you believe that cheerful people are more likely to overcome a bout of cancer, you need to look not just who are the cheerful people you know who've done very well, but maybe you know some dour people also who've recovered. That's the latter step that we tend not to do. It sounds like you're talking formally about a contingency table there. It sounds a lot like signal detection theory. So you have to look at how else it could have turned out. Can you tell me a little bit about that? Yeah, well a lot of our beliefs are ultimately beliefs about relationships. Is there a relationship between your personality and your health? Is there a relationship between your personality and the health of your marriage? Is there a relationship between whether basketball players have how they've performed in the past and how they're likely to perform in the future? And to evaluate any relationship you need to look at all the levels of one and how it stacks up with all the levels of the other. We tend instead to look at the presence of one and the presence of the other. Positive personality makes you happier. We look for all those positive people who are happy. I start out the book with this example that a lot of people have heard of that infertile couples, couples who are trying to conceive a child and are having some difficulty. They're often told, oh adopt a child and you'll have this delightful adopted child and then you'll be more likely to conceive your own biological child as well. Turns out that's been studied to death. There's no evidence for that. But it's easy to see why you would think it because the world is going to give you lots of that information. All the cases that confirm it will come to your attention. When someone adopts and then conceives you tell that story, I tell that story, we hear that story and it just seems like it's true. We're not going to tell a story about or it would be a very different story about people who adopt a child and that's it. They've now got this great family with the adopted child. That's a different story than doesn't get tied into the notion of the relationship between adoption and conception. Is it as simple as us just disregarding the evidence that contradicts our beliefs or just ignoring it or forgetting it? Well almost nothing is terribly simple with humanity which is why it's kind of fun and interesting to be a psychologist. There are times when we willfully put on the blinders and we say I just don't believe it. I don't care what the evidence. And that's true and that's important. But I think even more interesting are the times we're really trying to get the story right. And nonetheless our thinking isn't perfect and we end up drawing erroneous conclusions. Another way of thinking about this or thinking about the different kinds of biases to which we're prone is that look there's this big giant world out there and we have to look at it through a very small, you know, people if you will. And that's true at any scale that you want to think about. First of all it's literally true. I've got this world around where I can only see a little more or a little less depending on what you're counting than 180 degrees of the 360. If I'm trying to draw on the information that's stored in my head I can only keep seven or so items of information in mind at any one time. Advertisers talk about things in certain terms. You're going to lose this amount of money if you don't do this or you'll gain this amount of money if you do this. That channels, I just accept the terms they've used. I think about the problem in terms of those terms and that's a narrow kind of focus all the way up to the kind of ideologies that you have. If you're a political conservative you see the world, literally see the world one way. It's different than the world if you're politically liberal. So no matter what scale you're talking about we're only seeing some portion of the world and that introduces all sorts of opportunity for bias and misconception. Are there any formal tools or procedures from science and statistics that we can steal and use in our everyday thinking? Oh yeah, I think so. And I think people or another form of that question you could say is that look administrators at universities are always saying doesn't matter the content what you learn in college. That's not what a university education is for because the content's going to change, it'll be outdated in ten years anyway. Colleges, the place where you learn how to think. Okay, great. So how do I learn how to think? What would be the best courses to take? Should I take statistics courses? Should I take rhetoric courses? Sure, all of those are really good. However, and you should view this skeptically because I'm a psychologist, this is a self-surveyed claim here, but taking psychology courses is very as at least as valuable as any other because when you look at the research in psychology, you're always dealing with the second best, at best, the second best data set. We're dealing with messy kind of data all the time, we're having to overcome certain kinds of problems, we can't do the ideal experiment that we'd like to do. So you get used to understanding what are some of the limits on the kinds of inferences that you can make. So when you study psychology you really get trained in not making correlation for cause, you really get trained in the fact that I don't know how much jargon to use here. You get trained in selection biases that this sample of people that you're looking at in this condition, they weren't randomly assigned to that condition. And so how they were treated may not be what's determined their response but who they are that led them to get that treatment as what's responsible for it. You learn a little bit about not committing the confirmation bias. You develop the habits to sort of say okay that's the evidence for this, now what's the evidence against it. And that's helpful as a scientist, it's helpful if you're trying to run a corporation or a non-profit or anything in life to have those kinds of habits. And I think psychology teaches that very well. Are psychologists and scientists then immune to erroneous beliefs? No, not at all. And sometimes we're even, we knowingly give into our erroneous beliefs. So there's this research on the interview illusion that if I'm trying to predict who's going to be a good doctor, who's going to be a good professor, it's very hard from a sample of a little conversation that we're going to have to predict who's going to be successful or not. So you might think that we'd say well let's not have those if they're a waste of our time, why have them? We all do them. We interview people for people to be graduate students to join our lab or people ending their graduate careers come and give job talks and whether we decide to hire someone or not is enormously influenced by these little half hour conversations that we have that probably aren't that predictive of anything. So we know that and yet we still do it. So no psychologists are not immune to that. Which again is one of the reasons that this whole area of research on heuristics and biases, judgment and decision making has drawn such interest is that people recognize themselves in these kinds of mistakes. So when Kahneman talks about you pay more attention to losing things than gaining things, you see those examples and you go yep, you don't even need to show me any data. Everybody I know would do exactly that thing. Can you give me some specific examples from maybe belief in extrasensory perception or alternative health practices where a lot of the mechanisms are going on that you've explained? Well there's, one of the most pronounced kind of biases is if something happens right after something they co-occur in time, we tend to think that they happen because of that something. So if you're feeling ill in some ways when do you seek treatment? When you're really at your worst and so I give you some treatment, it could be a completely worthless treatment. Chances are your body is designed to make itself better but if you see me when you're feeling the worst, chances are right afterwards you're going to be feeling a little bit better and because of this tendency oh I got better right after that guy gave me this treatment you're going to think that that treatment is effective and that's a very powerful conclusion to shake. It's so natural to think look I did this in order for this to happen, this happened and then to say well no that's just my body did it to realize that. It's very easy to jump to that conclusion. It sounds a little bit like regression towards the mean. Regression to the mean yes, that would be an example of that and you see that going back to the sports illustrated jinx, you get pictured on the cover of sports illustrated when you're at your peak and unfortunately we can't stay at the peak by definition endlessly and therefore if that's when you're pictured chances are shortly afterwards you're not going to be doing as well and that gives us all the data that we see out there that suggests hey there's a bad luck to be pictured on the cover of sports illustrated. No it's not it's just like as you said it's the regression to the mean. Can you tell me about the intuitive scientist? Sure that's a metaphor that people have used to draw parallel between what scientists do which is try to understand the world and there's some formal tools for doing that and what scientists try to do professionally of course we all try to do in our everyday lives to figure out the world around us and there are a lot of similarities between what we do as people in our everyday lives and what scientists do in fact science developed out of you know the kinds of mental habits that we had over time recognized what the problems are and what are the things that allow the most powerful conclusion so it would be odd if regular thinking was just radically different than scientific thinking. It's different but there are some parallels between it. We're trying just like the science scientist does to identify what are the phenomena out there? Okay there are the phenomena where are they that way? We ask why all the time? There used to be this beer commercial in the United States of why ask why and it's a brilliant little tagline because by raising it it's illustrating it. We just do that all the time. Something happens and we want to know why. That's what scientists do and that's what we do in our daily lives. What can we do to pick up on some of the mechanisms that are operating dealing and evaluating claims say with psychic claims or predicting the future for example? Well Danny Kahneman has this take on the world that I think helps us quite a bit that we have these intuitive reactions we have this intuitive system that's going to register a bunch of associations that things that go together in space and time and we're going to think of those things as connected and training in science, training in psychology is not going to make those go away but we also have this other system, a rational reflective system that allows you to inspect that and says alright is that really true or not? What have I been taught about what kind of inferences I can draw? So scientific education post-secondary education is all about training system too to sort of say, no wait a minute you're guilty of the confirmation bias here. I've explained it this way, is there another way to explain it? Or I've looked for this kind of evidence, is there evidence against it? So looking at psychics in particular, what makes them seem so successful then in their predictions? Well anytime a prediction is confirmed, it just takes over your attention, it's a very dramatic thing and what you see is that local event, someone said this was going to happen and now it happened. What you need to do is step back and say alright this one little data point here, where does that fit in in a broader pattern of evidence and it's only if that's repeated a bunch of times that we really can trust it. So Carl Sagan in one of his books starts out with an anecdote where he had a dream that his father had died and woke up I may be exaggerating here but at least as I remember it he woke up in a sweat that oh my god my dad's I got to check in on him, he checks in on him and his dad was fine and he said to himself that look if just so happened that I had that dream and my dad had died there's no way that anyone could have disabused him of the idea that he didn't have a prophetic dream. But in fact we populate our dreams with familiar people, familiar people die and if you look at all the people in the world and how often we dream there's going to be a bunch of times when that happens just by pure chance but good luck trying to convince the individuals who've had those dreams or the people close to them that that was just one of this broad fabric of pattern of noise as you're just not going to convince them of that. Do we tend to think that other people have the same beliefs and opinions as we do? Sure and it's easy to see why. First of all the beliefs that you hold it's very rare for people to hold the belief to go yeah I know this is completely at variance with the evidence out there it's a crazy belief but I hold it anyway rather you have beliefs that are grounded in what seems like reality to you and so whatever it is that makes you think this team is better than that team this candidate is more likely to win than that candidate all the causal forces you can think of that convince you that that's true they're going to convince other people that's true you reason and therefore other people will believe that as well. What we fail to recognize is that other people may be doing very different kinds of calculations and arriving at a very different kind of result so that would lead us to think that other people are going to believe will believe what we believe but also we don't randomly sample the world so if you're trying to predict what do Americans in general what does the world in general think you only have the sample of people who you hang out with and we tend to hang out with people who are a lot like ourselves and that's going to give us a distorted view of how common certain things are so if you're politically conservative you probably hang out with conservatives you listen to Fox News you get conservative messages all the time and you're going to think the world is more conservative than it really is flip it to the other side the exact same thing liberals are going to think the world is more liberal than it actually is. So what is the correlation what's the relationship between what we think other people believe and the other beliefs that are out there varies from issue to issue but in general there is a correlation between what we believe and what we think other people believe that is this phenomenon known as the false consensus effect we see more consensus for our beliefs than is actually the case and some of it is due to that kind of bias sampling some of it is due to how we resolve the ambiguity inherent in these kinds of issues that come up or the questions that were asked you know what percentage of people are conservative well what do I mean to answer that question I have to decide what I mean by conservative other people who are answering whether they're conservative or not maybe defining conservatism differently than I am and most people recognize not everyone agrees with them so they make allowance for that okay I think this not everyone's going to think that what it's harder to make allowance for is that you're making a judgment about this thing you may be making a judgment about a very different thing even though we call it the same name social psychologist Solomon Ash referred to this as it's not so much that we have different judgments of an object but we have different objects of judgment in mind and you're making a judgment about one thing I'm making a judgment on a very different thing and unless we realize that we're going to misunderstand each other. Does the world look different in hindsight how does hindsight affect our judgments well that gets that's a great question it gets back to what you raised earlier about sort of the intuitive scientists that is we explain things all the time and so when you say that something occurred I don't just okay file it away that occurred I you know sometimes very explicitly and energetically explain it sometimes you get a little explanation of why that thing happened and so now you ask me how likely suppose you didn't know that this happened how likely is it that you would have predicted it well now I thought of all the whole web of forces that created it even if I try to take out the outcome all those causal forces are there that imply the outcome and so I think that oh this is really likely to be true so once you know stated differently the hindsight bias is one version of a broader phenomenon known as the curse of knowledge once you know something it's really hard to take it away and see what the world was like before you knew it which is why teaching is hard once you know this stuff be easy to teach it to people who already know it but you teach it to people who don't know it it's a little harder you've got to get beyond your own perspective and understand what it was like before you knew this before you knew the outcome how likely is it that you would have predicted it it's just very difficult to do is data or evidence alone sufficient to change people's minds or opinions I think it can be you know we're responsive to most people are responsive to evidence at the same time we are kind of storytelling creatures if you look at the sacred text of the world they are data tables they're stories they're narratives you know there was an oral tradition and we evolved in that oral part of our evolution took place in that oral tradition so narratives we think in terms of cause and effect which are embedded in narratives those tend to move people a lot better than data do so imagine if you're an attorney and you've got one side that's got a bunch of data over there showing people the law of large numbers and how this pattern really has meaning and therefore my client is and you have someone else who tells a really compelling story who you're going to bet on well I think we all know that humans really relate to a narrative kind of structure which is why the biblical parables which again is one reason why teaching psychology is so gratified because it lends itself to storytelling a lot of the classic experiments are really parables the parable of the person walking down the street who ignored people in need that literally comes from the parable of the Good Samaritan are there ways then to combine good data and evidence with a good story yeah but you know you have to be careful there good science writing for example that's what good science writing does takes the message from the data really the only way we can find out whether things are true is to subject them to empirical test but then to get people to resonate you need to take those conclusions embed them into stories that people can relate to and very good teachers do that science writers do that it just makes it more compelling if you want to get people's attention it's really good to have pictures worth a thousand words of course but a story is worth maybe not a thousand but a lot gotcha this course is about the science of everyday thinking what advice do you have for people out there who want to think better and do better in their everyday lives a couple of things one is to recognize that we know much less than we think we know is you know if I could wave a magic wand and insert something in people's heads it would be a certain kind of epistemological caution people who think they've got everything figured out are almost always wrong and you know overconfidence is massive and pervasive and if you could make people a little less overconfident if you got people to think that most things in life are shades of grey when people want to scream over and oh it's black versus it's white if you could get rid of that I think you'd make a lot of human progress my name is Tom I think about superstition