 It's it's great to be with you to discuss the science of why we deny science and reality In other words, what is it about our brains that makes facts so challenging so odd and threatening? Why do we sometimes double down on false beliefs and maybe why do some of us do it more than others? That's that's the next book. I'm not gonna talk about it that much I'm gonna try to be somewhat down the middle here, although it's difficult But I've been writing about political and scientific misinformation for a decade and I'm gonna confess I got a lot of the big picture wrong initially the first book was called the Republican war on science So we did not notice at the time the echo visually with another book that was out And it was all about people denying the science of global warming, you know denying evolution getting it wrong on stem cells What what was wrong about my analysis was that I was wedded to an old enlightenment view of rationality and What I mean is that I I had this vision, you know if you put good information out there and you use rational arguments people will come to accept what's true especially if you educate them in places like this Teach them critical thinking skills and a lot of people Believe or want to believe that this is true the problem is that rather awkwardly There is a science of why we deny science there are facts about why we deny facts There's a science of truthiness. I was gonna actually title the book that but a Republican brain is better But there's a science of truthiness and the upshot is that paradoxically ironically the enlightenment view doesn't describe People's thinking processes so if we're actually enlightened we have to reject the enlightenment view and I want to tell you first How I realized that the key moment came in the year 2008 when I stumbled upon something that I call the smart idiot effect Okay, what is a smart idiot? This was data from Pew And it was in a poll on global warming and it was showing the relationship between political party affiliation level of education and belief that global warming is caused by human beings and I don't know if you can see it very well because of the color contrast But what it shows is that if you're a Republican the higher your level of education Then the less likely you are to believe in scientific reality Okay, so these are the college grads. These are the non-college grads And you've got less belief among the college grad Republicans and what's true then you do among the non-college grad Republicans Whereas Democrats and independents the relationship between education believing in reality is the opposite They believe it more so these people the 81% that don't believe it those are smart idiots Okay, and I'm trying not to be partisan. So I will show you liberal smart idiots. You could call this dumb and dumber Now the people who deny that the science of vaccines, which is that they do not cause autism It turns out that the New England Journal of Medicine studied who these people are and they tend to be white Well to do and the mother has a college education and they tend to go online with Jenny McCarthy called the University of Google and Inform themselves about this and so by empowering themselves they end up more wrong and end up believing that vaccines are dangerous Education and intelligence therefore do not guarantee sound Rational decisions nor do they ensure that people accept science or facts and I want to give you an even crazier More fun example of the smart idiot effect So this is from the political scientist John sides of George Washington He unpacked the data on why belief that President Obama is a Muslim increased or in whom it increased between 2009 and 2010 Okay, so here's where the increase is and again We've got a higher slope for Republicans with some college or college grad Than those who have less education. So again smart idiot effect and this is pretty frustrating You like why they do this? How could this possibly be? It seems that the more capable you are of coming up with arguments that support your beliefs the more you'll think you're right and The less likely you will be to change and if this is true We have a pretty big problem It would explain a lot of the polarization in America the good news is that there is a way of understanding why we do this and It's quite relatable the science behind it is pretty easy to understand because we all know from our personal lives We all know from our relationships how hard it can be to get someone we care about to change their minds And we know from great works of literature like great expectations. I actually have seen the black and white great expectations You know that there's so many famous characters who fall for self-delusion who fall for rationalization Pip being one of them. He wants to believe that he's destined to Maria Stella He wants to believe that mishavish M is not this manipulative old crone. She actually has his best interest at heart He believes this so strongly that he essentially ruins his life. That's the story of great expectations What dickens grasps about people in a literary way? We are now confirming based on psychology and even neuroscience and what this leads to is a theory called motivated reasoning and what it demonstrates is that we don't think very differently about politics than we do about Emotional matters in our personal lives at least if we have a strong investment or commitment To understand how motivated reasoning works. You need to understand Three core points about how the brain works the first of them is that the vast majority of what all of our brains are doing We are not aware of the conscious part the us the self just a small percentage of what it's up to And the second one is that among the things it's up to that we're not aware of is the brain is having rapid fire Emotional or affective responses to ideas images stimuli and it's doing this really fast Okay, so fast. We don't even know it's doing this and then the third point is that these emotional responses Guide the process of retrieving memories in the consciousness thoughts Associations that we have with the stimuli Emotion is the way that we that those things are pulled up for us to think about and so what this means is that we might actually Think we're reasoning reasoning logically. We might think we're acting like scientists But in fact we are acting like these guys, and I love being at Harvard Law School putting up lawyers When we actually become conscious of reasoning, we've already been sent down a path by emotion Retrieving from memory the arguments that we've always made before that support what we already think so we're not really reasoning We're rationalizing we are defending our case and our case is not not just a part of who we are It's a physical part of our brains All right, and this explains so many things about why reasoning goes awry. It explains goalposts shifting for instance, so you know For years the birthers wanted the birth certificate the long-form birth certificate last year they got it, okay? So did they stop being birthers? No, did they change their minds? No We can you know the the new science of why we deny science can predict that they would double down on their wrong belief And they would come up with new reasons to distrust the new evidence that they have been given and the same goes for all Manor of logical errors fallacies hypocrisies the answer used to be hey Just learn critical thinking and you'll avoid these and that might be true to an extent but it seems like reasoning is designed to help us see these problems and others way better than to see them in ourselves and It's designed this way. It happens it manifests first when we're quite young I want to tell you about a really funny motivated reasoning study involving the biases of adolescence You might remember that in the 1980s there was this big battle about labeling rock albums All right, and there was this fear that music corrupted kids led them to Satanism What have you and one side in the debate was the PMRC and tipper gore and here? She's got an album the title is be my slave. I don't know if you can see the whips and chains But this was dominatrix metal so Some moms didn't think little Johnny should get this record and on the other side There was Frank Zappa here pictured in the Phi Zappa crappa picture. That's so awesome And as as he testified before Congress, I love this quote the PMRC's demands are the equivalent of treating dandruff by decapitation All right, such was the debate. Okay, so in comes a psychologist named Paul Klazinski And he does a motivated reasoning study dovetailing with it all He took ninth and twelfth graders who were either fans of country music or fans of heavy metal And he asked them to read arguments that he had constructed all of the arguments contain the logical fallacy All of the arguments were about the effects of listening to a particular kind of music upon your behavior So here's an example of a fallacious argument They might have gotten something like this little Johnny listened to Ozzy Osbourne then he killed himself Heavy metal is dangerous stuff Okay, everyone see the fallacy there and and so you would get the same kind of flawed arguments about country music So what happens? Adolescent country fans can see logical fallacies when they are in arguments suggesting that listening to country leads to bad out bad Outcomes bad behaviors, but they do not detect the fallacies in pro-country arguments the same thing for heavy metal fans All right They're all biased and it doesn't get better between ninth and twelfth grade Okay, more education doesn't somehow let you see what's wrong with your point of view And it is not just problems or fallacies of basic reasoning that motivated reasoning explains It explains for all the science debates that I write about the my expert is better than your expert problem Get straight to the heart of why some people deny science. They think their science is better Okay So this is from Yale's Dan Kahane. He's also here at the Harvard Law School Sometimes I think and he divides people's moral values up along two axes What's important about this is your moral values Push you emotionally to believe things and so you start tagging things emotionally based upon what your values are So we're either hierarchical or egalitarian. That's up and down. We're either individualistic or communitarian I think people basically know what these words mean, but in case you don't Republicans live up here Democrats live down here Republicans tend to be hierarchical and individualistic supporting more inequality and supporting keeping government out of life and Democrats more communitarian supporting the group. It takes a village and egalitarian supporting equality Okay The point is that this determines who you think is a scientific expert because then he shows people a fake Scientist who either supports or doesn't support the consensus that global warming is caused by humans And if you're up in the quadrant where the Republicans live then only 23% agree that the scientist is a trustworthy A knowledgeable expert if the scientist says that global warming is caused by humans But down here 88% agree that this scientist is an expert So in other words science is for many of us Whatever we want it to be and we've been impelled automatically Emotionally into having that kind of reaction to who an expert actually is This is bad enough But the double whammy is always gonna come when you combine these flawed reasoning processes with Selective information channels which brings us to the Internet of the sort that the Internet makes so possible But this is only a difference of degree It's not a difference of kind because we already see what happens with the selective information channel that is Fox News Okay, so lots of research showing that Fox News viewers believe more wrong things Okay, they believe wrong things about global warming wrong things about health care wrong things about Iraq and on and on And we can document that this is they believe them more than people to watch other channels Why does this occur? Well, they're consuming the information resonates with their values And then they're thinking it through and arguing getting fired up motivated reasoning and they run out and they reinforce the beliefs But also and this is where I'll get into some possible left-right differences There's some evidence to suggest that conservatives might have more of a tendency to want to select into belief channels That support what they believe to begin with Right-wing authoritarians are a group that have been much studied and they are part of the conservative base And there's some studies showing that they they engage in more selective exposure trying to find Information that supports beliefs. There was a study by Shanto Iengar at Stanford of Fox of Republicans and Democrats consuming media and what he found was that Democrats definitely didn't like Fox, but they spread their interests across a variety of other sources But Republicans were all Fox almost and as they wrote the probability that a Republican would select the CNN or NPR report was about ten percent in this study Okay, so so the Fox effect believing wrong things is probably both motivated reasoning and also people selecting in to the Information stream to begin with. All right, so how do you short circuit motivated reasoning? That's what Brenda Nyhan is going to talk about. I'm not going to talk about it But I will say you don't argue the facts. All right There are approaches that are shown by evidence to work for my part Let me just end with some words of eternal wisdom from a truly profound philosopher Yoda Yoda said you must unlearn what you have learned and when it comes to the denial of science and facts That is precisely the predicament we're in So thanks, and this is this is the new book. So it'll be out soon