 In the best-selling books, The Better Angels of Our Nature and Enlightenment Now, Harvard linguist Stephen Pinker made the surprisingly controversial case that humanity has been getting richer and less violent over the past two centuries. In rationality, what it is, why it seems scarce, and why it matters, he argues that our ability to reason and think critically is central to human flourishing and undergirds are phenomenal material and moral progress since the Enlightenment. Pinker explains how important cognitive defects such as the sunk-cost fallacy and myside bias cloud our thinking and contribute to intensely polarized tribalistic worldviews that result in the Trumpian rights embrace of QAnon and what he calls university's suffocating left-wing monoculture. Not afraid to shy away from controversy, he insists that public policy should be largely driven by facts, not emotion, even in heart-wrenching cases such as the police killing of George Floyd. If the goal is to save the largest number of Black lives or anyone's lives who are killed by police, then facing it on a viral video is probably not the right way to go. Reason talked with Pinker about rationality, social progress, and why despite so much negativity, fear, and anger in the world, he's optimistic about the future. Stephen Pinker, thanks for talking to Reason. Thanks, Nick. So, let's start. What's the elevator pitch for rationality? What it is, why it seems scarce, and why it matters. Rationality is an introduction to some of the major tools of reasoning that don't necessarily come naturally to us, logic, probability theory, Bayesian reasoning, correlation and causation, rational choice theory, beam theory. It's an exploration of whether, and to what extent, your typical human is rational. That is, how well do we measure up against these benchmarks of reasoning? There's an explanation as to why there seems to be such rampant insanity in the world, belief in paranormal phenomena, and medical quackery, conspiracy theories. At one point, you referred to it as a pandemic of poppycock. Yes, pandemic of poppycock. And then I take on the somewhat paradoxical task of making the case for Reason. Because, unless you already endorse Reason as the way that we should believe things, then there are no tools with which to defend Reason, but I do my best. All right. Let's start with that. Early on, you define Reason or rationality, rather, as the use of knowledge to attain goals. And you talk about a tribe in Africa that people, I think, and certainly maybe 50 years ago or 100 years ago, would say, well, these people are primitive. They're not rational. But you kind of show how they employ rationality. Could you talk a little bit about that and kind of fill in the definition of rationality that you're using? Yeah. I wanted to push back against the common trope, often appealing to evolutionary psychology, a field that I'm quite sympathetic to. But, well, what do you expect, if humans, of course, were irrational, we descended from cavemen on the savannah who just chucked spears at animals and had to run up a tree if they saw a lion. So we have nothing but a bundle of quick reflexes. But if you actually look at the lifestyle of people who survived in hunter-gatherer existence, even the Kalahari Desert, in the case of the sun, who I featured in the first chapter, they are highly cerebral. They survived by persistence hunting, which is a uniquely human specialty. It takes advantage of the fact that we are naked. We have no fur. We sweat. So one thing that we can do in a desert that other animals can't is run marathons. We dump heat. We don't get heat stroke. Furry animals do. They're faster than us. But if you pursue them long enough, they will keel over. So the trick, though, is tracking them because they're fast. They say a human, they're out of sight. But they leave behind tracks. And the sun reasoned from the tracks to what species of animal was, what it's sexes, what its condition is, where it's likely to be. And they do it even though lots of tracks look alike. The tracks get eroded by wind and rain. They engage in argumentation. They doubt their first instincts. They don't go with their gut. A young upstart can challenge an elder. So they don't fall for the fallacy of argument by authority. They distinguish correlation from causation. They engage in reasoning. They use a lot of the tricks of reasoning, which means that we can't just excuse our own irrationality. They say, what do you expect from cavemen at a time? That reading that section reminded me. I remember it was probably in an anthropology class as an undergrad watching a film about the Hadzda tribe. And they would throw a spear at an antelope or something and miss. And then they would pick it up and keep walking. And I didn't realize that they were basically just waiting for the antelope to keel over. And that's kind of fascinating to think about. So let's talk about if rationality is the use of knowledge to attain goals, then let's talk about why it seems scarce. You run through a series of issues of why we don't seem to be particularly rational. You know, we're not either not, is it that we're not goal oriented in all of our activity or that our attempt to get to our goals is confounded by any number of basic kind of logical fallacies? Well, there's some of each of that. But the flagrant examples of irrationality have captivated people, especially in the last few years, like the conspiracy theories, the fake news. There, I think it's that the goal that people are pursuing is not the goal that we are going to be second nature, namely establishing objective reality, but rather to boost the moral energy of one's own tribe's second coalition to show what evil, conniving opportunists the other side consists of. Now, to achieve glory within your clique as a formidable warrior for your side. Now, if you accept those as goals, then the way that people pursue them could be perfectly rational. If you're the spreader of the most titillating fake news that make the Democrats look bad, well, by some narrow criterion, you're actually being pretty smart in terms of getting kudos to your own side. So you would say somebody to make it political, you would say somebody like Steve Bannon and you quote his his his idea that he wants to flood the zone with shit. I mean, he was it's irrational or he is using he's spreading irrational ideas oftentimes conspiracies and just things that are kind of let's say, you know, challenged by reality. But he was actually pursuing them in a rational way. Or he was he was still goal oriented in doing that. He got his guy elected to me and as people who accuse Trump of being stupid are reminded he didn't get to be president. So in terms of the narrow goal that people set themselves, they can be all too rational. Right. What I think we often forget, it's just so much a part of our background of educated, reasonable people is, in fact, an unusual, exotic, weird conviction, which is that all of our beliefs ought to be aligned with reality. That's the only criterion for believing, only believe things that are true or that have some claim to being true. And do you best find out whether they're true? Now, even saying that you might say, well, well, doesn't everyone believe that? And the answer is no, that I kind of think of that as a post enlightenment conviction of some educated intellectual elites. But for most of you in history, when it came to really interesting cosmic questions, you couldn't find out any anyway, there was no true. How could you find out what's the origin of the universe? How can you find out what's really going on in discussions among kings and dictators and leaders in their private halls of power? You can find out, but your beliefs could, they could be entertaining. They could be increased solidarity. They could rile up your side. They could get you glory. And so that's a good enough reason to believe things. And all of us in some residue of that, I should add. Yeah, I mean, you talk about that in your discussion of reality versus mythology. So kind of an interesting bracket is that, you know, if if we're seeking truth and at some point you you quote Philip K. Dick saying reality is that which when you stop believing in it doesn't go away, that's actually not what we're not always interested in that, right? We are interested in some kind of symbolic identification or a secret world where we get to believe certain things, whether it's true or not. Exactly. And even among intellectual elites, there are elements of that, such as religious belief. I mentioned that the when Christopher Hitchens and Richard Dawkins published their atheist manifestos, there was a furious response, not just from the evangelists, but from fellow intellectuals who said it just kind of uncouth or just not done to consider God's existence a matter of truth or falsity. It's not like he came back and said, well, here are 36 reasons to believe that God does exist. It's like, how dare you even take that proposition and hold it up to scientific scrutiny. And national myths are kind of like that. The the our fearless founders, historical fiction. Does anyone really care whether Henry the Fifth actually gave that speech that Shakespeare attributed to him or the events in the crown? Well, people do, though, right? But but they're arguing over kind of dueling mythologies rather than whether or not this is historical reality. They don't. I mean, you know, there's some people who want to get to the bottom of it, the fact checkers, the snopes, but they're sometimes considered a little kind of pedantic. Like, you know, who really cares whether they said it or not. It's great dialogue. Can I just to kind of dilate on that a little bit? But, you know, with Henry the Fifth say the, you know, St. Crispin's Day speech, to the extent that that later gets used by actual British politicians, you know, people like Winston Churchill to either rally the troops or, you know, maybe in a more sinister way to attack bad people or to attack people with under bad motivation. That kind of crosses into where myth starts to create a reality of its own. Indeed. And we are storytelling animals. We live by our beliefs, by our environment. So, yes, they become a they do become a reality, a different reality. The reality of the national mythology is as opposed to the reality of what actually happened. So let's talk about some of the common mistakes or the things that block our rationality. And one of the, you know, I think everybody, this book is arguably your most challenging ever because you have a series you present a number of logic problems or rational rationality tests. And I think everybody's going to line say, oh, you know, I got them all right, or I got 90% right. I know one of the ones that you talk about is something that's called the Monty Hall, the Monty Hall problem. He was the host of an old TV show called Let's Make a Deal where there would be three curtains, a prize behind one of them. Could you explain why the Monty Hall problem haunts, you know, the the mansion of reason in contemporary America or the contemporary world? It's a deceptively simple challenge. So there are three doors behind two of them are goats. Behind one of them is a sleek. I believe they the goats were also sometimes called clunkers, if you picked wrong. Yes, yeah. The and the contestants' challenge is to pick one of the doors. So the contestant picks a door. Monty Hall then opens one of the other two doors, always revealing a goat. So he knows what's behind each door. Now the question is, should you switch? That is, should you switch to the unopened door or should you stick with the one that you have? Now, every almost everyone says either you should stick or it doesn't matter. It's 50 50. The correct answer is you should switch that the odds are two out of three that the car will be behind the unopened door that you didn't pick. One out of three that is behind the door that you didn't pick. Now, almost everyone, including some of the most famous mathematicians of the 20th century, get it wrong. It achieved a kind of second notoriety when it was picked up by the columnist Marilyn Vos-Salant, the smartest woman in the world. The world's smartest woman that we know that because she published every week in Parade Magazine, the most intellectual publication on the planet that having said, her column is really good. Yeah. Yeah. Now it's it's phenomenal. Yeah. And she explained the right answer and she got thousands of letters, explaining to her why she was wrong. Whereas in fact, she was right. The reason that she was right is that when, since Monty Hall knows where the car is located, the fact that he chose a door that he knew did not conceal the car is information. It narrows down the possibilities. He could have chosen the other door. He did not. That is doesn't tell you with certainty, but it does provide you with statistical information that the car might be behind that door. One way that Vos-Salant explained it is imagine that there were, say, a thousand doors. You pick one, Monty opens 998 of the others, leaving one other one not open. You would probably pick that door, wouldn't you? Because the fact that he left it out open is telling you something. He could have opened it. He didn't. But it goes to, so the reason I went through it is partly just for the fun of getting people to realize that their first impressions are often incorrect, including first impressions that are maybe honed by a lot of experience in pointing out other people's fallacies in kind of a gotcha know-it-all mindset that says, oh, that's the gambler's fallacy, just as you shouldn't expect after a run of reds, the next spin should be black, 50-50 red and black. That kind of mindset superficially applied to the Monty Hall of Python, Monty Hall problem. We have Monty Python, nothing to do with Monty Python. Would say, well, it's 50-50 for those two doors. Just based on the kind of superficial analysis that in games of chance, the options tend to be equi-probable. These options are not equi-probable because you have information about the two of them. And information is critical to probability. The final reason I mentioned it is it also reminds people of the concept of probability is itself a kind of a multi-splendid concept. Not clear exactly what probability means. There are different definitions of it. But the one that is relevant to risky decision making pertains to information. And a simple way of explaining the contrast is if I flip a coin and look at it, what is the probability that it landed heads? Well, for you, it's 50 percent. For me, it's 100 percent. Now, the same coin is the same outcome. How could there be two different probabilities? And the answer is probability depends on information. It depends on what you know. You know, one of this book is obviously one in a series that you've written that really harken back to the Enlightenment and defend the Enlightenment. And so I think we'll come back to this later in the conversation. But part of what you're talking about is that this understanding of rationality, the Zen tribe uses rationality. But when we can take advantage of kind of the codification and, you know, a deeper understanding of what are the rules that are governing, you know, intuitions or evolved sensibilities, we can we can be much more precise and much more effective in how we think about things. That's right. And probably all of my writings in some way deal with human nature as evolution shaped it. And the tensions between that and the world that we live in now. And that isn't necessarily I moved away from the classic conception of that contrast between hunter-gatherers on the Savannah and post-agricultural societies. Now, that is a cleavage, but the more important cleavage, the more important kind of tug on our intuitions isn't from when we were hunter-gatherers. It simply is everything prior to the contributions of the enlightenment of Western educated, post-enlightenment societies that have access to data sets, scientific method, tools of reasoning like probability theory, institutions that are dedicated to objective truth. That's what's really novel in human history. And that's what our minds are not quite adaptive to. So you don't even have to go all the way back to Savannah. Just think, do you not accept, you know, the Bureau of Labor Statistics or Bayesian reasoning? So well, let's talk about Bayesian reasoning because this is something which is really an incredible kind of conceptual breakthrough. And I know I have trouble understanding it from, you know, the minute I read it, I'm like, I got it. And then I start to wonder what the hell I just read. Could you talk a little bit about Bayesian Bayesian reasoning and why it's so important to to understand? Name that for the Reverend Thomas Bayes, the 18th century, and who conceptualized his main breakthrough, least conceptual, forget the theorem or the formula named after him, which itself is actually not that complicated. But the basic idea is if you don't consider your belief to be, yes, I believe it, no, I don't, but a matter of degree, degree of credence, ranging from zero, I'm sure it's false to one, I'm sure it's true. But for most things that we care about our belief is somewhere in between. So you can couch it as a probability. Once you do that, then some pretty simple algebra dictates how you should adjust your credence in a belief, depending on the strength of the evidence. There's basically just three, three concepts. There's the go into it. There's the a priori probability, which is actually spilled over into everyday parlance. Recently, more and more you hear people talking about your priors. That comes from Bayes theorem. And it's simply what is the what is the credence that you give to an idea based on everything you know so far before you can look at the data for the new evidence. That's the prior multiply that by the likelihood. And that's just if the idea is true, how likely are you to see what you're seeing? And then you divide that by the commonness of the data, namely, how often do you see those data, whether your idea is true or false, coming something over the true and false cases? So the case of a medical test, for example, the false positives would go into that denominator. So just three numbers. But what the main import, the main thing that people tend to neglect, and this was a contribution of one of many contributions of Amos Tversky and Daniel Kahneman, who are pioneers of study of human judgment in decision making. So in many circumstances, we forget the prior. They call it base rate neglect. You say, talk about a disease that's pretty rare in the population. You get a positive test result. There's a nonzero false alarm rate, false positive rate. Do you have the disease? Most people say, yes. Often the answer is no, because if the disease is rare to begin with and there are some false positives, if all you know is that there's a positive, then it's probably a false positive. That is difficult for us to kind of crunch through in terms of the numbers. Although, and this is one of the twists that I added in the book, if instead of talking, saying the prevalence of prostate cancer, the population is 0.01 and the false positive rate of a test is 0.09, you say, imagine 100 men and one of them has prostate cancer. And of the men who take the test, nine of them who don't have prostate cancer will nonetheless test positive on this test. You turn it into actual numbers that you can visualize in your mind's eye. Actually, people are much, much better at taking into account base rates. The moral of that story is that instead of just dismissing our species as hopelessly irrational because we do flood tests like medical diagnosis, a better way to think about it is that the mind is comfortable in certain kinds of formats, certain kinds of content, and that it's these formula like Bayes rule with its H's and D's and a little bit of algebra that we have trouble with. The power of a formula is you can apply it to anything. You don't have to be familiar with it, but our reasoning tends to be baked together with a lot of subject matters and depends on concrete visualizability. So what I'm getting from that is that we should really have all test results revealed by Monty Hall. Sorry, you've got prostate cancer, but some of the other kind of problems that you talk about, I think are more familiar to people like motivated reasoning and my side bias. Could you talk about my side bias? Because that, certainly, if we wanna reduce things to partisan politics, which I wish I would never do, but it happens, but my side bias seems to be a real stumbling block or a real obstacle to employing rationality in a way that will allow us to understand reality better. Indeed, it's been called the most pervasive of the cognitive biases. It's uncorrelated with IQ, unlike the other biases and emphalases. This is the tendency to ratify conclusions that are sacred values or talking points or part of the platform of your own tribe, coalition, and sect. And it can lead people, including very smart people, to make errors, blunders in logic 101 and probability 101. They look at data from, let's say, a fake experiment that tries to establish whether gun control works or not. And if they're liberals, they turn off the study shows that gun control works, even if the numbers show it doesn't, and vice versa if you're concerned. Right. How do you get around that? I mean, I guess the first step is identifying the propensity for that, but then what are the ways to kind of like to build that into your analysis so that you're not merely ratifying what you already believe? It can't be done just by drilling the techniques into people's heads, because people who know the techniques just choose not to apply them when it comes to challenging one of their beliefs. So it really does depend on a kind of an ethos or a value or a norm that you should doubt your own science beliefs or you just hold them up to scrutiny, be prepared to change your mind. There is a kind of a nerdy clique that tries to promote these values called the rationality community, which I think overlaps with the libertarian movement. But these are people who I think to their credit, try to spread these norms saying, well, you should assign a degree of credence to your beliefs instead of saying it's right or wrong. You should think in Bayesian terms intuitively. You should be prepared to change your mind. When you're arguing against someone, you should steel man them. We lose the opposite of a straw man, namely build the most formidable effective version of the opinion that you disagree with and try to refute that. One of the sections that you talked about or in the book that you discussed, which I found really interesting was about probability and randomness and kind of confusing one for the other. In that you were talking about availability inflated fear and particularly murder. You talked about the George Floyd murder or killing by police in Minnesota, which obviously changed the political conversation in a way that exceedingly few events have. And you noted that 65 unarmed people are killed each year by police and that I think it was 23 were black. But this is an interesting thing where if you look at the data you would have, and if we were all Mr. Spock from Star Trek, we would react in one way. But we didn't react that way and you're saying we shouldn't have reacted that way. But can you kind of tease out how rare events end up becoming kind of catalysts for massive kinds of social change and the way that that intersects with your concerns about rationality? Yeah, and it gets back to the point that rationality is always in service of a goal. And if the goal is to probably to say the largest number of black lives or anyone's lives who are killed by police, then basing it on a viral video is probably not the way to go. But you should go to the Washington Post data set and look at how many are killed and then compare it to how many people are killed from gang warfare and street crime where if you hobble the police, you might actually increase the number of people of all races who are killed, which is in fact exactly what happened. But there's another goal that you can pursue. And again, you can't say it's irrational because it's a different goal and it's irrational to pursue that goal. It's how do you mobilize people for some social or political crusade that up to them they may have been apathetic about? Well, what Thomas Schelling, he didn't call it that, that he identified a communal outrage and John Tooby and Rita Cosmary has also written about this. That is a conspicuous event that is perceived as an intolerable front to your tribe or group, that not only do you know about it, but you know that everyone else knows about it at the same time. They can be very effective for better or worse in mobilizing collective action as people respond to that collective outrage. So the George Floyd murder was certainly an example. Other examples are the Tunisian fruit peddler who set himself on fire and catalyzed the Arab Spring 9-11, which led to, we're still dealing with the sequelae, including the fact that it was probably the invasion of Afghanistan. Retroactively it was seen as trying to oppose liberal democracy or this tribal society. But really it was just kind of, there was such outrage at this attack. It was basically saying, don't fuck with us. You heard us, we're gonna hurt you. Pearl Harbor, the explosion of the main, the Gulf of Tonkin incident. Leaders say never let a crisis go to waste and use a conspicuous outrage. How does, you know, you mentioned the Gulf of Tonkin, which historians now question whether or not that incident actually ever happened. And you know, there are other, and that's, it's not a conspiracy view. There's a real question about whether or not the incident that kind of led to this massive increase of US presidents in Vietnam. How do you tease that kind of stuff out? Or is that only something that can be done after the fact? Or how would we, if we're trying to use rationality to kind of keep our best society and our best selves going, how do we check that particularly in the heat of the moment? Yeah, and then it is a challenge because there is some legitimate rationale for not exposing everything in intelligence governments to not publishing it on the front page in times the day that it happens. On the other hand, in a democracy, people ultimately have to know what the government is doing and what it's reacting to. So, there's no simple answer to how to balance those. But, you know, in general, I think you and I would agree and probably most people listening that should be tilted as much as possible to access, but government should not be allowed to get away with shenanigans. Yeah, especially, I mean, I guess this also, maybe you can talk a little bit about the question of conspiracy theory and also whether or not is irrationality or non-rationality getting, or is rationality getting scarcer? You say that it's scarce. But, you know, if you start from, say, the Vietnam War in an American context, there's been a massive decline in trust and confidence, not just in government, but also in business, in kind of nonprofit sectors, including things like the Catholic Church or the United Way. And so, you know, as kind of trust and confidence continues to decline, people seem less likely to believe what they're told anyway. So, you know, are we in a golden age of conspiracy thinking and of magical thinking or is it really kind of a steady state that just comes with being human? Yeah, I think there is a steady state and conspiracy theories go way back. I mean, just a century ago, like the protocols of the older society and the Illuminati and the free nations. Well, the Illuminati are true, though, you know. Yeah, Action Bronson, the Albanian-American rapper has, you know, believes in it, so I do too. Yeah, and especially you can tell, you know, it's hard to plot these things quantitatively and I always resist the leap from, there's a lot of stuff now, therefore it's getting worse because often that's not true. And there is one guy who tried to quantify it, Joseph E. Shinsky and his book on conspiracy thinking, theory theorizing. And so he tried to find a data set where you could actually measure it that goes back and he looked at a trove of letters to the editor, I think of the Times, and he found at least through 2010 there was a pretty constant level of conspiracy theorizing. By the way, trusted institutions, you're right that they have sunk in most institutions since the 60s, but we should also keep in mind the fact that John Mueller has pointed out, which is that 60s were kind of a high point that we, it'll kind of a peak in that before that, people didn't trust the government either. So that may be also close to a steady background state. Yeah, John Mueller has also just been great in kind of really documenting the amount of hype over terrorist killing or terrorist violence in America and things like that. It's a really, I think he fulfills most of your requests for more analytic approaches and rational approaches to everyday life for sure. Absolutely. Let's talk a little bit about why any of this matters. And it seems, I hesitate to say like, why should we be more rational or why should we avail ourselves of our rationality? Can you, are we actually in a situation where we need to be doing that? But yes or no, and then why, what's the main case for increasing the volume and kind of quality of our rational behavior? Yeah, and again, it is a paradoxical challenge because the only way that I can defend rationality is by using rationality, which itself is an argument that we're already committed to rationality if we're having this conversation in the first place. Because you're not offering me a bribe as to asking me to bribe your listeners and saying, I'll send you a $25 Amazon gift certificate if you agree with me. I'm giving reasons, which means you're already committed to reason. So there's that kind of inescapable background that we, it's the air we breathe. What is the alternative though to rationality? Is it just kind of a Nietzschean will to power or something like that? Yeah, yeah, that's one of the primary alternatives. There's the idea that you should go with your gut. There's wisdom in Articulate's first reactions. And this overlaps a bit with Nietzsche that was important is not to think but to feel. I think it was a heritor with the German Romantic who said that. Or that there is no, it's tied to Indigism, but it's cut part of social justice warfare, wokeism, the successor ideology as Wesley Yang put it, is that objective truth is an illusion. It's just a claim to power. There is a zero-sum competition among interest groups defined by race and sex. And argumentation should be to rectify the power balance where it's been the straight white nails who've had all the power. Is there a way, I mean, can you use rationality to say, no, in fact, we can at least aspire to disinterested, a disinterested understanding of the world, which will allow us to have a fairer and more just world. Or I mean, when people say, well, you know what? Like talking about the Enlightenment, that's just the way to disguise your will to power and to make it complicated so it seems legitimate. But in fact- And get that all the time, right? Yeah, I mean, so I mean, in a way, I mean, these are kind of incommensurate arguments. And in a way they are, that if you believe that our argument itself is just a subterfuge that you're not gonna be able to get any other argument to change your mind. Nonetheless, I think there is a little chink, a little opening, which is, so are you willing to say that everything you just said now is irrational? And so why should I believe it? Why should I support you? Are there any reasons? And as soon as they say, well, yeah, there are reasons. Aha! You believe you're in reason after all. Now, of course, human nature being what it is, people don't lay down their tribal commitments, the drop of a hat. But on the other hand, onlookers might be persuaded one way or another. New babies are being born every minute and they've got to acquire an overworld to you. But let me, I also just want to take it beyond the kind of foundational answer, which is that as long as you're discussing reasons for anything, you're already committed to reason. Getting back to your question, why does reason and rationality matter? So that's what obvious kind of better reason. Another is that in fact, there are some data that suggests that people who are less susceptible to some of the classic fallacies, like the gamblers fallacy, some cost fallacy and so on, have better life outcomes. They're less likely to get sick, to miss a train, to get fired, to get divorced. There's getting back to the causes of progress. Progress being a demonstrable empirical fact, namely we live longer, we're richer, we're better educated, we're happier. How did that happen? Well, no thanks to the universe, where the laws of physics and biology try to grind us down. The only way that we can possibly carve out a better life for ourselves is by applying reason, applying our brain power to make us better off. And I suggest somewhat, maybe bitaneously that the social movements that have succeeded and that should have succeeded often began with an argument. There was a John Locke, there was an Adam Smith, there was a Mary Astell, there was a Cesare Bacaria, who articulated why some practice today was inconsistent with other values that people held, and it would go viral in pamphlets in a kind of Samas dada day, be debated in pubs and salons and coffee houses, and eventually become the conventional wisdom. Yeah, and that's a fascinating section, and you talk about somebody like Frederick Douglass, there were also a number of early feminists who took, you know, Lockean logic, or the logic of the Declaration of Independence, and then used it to actually expand the rights of, you know, to expand the circle of who can it is human, who can it full rights. How is, do you think reason alone, or rationality alone is compelling, because, and I think about something like the abolitionist movement, there were absolute arguments which employed kind of religious reasoning, that we are all equal before God, so no one should, you know, rule over one of us taking, saying that blacks are humans, just as whites are, but then there was also famous images, which seemed to be more geared towards the emotion, so the famous image of the slave ship and the way that the slaves were actually put in the, you know, in the holds of ships, by all accounts had a massive impact. So is reason one of the tools, or is it, you know, is it necessary, is it sufficient? How do you, when you're trying to pursue people to a particular point of view or a vision of the good life? No, it's a really good point, because we are humans, we're driven not just by abstract principles of human rights and human equality, but by empathy, by horror, by disgust, by compassion, which, you know, those are emotions and an effective persuader knows how to use both. And Frederick Douglass being a fine example, he had a, you know, just a lightning sharp mind and kept saying, I'm not going to argue why slaves were receiving a many proof, you know, 17 arguments, but he also appealed to heart-wrenching images of the suffering of slaves, yeah. This sounds like a good place to ask you, I want to ask you two questions or to explain one thing, in the book you mentioned in passing the famous humeline about how reason is and always should be a slave to the passions. Can you explain what he's actually getting there? Because sometimes I think that leaves people wondering what the hell to think about. Yes, right. It's easy to misinterpret him as saying that we should just, you know, kind of shoot from the hip reason by the seat of our pants to come to temptations at the moment, but he of all people was not saying that. I think the ultimately is still logical point that reason is in service of a goal and it can't tell you what that goal should be and the goal in practice in humans is provided by the passions, by the emotions. We deploy our reason to attain something that we want and then who's to say what we want? That's those are passions. So does that, I mean, does it get us that far from a kind of medieval conception of man as a, you know, an angel trapped in the body of a beast? Yeah. Well, it is, you know, and I did appeal to an angelic metaphor in my book on the divine violence and the angels of our nature. And I do think as someone who's an advocate both of the idea that there is such a thing as human nature and as an advocate of the belief that progress is possible. So you can't just say, oh, you can't change human nature. We're stuck with more of these synergines. I don't believe that. Just because the mind does have multiple systems, multiple components, multiple modules if you want. And we, there's always tension between various desires, various better angels. And it's by kind of mobilizing and empowering our better angels, so to speak, that progress is possible. So there's something in medieval notion now. I want to ask you about Rebel Without a Cause. The, I think it's a 1954 movie or 1955, but right around the time you were born, it's obviously a famous movie of starring James Dean. You use that to illustrate a couple of points, but can you talk about how the chicken run where the two characters are going to drive cars straight at each other. How does that illustrate rationality for you? And then also, I mean, why do you love that movie so much? In fact, in the movie, I had to remind myself that he didn't actually approach each other. The chicken there was driving over a cliff. Right, right. But the idea is that the, let's take the simpler case of the two cars approaching each other on high speed, each driven by a teenager. The first to swerve is the chicken, he loses space, he loses. It's an illustration of game theory paradox in which often the optimal strategy in situations of conflict is to be less rational, less in control. In this case, how do you win at chicken? Well, you put a U-lock on the steering wheel, you put a brick on the accelerator and climb it to the back seat. The other guy has no choice but to swerve. So the guy who visibly sacrifices control is the one who wins. Now, of course, the problem is if both teenagers hit on the strategy at the same time, it could be a recipe for disaster. Yeah, we are talking about teenagers, so the likelihood of that is probably higher than it is among people of more mature age. Exactly. And of course, in situations where the winner is the less rational one, what's the higher level rationale that these don't play games like that or don't deal with the hotbed, the high maintenance partner, the unreasonable person, because they are gonna win and so find someone else to negotiate with. Can I ask about, one of the reasons I bring up Rebel without a cause is that that also kind of created a different sensibility among a lot of American, I guess, global or North American teenagers and it created a sense of what it was to be an adolescent. People ape James Dean. I mean, they're still doing it in various ways. Is it conceivable that for something like rationality that if you have maybe artistic works or large movements that suddenly make it cool to be rational? Is that a way of kind of changing the grounds upon which we stand? Because your argument is that it's better to have more rational actors in a society than fewer. Is that one of the ways to kind of build out the volume and the quality of rationality in society? Yeah, could you have a rational James Dean? Yeah. Yeah, it'd be a tall order. I mean, it's hard to engineer any kind of grassroots cultural change from the top. But I mean, it kind of comes and goes in streaks, right? Where, you know, in the Enlightenment, certainly there were movements where it was kind of, you know, it was cool, although maybe not at the time, but looking backwards, we categorize it as, you know, the certain salons, certain, you know, areas of Paris and Edinburgh and whatnot. Yeah, and there is a kind of Silicon Valley rationality community Bay Area that kind of tries to do that. I don't know if they succeed in breaking out of the nerd stereotype. Right. But the thing to capitalize on probably is not how cool it is to be rational, but maybe how uncool it is to be irrational, to be caught in making an obvious blunder. So it might have to come from that side. At one point you mentioned in passing the, I read it down here, I believe it's, that universities have a suffocating left-wing monoculture. One would expect that universities would be the first place to kind of be pushing rational discourse. What do you think explains the evacuation of rationality from universities and what is the best way to kind of try and bring it back to these places where that seems to be the whole point and certainly was, you know, at least the origin story of universities? I think that everyone is tempted by a goal that is not the same as the goal of pursuing objective truth, of being on the side of a prestigious coalition. And we all have a kind of acute antennae for which way the coalitions are forming, a terror of being on the, in a minority coalition. And so it is a kind of herd mentality that no one is immune to. As a kind of a paradoxical byproduct of some of the progress that we've made, and namely racism really has gone down and sexual harassment and sexism and so on. It creates an opening for a moralistic crusade in which you can be on the side of the angels and you can, and do your darkness, not to be on the receiving end of the moral indignation. That's a game that got, it didn't begin in hopefully the last decade or so. It's, I remember it from when I was an undergraduate it already got the idea that academics are some kind of moral beacons. And that beacon really consisting of condemning the heretics and the rebrowmates, being the attacks on Leo Wilson, the biologist by his Harvard colleagues, Richard Wilenton and Steven J. Gould were an early example of that. And I wouldn't have predicted that it become kind of the dominant ethos of the university finding bad people to demonize. It's kind of interesting, by the way, just to stop on that for a second, that that really started in the sciences because oftentimes kind of political moralizing or ideological moralizing is seen as something that starts and stops in the humanities and social sciences. That is interesting, yeah. And John Hyde has been most explicit in saying that universities really have to choose whether their telos is finding, is advancing knowledge or advancing some political cause, in this case social justice defined as a relative standing among groups. That you can't do both because reality is going to sometimes confound your moralistic narrative. Can I ask, just as a bit of biography, I mean, you somewhat famously grew up in a Jewish section of Montreal or of Canada, which kind of makes you an outsider. I mean, if you're English speaking in Quebec, you're already an outsider. And then if you're not part of the kind of Anglo, Anglo-Canadian establishment and whatnot. But do you think that there's anything in your biography that kind of predisposes you to being comfortable speaking from a minority point of view or going against that herd mentality that you talk about? Or is that, how do we develop that in people who maybe wouldn't necessarily have been raised in that kind of context? Yeah, it's possible. I think I've also been kind of practical and mindful of what I can get away with without just being canceled and vanishing into obscurity that I've taken advantage of this peculiar custom of tenure and I've been more outspoken when I know that I wouldn't kind of lose my livelihood for saying unpopular things. I think having the, coming from social sciences was just a mindset that deep interesting ideas about politics, human nature, meaning of life can be and ought to be shaped by empirical observations, by data, is that overriding mindset that I like to think that motivates me in these other questions. One of the, in the final section of your book when you're talking about why it matters, you say that rationality is a public good and you use Wikipedia as an example of how in a kind of public sphere of debate and discussion, we can use rationality, but also as you stress throughout the book and you've said today, you need to be able to constantly be checking your bias or to make sure that you're not baking in your own my side bias or confirmation bias or motivated reasoning. How does Wikipedia work as a kind of good way of producing knowledge, but also kind of checking to make sure that people aren't just kind of, going down a rabbit hole of just so stories that validate whatever their priors were. Yeah, it's come as a surprise to almost everyone how good Wikipedia is. Certainly in its early years, I and most people say, this can't possibly, would you have an encyclopedia? It's the online encyclopedia that anyone can edit, right? What a disaster. What a disaster, in the early years it was a disaster. I mean, the articles in the first couple of years were embarrassing, but it, so it did implement a set of rules of checks and balances under the overarching ethos or telos, if you want to use fancy words, of objectivity and truth. That's, there are five pillars of Wikipedia. These are principles that all the Wikipedians sign on to and foremost among them is accuracy and objective truth. So there's the overall commitment and then there are the mechanisms where one person doesn't get to bang his drum or write his hobby horse. If he or she does, more often he, then someone else with a different hobby horse will erase that edit. It doesn't always work, but of course the Britannica didn't always work either. Right. And it is, what it shows is that it isn't that digital media are inherently, cesspools of fake news. It depends on the rules of the game. And again, yet another lesson is that you can't predict a priori what is going to work. You have to see how things unfold and measure their accuracy. At the beginning of social media, a lot of people thought that this would be a birth and new enlightenment because you're at the time and you and I are old enough to remember where when the problem with the press is, as they used to say freedom of the press goes with those who own one. And you had, you know, Noam Chomsky with manufacturing consent and an oligarchy of a small number of corporations. What we really need is that anyone could publish. Well, we got what we asked for. Yeah. So are you, I guess to kind of end the conversation, are you optimistic about the ability for people to generate rules of the game? And obviously there's going to be many kind of scenario or many areas that people are doing this and they're going to be overlapping and oftentimes contradicting themselves or different groups. I mean, this is part of the enlightenment, right? It's really a process. It's not a threshold that you cross or a plateau that you reach. Are you optimistic that we're going to get better at kind of creating the institutions and the norms that will make us be more rational, which will lead to more progress and also kind of better social amity? Yeah, I think it'll be, as someone who believes in human nature, I don't think we'll ever have an epistemic utopia. I think there will always be conspiracy theories and fake news. Will our institutions kind of develop the right workarounds or better ones? It'll probably vary by country, that mechanisms of democracy, you know, can promote or or demote data-driven decision-making different institutions will succeed or fail. The fact that so many people are aware of the hazards means that there is some hope that we'll develop some workarounds. But there'll always be some amount of chaos and background hubbub of nonsense. Do you have, as a final question, do you have a particularly embarrassing belief that is obviously not rational? Well, I guess I'm probably not the right person to judge because we all are subject to bias, bias. All of us think that everyone else is biased, but not me. And it's like the preface paradox. You open a book and every preface says, remaining errors are my fault. You say, well, why don't you correct them? And of course the answer is, I don't know which ones they are, and it's not yet. So you don't have any irrational beliefs. You don't believe in UFOs or vitamin C therapy or anything like that. I don't have any fringe beliefs, although some might say that a blood path, I believe the progress has taken place as a fringe belief. Yeah, that's a strange one. I guess just as a final, final question, at some point I know over the years we've talked about Bob Dylan and we both admire him greatly. How excited were you when he won the Nobel Prize for literature? You know, as much of a Dylan man as I am, I actually was not kind of down with that decision just because there's already such fame and rewards and notoriety to popular musicians. Whereas literature is a tougher game and something that is designed to amplify those guys, I think it would have been better suited. I mean, you need a different kind of prize, but it would have been nice, I think if a novelist game creator famed through an allocation of that prize. I guess he himself was ambivalent about it, right? Cause it took him a couple of weeks or a month to get back to the committee, so. Oh, he'd have to be, if he wasn't there, he wouldn't be Dylan. Oh my God, he immediately said, thank you, this is the honor of my life. He'd say, oh my God, what happened to Bob Dylan? Yeah, I was kind of hoping he'd join SART in rejecting it, but you know, Dylan never fully satisfies his audience, so that's part of what makes him all right. And that's crucial, absolutely. Yeah. We're gonna leave it there. Stephen Pinker, author most recently of Rationality, what it is, why it seems scarce, why it matters. Thanks for talking to Reza. Thanks Nick, always a pleasure, and it's certainly fitting that this interview be for a site called Reza. Yeah.