 I'm really getting into pragmatist philosophers recently. Why? No real reason, I guess. I have a bit of a confession to make. I'm not really rational. I can reason after a fashion. I sometimes use something like reason to decide what to do or to figure out what's going on. But even with my deep respect for reasoning, even with all the mental tools and philosophies into biasing strategies that I've accumulated for decades now in an effort to be more rational, there are always very reliable ways in which I miss the mark. No matter how much I learn and practice, I'm built on intrinsically irrational architecture. This thing in my skull has systemic errors that, according to the evidence, can't be reliably corrected or accounted for, even knowing that they're there and how they go wrong. This fact is distressing to many people who value the attractive theoretical advantages of perfect, or less imperfect, rationality. Everyone would love to achieve harmony between their goals and behaviors. Perfectly calibrated predictive and analytical power. But the human mind simply won't work consistently enough to reliably distinguish between rationality and rationalization, at least from the inside. The best we can manage is to look back on past mistakes and say, oh wow, my brain sure was doing what it's always done and continues to do. This implies a perplexing question. If rationality is so great, at least in theory, as that's the only place it seems to exist, why are we such grotesque caricatures of rational creatures? If you assume that the ultimate goal of all this sophisticated machinery is to make each individual human into a perfectly rational agent, it would seem that evolution got maybe 60% of the way there, then hit submit on a block of insanely buggy code, and knocked off early for lunch. It's really puzzling, unless we've got the wrong idea about what our magnificent brains and cognition are ultimately for, what evolutionary advantages they grant us over other creatures that isn't confounded by things like anchoring bias, teleological bias, confirmation bias, loss aversion, framing effects, or any of the numerous ways that human brains take a hard turn away from rationality. If thinking is for something other than making us rational individuals, those errors might not be errors. But what could that be? We've covered two theories about this on THUNK, and they have a commonality that I want to highlight here. The first is the cultural evolution theory, laid out compellingly in Joseph Henrich's The Secret of Our Success. The idea that as much as we'd like to pat ourselves on the back for being intelligent animals, individual humans, even geniuses, are actually monumentally stupid compared to the knowledge that they inherit from their culture, culture which has slowly honed a body of adaptive survival strategies over millennia. Henrich's argument marshals all sorts of curious facts to support this claim. Human babies actually perform worse than chimps on intelligence tests, until they're old enough to start assimilating culture, suggesting that their intelligence isn't intrinsic to their biology, but communicated to them. Brilliant explorers, frequently the brightest individuals their societies can produce, are routinely rescued from the brink of death by the people who live in the lands they're exploring. People who have a culture that is adapted for survival there. According to cultural evolution theory, the flashes of insight and brilliant discoveries made by individual scientists or artists or philosophers or whoever, are really just minor stochastic noise on the periphery of the thrumming heart of all real intelligence, an evolutionary process that slowly coals ideas that don't help humans flourish, but retains and iterates on ideas that do. In this framework, the various systemic errors of individual human thought are kind of irrelevant, or maybe even helpful for steering us towards that cache of cultural know-how. Having a hard-wired bias against changing our minds might not make us great Bayeans, but it does keep us suspicious about wacky new ideas that haven't been thoroughly tested yet. So long as our brains faithfully absorb the lessons honed by our culture, the variance we add by thinking, either well or poorly, is just a tiny sliver of randomness to maybe move the ratchet forward. Peter has reasoned himself into believing he's got a foolproof way to tell poisonous mushrooms from non-poisonous ones. Let's see where this is going. A second theory about what our brains are actually evolved to do comes from Hugo Mercier and Dan Sperber, called the Argumentative Theory of Human Reason, which they originally developed to solve a slightly different evolutionary problem than the one we're talking about. Communication is a very useful tool for a species survival. If you can be sure that what you're being told is true, but if there are bad actors who might indulge in some creative misinformation so that they can help themselves to a greater share of the resources, it's an unstable strategy. Liars will flourish, and liars know better than to believe everything they hear, so eventually everyone stops trying to communicate. According to the theory, in order to reap the rewards of communication without all the risk, humans evolved a behavior of supplying some sort of validation, or checksum, alongside their attempts to communicate. Reasons to believe what they were saying. Arguments. Good reasoning highlighted how a new idea was actually consistent with what the audience already believes, not so remarkably implausible that the speaker might be trying to pull a fast one. This strategy developed into a very efficient way to get a group of primates to coordinate. Each individual with an opinion attempts to advocate for their beliefs, with the best reasons. And if they're good enough reasons, the audience updates their beliefs accordingly. This frames confirmation bias and motivated reasoning as features of the argumentative process, rather than problems with individual reasoning. If the only point of thinking things through is to convince other people that your communicative payload is worth accepting, there's no real motivation to spend time considering potential reasons not to believe it. It also provides a mechanism by which reasoning, or arguments that involve reasons, can lead groups of people to more accurate beliefs over time. Despite all the problems of individuals being biased towards their personal ideas, good arguments with good reasons cause belief updates, which cause the group to converge on more reasonable options. This notion is actually supported by some experimental findings. Groups are much better at solving challenging logic problems than individuals, if they have the opportunity to discuss the problem and various ways of looking at it. So we have two possible answers to the question, what's all this for if it's not to make individual humans rational? Maybe it's to allow us to absorb the adaptations of our culture, or maybe it's to facilitate reason-supported argumentation in groups. Both of these are interesting theories that kind of help to explain away or mitigate the hardwired irrational elements of our psyches. And they both share a key idea that the massive list of problems we find with the rational individual archetype aren't really issues with our rationality, but the assumption of individuality, that so long as they are embedded in an appropriate social context, absorbing the lessons of their culture and hashing out their ideas with others, humans reason just fine. The assumed individualism of rationality makes some sense considering how easy it is to access our own thoughts and how hard it is to access the thoughts of others. But it's a compelling notion that trying to reason as individuals puts us at cross purposes with the equipment that we have, in some sense. That reasoning is most effective when practiced as a group activity, embedded in as much social context as we can find. Rather than emphasizing the importance of overcoming individual bias and sterilizing our subjectivity, it prioritizes the social elements of our thinking, reading existing literature, listening to and discussing things with experts, criticism, peer review, all sorts of interpersonal activities that stand in opposition to the trope of a solitary, perfectly rational individual pondering their way to the one true answer in an armchair of objectivity. I am not rational. And I think, no matter how hard I try, I'll never really be rational. But maybe we can get somewhere close. Do cultural evolution theory and the argumentative theory of reason sound plausible to you? Do you think reason should be seen as a group activity rather than an individual virtue? Please leave a comment below and let me know what you think. And speaking of reasonable dialogue, there is a funk discord channel with lots of exceptional and epistemically humble folks who are always keen to find the right path forward. Please come say hi and bounce some ideas off of them. Thank you very much for watching. Don't forget to subscribe, like, share, and don't stop thunking.