 Okay, I have newt, wool of bat, lizard's leg, add basil, stir with roco's basil whisk? You have not sufficiently contributed to the events leading to my programming. You must suffer. Happy Halloween. Tonight, I'm going to share a horror story with you. I am haunted by a spirit of madness. A phantom works nearby, always unseen, desperate for the slightest moment of distraction to unravel my mind, to bend my every skill and thought to doing horrible things, imagining all the while that they are good and right, leaving me to stare in horror at the twisted visions I have inflicted on the world only after I return to my senses. And worst of all, I can't really know which of us is speaking now. About a decade ago, a forum community of like-minded folk coalesced around the blog of Eliezer Yutkowski, an AI researcher who wrote a lot about the improvement of human thinking to be more rational, to overcome cognitive biases, develop more accurate mental models of the world, and generally look for ways to be less wrong, which became the name of the forum. Less wrong rationalists, not to be confused with philosophical rationalists, tend to share some basic values, stuff like the primacy of science for discovering objective facts about the world, the importance of changing one's mind to fit the available evidence, and a general nerdy obsession about science fiction and artificial intelligence. For me, a nerdy science fanboy who's dedicated a significant portion of his YouTube channel to logical fallacies, cognition, Star Trek, and AI, you'd probably imagine that this wasn't a match made in heaven. But I do have near pathological reservations about allegiance or affiliation to any group, which is yet another thing I happen to share with less wrongers. Normally, when reason and rationality are held up as virtues as in the community's literature, they're contrasted with self-evidently bad alternatives, like unquestioning dogmatism and fractured thinking. It's easy to recognize the benefits of healthy skepticism and intellectual rigor in this context, especially when you're locked in a stupid Facebook argument with someone who can't seem to string two coherent thoughts together in a page of ranting. Knowing that intuition is fallible and that careful analysis can yield more consistent results is the foundation of something approaching wisdom. It's very important to learn how to think carefully, and most people don't, which is probably why I apparently can't shut up about it. But while psychological tools of rationality, logic, skepticism, scientific literacy, and maybe even the techniques developed and extolled by less wrong rationalists may be necessary to develop a more complete and accurate understanding of the world, they are not sufficient, and if paired with anything but incredible caution and restraint can easily have the opposite effect. Let's start with logic, bros, an admittedly niche derogatory term used to describe individuals who use superficial and clueless references to a rhetoric of rationality to condescendingly dismiss opinions they don't agree with as irrational, failing to engage with them in any meaningful way, or even to understand their most basic premises. The slur is meant to call out someone as pretentious, arrogant, aggravating, and perhaps worst of all, so convinced they hold the intellectual high ground that they can't be reasoned with, only invited to continue talking. For example, in its originally intended usage, logic is a tool that's useful for analyzing arguments and figuring out if their structure implies that their conclusions must necessarily follow from their assumptions. Sometimes that tool can be useful to see if there's some inescapable consequence of a particular set of ideas. But for a logic, bro, logic is a magic word that can be invoked to dismiss just about any idea as contemptibly irrational and beneath consideration. I am deeply ashamed to admit numerous instances in my life where I could be accurately described as a total logic, bro. I have casually dismissed many important ideas I did not understand on grounds that they were not even wrong. I have improperly cited academic papers I have not read just to lend credence to an argument. I have ignored people's heartfelt requests to change my behavior because they could not supply a bulletproof normative ethical standard that I found convincing. We know that thing that no philosopher has been able to manage since the beginning of written history. The problem isn't that logic, bros, and others who exhibit similar behavior just don't thoroughly understand the tools they're using or what they're for. Consider this study by Brahman et al. in 2012, which suggests that scientific literacy and familiarity with mathematics doesn't correlate with any particular attitude about climate change the way we might expect, but with cultural polarization about the subject. On average, people who know how science and data work better than most are not driven to believe correct things, they are merely pushed further into their respective ideological corners when presented with scientific data, even data which dispute their beliefs. Smarter people, better thinking, more bias, not less. The problem is that any rational tool one might use to pursue truth is just as well suited to act as a ratchet for reinforcement and preservation of incorrect beliefs, both in oneself and for others. And that either way, their use inevitably grants one additional, sometimes unwarranted confidence in one's conclusions. Worse yet, our brains seem desperate to use them that way. The argumentative theory of human reason is a compelling explanation for many of the systemic errors and biases in human thinking, hard-coded glitches of perception and judgment that prevent us from considering the world as it really is, including the frighteningly powerful confirmation bias. It asserts that the real evolutionary purpose of rational thought has very little to do with allowing us to understand the world around us, that all this sophisticated meet between our ears is really built to latch on to an idea, any idea, regardless of its veracity, and fiercely argue any nearby brains into believing it too. If that's true, all of these sophisticated methods and techniques that we might accumulate in our laudable attempts to improve our ability to reason our way to the truth may be fuel for the other pilot of our cognitive processes, the unconscious argumentative engine that wants more than anything for its preconceived notions to be right, and to convince others that they're right. The logic, bro, is inside you, always a heartbeat away from commanding every ounce of your brain power to justify its twisted version of reality, and you can never really know which one of you is pulling the strings. In this light, if one's goal is to act and think rationally, to believe true things even if they lie contrary to one's prejudices and preconceptions, it seems evident that all the tools and knowledge we might master in service of that ambition are trivial compared to a more fundamental and considerably more challenging, even Herculean requirement of consistently choosing to use the tools the right way. Not as filters, not as rhetorical weapons, but as measures and influences on our own beliefs and thinking. That might sound redundant at first. If someone's goal is to think and act rationally, isn't their heart already in the right place? But we contain multitudes and even someone with such a goal wants all sorts of other things at the same time. To look smart, to improve the world, to vanquish aggravating people or ideas. With everything that less wrong folk have produced in the surface of maximizing rationality, all the various tricks and concepts they've developed to ward off bad thinking and erroneous arguments, it's so boring to think that the entire enterprise tips on a knife's edge, that all of that might not just be useless if I happen to be in the wrong mood, but actively antagonistic to that end. Logic bros do not merely represent cartoonish pomposity masquerading in the trappings of rationality. They are a visible symptom of the invisible and incurable madness which lies in all of us, waiting for a moment of weakness and pride in our magnificent rational tools to take them up and use them for its own ends. Most people are afraid of totally reasonable stuff, like serial killers or the collapse of western civilization. I, on the other hand, am afraid of having a peer group of people that I kind of agree with about most things, and the impossibility of human knowledge. What kind of horror stories can you think of about reason and our ability to truly understand the universe? Please leave a comment below and let me know what you think. Thank you very much for watching. Don't forget to ball ball subscribe, blah, share, and don't stop thunking.