 Oh crap, it's thunk day and I haven't thought of a pun yet. Well sometimes I can come up with something clever on the fly. Maybe if I pretend like that's what's going to happen this time it'll actually happen. That should bias some time. In episode 64 I was a little shy about introducing the argumentative theory of human reason. But it's grown on me. Humans definitely have a remarkable ability to model and predict the future behavior of their environment. Like predicting when the next solar eclipse will occur and where it will be visible on Earth. But there are so many troubling bugs with human reasoning that it's possible that its original purpose wasn't for anything remotely like that. As thunk fans are no doubt aware, cognitive biases, specifically the ones categorized under confirmation bias, are a whole set of weird and consistent errors with how humans perceive and process information. Existences which are totally beyond our conscious control and reliably lead us to incorrect conclusions. Their existence is at least consistent with the idea that many of our sophisticated eclipse predicting mental processes are actually optimized for something else, for locking us into a set of beliefs and arguing other humans into believing the same things that we do. According to the argumentative theory, all this cool, fancy, rational stuff we can do with our brains is actually a sort of misappropriation of those mechanisms. Kind of like trying to drive a nail with a pipe wrench. Yeah, you can do it, and it might even work, but it's pretty clearly the wrong tool for the job. And that's a huge problem, because sometimes the underlying nature of human reason shines through. We've all known someone who has stubbornly refused to change their mind, even with a bajillion clear reasons showing beyond a shadow of a doubt that one of their beliefs is wrong. According to this theory, that's not really their fault. Their brain is just doing what human brains evolve to do best. To buckle down, digging its heels, and argue vehemently for an idea. We'll pass the point that a more rational organism would realize its error, and update its beliefs to be more accurate. If you were in their position, with all that cognitive bias machinery churning away in the background, it would also seem to you as though you were being totally reasonable. There have been a substantial number of times in your life where your brain has done exactly that. Not just when you were young and stupid, not just when you didn't have the facts yet. There is, in all likelihood, something that you currently believe that is demonstrably wrong, that you can't fix because you're programmed not to want to. Now in a typical THUNK episode, this would end by me saying, so be sure to use critical thinking and logic so you don't screw up, and then I would do all the end screen stuff. But appropriately enough, it would seem that I was wrong about that. This 2007 paper by Daniel Willingham makes a compelling case that abstract critical thinking skills tend to be trapped in whatever context we first learn them. Like, you might teach someone in a journalism class how to check sources and research claims, but as soon as they get on Facebook, all that training goes right out the window. It's not a case of being lazy or careless or stupid. It's just that people have to develop a familiarity with those situations that might call for more cautious reasoning. And the red flags that we might learn in math class might not look the same in the context of a supermarket or a mortgage. And importantly, that argumentative machinery is still churning away behind the scenes. Even if I've successfully learned rules about how to analyze information carefully or navigate logically from premises to conclusions, if I only ever switch that stuff on when I'm trying to prove that I'm right? The data suggests that that's exactly what ends up happening. In many situations, people with high intelligence are actually less likely to update their beliefs in the face of new information, using all that horsepower to stay right where they started. If we're really interested in fixing those wrong beliefs that our brains refuse to let go of, what we really need is de-biasing, a set of tools that aren't really about critical thinking in and of themselves, but about facilitating its use. Not filling our critical thinking engines with all the available facts. Even the ones that are cognitive biases would rather we not think about too hard. I've got all sorts of fascinating papers about de-biasing techniques and their efficacy linked in the video description, but I've picked out a few which seem like they might be easy to work into everyday practice. Many of them might seem familiar or even folksy, but there's decent evidence suggesting that they work, and they were infinitely better than what I was doing before I read about them, which was… nothing. These are things that you can do to potentially reduce your risk of bias. They in no way guarantee that you will be right. They simply lower the odds that you're being unreasonable. Let's get started. Number one, get comfortable. If you've ever been to the grocery store when you're hungry, you're probably aware of just how much your judgment can be affected by how you feel at the time. Personally, I go into a sort of a fugue state and wake up with a cart full of pasta. The main issue here is that System 1 cognition, the snap judgment, gut reaction, intuitive mode of thinking, which is highly subject to biases, tends to win out over System 2 thinking, the slow, methodical, rules-based, eclipse-predicting kind, whenever we feel stressed, rushed, uncomfortable, or worst of all, threatened. There's a ton of dumb stuff that isn't even physically threatening, anything from losing face in an online debate with strangers, to ideologies and politics that we disagree with, that switches our brains over from System 2 to System 1 in preparation to argue our teeth out. In general, if you want to be in the zone to change your mind regarding something that you might be wrong about, you want to feel confident, well-rested, and most of all, safe. Additionally, making decisions far in advance has been shown to reduce System 1 bias because people don't feel rushed to make a decision. And also, don't try to change your mind on an empty stomach. Number two, look like you're thinking hard. A bizarre extension of that principle actually works backwards. When we're trying to solve a tough puzzle, we do all sorts of stereotypical things. We read slowly and methodically. We'll wrinkle our foreheads, that sort of stuff. It seems that the body associates those physical cues with System 2 thinking and is more apt to switch it on when those cues are present. As weird as it sounds, you might actually be able to reduce your odds of bias by pretending to be thinking hard, by consciously wrinkling your forehead as though you were deep in thought, or even reading new and relevant information in slightly illegible font or a foreign language, which forces you to read it slowly and deliberately. It sounds goofy, but hey, if it works. Number three, get into someone else's head. That argumentative engine just loves discarding relevant information in favor of scoring points for our team, regardless of who our team actually is. Political groups, fandoms, Mac vs. PC, you can divide people up into totally stupid arbitrary groups, like by the letters of their last name, and they will still exhibit a clear bias for members of their group and a clear bias against members of other groups. Perspective taking is the psychology term for imagining what it's like to be someone else. It's easy to do in an insincere or sarcastic fashion, but legitimately imagining yourself in someone else's position is surprisingly effective at reducing the effects of in-group bias, no matter what that bunch of jerks told you. Number four, consider the opposite. One of the primary mechanisms of confirmation bias is to restrict our attention to one possible chain of reasoning out of many. We just get locked in on something like, if it rains, the sidewalk gets wet. The sidewalk is wet, so it must have rained. The strategy of considering the opposite is meant to crowbar our awareness open just a skosh by imagining other potentially viable chains of reasoning. Okay, let's assume that it didn't rain. How else would the sidewalk be wet? Well, I guess it's possible that the neighbors are watering their lawn, or maybe the street sweeper just went through. Come to think of it, I don't see any cars parked here. Number five, perspective hindsight. That same myopia of reasoning also makes it very difficult for people to imagine ways that their plans won't work out. I mean, once we've hit upon a feasible sounding course of events that makes us happy, why should we waste brain power considering other possible paths? Of course, moving in with someone after three dates is gonna turn out just fine. Perspective hindsight is a really fun way to get around that mental block. Imagine that there's a flash, and then an older version of yourself steps out of a time pod and proceeds to explain to you exactly what went wrong. What do they say? She never does the dishes, and she snores like a bear, a bear, Josh! Of course, even if we knew all of these strategies that only gets us halfway to changing our mind away from an incorrect belief. The other half is actually wanting to know when we're wrong, acknowledging that bias is something that is constantly happening to us, not just the people that we disagree with, and actually taking the time to use those strategies to ferret out wrong beliefs, even if we're dead certain that only a stupid or a crazy person could possibly disagree with them. But if we want to actually think rationally rather than just flinging poo at each other from mental fortresses that are totally impregnable to facts, we pretty much have to. As Oliver Cromwell put it, I beseech you in the bowels of Christ. Think it possible that you are mistaken. I know that I certainly was. Do you think that you can incorporate de-biasing strategies into your thinking? Please leave a comment below and let me know what you think. Thank you very much for watching. Don't forget to blah, blah, subscribe, blah, share, and don't stop thunking.