 I read somewhere the edge of Socrates' Togo was always coming unraveled. I guess he really needed some sort of hemlock. The human brain is a super complex and intricate mechanism that evolved in a particular way so that it could process, interpret, and then predict the behavior of its environment. It's really good at that, at least compared to other brains that we know of, but it does have some significant problems. Optical illusions are a fantastic example. They aren't the result of stupidity or misinformation, they're simply the result of intrinsic flaws with how brains work. We can show that these two lines are the same size, but no matter how smart someone is, no matter how many times they've seen this trick before, they still can't help seeing that this one is longer than this one. There are all sorts of these hard-wired errors in our brains which impede our ability to observe and interpret our world correctly. We have similar errors in our abilities to process sound, taste, touch, and perhaps most worryingly, facts. The vulnerabilities in how human brains process information are called cognitive biases. They're like optical illusions of thought that make certain ideas seem intuitively right, even if we can prove beyond any doubt that they're wrong. Availability bias is a great example, a quirk which causes our brains to view information that's easier to remember or access as more significant than other information. If you ask someone whether a word taken at random from a book is more likely to start with a K or have K as the third letter, it's much easier to think of words that start with K on the fly. With availability by eschewing their reasoning, that person is likely to believe that blank blank K words are significantly less frequent than K words. In actuality, words with K as the third letter are almost three times as likely, but because they're not on the tip of our tongue like kangaroo or kazoo are, our brains have a tendency to decide that they must be comparatively rare. That's not a mistaken reasoning, it's just that our brains happen to have a blind spot of intuition when relevant facts are particularly easy or hard to come up with. You've probably heard of another one of these quirks because it's relatively infamous, called confirmation bias. Our brains reliably treat information which agrees with our preconceptions as more important than information which conflicts with them. Even if we would like to think that we make up our minds according to the evidence, before we've even heard that evidence, any opinions we might have floating around in the back of our heads, are going to color just how much the evidence can affect what we already think. It affects memory. It's measurably easier for a brain to remember the fact which agree with its existing opinions than the facts that don't. It affects attention. It's reliably easier for brains to pay close attention to facts which support their opinions, but they tend to lose focus and skim over the ones that conflict with them. It even affects certainty. You can feed a brain the exact same number of facts for and against its current position, and it will somehow come away from that more sure that that position is right. And again, this isn't a symptom of anyone doing anything wrong or intellectually dishonest. This is just how broken brains are by default. Calling someone out for confirmation bias is like calling them out for breathing. It's just a fact of human life. Unfortunately, it does make the ideas and ideologies we're first taught very difficult to dislodge, even if there's good evidence that they're wrong. Whoever got to our brains first got to decide which way our confirmation bias would push us, and likely keep pushing us. Consider the fact that the vast vast majority of people will adopt the same religion and political leaning and cultural values as their parents, and will defend them at length. That's not a coincidence. So here we are stuck with broken brains. They're supposed to be working hard to figure out how the world is, but they're prone to getting stuck rationalizing what they think they already know. How do we fix that? I mean we could try to catalog and compensate for every single bias we discover, but that's a massive undertaking. We'd have to recognize them in a huge variety of multifaceted and complex situations, and then try to figure out how to balance for each one of them that might apply to each individual aspect of our evaluation of that situation. That's a lot of work. But fortunately, there is a 2,000-year-old method which we can use to prod our brains into a more active state of analysis, something that's slightly less prone to the effects of bias and preconception. It's called the Socratic method. Socrates was an ex-soldier in ancient Greece with a butter face and a knack for pissing people off. An oracle claimed that there was no person in the world wiser than he, which was odd because he claimed that he didn't know anything. That's a fairly uncommon thing to hear. We're saturated with strong pronouncements every day, assertions of absolute certainty, whether it's who's going to win the World Series, or which dishwashing detergent is best, or which God's created the Earth, or which economic plan is going to result in the most prosperity. Rather than telling people what he thought to be true, Socrates wondered about the public marketplace asking questions. You've probably encountered a 5-year-old who continually asked the question why. Socrates was kind of like that, only with enough intelligence and insight to point out if something that you said seemed contradictory or unjustified. That's fundamentally the basis of the Socratic method, a process that Socrates purportedly used to find gaps in theories that many people thought were unassailable. It's an easy thing to learn, and it's fantastic for jostling those convictions that our brains latch onto. We start with a theory or a belief. Let's say it's something like this baseball team will win the World Series. Next, we ask a question about some key part of that theory to clarify it and test it for strength, just like a 5-year-old asking why. How do you know that team will win the series? By answering that question, we'll redefine the theory using more rigorous reasoning. They will win the series because they've hit more home runs than any other team. Now, we've actually got a new, better, more complete theory. The team that has hit more home runs than any other team will win the World Series. We apply the exact same process again and again, further and further refining it. Is it always true that the team with the most home runs wins the series? Well, no, but it indicates the skill of the players, and you're not going to win without a skilled team. Is the skill of the players the only thing that determines the winner of a baseball game? Wouldn't that mean that the most skilled team would win every single game they play? And so on. This recursive cycle of questioning and redefining is a fantastic way to get at the substance of our beliefs, to trim away the lazy hand waves our brains use to keep themselves complacent and look at the actual skeleton underneath to see where that chain of thought is tightly linked and where it isn't. Knowing what kind of questions to ask is definitely a learned skill, but you eventually get a sort of intuition for which angles of attack are most promising. Which lines of inquiry will result in the most insight gained about the theory? Notice that I'm talking about understanding and gaining insight into a theory, not refuting it. It's easy to make the mistake of thinking of this process as a sort of weapon to destroy bad ideas. Socrates probably wouldn't have been anyone of note if he had simply wandered about asking pointed questions to make others look foolish and sell his own ideas. I mean, we've all known somebody who can only make a point sound convincing by calling somebody else's point stupid. But the Socratic method is meant to be an improvement mechanism, to get the absolute best form of a theory and to understand exactly what it can and can't tell us. The reason that he was labeled wisest was because of a genuine intellectual humility, an understanding that every theory about the world has its limits, and that true knowledge is knowing where those limits are. If you're willing to do that work, you can practice the Socratic method yourself, or even better with a neutral partner, one who can ask good questions without trying to steer you to a particular conclusion. State, question, redefine, and keep going until you can't anymore, either because you're exhausted, or like Socrates, you've realized that you really don't know. Either way, you'll have gotten your brain out of its bias comfort zone for a bit, and you'll understand why you think what you think a little bit better. What are the limits of your theories about the world? Who is going to win the world series? Please leave a comment below and let me know what you think. Thank you very much for watching. Don't forget to blah blah subscribe, blah share, and don't stop thunking.