 I was getting into Cartesian skepticism for a while before my credit card company called me. Apparently I had maxed out. Do you believe in leprechauns? I'm not being metaphorical here. I'm asking if you believe that little fairy creatures, who men shoes and keep pots of gold at the end of rainbows, literally exist. If not, why not? Many stories in Irish folklore feature leprechauns and there are plenty of people who actually believe in the wee folk, but you probably don't. Why is that? By not believing in leprechauns, you're expressing a form of skepticism, an attitude of doubt or reservation about leaping into bed with an idea, like the existence of leprechauns, even if it's presented as something that you might choose to believe. Skepticism comes in many different forms and degrees. You can be skeptical about whether the events of a reality TV show actually happened, about the existence of ghosts or an afterlife, or even about how much you can trust your senses. Those are all pretty different. At their core, they all share a certain attitude of disbelief or uncertainty towards some statements about knowledge. The reasons motivating skepticism can also be pretty different, but similarly they all boil down to the suspicion that someone has been convinced of something that isn't true. We talked a little about mimetics last time, a way of looking at ideas as self-replicating, competing entities. In that light, skepticism operates as a sort of mental immune system, rejecting ideas that seem like they might cause problems. And just like an immune system, problems do happen if it's calibrated incorrectly. Too lenient and you become a host to all sorts of crazy ideas simply because you were exposed to them. Too strict and, well, let's see what happens when we turn it up to eleven. Usually, we express skepticism about singular assertions, using one fact to challenge the legitimacy of another. If I tell you that I have a fast car and someone else tells you that I don't, those two assertions conflict with each other, so now you have a reason to doubt the truth of one or the other. But there's always a light at the end of the tunnel with that approach. Gather more facts and eventually, with any luck, you'll be able to confirm whether I have a fast car or not. Philosophers, on the other hand, will frequently group whole sets of like assertions together by some common characteristic, like I know that X because of Y, and then call the legitimacy of the whole set into question, so that no matter how many Y's you have piled up, they'll never be useful for proving that X. For example, take the assertion, I know that miracles happen because there's so many stories about miracles. One might assert that it's always more likely that someone's lying about having seen a miracle than it is that something miraculous has actually happened. If that were true, then it doesn't matter how many stories about miracles you're able to amass, it's still more likely that they just never happened. As philosophy fans are aware, there's an extreme form of philosophical skepticism that's most commonly linked with Descartes, sometimes called academic skepticism, which calls into question everything that we would normally claim to have knowledge about, including things like the existence of chairs or our bodies. Descartes suggested that everything that could possibly be untrue, even if it would take some bizarrely improbable circumstances, shouldn't count as knowledge. Like if there was the slightest chance that I was being hypnotized by evil leprechauns into believing in gravity, then I couldn't say that I knew that there was gravity. Most people, even most philosophers, wouldn't accept that as the standard for knowledge. The modern form of this skepticism turned up to 11 is much more flexible than Descartes' version, making use of an epistemic tool called the closure principle, which is just a way of getting from knowledge about one thing to knowledge about another. It's pretty easy. If someone knows one thing, A, and that thing necessarily entails something else, B, so long as they can figure that relationship out, they can also be said to know that B. So if I know that it's raining outside and I'm not being thick, then I can also be said to know that there are clouds out there. However, we can massage the closure principle into a slightly different but logically equivalent form. If I don't know B, then if A necessarily entails B, I must not know A either. If I'm sitting here inside wondering whether or not it's cloudy, then so long as I'm not being thick, I must not know that it's raining. That's fine. But what if A is everything that I think I know, and B is I'm dreaming? If I absolutely know that I'm recording an episode of Thunk right now, that necessarily entails the fact that I must not be dreaming. But I do sometimes have dreams about recording Thunk. It might be possible that I don't know whether or not I'm dreaming, in which case I must not know that I'm recording right now. My knowledge of the world has fallen to skepticism. The closure principle doesn't work for everything, but it does seem to work for absolute knowledge, and, unlike Descartes' version of academic skepticism, it also seems to work in a more limited way for whatever threshold you care to set for belief. If I were just reasonably justified in believing that A entails B, then I'm equally or maybe a little less justified in believing that B. If I have some reservations about B, then I must have equal or greater reservations about A. Again, we're talking about an argument challenging the possibility that we can claim to know anything. That's a tall order. Even when we've ditched Descartes for this more robust version of academic skepticism, there are still some loopholes that we might use to wiggle our way out of it. If we wanted to, we could just run the closure principle forwards to get out of it. If I know that I'm recording Thunk right now, then I must know that I'm not dreaming or hallucinating or whatever. Done. If we were really dedicated to the whole academic skepticism thing, we might say that that's plain dirty. I mean, the whole point of the exercise is trying to establish whether or not I can know that I'm recording. So me just saying, yeah, I'm totally recording is kind of missing the point. But we can use that same line of reasoning to get ourselves out of it again. I mean, if I'm not allowed to say, no, I just totally know that I'm recording, then you can't be allowed to say you don't know whether or not you're dreaming because that's also missing the point. In any case, the discussion of academic skepticism is, appropriately, academic. Nobody actually lives like they're constantly unsure if they're hallucinating or in the matrix. It's basically just supposed to be a jumping off point for thinking about epistemology, knowledge, and belief, that sort of stuff. Philosophers like to turn things up to 11 just to see what happens. It's fun, but that leaves us with a bit of a practical problem. It's pretty obvious that we shouldn't behave as though we don't know anything, but it's also pretty obvious that we shouldn't believe everything that we hear either. I mean, if you disagree, then I have a bridge I'd like to sell you. And unfortunately, things can get really, really messy when we're trying to strike a balance between those extremes of skepticism in our everyday lives. Deciding which things to believe and which things to doubt involves a ton of complications that all have to be accounted for, at least if we're interested in believing the truth. First, skepticism has a social problem in that it's not really sexy the way that fervent belief is. It implies a need to investigate further and think carefully about stuff, and it's just a lot of extra work. I mean, it's very difficult to get a frenzied mob of people chanting, We don't know. We have to think about it. That makes it a hard sell in a world where many are very passionate about things that they can't always prove. In that light, skepticism can have serious strategic disadvantages in certain settings. I mean, the last person to make up their mind often doesn't get to influence the decisions that were made by those with lower standards. Also, unfortunately, skepticism has a deserved reputation as being a sort of stocking horse, a front for people who've already made up their minds but are just interested in harassing their opposition with additional demands for evidence. It's not uncommon for someone to just ask questions without really caring what the answers are. Finally, confirmation bias, the proven cognitive defect which causes all humans to persist in believing things even when the balance of evidence shows that they're wrong, definitely affects when we apply skepticism and when we don't in irrational ways. We are always going to be more skeptical of information which conflicts with our preconceptions and less skeptical of information which agrees with them. That's just how we're wired. However, the worst possible form of confirmation bias combines with something kind of like the philosophical skeptic strategy to render someone's beliefs totally immune to facts. If I were to dismiss all sources of information which conflicted with my beliefs as being unreliable, then it wouldn't matter how many facts they were able to offer. I would still never have to doubt myself. This sometimes happens deliberately when someone labels a source of inconvenient contradictory information as fake news in order to convince their followers to dismiss all that data. But it also happens unconsciously like in the echo chamber effect where we unwittingly distance ourselves from people and sources of information that make us feel bad because we disagree with them. All of this points to a somewhat depressing conclusion. Skepticism is an essential part of rational thought. It's what allows us to prune our beliefs of crap that other people have nonetheless managed to convince us of. But it's a hard sell, sincerely exercising it puts people at a disadvantage in arguments and even when we do use it, we use it in such a way that we confirm the beliefs that we already had to begin with. Fortunately, there's a scientifically backed cause for hope. Dan Cahan and his team of psychology researchers at Yale Law School have been doing a series of experiments to examine how people with certain political beliefs are affected by confirmation bias. At first glance, his results are terrifying. Not only is confirmation bias alive and well among both liberals and conservatives, but it seems that intelligence amplifies its effects, perhaps giving smart people the ability to rationalize away conflicting evidence. However, Cahan and his crew found an interesting variable which was negatively correlated with confirmation bias, causing people to be more open to contradictory evidence and more inclined to exercise skepticism about their existing beliefs. Scientific curiosity. Those individuals who reported and demonstrated that they enjoyed consuming educational materials about science and scientific findings were resistant to the effects of confirmation bias. The greater the curiosity they displayed, the more they were willing to question their own beliefs when presented with contradictory evidence. That actually makes a lot of intuitive sense. If you're curious about what the research says rather than just looking for articles that support what you already believe, you're sort of entertaining a form of skepticism about your existing beliefs from the get-go. You're kind of in the zone to be convinced by the evidence. It seems reasonable to suggest that curiosity is corrosive to preconceptions and opens the door to allow skepticism to prune away the ones that don't make any sense. But I'm not sure I totally believe that. What about you? Please leave a comment below and let me know what you think. Thank you very much for watching. Don't forget to subscribe while I share. And don't stop thunking.