 Earlier in the episode and earlier in the course, we've seen that people are a little bit too over eager when it comes to seeing patterns and noise. As Tom Gilovich said in the interview, if you take a bag of M&Ms and pour it on a table, you can, it doesn't look random. You see pockets of color here and there. You see faces and the things and so on. And this happens all of the time. We see patterns among seemingly random events. And yes, this is true and it's very common. It's been studied for ages. But something else that happens, which is quite interesting, is detecting relationships that don't exist as well. So what do we mean by that? There's a really good one by Danny Kahneman and Amos Tversky. Surprise, surprise. But they tested the claim that or they looked into the claim that people report arthritis pain when there's a storm approaching. Turns out there's nothing, no evidence for that whatsoever. Another one is emergency room nurses claim that there's a lot more activity when there's a full moon during regular days of the month. And sports people, athletes, routinely engage in all sorts of superstitious types of beliefs. They tie their shoes in a particular way. They bounce the ball exactly five times. They wear their pair of lucky socks or a pair of lucky shorts before a game or a competition. And this happens all the time. These are superstitious beliefs. As a result of seeing that two things tend to be linked, a pair of lucky socks and how they perform or the full moon and the activity in that evening. But there is no link. There can't be any link between them. But people see certainly that there is a link. Yeah, these superstitions, I think, is related to something called the confirmation bias that we briefly touched on in episode three when we talked about the interview illusion. Now, simply that is, we tend to notice things that confirm our beliefs. And we don't notice the things that contradict our beliefs. And I chatted to Tom Gilovich about this. And here's what he had to say. So formally, what are the other kinds of cognitive mechanisms that are operating when we had these beliefs or opinions? I think one of the most powerful and most interesting ones is something that a colleague, a former student here at Cornell, Scott Lillienfeld, calls the mother of all biases known as the confirmation bias. And that's a term that most people are familiar with. And they're familiar with the idea that if we want to believe something, we'll go and seek out evidence for it. And we won't seek out evidence against it. That is really true. It's a very pronounced tendency to treat information that's consistent with what we want to believe in a pretty friendly way and be really hostile to information that's consistent with something we don't want to believe. It's almost as if we ask ourselves of something that we want to believe, can I believe this? Or is there evidence for this? And there's evidence for almost anything, even the most outlandish things. There's some evidence for it. The question is, is there enough evidence? Is there sufficient evidence? And we don't tend to ask ourselves, must I believe this? Is there enough evidence here? So all of that's true. All of people can relate to that. But it's even more pronounced than that. That is, even if you don't care about a particular belief, you have no vested interest in it, you tend to look for evidence consistent with the idea rather than information that's inconsistent with it, which of course, if we want to have a balanced picture, we've got to look at both. So if I asked you, I gave you some plants, a bunch of hostas, and said, here's some, here's some, you're a nice guy, here's some extra hostas from my garden. I think they probably need a lot of water, but you might want to test that. How would you test that? Well, if you're like most people, you'd give it a lot of water and see how they do. And what you wouldn't do is give some a lot of water, some hardly any water at all, and see which one does better. You look for evidence for it rather than against it. And that's a very natural tendency at some level, it makes sense because it reflects a broader belief that look, if this thing's true, there must be some evidence for it. So let me look for some evidence for it. You're doing a very reasonable thing. However, you're doing an incomplete thing as well. You need to look not only for evidence for something, but evidence against it. So if you believe that cheerful people are more likely to overcome a bout of cancer, you need to look not just who are the cheerful people you know who've done very well, but maybe you know some dour people also who've recovered. That's the latter step that we tend not to do. In this episode, we started by introducing the intuitive scientist. And we spoke about how we can take some of the formal claims of science and bring them into the kitchen and to our everyday lives. We also chatted about our tendency to misperceive random events and random relationships. And we spoke a lot about how we can test claims, how it is that we can convince ourselves and others that there's a real effect here. There's something genuine that we should pay attention to. Now next week in episode seven, we're gonna build on this. We're gonna talk more generally about finding things out, about testing claims and how to change opinions.