 You open up some Muscle Magazine, you open up some Men's Health Journal, you open up one of these things and you see that they've got these ads for the most ridiculous claims possible. You do this one thing and you'll be huge, you know, or you do this one thing and you'll have better sex, you do this one thing, all of these crazy supplements, products, etc. and you think to yourself, man, what a bunch of nonsense. And then you turn to the page about the best way to do a chin-up and you read the article and say, oh, that's clearly, they know what they're talking about. The Murray-Gelman amnesia effect is that if somebody lies to you, or if somebody's full of shit on one thing, you really ought to be skeptical about what they're saying about another thing, right? In court, if somebody lies once, their testimony is all subject to doubt, right? But yet, we have this effect where we read the media, we read journalists, we read Muscle magazines, we trust doctors, we do all these things where we say, oh, you know, this guy is totally backwards on this one point, but man, has he got an interesting thing to say about how to do chin-ups. You know, he's really right on, maybe he does. It's possible. Maybe he does. Maybe the doctor who says idiotic things in one case knows something about another case, but you have to be a lot more critical. You actually have to investigate the claims, think about them yourselves, do the kind of cognitive work that doesn't just account for, hey, you see, therefore my heuristic says you can trust authorities. Now, probably most of you guys don't just automatically trust authorities. At least you wouldn't be here if you did, but you have to understand that it's really easy to fall prey to this. I don't trust the mainstream authorities, but I trust my alternative authorities. Well, again, you can't give that same kind of credibility just because they're going against the mainstream. There's a very interesting phenomenon if you spend some time looking at any alternate sort of health ideas or alternate whatever it is. There's a reputation effect that builds into somebody being able to criticize the mainstream. Reputations are entirely designed around the fact that, hey, those other guys are wrong. Oh, wow, this guy knows what he's talking about. Yeah, he knows what he's talking about. He's critical, but does his actual positive case have anything to say? The more time you listen to somebody just criticizing something, the less you probably think that they have to say positively. And then finally, the halo effect. The halo effect is probably one you guys know, not the game, but obviously. The fact that the more attractive or the more muscle-bound or whatever, the more that a speaker or a person or an authority has the features that you think that they're promoting through their product or through their intervention program or whatever, the more that they have those, the more likely you're to believe that they're true, right? And there's no correlation. There's no correlation. People, you know, I mean, a guy that's been wheelchair-bound since he was a teenager is just as capable of figuring out proper exercise methodology as a guy who's been working out in a gym since he was 12, right? It's all the intellectual thing. Now, he may not have the same experiences. He may not be able to say, oh, this is what it's going to feel like maybe. But there's nothing about the speaker just because he's big and muscle-bound doesn't mean that he necessarily knows what he's doing. He just knows that, hey, what I've done has worked for me, but you can't necessarily generalize to the whole population. One last thing that I wanted to point out. This again is from Charles Seif's book on understanding randomness and avoiding randomness, okay? Randomness is something that we have a very hard time dealing with. Which of these two is random and which of them is human intervention, okay? The random one is the one on the left, clumps, groups, big white spots. People tend to assume that the one on the right is what's actually random because they're nice, evenly spaced, everything's kind of scattered out. But anybody who's, you know, if you just drop salt or you do anything where you just let something randomly scatter, you know that there are clumps and groups and different things like that. This does not compute well with our heuristics. If you really think about randomness, right, in study design or randomness in the effects, all these kinds of things, you realize that people are susceptible to all kind of biases. If you guys like the philosophy dinosaur, talks about the gambler's fallacy. Gambler's fallacy is when one assumes that deviation from what occurs in the long term is going to be consistent in the short term. So people play games of chance, they play poker, they play roulette. They assume that if they had a bad luck, a bunch of rolls of dice, they've got to have a good roll, they're due, right? And I saw this in Vegas once, I saw a guy at a roulette table. And I remember the time in Vegas before they had installed the roulette tables, the recent history, have you guys seen these? They put the numbers that have just recently been up. And this guy was like studying them when he was making his bets as though it had anything to do with the next spin of the wheel. Now, assuming that the casino is legit and not corrupt, it has absolutely nothing to do with the wheel. Doesn't matter how many times 22 has come up, it doesn't mean it's not going to come up the next time. So the best example of this is if you take students, and I used to do this when I taught college, the younger kids I teach now probably wouldn't be susceptible as much to this because they'd ask me too many questions about why I did it. But if you ask students to go home and flip a coin 100 times in a row and record the results, you don't even need to know the students or know their work habits to be able to tell which of the students would have faked it and which of the students actually conducted the test. Because students who fake it, and you can even do this, the intervention usually is tell half of the students to just fake it and tell half of the students to do it for real. And then you can look at the ones who are supposed to have done it for real and find out which one's actually faked it. Because the one on the bottom, it's trying to be too random. It's trying to have a lot of heads and tails spread out throughout the actual results. In reality, you're going to have long strings. It's hard to count. Let's see. 1, 2, 3, 4, 5, 6, 7 heads in a row, 7 tails in a row. 100 flips of a coin is likely to have a string of heads or tails. But that doesn't feel random to us. We don't process that through our heuristic as being random. So avoid the problem of thinking that random is supposed to look random and recognize that random can actually be something really, really different. Now, I was talking to some of the guys yesterday and the day before. I feel like in preparing this material, I could teach a whole course on this, like a whole semester on all of these biases. I just pulled out some of my favorites just for this list, which I barely have time even to talk about. There's probably hundreds of these. And think about this for a second. There are hundreds of these cognitive biases, heuristics, algorithms that we process in our minds quickly, easily, and that lead us astray from good reasoning. That doesn't even account for logical fallacies, poor forms of reasoning. It doesn't even do that. So there are a lot of these, some of my favorites, apophenia, that's the tendency to see patterns where they don't exist. This is the, Jesus's face is in my toast. Brains have certain abilities to recognize patterns. It's part of our survival advantage. The more that you can see patterns, the less you're going to get predated by big cats and other bears and things like that. Seeing patterns also helps you see food. But if you have this natural tendency to see patterns, you're going to start to assume that there are patterns where they don't exist, right? Or another one of my favorites, survivorship bias. This is a good one. Just because somebody has lasted doesn't mean you know anything. Jim Collins unfortunately had the good to great book. Anybody ever heard of this? This business guru kind of got good to great. Studied these businesses that went from average businesses to great businesses, published this book, claimed that he understood everything that the CEOs and leadership teams did to make them great, published the book, made lots of money, and guess what? Some of those businesses ended up failing within the next 10 years, right? It's the survivorship bias. He didn't study the businesses that didn't make it, you know, that did the exact same thing, but that failed, right? Because he's selecting out just the ones that survived and saying, oh, in order to survive, in order to do well, you must do this. But what about all the ones that didn't survive that did the same thing? So that's not actually isolating the cause of what was successful. So a lot of these, and you can look these up. I mean, I would recommend, there's a couple of good books, there's one called You Are Not So Smart, by David McRaney, which is a good one. But the ultimate thing is you need knowledge and experience, okay? You actually have to do the hard work, the hard thinking to get through it, and then I'll just end with this little comparison between the Hedgehog and the Fox. It's actually an ancient thing from Archilochus, but it was later developed by Isaiah Berlin. The Fox knows many things. The Hedgehog only knows one big thing. So the Hedgehog has a grand theory of the world. The Hedgehog is impatient. The Hedgehog is likely to argue, reluctant to admit error. Everything has to fit into the box. Everything has to fit into the grand, unifying theory. If it doesn't, it must be wrong. The Fox, Foxes are complex thinkers. They know many little things. They put things together. They realize that answers emerge out of complexity. They realize that the best part of learning, developing, growing, being the best person you can be is that when you answer certain questions, you actually find out what the really, or when you answer certain questions, you lead to even more questions, more interesting questions. That's what life is about. That's what knowledge is about. And so, just to wrap up, try to be more of a Fox than a Hedgehog. All right, thank you. Eric Daniels, guys. So, we have time for questions. We have some questions up. When I go to, or you go to your system doctor and they prescribe statins for cholesterol, if your cholesterol level's at a certain point, or the US government has a food pyramid that's, it's the crux of your speech, basically saying don't, maybe they're not right. Yeah, I mean, look, here's an interesting case. My wife, she hates going to the doctor. She hates dealing with doctors. So, I went to the doctor with her. Doctor said, she's a decent doctor, right? I mean, she's not horrible. But she looks at my wife's blood work and says a few things. And I say, okay, I know better. I've actually read the research, women and statins, women and all these other things. And I told her, I said, look, you want to keep us as patients, it's not gonna happen, whoa. And she said, well, I'm not sure you're right. I said, look, I'll just, I'll throw you the research. I just pulled up a bunch of articles, sent her the research. They're not always right. I mean, in a sense, unfortunately, doctors today are practitioners. They're not scientists, right? They learn what is true when they're in medical school. It's only the rare ones who are actually thinkers, right? And it's hard to find them. I mean, today's medical system, I know the kind of regulation and the doctor choice you have, et cetera. It's really, really hard to find doctors who are thinkers and who are willing to think outside the box. But if you do it, I mean, if you have the luxury in your life of having concierge medicine, great. Pursue that, where you can choose a doctor and go to a doctor, pay an annual fee, somebody who's actually thinking. And I find that more often than not, those kinds of relationships, concierge doctors, or people kind of outside the medical system are the ones who are willing to look at the evidence to be thinkers. You have to identify them, though, right? You have to network and find them. And you can, right? There are practitioners who do this. But most medical doctors are, I mean, I had a medical doctor once who would just look things up on his little notepad and he would do these different things and he would just say, oh, it's just an algorithm for him. And it's not even one that he thinks about, it's just one that's on a page. So you have to be very, very skeptical of just standard practitioners, especially in a field that's so, so infused with bad thinking. Unfortunately, medicine, it's just, I mean, when I read this study that said that over half of medical literature in top journals, Journalical American Medical Association, Lancet, all these things, is statistically just mumbo jumbo? I thought, wow, but that's what they learn in medical school. And these journals are coming out so fast and so often that they can't even keep up, so they just trust what their professors tell them. You have to be really, really skeptical of what doctors say and really ask them if they can explain it to you, if they can actually give you the arguments and answer the counterarguments, maybe they're a thing. But if not, you know, but just because I say that doesn't mean you should run out and Google everything yourself. You have to make sure that you're not engaged in those same kinds of biases and those same kind of errors. Yeah, another question. Thank you for the talk. You present a lot of information here. I wanted to get back to the beginning, which where you were talking about your core beliefs and then you said that there's two ways of thinking basically, the expert versus the effortful. Yeah. And for me, a lot of my thinking revolves around happiness. And what I wanted to ask you was, how do you, what would be a efficient way to go from where you're at in life and go, well, I want, excuse me, I want more happiness in my life and I want to make shifts so that my thinking became more on an expert or a fundamental quick basis where it's innate. So I'm doing those behaviors, I'm thinking of things in ways that promote this outcome, happiness. Yeah. I think, I mean, depending on, happiness is fundamentally for me, it's a function of the core areas of your life, career, romance, your avocations, what you pursue on the side, all those things. And the way to do it is to seriously think about and seriously sort of understand the best practices of those areas of your life. And then in order to be, as you say, more efficient at it, the thing that you can do is develop those heuristics. Don't be afraid of them, right? As I said, you can't literally think through every decision, should I kiss her now? How much pressure should I put on her lips? How, what should I say afterwards? You can't literally go through a kind of checklist of pros and cons every time you act. You have to have that gut instinct. The way to develop that gut instinct is to think about those problems and then actually start to try to figure out are these things programmed, right? You already have the programming. You can't help it, right? You already have those automatic intuitive answers in those situations. What you have to do is you have to spend time and you work on a different problem every couple of weeks or whatever. You take time to step back and in your life actually go through that deliberative process about the things that you normally automatically do to just double check yourself. Am I really programmed correctly, right? In the same way that sometimes you check your arithmetic, right? You go through arithmetic, you do a problem really, really quickly, but let's say it's a really important financial decision. You say, I better make sure I've got these numbers correctly. And so you actually slow down, you go through the process, you think through the calculation and you only need to do that in the context in which you need to do that. But if your goal is to program and to make sure you're programming your automatic heuristic thinking is accurate, then what you need to do is spend time on some of those decisions and step back for a while and actually do the long thinking and check yourself. Is the thing that I do automatically the same as what I would do if I was consciously deciding it? If it is, move on to the next one. But I guarantee, I mean I've done this throughout my whole life, I've learned to, you find ones that you say, why the heck do I do that? My automatic response is to think that that's random. Like if I really look at the evidence and I really look at this, I think that's not random. So I have to reprogram, in a sense. I have to start developing those habits of saying, anytime I hear the word random, I'm skeptical. And then I can very quickly, because I've learned this stuff, I can very quickly start to assess, is it actually random or not? You just have to step back every once while and focus on one or two issues that you automatized. Everybody's automatized these things, you can't help it. But make sure that you've automatized it in the right way, in the way that it agrees with your current thinking. And then as you do that over the course of your life, more and more of those will be consistent with your conscious thinking. Because you get these when you're a kid, you get these from your parents, you get these from pathologies of the ways that your teachers taught you. Nobody's gonna have those automatic things always correct. So what you wanna do is just check yourself. And that's what's gonna make you better at doing those things. The more you know and the more you can check yourself, the more automatic it's gonna be. But you always have the potential to make mistakes. So it's a continuous process. One last question, really. What's the next one? You know, the books that I recommended in the talk, I can give you a couple more later. I don't actually have a book on this. It's something that I've thought about, I mean as a teacher, learning about this kind of stuff that I think actually would be really useful. The process of teaching kids in fifth, sixth, seventh, eighth grade, who are just learning some of these things for the first time. And then thinking about well, what can I do right now so that they don't have to worry about this later? And then how can I kind of digest that? I mean, maybe I'll try to write it up someday. I don't have anything that I've produced in that sense, but I do, you know, Kahneman's book, Charles Seif's book. I can recommend some others afterwards. Just reading that stuff. Becoming aware of it is a great way of starting to recognize it if you have it in your own thinking and then trying to get rid of it. Awesome, let's give it up for Eric Daniels. Yeah.