 Another illusion, the illusion of authority. This is a pretty easy one. You put a brain scan into a research paper, people are more likely to believe the results. You can have the exact same text. You can have text that says in sophisticated language, that says that the research doesn't actually have any findings, put a brain scan in it, and people will believe it. The text basically says, yeah, we didn't really find anything statistically significant. We didn't find anything. But the headline, brain scan, boom, people believe. Oh, if I eat more carrots, my brain will be healthier, whatever. Just graphs alone. Just bar charts alone in research papers in news articles will give the illusion that something sciency is going on. Don't fall prey to this. Just because someone can quantify something doesn't mean that they have anything good to say. I actually pulled this, this is funny, this graph. I pulled this off of, I was just looking for public domain bar charts. I happened to find this one. This was on a website of a woman who's apparently a science fair consultant. All she does is hire herself out for probably hundreds of dollars to kids who are working on science fair projects. I thought, man, that's a racket. I need to get into this. And this chart says, oh, you can present the data one way or the other way. And I recommend that you present it this way, because it gives more drama between the two variables. I don't even know what the research is. Something about the difference between coffee and water. The worst kind of sciency stuff that you see today, infographics. God, I hate these things. I mean, they're beautiful. Sometimes they're really, really creative. But I pulled this one just for James, because this happens to be about the number of UK health clubs and gyms. And they've got this guy doing a bench press, and they've got all of these weights out here that are used to represent the numbers in the size of the plates or the size of the barbells or dumbbells or whatever represents different variables. I looked at this chart. I figured out maybe five or six different conclusions that I could reach based upon the data presented here. Now those conclusions were contradictory. In one conclusion, I say, oh, Brits are getting healthier, because there's obviously more gym memberships being sold. Another one, I said, oh, private memberships are on the way down, and more people are going to public gyms, because there's a different change in the ethos of British working out that it's a more public thing, and people are becoming more concerned about it. Or, oh my god, the absolute number of gyms is dropping, and so therefore Brits are no longer concerned about their health, no wonder we're all fat. You can come up with whatever conclusion you want, because you just present this data, and it looks kind of fancy. So don't fall prey to this. The illusion of authority. Forget about reading medical journals. Oh, well, maybe I'll just go to the experts. I want the experts to tell me what's best for me. Well, unless you're ready to become fluent in how to understand research design, how to understand statistics, how to understand epidemiology, how to be super critical of the fact that I don't know what John Ioannidis says. Over 50% of medical research is basically bunk. Published medical research can't be replicated. Statistical methods are so bad that the paper is so riddled with errors. And yet people who reports this stuff, the media. What do they say about it? They tell you, oh my gosh, Dr. Oz says, do this, and you'll burn fat. Do this, and you'll do that. This is what they call the Murray-Gellman amnesia effect. This is actually something that Crichton came up with, John Crichton. Murray-Gellman was a famous theoretical physicist, won the Nobel Prize in the 60s, and Crichton actually named this effect for him because he said, I had a conversation with him about it once, and he's much more famous than I am, so I named it after him because it'll give it a lot more authority, kind of as a joke. But the Murray-Gellman effect is described basically by Crichton as the following. You're reading the morning newspaper. You come across an article. The journalist is not only just stupid, they've got it backwards. Wet streets cause rain, right? I mean, they literally have cause and effect backwards, and you think to yourself, what a fool, right? This guy is such an idiot. And then you turn to the next page, and you trust what they have to say, the same paper, the same journalist, you trust what they have to say about the state of the world in the Middle East, or you trust what they have to say something else. The Murray-Gellman effect, think about muscle magazines. You open up some muscle magazine, you open up some men's health journal, you open up one of these things, and you see that they've got these ads for the most ridiculous claims possible. You do this one thing, and you'll be huge, or you do this one thing, and you'll have better sex, you do this one thing. All of these crazy supplements, products, et cetera, and you think to yourself, man, what a bunch of nonsense. And then you turn to the page about the best way to do a chin-up, and you read the article and say, oh, that's clearly, they know what they're talking about, right? The Murray-Gellman amnesia effect is that if somebody lies to you, or if somebody's full of shit on one thing, you really ought to be skeptical about what they're saying about another thing, right? In court, if somebody lies once, their testimony is all subject to doubt, right? But yet, we have this effect where we read the media, we read journalists, we read muscle magazines, we trust doctors, we do all these things, where we say, oh, you know, this guy's totally backwards on this one point, but man, has he got an interesting thing to say about how to do chin-ups. You know, he's really right on, maybe he does, it's possible, maybe he does. Maybe the doctor who says idiotic things in one case knows something about another case, but you have to be a lot more critical. You actually have to investigate the claims, think about them yourselves, do the kind of cognitive work that doesn't just account for, hey, therefore my heuristic says, you can trust authorities. Now, probably most of you guys don't just automatically trust authorities, at least you wouldn't be here if you did, but you have to understand that it's really easy to fall prey to this. I don't trust the mainstream authorities, but I trust my alternative authorities. Well, again, you can't give that same kind of credibility just because they're going against the mainstream. There's a very interesting phenomenon if you spend some time looking at any alternate sort of health ideas or alternate, whatever it is, there's a reputation effect that builds into somebody being able to criticize the mainstream. All they get, reputation is entirely designed around the fact that, hey, those other guys are wrong. Oh, wow, this guy knows what he's talking about. Yeah, he knows what he's talking about, he's critical, but does his actual positive case have anything to say? The more time you listen to somebody just criticizing something, the less you probably think that they have to say positively. And then finally, the halo effect. The halo effect's probably one you guys know, not the game, but obviously, the fact that the more attractive or the more muscle bound or whatever, the more that a speaker or a person or an authority has the features that you think that they're promoting through their product or through their intervention program or whatever, the more that they have those, the more likely you're to believe that they're true. Right? And there's no correlation. There's no correlation. People, I mean, a guy that's been wheelchair bound since he was a teenager is just as capable of figuring out proper exercise methodology as a guy who's been working out in a gym since he was 12, right? It's all the intellectual thing. Now he may not have the same experiences, he may not be able to say, oh, this is what it's gonna feel like, maybe, but there's nothing about the speaker just because he's big and muscle bound doesn't mean that he necessarily knows what he's doing. He just knows that, hey, what I've done has worked for me, but you can't necessarily generalize to the whole population. One last thing that I wanted to point out is one convention returning speaker from Austin, Texas, 2012. And actually, Anthony Johnson, the CEO of the 21 convention said it's one of his favorite speeches. The Austin, Texas one and let's hope this one is two. Eric Daniels, let's do it. All right. Thanks. All right, guys, hopefully you're fully caffeinated and ready to go, the two systems model. Okay, and this is basically what I'm talking about with the two ways that you can forget about the auto regulation of the brain. That's primarily what the brains work. Once you start thinking, there's really two systems he says that you can think incredibly, incredibly distorting. So one of these is the illusion of precision. Had this result and therefore it's going to apply to this group, right? You see this all the time and not the least of which is mice or lab rats or whatever, comparing to humans, but even different groups of humans with different ages, different profiles, et cetera. Prince Businesses to Great Businesses published this book, claimed that he understood everything that the CEOs and leadership teams did to make them great, published the book, made lots of money, and guess what?