 Thinking about what happens when you go to a foreign country where you barely can get by in the language. Maybe you don't even really understand the language, but you at least know a few phrases so that you can order food and do things like this. And think about what kind of effort is required for you to do even the most basic tasks. You have to think through what are these foods? What am I about to order? What am I doing? Am I following the proper cultural protocol? Am I following the proper language protocol? And it becomes very, very difficult. And at a point, when you're a tourist in a foreign country where you don't know the language, you don't really know the culture, you become exhausted a lot more easily, cognitively exhausted. It's hard work thinking through every step that you normally would just do automatically walking through any American city. And so that hard work ultimately can't be possible every day all the time. So we develop heuristics, we develop algorithms. Basically, we develop shortcuts to help us do the things that we do every day. And the problem is these shortcuts, these habits of thought or these ways of solving the same problem or the same kinds of situations over and over again can lead to a very, very characteristic error. The danger is when heuristics, habits of thought become biases. They become ways that we actually short circuit the process of reasoning. So the best understanding of this, I mean this is, I think Naseem Taleb said Dan Kahneman's book, Thinking Fast and Slow. He actually said this is a book that's on the level of Adam Smith's theory of moral sentiments or I think he said Freud's work. I mean he's very, very serious. This is a Nobel Prize winning economist. This book, Taleb said it probably oughta win a Pulitzer Prize. I think he's probably right, it didn't end up in winning the Pulitzer but it's a really, really good book. It's probably the best book on this problem that is out there today. Kahneman basically innovated along with his colleague, Amos Tversky, who's now dead but back in the 70s, they innovated all of this stuff about cognitive bias and thinking and how to do it. Now, Kahneman in this book presents what he calls the two systems model. Okay, and this is basically what I'm talking about with the two ways that you can forget about the auto regulation of the brain. That's primarily what the brains work. Once you start thinking, there's really two systems he says that you can think through. There's an intuitive automatic system, what he calls system one. This is the habits of thought. This is the automatic processing of information in a way that makes sense, that you've done before, that you understand, that you can get through very, very easily. Now this can be both expert or just purely heuristic based, so experts, as you know, famous story about chess players. You ask a master chess player to assess the moves on a board. You can pull a board out, it's got all the chess pieces put into different places, and you say, okay, take a look, you give them like five seconds, boom, you take the board away, and they can tell you basically how the game is gonna go. They're experts, they can see very quickly and assess the situation, think seven moves ahead or whatever it is, and figure out how to play that game. You show it to an amateur or somebody who barely has played chess in years like me, and I'm gonna say, there's some pieces on the board. So I might have a heuristic, I might be able to just see, yeah, it looks like I should move my knight. No better reason than I think I should. System two, the other kind of thinking that Kahneman talks about is deliberative, effortful, it's the kind of thinking when you think about actual hard focus, bringing your mind into a state of conscious focus on problem solving. Okay, now the key here that Kahneman points out, and I think that is really at the root of his book, is that you have to know when to use which. You cannot just say, oh, I'm gonna be perfectly rational. I'm just going to use system two all the time. I'm going to deliberate with the same conscious effort about whether to order the steak or the chicken as I do about whether to engage in a whole lifelong process of choosing this career or that career. You cannot do that. You have to develop these kinds of systems and be very confident about these systems. So the question is if you have this automatic system where your brain has this intuitive sense where you just give answers, how do you make sure that you're not short circuiting yourself? Well, the answer is, as I said, knowing which is which. So let's take an example from Kahneman's book. So if I select at random, from a representative sample of Americans, a man to describe his neighbor, Steve is very shy and withdrawn, invariably helpful, but with little interest in people or the world of reality. He's a meek and tidy soul. He has a need for order and structure and he has a passion for detail. Question is, is Steve more likely a farmer or a librarian? Think about that for a second. Guess what the answer is. Some of you, I mean, at least if Kahneman's right, a lot of you are thinking, oh, that sounds like a librarian. But guess what? Male farmers outnumber male librarians in the United States by 20 to one. You'd be very poorly advised to judge just because of these characteristics that this guy is more likely a librarian than a farmer. Far more likely, even if you don't know anything about him, is Steve more likely a farmer or a librarian? Well, he's a guy, 20 times more guys doing farming than librarian work, more likely 20 to one that he's a librarian. This is what he calls the representative in this bias or the simplifying heuristic. If you tend to simplify things into categories that give you those automatic answers, you're going to often be misled by your thinking. You're gonna easily be seduced into that process of identifying what you already know in a way that conforms to a certain belief of what you think is representative. Let's take another example. This one's a fun one. So in your minds, think of the letter K, okay? And then try to think of whether there are more words that start with the letter K or more words that have K as the third letter of the word. Okay, and if I asked you to start writing down a list, okay, so start writing down words that start with the letter K and then start writing down words that start with any letter but have K in the third letter slot. You would not be able to come up and maybe if you're trying this right now, like unless you're a Scrabble player, right, like chess players, they hone this skill of thinking of words in really, really weird ways. You think a lot more about the letter that a word starts with rather than what happens to appear in the third place. As it turns out, guess what? Way more words with K as the third letter than K as the first letter but you can probably think of a lot more words that start with the letter K. Same thing's true if you thought of words with R. I mean, you say, oh, K, K's kind of a weird letter. You know, K's don't appear that often. It's a high value tile in Scrabble. Let's, something like R, L. Again, way more words with R and L as the third letter than the first but it's a lot harder for you to think about that. Why? Availability heuristic. It's a lot easier for us to think about things that are immediately available and for us to draw conclusions about those things based on what's available to us. So essentially what evidence is immediately at hand that's easily graspable when we're considering a problem makes it seem like it's a lot more of the dominant evidence. This is the problem, if you wanna put it that way, of what Donald Rumsfeld was talking about all those years ago. You have your known unknowns and your unknown unknowns. Everybody made fun of Rumsfeld for this statement because he was talking about how the war in Iraq was going but in terms of the epistemological problem it's incredibly, incredibly insightful. The idea is that there are things that you're aware that you don't know. I don't actually know how many words have K as the third letter but what I really don't know is the stuff that I don't even know that I don't know. And if I could tell you what that was then it wouldn't really be the stuff that I don't know, obviously. But there's a lot of availability in knowing what you know and even being aware of like, okay I don't have the evidence because nobody's ever tested this but I know that if somebody tested it I would expect the results to be X, Y and Z. But what about the stuff that you haven't even thought of yet? You have to be open to the possibility that you can't just take what's available as evidence to make deep, deep conclusions. You have to think through what counts as evidence and what might be things that you're not thinking of. So when Kahneman compares these two systems he gives a number of examples which I've taken out of his book here. System one, this system of automatic and intuitive thought. This is where you can do stuff automatically. Is one thing farther away or closer than another? What if there's a sudden bang in the room? How, your brain process, you think right away, you don't even have to say, where did that come from? Your brain automatically directs your attention to that part of the room. Is it a threat, is it not? If I say what's two plus two? None of you have to think about that. You've programmed your mind. That's an algorithm. You've programmed, hopefully, basic accounting, basic addition and subtraction into your mind. Some of you may have even programmed more sophisticated calculations. It's automatic. Now if you had programmed it incorrectly, if you had programmed something into your mind where you had some really weird way of calculating things and the programming was off and you habitually said five when someone asked you what's two plus two, that's the kind of example that I'm talking about. It's okay to have heuristics. It's okay to have algorithms in your thinking. The problem is when you misprogram them, just like if you misprogrammed basic addition. Now, if I say bread and... Yeah, some people, I knew somebody would say circuses. Some people will say butter, but one or the other, you have this automatically. You can't help yourself from coming out with that. It's already in your mind. The phrase is so familiar. Ro, ro, ro, your... Yeah, you guys, you could try as you might. You will not be able to stop your brain from thinking of that word. You could sedate yourself or alter your consciousness in some way, maybe. But if you're fully aware, you can't help it. Read words, detect hostility in a voice, drive a car in an open road if you've got years of driving experience. System two, though, is that kind of conscious focused system. If I told you on your way to the airport, try to count as many women as you can who have just perfectly white hair. And if you set yourself that cognitive task, you would be incredibly focused. You would be exerting effort. It would be easy to distract you. Little things could pull you away from that task. What about if you brace for a gun, at a starter gun, at a race? You're focused on the idea of don't move, don't move, don't move, but as soon as that sound happens, move as fast as you can. Or if I asked you to calculate 17 times 24 in your head, right? Everything that your pupils would dilate, your walking pace would slow down, you'd actually have to think of it unless you're one of these human calculator types. Count the number of letters A on a page, calculate your tax forms, evaluate washing machines for overall value. All of these things are deliberate conscious thought. Where you slow down, right? In the title, Thinking Fast and Slow, where you slow down and actually deliberate and consider these things. Now, a lot of what grows out of Kahneman's work as well as a bunch of other cognitive psychologists, neuroscientists, and others is a set of thinking biases, right? So Kahneman's Fersky came up with these in the 70s. People have been adding to them. There's a lot of easy, simple ways that people think. The algorithms, the puristics that they've programmed into their brains that are incredibly, incredibly distorting. So one of these is the illusion of precision. Okay, so Richard Hofstadter, the American historian noted, the American mind seems extremely vulnerable to the belief that any alleged knowledge which can be expressed in figures is in fact as final and as exact as the figures in which it is expressed. So there's a story that Charles Seif tells. So you're walking through a museum and you're there with some kids and everybody, oh, look at the dinosaurs and look at this and look at that. And the docent, the tour guide is walking around. And one of these teenagers says, how old is that skeleton? And the docent says, it's 65 million and 23 years old. The teenager says, oh, how do you know that? He said, well, when I started working here 23 years ago, the paleontologist told me that it was 65 million years old. Now, of course, the example is basically an example of the illusion of precision. He thinks that when the paleontologist, I mean, ridiculously thinks that it was precisely 65 million years old. When all of you who are laughing and you realize, paleontologists when they say it's 65 million years old, they mean give or take a few hundred thousand years possibly, the dating isn't precise. So Charles Seif in this great book called Proofiness is a great sort of way to immunize yourself against all kinds of what either called enumeracy or just susceptibility to statistical mumbo jumbo. He has these lists of things that people use, Potemkin numbers, right? 78% of all statistics are made up. I'm sure about that. You can trust me. Potemkin numbers, if you know the story about Prince Potemkin once created a whole village in Crimea when the Empress of Russia was coming to visit. He didn't want her to think that they were just this podunk little town, which is in fact they were. So they literally built a whole street that she could view that had fake storefronts like a Hollywood set where they just built up this elaborate town that she could come visit and they took her through just this one part of town and then she left. And she thought she had the illusion that it was much bigger, much more beautiful. Disestimation, right? You can make numbers up or you can make the illusion of numbers. Disestimation is just that idea that when a number is expressed with more significant figures, people tend to think that the number is more precise, is more correct. If I tell you that this pointer clicker is precisely, I don't know, what is that about? 67 millimeters long. You say, okay, he may be measured. If I tell you it's 67.325 millimeters long, you're more likely to trust that I've actually measured this and that I've done it with an exact instrument. Why? Because I added three digits to the end. But the question is all one convention returning speaker from Austin, Texas, 2012. And actually, Anthony Johnson, the CEO of the 21 convention said it's one of his favorite speeches, the Austin, Texas one. And let's hope this one is two. Eric Daniels, let's do it. All right. Thanks. All right, guys. Hopefully you're fully caffeinated and ready to go. The two systems model. Okay, and this is basically what I'm talking about with the two ways that you can forget about the auto regulation of the brain. That's primarily what the brains work. Once you start thinking, there's really two systems he says that you can think incredibly, incredibly distorting. So one of these is the illusion of precision. He said, had this result and therefore it's going to apply to this group. All right, you see this all the time. And not the least of which is mice or lab rats or whatever, comparing to humans, but even different groups of humans with different ages, different profiles, et cetera. Prince Businesses to Great Businesses published this book, claimed that he understood everything that the CEOs and leadership teams did to make them great, published the book, made lots of money, and guess what?