 Welcome. We're ready to go. My name's Barbara Drescher, and it is cutting out, okay? I have to take the cues from the video, guys. Are we okay sound-wise? We're okay? Okay. I didn't quite get that. You just want me to start. My name's Barbara Drescher. There's Kyle Hill. What we're going to do today is hopefully help you figure out just how rational you are or give you at least a ballpark figure. I'm going to start by, first, a little bit of definitions, and then we're going to pass out a little test. So if you have a pen, something to write with, something to write on, now would be a good time to get it out. If you don't, you might want to take down the answers on your smartphone or whatever. Hopefully we have enough copies for everyone, but if you can share, that would be great. You thought you didn't have to get tested since college or high school, but you were wrong. You were wrong. And then we're going to give you about 10 minutes to do this. We don't want you to think a lot about the questions. So you're probably going to be angry. We just want you to kind of answer the questions as quickly as you can stand to. And then Kyle is going to take some time to go over the quiz and talk about each one of the questions and what they're going to mean for this question about how rational you are. Now, if you're familiar at all with rationality or decision-making literature, you might see that some of the questions are familiar. Please don't read into them too much. Some of them might seem a little strange, like strange versions. I'm actually going to be talking about what these mean for this question and working it into a framework that you probably aren't familiar with. So it's worth it to stick around for that part, even if you're familiar with all of the things that these questions are going to represent. Okay, so let's go ahead and give you 10 minutes starting... Yeah, you got the timer? Yep. Okay, time you. Connor, we have to pass out the test, but we want to start the timer so we're running. Okay, you do this side. I'm going to do this side. Just come and pass them down. Once you get the test, you can start right away. Don't worry about it. Just raise your hands. If you can share, please do. We weren't expecting this big of a crowd this late in the day. Collect these, so if you can pass them back. I've got one more back there. Let's start back here. All right, so we don't have any more questionnaires because of how many of you guys are here, obviously. So if you want to share, that'd be awesome, obviously, but you don't want to, you know, kind of pool your resources on the questions. Just look at the paper, do your own answer. And again, if you don't have pen or paper... They will be up on the screen, so it's... Sure. You can also do it up on the screen when we have it or just put it into your smartphone on notes or something like that. For those of you still filing in, if you can find someone's questionnaire to share or just look up a little bit on, we will have the questions up, but just to familiarize yourself with where we're going. No, please. You've still got a little bit of time left that wasn't a klaxon or anything. Yeah, that wasn't meant to be an alarm. That would be awesome. Yeah. About one minute. In seconds. It's okay if you don't finish, by the way. I'm just going to use this. Yeah, I'm just going to use this. Okay, pencils down. You can read it. Okay. So, why are we doing this? We want to get at the difference between what is intelligence and what is rationality. A lot of people think that these are conjoined at the hip, but really they're not. When we do certain psychological tests, we find that there's a small contingent of things that it's just not accounted for by IQ and what we commonly consider to be intelligence. So what are these other things that we are looking at? Some research considers that to be what rationality is and factors into that as well as intelligence. So, by definition, a little bit more formally, rationality is consistent belief structures and behavior that optimize goal fulfillment. This is textbook psychology lingo for you here. In other words, thought processes and behavior that lead to get what you really want. So, are you rational? That's what we wanted to find out here. So, the first question, if you take a look at that real quick, is demonstrating the my side bias. And what do I mean by that? Well, we gave two different questionnaires here. Slightly different versions of the same question. In psychology, we do that to test between subjects to see without subjects' knowledge if differences pop up, if just a slight wording occurs, and it often does. The natural my side bias is the tendency to evaluate propositions from within one's own perspective when given no instructions or cues to do otherwise. So you see here the two versions of question number one that we gave you guys. So, the first version was talking about the U.S. banning the sale of a dangerous German car. The second version was just the opposite. Germany banning a dangerous American car. So, who got the first question? Who agreed that the U.S. should ban the sale of the German car? Can I see a raise of hands? Okay, now for the second part of the question, who do you think, who thought that Germany should ban the sale of the dangerous American car? So, assuming that you are American and mostly American and not German, the my side bias takes advantage of demographic variables. If you are American, you are less likely to agree that someone should be banning American cars somewhere. Likewise, if we did this in Germany, they'd be less likely to agree that America should ban a German car. There are differences that relate to the individual here that we can't always account for. These studies are really done with large sample sizes. But we've got a lot to get through, so I know that you're going to have issues with just about every question. All I can say is there are variations among people and how they respond that are not accounted for in these questions. We're really looking for averages. On average, most people are not statisticians. You're going to see a lot bigger representation of statisticians in this room. How many people are statisticians? Or would consider themselves a statistician? Yeah, that's about right. No? Just me? What about a mathematician? Yeah, okay. There you go. Yeah, so that's more than a general public. When we do large-scale psychological surveys, we can't account for everything such as age, what your background is, and things like that. So we do it randomly, hoping that randomization will sort out the messiness of different backgrounds, different ages. There's one other thing you're going to see while Kyle's going through these. We're not going to see anywhere near as strong a bias in this room as we would see elsewhere. Hopefully. Don't disappoint me. Hopefully. But it's not just kind of little tricky questions like this. This extends out to a lot of other real-world implications. So men in psychological studies that have used these myside bias questions, men are more likely to think that the salary gap between men and women is basically the same when they're employed in the same position. Men are more likely to think that that's true than women. Why? Because it's affirming a position of a man. It's okay. There is no problem here. Men are more likely to agree with that statement than women. That's your side. I'm a man more likely to agree with that statement. Same thing happens with smokers. So smokers are more likely to think that secondhand smoke is not harmful than nonsmokers. Also, people who consume a large number of alcoholic drinks during a month are less likely to think that college students who drink will become alcoholics. And lastly, as I'm sure most of you will be familiar with, the more religious you are, the more likely you are to think that religious people are more trustworthy in comparison to non-religious people. This is my side. I'm agreeing with my demographics, what makes me me. And creating your side is really simple. It doesn't have to be religion or smoking or not smoking or drinking or not drinking. In some other psychological studies, you can create a group just by the flip of a coin. If everyone in here was to flip a coin and you went into tails group and heads group, you would give less hypothetical money to the other group. If I said you have $1,000, you can give some to the other group, go ahead, you're less likely to give the other group money just because you're tails and not heads. And then you start rationalizing, well, I don't know about those heads guys. If my friend Jennifer always calls heads and I don't like her, it gets these weird afterthoughts that cloud your judgment. So the next one, oh, oh, sure. So the rational response here is not... There's no correct answer to this. It's to elucidate the bias here. If you are saying... And that's why we did it between subjects here. If there's more people, more Americans agreeing that you can't ban an American car in Germany than the opposite, then we see a difference between groups. If we were to give both of these questions to you, you would probably pick out the difference and then just be mathematically confused. Why would you phrase it like this? Which is why we did it between both of you. So moving right along, a sunk cost effect. So this is another bias, something that pretty much everyone will have some experience with, I bet. The sunk cost effect is the increased tendency to persist in an endeavor once an investment of money, effort, or time has been made. So again, for this question, you guys received two different questionnaires. One has a simple choice based on convenience. Would you pay a bit more for a shorter drive or not? But the lower question here... This one here. It first dictates that you bought a $50 coupon to get the lower price and that that discount is only at the further store. So who answered in ARB the low range of videos for the first question without the coupon? Who said that just a low number, ARB? And who said more videos would be rented? Okay, so more of you. So we're making just a choice based on convenience here. There's not really any impetus other than do you mind driving an extra few minutes or so? Now for those of you who got the second question, who was on the low end of how many videos you would rent from that store? Come on, be honest. And who would do the high end, the CRD? You guys are doing pretty well. That's because you're Canadian. So, but we see this bias crop up again and again and again. You especially see this when people will just dive into a crappy stock and refuse to sell it because it will eventually get better. If you have put $5,000 into a stock and it keeps diving, you will wait for it longer than probability or mathematics would dictate. And especially here in Vegas, it's kind of the old axiom here. You've got to know when to hold them and know when to fold them, right? Because if you have dumped $50 into a coupon, you might think, well, I already spent all this money. I might as well go to the store. I use that even though it's a little bit further away. Less money, further away. So, you can see this all the time with gamblers who just have this compulsion to just keep going, just keep digging themselves into the hole as they try to dig themselves out. And that's the sunk cost effect here. You might have had that with a crappy car you used to own that you just would not sell or let go or trade up just because it's still a good car. I paid a lot of money for this car in 1987. So, moving on again. So, questions three and four dealt with framing. This is a classic communication and psychology tactic here where a framing effect is said to occur when equivalent descriptions of a problem or a decision lead to systematically different answers. So, two versions of the question can have the same monetary value or outcome. But if you phrase it just a little bit differently in terms of maybe gain or loss that it will lead to systematically different answers where 90% of people choose one but with the same outcome 90% of people choose the other option just based on little changes in wording. Before I get to the question, to get to this, in one study beef described as 75% lean was given higher ratings than beef rated as 25% fent. Those are both, those are equivalent statements but when you phrase them just a little bit differently most people immediately deviate to the other answer. Similarly, research and development teams are given more funds from their companies when they frame how much they have done for the company over the last year in terms of gains and not losses. How much they made the company or how much they lost the company. So, now typically in psychological questionnaires we would give you a gain. So this was kind of a gain in the lost frame here. So would you take a sure bet or would you lose something for sure? The gain or loss. But we couldn't do that with you guys because we're trying to get to rationality here. If we gave you two versions of the same question you would probably work it out that we're trying to frame it. So we had, again, we had to give one section one of these and another the other. No, actually they got both questions. They're different contexts. That's what I meant. You guys are very rational. It's good. So what we are looking for in terms of that because we can't do it the other subject way that I got wrong is we are looking for consistency here. There's what is intelligent for you to do in one of these situations can be different for every one of you. But what is rational is to act that same way, whatever that choice is for you. So we were looking for consistency instead of a correct answer in your head. So who chose both 3A and 4A? Raise your hands. And who chose both 3B and 4B? Okay. So what I also saw there was most of you were choosing a rational choice in terms of you'd rather lose something or gain something for sure than to deal with a probability and maybe not get any money at all or less money. That is definitely the rational response in other psychological tests. Right, so... Yeah, there are effects that are related to the size of the gamble as well. But again, we've got a lot of material to go through so if we could keep the comments. I know you've got lots of things going on but we've kind of got time. Yeah, we want you to leave this still asking about... We want you to be talking about these things all weekend. Okay, so again, let's go. Availability and risk, the availability heuristic. So on top of most of us not judging risk accurately ever, most of you, probably not most of you, sorry, but many people are more afraid of flying in a plane than driving a car when statistically driving in a car is the most dangerous thing you do in any given day. We fail to realize just how risky things are for us. So to get to that and then to the bias of the shows, I want to ask you guys, and when I've asked skeptics before, I've asked you guys, how many people think more Americans die from asthma attacks or tornadoes? So asthma attacks. Yeah, tornadoes. Nobody's going to raise their hands now. Now nobody wants to, yeah. But exactly, you can see the difference here. Was it this year or the end of last year, the terrible Oklahoma tornadoes? This year? Yeah. So it's easy that if you have these kind of devastating, always in the news events, this plays into what's called the availability heuristic. The easier it is to recall in memory, the more you judge it to happen. So if it's easier to recall, then it's judged to happen a lot more. So if you're always watching flights and you just saw those terrible plane crashes happening in quick succession, that might bolster your belief that the plane is more risky than driving, because I'm never in an exit. So who thinks more people die from suicide each year? And homicide. Okay, it was a little bit more for suicide, and that is true. We tend to hear about murders in the news and this terrible world that we live in, but statistically, suicide is a lot more likely. I hope you guys do get on this next one. So who thinks more people die from the seasonal flu and car accidents? Okay, so this might be contentious among some of you medical types that I know, but from the numbers I looked at, the seasonal flu does affect a lot more people than just car accidents alone. I know the most recent statistics on car accidents are pretty ridiculous, and especially in the summertime, 40,000 people a year or something like that. But from what I found, the seasonal flu does, especially influenza in all types, affects a lot more people and kills more people. I know I said Americans, but also around the world, and if I got that wrong, did I get that wrong? See, that wasn't so straightforward. So you guys were right. So this one's a little bit harder. Who thinks more people die from drowning each year? Who thinks Parkinson's? That's about 50-50. It is Parkinson's disease. We hear in the news about unfortunate kids drowning in pools during the summer or someone getting drunk on a bridge somewhere and toppling, and it's always news, but not so much Parkinson's. We don't hear about the people who are fighting and dying from Parkinson's disease, and that is what the availability heuristic is. If it's not readily accessible in memory, we judge it less likely to happen or to happen to someone we know or something like that. Many of you probably know someone who has been diagnosed with cancer. If you know someone like that, it automatically seems more risky than if you don't know someone like that, and that's because it's available in your mind and in memory. You can think about it kind of like your mind is like a cornfield. It's always easier to navigate through a tall field of corn if the path is already well-worn. If you're already using these connections in your brain, airplane crashes happen all the time. That's much easier to navigate than something you don't have any information on. It's much harder to get through that. So, moving on again. Base rates. I know this one is hard, and we will go through it. Here, there are actually right and wrong answers to this. So, intelligence definitely plays into this, but also rationality because we have to know what kind of information is important or not, or what to consider or not. So, or what to do with the information. Don't worry about it. So, question nine illustrates base rate neglect where one makes a decision based on specific information rather than very general information. So, for example, when you make a decision, if you hear about in the news that some activity increases your risk of cancer of a certain type of cancer by 200%, but your risk of getting that cancer is only 1%, the difference is only 2 percentage points there. So, even though a specific information, a 200% increase sounds scary, the base rate that many people ignore is what's important here. So, people who usually have only generic information act on that generic information, the base rate, because that's a rational thing to do. But when they have specific information and the general information, like 200% instead of just one, then they use that specific information exclusively, forgetting about the base rate entirely. So, this bias, this heuristic, probably skewed your answer of this question pretty high. So, why don't I ask? Who had it 70% and above? Raise your hands. 60% and above. 50? 40? 30? 20? Well, educated crowd here. You did that or they're terrified to answer. Oh, they didn't take the time to do the math, okay. Well, if you didn't figure out the probabilities of the math, take your best guess now. 20% or above. 10% or above. 5% or above. 1% and above? Okay, so most of you between 0 and 5. Okay. That's really interesting. No, it's good. The correct answer is 7.5%. So, the important thing isn't to get it exactly right. It's important not to fall prey to this bias and almost none of you did. I'm impressed. Why it's amazing is because when this question was first asked of doctors, 8% of them got it right. People who should know how to deal with a false positive or a false negative. 8%. But there's a different way to think about this problem. When you phrase it in a different way, maybe something more natural and a more natural way to think, instead of counting instead of dealing with fractions and percentages, that would be a lot more easy to deal with. So, right here, let's deal with frequencies instead of percentages. The number of times an event occurs out of a whole mess of events here. So, when we have our percentages, let's transform that into actual women we're dealing with. We're actually thinking about who's affected here. So, let's start with 1,000 women to make the math a little bit easier. And we'll follow this. So, just based on the percentages that were in the question, 10 of the women will actually have breast cancer. So, 1%. That means that 990 will not have breast cancer in the general population. So, now the testing, out of these 10 women who actually have breast cancer, 8 of them or 80% will test positive for breast cancer. Now, out of the 990 women who don't have cancer, you still have this false positive rate, 99 of them or 10% will also test positive for breast cancer. Now, when you have it split up like this, it's a lot easier to see. How many women that turn up positive for breast cancer actually have it? Well, we take the number of women who test positive and actually have it, plus the total number of women who test positive. So, 8 divided by 8 plus 99 is 7.5%. You don't need to know Bayes' theorem off the top of your head to get this right. Bayes' theorem would give you the correct answer with all the probability given this event happening, given that event happening. But when you think about it in a little bit more natural way, then it's a lot easier to understand. I have to keep moving on in hopes of if you guys do have questions, we might be able to get to them. Oh, no, my Skype is popping up. Oh, Dan Welch is online. Okay. So, a lot of you guys have probably seen questions 10 and 11 before. They're basically versions of the same question. The four-card problem. What is the minimum number of cards? Which ones? Must you turn over to see if the rule has been broken or not? I don't quite know how to pull for every possible permutation of these. I would just actually ask how many of you said just E? Okay. How many of you said E and 4? Okay. How many of you said E and 7? Wow. A lot. Anybody say E and K? 4 and K? I'm actually surprised at how many people said E and 7. She's surprised because when she does test like these, almost nobody gets the right answer. Why is that? Why is it so hard to understand what cards you need to turn over or not? Well, part of it has to do with confirmation bias, which I'm sure all of you are very familiar with. First of all, it's always easier to find confirming evidence for your beliefs, right, than disconfirming evidence. And even in some cases in psychology, disconfirming evidence is completely ignored. So it's a lot easier to rely on information that would confirm that the rule is still valid, but not totally discredited. And that's obviously at the heart of good science, right, trying to falsify a hypothesis or not. So a lot of people... Most people will get this wrong, but you guys did very surprisingly well. Another theory on why this is very hard to get right, why it's rationally hard to do is because we don't really think in terms of this abstract information, which brings the next one up. There's not actually a slide for it. Oh, there's not a slide for it. The next one up, which is dealing with what you need to do to test if someone is illegally drinking in a bar. Do you want to go... The question that you were asked had to do with finding out if somebody was cheating or not, whether they were drinking underage, right? So there were four options. Those four options correspond to the same things that these correspond to. You're either confirming that somebody who is drinking is underage. You're confirming that somebody who is not drinking alcohol is whether they're underage or not, or you're checking someone's age and whether or not they're drinking. Okay, so there's four options there. They correspond to the exact same things of this problem. And yet this is actually much easier for people to do. How many people felt that this was a lot easier to grasp when it was put in these terms? Do you feel the second question was a lot easier to answer? Even though they're basically the same questions. Why is this? Well, sometimes it's a contentious topic. But evolutionary psychology, some of the fundamental hypotheses, don't worry, I won't get very specific with it. But it said this is a social situation that we are adapted to deal with. We are dealing with a group of people and we need to evaluate what they are doing. And this is easier for us to grasp than how can I falsify whether or not these cards have a certain symbol on them or not. And so like with breast cancer, the previous question in dealing with frequencies, counting women instead of dealing with percentages, it's easier to get to the rational response if it's framed a little bit differently. So, gotta keep going, gotta keep going. So, number 12 and 13, this is number 12, gets at the representativeness, heuristic, and the conjunction fallacy. So before I get to the question, the conjunction fallacy is a formal fallacy when it is assumed that specific conditions are more probable than a single general one. So how many people chose answers for 12 or 13 that involved more than just one description? You can raise your hand. Come on, you know you didn't. If you chose, for this question, if you chose B, D or E, who chose B, D or E? For 12. So, fundamentally, probabilistically, two events occurring at the same time is less likely than one event occurring by itself. If two events have a 50% chance of happening, they each have a 50% chance of happening, them happening together is only 25%. So, any one of the, so C and A here, any just one of those attributes, even though you don't know very much about Sheila, is more likely than additional conditions built on that. She may as well drive a minivan, but to drive a minivan and also take her kids to school, never mind. ABC, right. So, Sheila is a blogger and is active in the anti-GMO movement. That is two conditions which is fundamentally less likely than just one, C, B or A. Sorry about saying B earlier. No, just A, because A is included in D. It's just less likely than A. Man, you guys. So, the same is true with 13. If any of you chose an answer that had both, that had more than one condition in it, that is less likely than one that just had a single condition. And why do we tend to answer like this? Well, it's not just the conjunction fallacy where we assume that if we have more information about Sheila, that's more likely. Oh, she's active in the anti-GMO movement and she's a blogger that sounds very specific, so that must be correct. But we have this representativeness heuristic as well. So, when we have an idea of what someone described as Sheila is, what someone like that might look like, we tend to answer in that same way. For number 13, it sounds like a description... It may sound like a description of somebody you know and you might answer that way. Well, I know someone who is an actress and yada, yada, yada. We have this representative in our head that we tend to answer to instead of just evaluating based on what is more likely or not. So, the last one here, 14, is a little hard to get through. I will admit you that. So, you've read through the problem. I'm not going to read it again. But how many of you would choose both box A and box B? You can raise your hand. How many of you would choose just box B? Raise your hand. And most of you didn't answer the question? Is that...? Not everybody had. Didn't have time. That's fine. I'll leave this up here so you can read it. So, there seems to be like an element of risk here that however you previously answered, the computer is taking into effect and is predicting what you will do, and it seems uncertain, and it's drawing you away from just rationally thinking this out with a lot of... I'll say it, irrelevant information. And I'll show you why. The key to understanding this is that it doesn't matter. It doesn't matter because you will always... It's already happened. The computer has predicted whatever it is going to happen. You don't know how it makes its decision process. So, given that you have no information about that, really, it doesn't it's predicting, but it could care less about you. Why would you not choose both boxes to get whatever is in both boxes? If it has already made the decision about what you will do, it doesn't matter. You will always get both boxes. Whatever is in box B, you're going to get it by choosing box A and box B, right? Do you see what I'm saying here? Is that if it predicted for you to choose box B only and then left or didn't do anything else, choosing both boxes, you get both. So, I'll put this another way. So, the classic version of this problem, which is why we didn't use it, it's a little esoteric, but it dealt with a superintelligence named Omega. And he did the exact same thing. He was predicting what you would do, and he predicted which box you would pick, and then loaded the boxes with either $1,000 in box A and nothing or $1,000,000 in box B if that's what it predicted you would choose. But the rational reasoning is the same here. Omega has already left, just like the computer has already predicted. Either box B is already full or already empty. If it is already empty, then taking both boxes nets you $1,000, and taking just box B gets you nothing. Conversely, if box B is already full, then taking both boxes gets you $1,000,000, whereas just choosing box B only gets you $1,000,000. You see here, you are always up a grand if you choose both boxes. And similarly, you are always up in the question you answered if you choose both boxes. You don't have to take into account all this irrelevant information that makes it seem like a risk. And this is hard because it gets at rationality and not necessarily intelligence. So with that, I will turn it over. And to get at when you separate this, what questions account for intelligence, but what accounts for something a little bit different than that? Tag team. Okay, if you're a little scrambled, I'm going to try to unscrambly and tell you what these things mean because they don't really seem like they're testing rationality, right? The very first question was which answer is rational, and the truth of the matter is in most of these questions, there isn't a rational or irrational answer. That's not really what we're testing. It takes a lot of finagling to get at the differences between a rational answer and non-rational answer and an intelligent answer and an unintelligent answer. I don't even know if there is an unintelligent answer. So this is kind of a complex field. So I'm going to kind of go through them again a little bit and try to tease out why we're even talking about these bizarre questions, most of which you never have to encounter in the real world. Especially since we kind of started with talking about meeting your goals and you live in a real world. So you think you're smart or maybe you don't after that. Actually, after seeing your answers, I'm really impressed with this audience. Okay, I want to play a little game here first, smart or not smart? Okay, Jimmy Carter, smart, not smart. How many think you're smart? Okay, you're right, he's smart. Okay. Lindsay Lohan, smart or not smart? How many people think she's smart? How many think she's not smart? You would be right. Not smart. Okay. Arnold Schwarzenegger, the governor. Smart? Not smart. He's really smart. Jody Foster, how many think she's smart? You're right. Okay. Andy Warhol, smart, not smart. How many think he's smart? Raise your hands high. How many think he's not smart? You would be right. Not smart. I'm talking about IQ. I'm talking about IQ. George Bush, smart or not smart? How many people think he's smart? How many think he's not smart? You would be wrong. He's actually very smart. Mitt Romney, smart, not smart. How many people think he's smart? Just IQ. Just IQ, yes. This is what we're talking about. Mitt Romney's actually very smart. And yet, he tied a dog to the top of a car. Smart people do stupid things all the time. Okay. You know it. You've done it. Here's some smart people that have done stupid things. Smart people do stupid things. Why do smart people do stupid things? Because the decisions that we have to make on a daily basis to meet our goals involve more than just intelligence. Some of the questions that you were given, like the base rate one, require a lot of intelligence to answer and some knowledge, some education. There's a lot of reasons why we don't make good choices all the time. Some of it is we don't have enough information. That's one of the reasons why we have workshops like Science-Based Medicine here and things like that to sort through the good information and bad information for people who don't have the expertise yet or don't plan to, to sort through it themselves. So we have to start with good information. We have to be educated. We need intelligence sometimes to understand what that information is. Those are factors in rationality, but then there are other factors that even smart, well-educated people don't... those factors don't necessarily turn in the favor of rationality. And that's how we get smart people doing stupid things. So let's go over a little more of the questions, but in a different context. Let's think of them a little differently. First of all, what does it mean to be smart? Well, we talked about at the beginning what rationality is. We're talking about consistent belief structures and behaviors that optimize your goals. These are good decisions that get you what you really want. What is intelligence? Anybody want to shoot for a definition of intelligence? No? Yes, very good. Thank you very much. Performance on an IQ test. What is measured by IQ test? And that's actually a pretty good definition of what intelligence is. Scientists and psychologists have argued this for years. We've looked for different types of intelligence and different things. We tried to figure out exactly what intelligence tests. And they test intelligence. They do. Some people have argued that there is no such thing as one measure, the one thing that can capture intelligence, and part of that has to do with the definition. So let's talk a little bit about how this ends up to be very complex. How we test tells you a little bit about what it means. We test in different situations. We have things called optimal situations. We have things called typical situations. In optimal performance situations, what we're talking about is your ability to solve problems. The knowledge that you have, your approach to problems. In typical performance situations, we're talking about something a little different. In optimal situations, we don't want to know what you can do. That's what we're trying to tap into. In typical performance situations, anyone want to guess? Yes, what you will do. Which one of these things do you think IQ tests? Exactly. Cognitive abilities. They test cognitive abilities. They test a specific subset of cognitive abilities. Unfortunately, that's not all that's involved in rationality. That's not all that's involved in making good choices. There's another thing called thinking dispositions. Thinking dispositions are your approach to answering questions and things like that. They are something that are kind of part of your personality. It's literally how you look at problems. There are several different factors involved that fall under this category. IQ tests are optimal performance situations. Rationality is best assessed under typical performance conditions because what we want to know is what will you do. That doesn't leave IQ out of this. It means that IQ cannot capture the whole picture. In fact, it captures a lot less of it than you might think. A lot of what I'm talking about today is kind of born out of 30 or 40 years of cognitive psychology literature. A lot of this work is actually done by Keith Stanovich. He's kind of culminated. I'm going to have some books up at the end for suggested reading for the future. He's kind of culminated all this work into a model. This is really only part of the model. But I'm going to try to simplify it for you a little bit. Most psychologists recognize that we have kind of two different systems, type one and type two. Type one is kind of all the automatic processes that we do, the things that we do quickly. We recognize objects. We recognize faces. We can solve simple problems, things like that. We do those things without thinking. We can do multiple things at a time. They don't require a lot of resources. Type two thinking usually requires our attention. We can usually only do one at a time. If you're driving in the rain and you're looking for an address, you turn off the radio. Sometimes things that are normally type one can turn into type two when we have too much to sort through. But the point is that type two processes and type two questions require effort. They require some thinking and some effort, some attention and cognitive resources. He's broken type two processes into two different kind of areas of individual differences. We find individual differences in the autonomous mind, which is those type one processes, but we don't find very many. There are very few differences among people in their ability to recognize faces and things like that. And most of those differences that we do find are due to things like brain damage and so forth. Although there is some variation among people. In type two processes, we can actually divide this into two major clusters. One of them he calls the algorithmic mind. And those are the things that IQ tests tap into. The algorithmic mind is the part of type two processes that allow you to do complex computations and think through problems and come to difficult answers. And then there's the reflective mind. And the reflective mind shows a lot of individual differences in what we call thinking dispositions. So the problem, the waste and four card problem that we were talking about, one of the ones that you got on your sheet is very similar to this one. If a person is drinking beer, then they must be over 19 years old. You have four cards that you have to turn over. One has what they're drinking on... Each one has what they're drinking on one side and their age on the other. And you can see four of these sitting on the table and you've got to make sure that the rule is being followed. So which cards do you turn over? How many say the beer and the soda? Nobody. Good. How many say 24 and 26 years old? Okay. Who wants to venture the right answer? The beer and the 16 years old. That's a relatively simple problem. And yet the abstract version of the waste and with the E and the K and the four and the seven, it's the exact same structure. It's the exact same structure. And yet it's more difficult. It's more difficult because it doesn't provide us with an easy... That's sort of what Kyle was talking about. It doesn't provide us with this schema, this idea of how things work. It doesn't tap into something that we're very, very familiar with that we can kind of follow that path. Now people usually get this right. Does that make them rational? Because they get the rational answer? Not necessarily. There are many ways to come to that answer. And not all of them are to think it through. Okay. That's the point. So just because you get the right answer doesn't mean you're rational and just because you get the wrong one doesn't mean you're not. Okay. Here's another problem that's kind of similar to the abstract version. Imagine that you've been employed by a manufacturer to ensure that their products are packaged properly. You're told the boxes with the circle on the side must be yellow. The log tracking this information is missing four entries as shown below, which products must be retrieved from the inventory to ensure that the rule has been followed. Okay. This is a real-world situation. And yet it's one that most of you are probably not familiar with unless you work in a manufacturing plant that has boxes with circles and colors on them. Yeah. So how do you figure out the answer? Well, if you take the time to think it through, you can think about this in the right way. This is actually just a conditional syllogism. The major premise is the rule. The rule is that boxes with the circle on the side must be yellow. And you have four options for a minor premise. Two of them lead to a valid conclusion, one that will tell you the information that you're looking for, and two do not. If you pull the box with the circle and the rule, and it is yellow, then the rule was followed. If you pull the box with the circle and it's not yellow, then the rule was broken. So you learn a lot of information from that first one. This corresponds to the E card in the abstract version. If you pull the box that's yellow, if it has a circle, the rule was followed. If it doesn't have a circle, so what? Have you really learned anything? Have you learned that the rule is always followed? This is the most common answer, though. This corresponds to the four card. It's the most common answer, and yet it really doesn't tell us anything because the rule doesn't say that if it's yellow, it must have a circle. It can be yellow and not have a circle. If you've pulled the box with the square and it's yellow, so what? If it's not, so what? Very few people actually choose the K card. And then finally, if you pull the red box and it has a circle, the rule has been broken. If it does not have the circle, so what? But the point is you need to know if the rule has been broken. That's the really important information. The important information has a rule been followed. The important information is, has the rule been broken? We can do this a lot better when we have a schema like the drinking schema because we're more interested in the kid who's not supposed to drink than we are in the adult that doesn't drink. That's the important information. We recognize that when we see it in something that's everyday and familiar. So the waste and four card task is correlated with cognitive abilities. If you give the waste and four card tasks to people, you will find that their performance varies with their IQ, okay? Especially when they're in situations when they know they're going to be tested and they have the time to answer the question, okay? You guys did not have the time when we did that on purpose to make you kind of push you into giving the more normative answers because I expect this audience to do better on these questions. So sorry about that. We're not trying to make you feel stupid. Okay. Conjunction errors. These are the questions that are related to the representativeness heuristic. The representativeness heuristic that Kyle talked about is so strong that it leads us to ignore these probability things even if we're trained. This is the interesting part of it. This was one of the questions, the one that Kyle did not talk about. The typical question in the literature is something called the Linda problem which is very similar to the first question that Kyle went over. It's this picture of a woman and it suggests something about her and it kind of forces you to give an answer that you may not normally want to give even especially if you know that it's not really correct, okay? Because it's the only thing that fits. It's the only thing that fits the description of her. The context really matters. Well, there are a lot of people that argue that, well, people don't know the conjunction rule and that's why they answer. So we've done studies where we've actually told people the conjunction rule. Guess what? They still break it, okay? I have had... I've actually done studies like this with students that I have trained on probability and I know for a fact that they understand the conjunction rule very well. But the problem is I didn't give them an answer that was correct. There is no correct answer. They're just wrong answers. The wrong answers are the conjunction answers. They want so badly to add that piece of information, okay? In this case, what I was interested in is you'll know that how many of you chose the stay-at-home mom or kids play soccer? No? Well, yeah, you guys didn't violate the conjunction rule a lot. The kids play soccer and stay-at-home mom are both options and they're all representative. And yet when I did this with undergraduates, a lot of them chose one. The majority of them chose one. And my theory is that we're information gatherers. You know, when something big happens, when there's a disaster, when there's an earthquake, when there's the tornadoes, what do you do? You turn on the television, you want the news, you want... We'll watch the same thing. They don't update it for hours. They just keep going over the same thing over and over and over again until you're sick of it, right? You're looking for that one new piece of information. Every bit of information helps. So when we think we know more information, more information is good. Yes. You have to... And all of these... The question was how do we know that any of them are related? And I'm going to answer this real quick, but please try to hold back on questions. You don't. The question is you don't. The question is which answer is the most probable? And you do know a bit of information about people. Yes, it's stereotyping to assume that that person has that characteristic, but we didn't ask you which characteristics they had for sure. We asked you what's the probability. And representative information really strongly pulls us to other representative information. You know, when things go together, when we know they're correlated, we know that more people who own minivans have kids that play soccer than not. You know, the people that own pickup trucks. That's probably true. And so we use whatever information we're given. And that's the only information you were given you had to make a choice. Okay, conjunction errors are actually correlated with thinking dispositions, okay? So you can get people to not make conjunction errors, but you can't get them to fight that urge to give the representative answer. Here's a good example. There's a classic problem. Tom W. is of high intelligence, although lacking in creativity as a need for order and clarity and a need for tidy systems in which every detail finds an appropriate place. Does this sound like anybody you know? He's writing is rather dull and mechanical, occasionally enlivened by somewhat corny puns and flashes of imagination of the sci-fi type. He has a strong drive for competence. He seems to have little feel and little sympathy for other people, and I enjoy interacting with others self-centered. He nonetheless has a deep moral sense. Okay, you're given this description and then you're asked to rank in order of probability which majors in college he's chosen. And this is your list, okay? What do you think the most probable major is? Anybody? We got computer science. How about engineering next, maybe? This is pretty easy, right? Okay, now imagine, rather than given Tom's description, I just gave you a list of college majors and asked you to rank them in terms of probability or in terms of popularity. So the probability that any given person would choose that major, you would use the information that you have which is the popularity of the major and what would be first? Social science actually is usually most popular. So it depends on the college. All colleges are different, but where I taught social sciences, there were two subfields and social science were the top two majors in the school. These are very popular majors and yet the description doesn't suggest that and so what we just did is what? Anybody know? We ignored the base rates. We ignored the base rate. It's just like the problem that Kyle went over about the breast cancer screening. You have to consider how many people there are in those majors and consider that information and the probability that he chose that major, not just the characteristics of the person. Base rate neglect is correlated with cognitive abilities. It's probably also correlated with education on probabilities. Framing, this is actually the classic study that you may have seen before, this is Kahneman Traversky's disease study. So some people are given this question and unusual disease is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. If program A is adopted, 200 people will be saved. If program B is adopted, there's a one-third chance that 600 people will be saved and in two-thirds chance that nobody will be saved. What would you do? There's no right answer here. Right? Is there a right answer? Okay. What if instead of those two options, you were given these two options and this description, a unidirectional disease is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. If program C is adopted, 400 people will die. I don't think they actually hear that die in their head when they're reading it, but you know, if program D is adopted, I was imagining it, there is one-third chance that nobody will die and two-thirds chance that all people will die. What do you do? The question isn't which answer is correct because there is no correct answer. The issue is that people behave differently based on which question they got and all of these answers have the exact same expected outcome. The question is, is it certain or is it only probable? And when they're framed as gains, we prefer the certain gain to the uncertain loss. When they're framed as loss, we use risk-seeking as strategies, we prefer the uncertain loss to the certain loss. Okay. How it's framed dictates our answer and that isn't the way it should be because it really should be what we want that dictates the answer. That's the problem and that's why rationality is so much more important than intelligence because rationality allows us to get what we want and not what other people decide we want. Okay. And you see framing everywhere, advertisement, politics, I mean things framed as economics, there's framed as... I'm drawing a blank, so I'm just going to move on. When we see a frame, we act toward our goals, though. These are different. They are correlated with cognitive abilities when people are given the same... When people are given A, B, and then CD and they can actually see that they're being manipulated, then they think about what they want and the differences go away. So we call that within subjects when people see all of the different conditions. We call it between subjects when we divide people into groups and we only give you one condition and we compare you to the group and we give you one condition. The first two questions that you got we did in a between subjects matter. So when it's between subjects or thinking dispositions take over, we actually have to tell people to think. Okay. This is another interesting one. Imagine that you need money to cover costs of your son's wedding and you have to sell some stock. You have two choices. Blueberry tiles was purchased at $3,000 and now worth $49.50. Tiffany Motors was purchased at $7,050 If you have to sell one, which one will you sell? Which one do you want to sell? How many want to sell blueberry tiles? How many want to sell Tiffany Motors? Wow, you guys are bizarre. Okay. I actually manipulated this question a little bit. The original question had the worth at the same and the difference at the same. I manipulated a little bit to kind of trick you into there being a correct answer or a rational answer, okay? There is no rational answer when they're both worth $5,000. It doesn't really matter and I think what I did is the difference isn't enough. Most people, when the difference is the same, will choose... Actually, wait a minute, wait a minute. Yeah, they'll choose the blueberry tiles. Why do you think most people choose the blueberry tiles? At least they've made some money, okay? The idea that Tiffany Motors might turn around is a really strong one. Here's the problem. These things are worth exactly the same. What we're talking about is being invested in past behavior when it doesn't matter anymore. It's gone. That $2,000 is gone, okay? It doesn't matter when they're both worth the same. So, we've got a sunk cost here, though. We don't like having a loser on our record because as soon as you sell the Tiffany Motors, you have lost money. Ah, the tax right off, yeah. Sometimes we don't work everything into these problems. But if you sell the blueberry tiles, guess what? You have a winner and that's on your record. You've got a record of winning. So, sunk cost is correlated with thinking dispositions. These are those things that are in the reflective mind that help us to make rational choices. I'm not going to go into more of the my side bias and the newcomer's problem in that last one. I'll let you toy with that. That one's a really big one. But those things are also correlated with thinking dispositions and not with cognitive abilities after you've considered thinking dispositions. So, the sources of individual differences that Stanovic has identified, I kind of put them into four different categories, okay? When you sort through all of the mess. You've got cognitive abilities, which is IQ. Education deficit, how much do you know? And the other two are the big important ones because these are all about thinking dispositions. Overconfidence, which I like to call arrogance. Close-mindedness and inflexibility, which are kind of married together. And these two things feed off of each other. If you are overconfident, you don't know that you don't know. And if you are closed-minded, you will never know that you don't know. Okay, so here's the thinking dispositions that are involved with this. There are actually quite a few. I've kind of lumped them together into composites that I think are probably, they probably capture most of the variability. Again, I'll tell you a book you can read to get more of the details here. Need for cognition. Really, how intellectually curious are you and how much work are you willing to do? You notice that I mentioned some tasks where people, once they were told what to do or they were shown that you're being manipulated, that's when they kind of turn on the switch and actually do the thinking. That thinking disposition is important. How much do you want to think about this? And that actually varies with the tasks. Sometimes it's just not important, you know? But how much do you need to know? How much do you want to know if you're correct? How many engineers do we have in here? Computer scientists, okay? The Monty Hall problem? How many of you have actually created simulations for the Monty Hall problem? Yeah, that's almost as many engineers and computer scientists as we had. So we need to see it for ourselves. Okay, that's a good disposition to have. When we don't want to think a lot, we don't want to put in the energy and we're not intellectually curious. We don't make rational decisions. And most people describe George Bush as somebody who is not intellectually curious. So he has the ability to make these good decisions and to think about it rationally. But even his supporters agree that he's not rational. Adherence to beliefs and open-mindedness. How open are you to the possibility of being wrong? That's the big one, in my opinion. And you'll see this a lot in, gosh, even in our community. You get people that are just so certain about their beliefs that they'll defend them no matter what. And in my book, they're no different than the people that they say are wrong, okay? We could all be wrong. God might come down, you know, right now and tell us we're all wrong. I don't think so. I think that's a pretty low probability. But if you're open to the possibility, you're much more likely to hear what somebody else is saying. And it might apply to something else, even if you are right, or you might find out that you're wrong. The point is that we get nowhere when we're not open-minded. We just don't, we're just stagnant. We need to be able to be open-minded, open to the possibility that we're wrong, so that we can listen to what other people are saying. And besides that, if you are really listening to what somebody else is saying, it's easier to form your own arguments against them. My side bias. Nobody even laughed at that, so that makes me a little scared. My side bias, how much do you favor your favorites above yourself? This is a really tough one for human beings. We like who we like, and we want them to be good, and we want them to be right all the time. And sometimes when they're not, or we have to admit something that's not pretty about them, or we don't want to disagree, you know, it's hard because it sort of feels like we're ending a relationship or popping a bubble or something, and it really doesn't have to be that way. I think that it's horrible when heroes fall from towers, but if we don't build those towers to begin with, if we understand that everybody is human, then it's okay. And that's part of being open-minded, is letting our my side bias exist, but maybe not run our lives. It's okay to love somebody and to stand by them, but don't pretend that they're not wrong when they're really wrong. I'd like to add something about need for cognition as well. Can we go closer to the mic? No, sorry. It's not always just needing to get more information. Sometimes, especially when we're talking about risks, it's maybe you don't want too much information. This happens a lot with... Sorting out what's relevant? Right. If you have a home that's in a floodplain and someone comes around, do you know that your home is likely to flood and this would happen and you need insurance? No, I don't want to know about that. You know, whatever happens, I don't want to think about it. It would cause me too much personal stress. That goes into this, too. And when you're talking about cancer, well, you have X probability of getting this cancer. Would you like to know more information about it? No. That's scary. I don't want to deal with that right now. I can't deal with this right now. So it's not just always trying to update yourself. It's sometimes closing yourself off as well. And that is something that we need to know when we're modulating or not. I think that's a good point. And especially with my side-byes, and personally, you know, I grew up on Jim Carrey movies and then when he started the whole anti-vax thing, you're like, man, I can't laugh at anything you do anymore. You know, it makes you feel bad. Like when you meet an actor in real life and he's a dick. You know, you feel so betrayed and it's the same thing here. We try to avoid that feeling. You don't want to experience this dissonance. Yeah, you just reminded me of something that I did not emphasize on that slide that I think is really important. Probably the biggest thing that people don't like is to think that they've been fooled. That's part of that open-mindedness. It's not just about being open-minded of the possibility that you may be wrong, but being open-minded of the possibility that somebody fooled you. That's even harder for people. That's true. I think that is why a lot of people don't report to the cops because they're embarrassed. How can you be more rational? And if I can get through this one, maybe we can have five minutes for questions. You can remember that it's okay to be wrong. In fact, people will think more of you, not less of you. That's what the psychological research has said for decades, that people actually think more of you. They have a higher regard for you if you admit that you're wrong than if you don't. Concentrate on your future behavior, not your past behavior. Recognize that you're just as human as everyone else. Notice that this is all about you, not about other people. Be charitable when interpreting what other people say. Well, that one was. Consider possibilities and alternatives. A lot of this need for cognition is not being willing to sit down and actually consider all of the different possibilities. Or maybe you want to, but you don't recognize that that's what you need to do. And that's actually a big one in a lot of these tasks. And it brings me to the thing I want to leave you with. And I'm going to tell you right now, I hope that most of you have never heard this one because every time I give it to an audience, nobody seems to ever hear of it. And I learned this, I won't even tell you how many years ago. So I'm a little surprised, but I love this problem and I'm not going to talk to you about it. I want you to talk about it amongst yourselves. And the problem is called, I call it the blind man, but it's about three men. And these three men are in prison. The prison warden likes to play with them. So what he does is every Tuesday, he wrecks the gallows, and he plays a little game with them and it's a little gambling game. They have an opportunity to go free. They don't have to play, so he gives them a choice, but they do get an opportunity to go free. On this particular day, there's a risk involved though you'll find out. On this particular day, he wrecks the gallows and then he gets these three guys set up. Now one of the men is blind. One of the men has one eye and one of the men can see perfectly fine. He walks up behind them with a box and he says I have five hats in my box. Three of them are white, two of them are red. I'm going to put a hat on each of your heads. Now they can't see their own hat because it's on their head, but they can see the other guys are wearing well except the blind man. So he puts a hat on each of their heads and he says okay, now I'm going to ask you one at a time, can you tell me what color your hat is? If you can tell me correctly what color your hat is, you get to go free and they want to do this because all three of these guys have been in here a long time. If you get it wrong, gallows, they're dead. So that's a pretty big risk. They've been in there a long time, they may want to die, I don't know. But let's assume they don't, they want to live. But they do have the option of saying I don't want to play, in which case they just go back to their cell. So he goes up to the first guy who can see just fine and he says okay, what color hat are you wearing? And the guy says no, I don't want to play. And he goes back to his cell and lives to see another day. The second guy that only has one eye and he says okay, what color hat are you wearing? And the second guy says, I don't want to play. So he goes back to his cell and he lives to see another day and the blind man tells him what color hat he's wearing and goes home. So your task is to not only identify the color of the hat but you have to be absolutely certain. So you've got to be able to explain why you know. Okay, because he wouldn't say if he didn't know because he wouldn't want to die. And nope, nope, nope, nope, not going to take answers now. I want you guys to actually think about it and I've given you all the information you need to solve the problem. Sorry, you want me to repeat the hats? There were three white hats and two red hats in the box. Any other information you need repeated? That was it? The hat they're wearing doesn't matter. They're not, they can actually tell you what hat they're wearing. Okay, we've got two minutes and I know you guys have a ton of questions. We'll be around all weekends. You can come and talk to us. But I think we can probably take one question over there. Just the former slide. Why don't you get to the books? Oh, yeah, I did want to quickly show you that slide. These are books that I recommend. They're books by well-published cognitive psychologists. Most of them kind of summarize a body of literature. Actually, one of the authors is here at TAM. And a lot of what I was talking about today in this workshop comes from what intelligence tests miss by Keith Stanovich. I highly recommend all of these books. Carol Tavris is actually a popular TAMR. So these are fantastic books. So if you're interested in more about this, especially about the difference between intelligence and rationality, these are the books I recommend. And I'll back it up to that slide if you want to hang around a little bit. And we're actually out of time. But if you want to come up and talk, we can hang around for a few minutes. Okay.