 If you ever find yourself trying to figure someone out, try mentioning boxing. If they start talking about Christmas, they're British. If they start talking about Muhammad Ali, they're a sports fan. If they start ranting angrily about one boxing and two boxing, they're probably a nerd. In a moment, we're going to play a little game where you have the opportunity to win some cash. But first, let me tell you about this amazing genie. It's made a number of super accurate predictions about your behavior, just nailing it 99 times out of 100. It correctly guessed what you were going to order for dinner the last time you went out. It forecasted what shirt you were going to wear today. And it knew that you were going to watch this particular video. It seems to be remarkably good at knowing what you're going to do. For our game, I have two boxes, the genie box and a bonus box. To play, simply decide whether you'll take the genie box alone or both boxes. The genie box's contents depend on the genie's prediction about your decision, which it made yesterday. If the genie said that you're going to take both boxes, I'm not going to put anything in the genie box. But if it told me that you'll skip the bonus box, I'm going to put a million dollars in the genie box for you. The bonus box is always the same thing. It's a clear enclosure with $1,000 inside. That's the whole game. What do you choose? Do you just take the genie box or do you take both? Feel free to pause the video if you want some time to think about it. If you're like most people who hear the question, you probably have a strong intuition about what any sane person should do in the situation. But you might be surprised to hear that a lot of people have equally strong intuitions in the opposite direction. For every five one-boxers, there are around two two-boxers. Let's experiment with that gut feeling a little bit. Does your answer change if I change those numbers? If the bonus box had say $50,000 or $900,000 or just $100, would you choose differently? The relationships between the amounts in the two boxes or the rules by which they operate aren't actually changing. But many will flip-flop as the numbers go up or down, which seems telling. How sure are you of your reasoning if you're willing to jump ship on it once the stakes are high enough? This is called Newcombe's Problem, dreamt up by physicist William Newcombe and popularized by philosopher Robert Nozick. It's been an inspiration for numerous papers in mathematics, economics, and game theory, and it's a fun thing to drop into a group of people who've never heard about it. As Nozick points out in his paper defining the problem, the issue is that we have two analytical mechanisms, two different lenses that we use to try to make sense of the world, and while they normally agree, in Newcombe's Problem, they're pitted against each other. In our Genie Problem, the odds that the Genie is going to guess correctly about you are 99 out of 100, call it .99. If you choose to take only one box, the expected utility is 0.99 times a million, or $990,000. If you take both boxes, you're gambling that the Genie is wrong about you, so the expected utility is 1 out of 100, 0.01 times a million plus the thousand in the bonus box, $11,000 total. $990,000 is way more than $11,000, so you'd have to be an idiot to take both boxes, right? Well, let's look at this in a different way. In this scenario, the Genie has made his decision already. Either the money is in the Genie box, or it isn't. Nothing you do now is going to somehow go back in time and change whatever the Genie guessed about you yesterday. So, you're faced with two options. Either take the money that's in both boxes, or only take what the Genie decided to leave you. If you only go for the Genie money, it could be either a million dollars or nothing. If you go for both boxes, you'll either have $1 million, $1,000, or just a thousand. In both scenarios, taking the bonus box results in a better outcome. So, you'd have to be an idiot to just take one box, right? These two strategies highlighted by Newcombe's problem have been considered in-depth, and philosophers have realized that they actually represent two subtly different approaches for optimal decision-making. Evidential and causal decision theory. Evidential decision theory doesn't really worry about the relationship between a given choice and its consequences. It merely looks at the likeliest states of the world after the decision is made, and asks how desirable each world is. Another way to think about it would be if a friend of yours heard that you had made that decision. Would they be happy to hear it? That's what you should do. Causal decision theory, on the other hand, draws a line at now and asks what decision will probably cause the most desirable effect in the future. Any implications of the decision itself aren't rolled into that calculation, because causation doesn't usually work that way. You don't really care what the news if you're having made a particular choice might suggest. Just what the choice will actually do. You probably use both of these principles instinctively without issue. It's not often that we have to choose between the two, which is what makes Newcombe's problem such a divisive thought experiment, and why it's prompted several interesting variants on the theme, collectively called Newcombe-like problems. Some of these problems actually lead people to different decision-theoretic conclusions, switching back and forth between the two principles willy-nilly. For example, say that researchers discover a gene that makes people more likely to get headaches, but also gives them a strong urge to eat chocolate. Should you eat chocolate? Even if you were a one-boxer in the original problem, following evidential decision theory, it seems a little silly to use that strategy here. It's pretty obvious that you either have the headache gene or you don't, and that eating chocolate doesn't actually cause the headaches or cause you to have the gene, despite the fact that if a friend heard news that you'd chosen devil's food cake for dessert, they might worry about you. In this case, even if eating chocolate is possible evidence of a world in which you're doomed to migraines, it's pretty obvious that you might as well have some brownies. How about this one? You're handed a button, which you're informed will instantly kill all dangerous psychopaths in the world if pressed. You're sure that it would be better to live in a world without any such folks in it, but you're almost certain that only a dangerous psychopath would press the button, and you definitely don't want to die. Should you press it? Evidential decision theorists would say no. The most likely world in which you would find yourself pressing the button would be one in which it would kill you, which you want to avoid. But causal decision theorists would say yes, either you are a dangerous psychopath or you're not, and pressing the button isn't going to change that fact either way, so go nuts. The numerous examples of nukem-like problems where one principle or the other seems to fall apart have led many people to guess that neither is totally correct, that there's some other axiomatic principle which governs all rational decision-making, and will reliably produce intuitive results in every possible situation. Some have attempted to patch EDT or CDT to fix their seeming inconsistencies, but nukem's problem is still a point of contention among philosophers and decision theorists, and I'm pretty sure thunk fans. Would you take one box or two boxes? Please leave a comment below and let me know what you think. Thank you very much for watching. Don't forget to blah blah subscribe, blah share, and don't stop thunking.