 The World Economic Forum is perhaps an unusual place to share scientific evidence that money won't necessarily make you happy, but here we are, and it gets worse. I'm here to suggest that certain kinds of money might even make you less happy. At first, this may seem to contradict the classic economic model of human nature, Homo economicus, whose happiness depends solely on maximizing material self-interest. But research in psychology and neuroscience suggests maybe a better model is Homo moralis, whose happiness depends not just on self-interest, but also the welfare of others. In my lab, we've been studying how the value of money itself depends on whether it was earned morally. Now, the idea that money can be tainted can be traced back at least two biblical times, where we see the first mention of the term filthy lucre, or money earned in a dishonest or dishonorable way. We've been testing this idea in the lab with experiments where we give people the opportunity to earn some money for themselves by delivering mildly painful electric shocks to either themselves or a stranger. The shocks feel a bit like a bee sting that lasts for half a second. They don't cause any physical damage, I assure you, but they are unpleasant. So when people in our experiments make these decisions, they face a moral dilemma. Now, crucially, the decisions are totally private and unobserved. So we might expect people to be pretty selfish in this situation, because there are no extrinsic incentives, like fear of judgment or punishment, to prevent someone from shocking a stranger for money. In these studies, we observe a wide range of behaviors. We see people who refuse to deliver a single shock to another person in exchange for even $30. We also see people who will deliver 20 painful shocks to another person in exchange for just $0.10. What's especially interesting, though, is to compare the average price per shock for a stranger versus oneself, and we see ill-gotten gains are less valuable than money earned decently. On average, people require about twice as much money to shock a stranger than themselves. So how can we explain this behavior? To find out, we put people inside a brain scanner, like this. By measuring blood flow in the brain, as people make these decisions, we can ask how the brain's reward circuitry responds to an ill-gotten gain. Or in other words, does the brain treat ill-gotten gains as less valuable? This is the striatum. It's a brain area that consistently responds to rewards like money. And here's the response in the striatum to money earned from shocking oneself versus a stranger. As you can see, ill-gotten gains are worth less in the brain's reward circuitry. And individual differences in these brain responses track with individual differences in behavior. So those who show a larger devaluation in the reward network's response to an ill-gotten gain show larger differences in the price of pain for oneself versus a stranger. In other words, we can predict moral behavior by looking at the brain's response to an ill-gotten gain. Now, you might be wondering at this point, if ill-gotten gains are truly less valuable, how can we explain corruption? We're curious about the same thing. And we have a hunch that if people believe or can convince themselves that the money goes to a good cause, this might restore the value of an ill-gotten gain. To test this in some preliminary studies, we added a twist to our experiment. In the profit condition, as before, people can earn money for themselves by shocking themselves or a stranger. In the charity condition, the money goes to a children's cancer charity instead. The differences between these conditions were quite striking. As we saw before, with profit, most people would rather personally profit from their own pain than a stranger's pain. But this was not the case when the money went to charity. If anything, people seemed to be slightly more willing to shock another person than themselves for a good cause. What this suggests is that the value of money depends not just on its moral consequences. It also might depend on the stories we can tell about our choices. It's pretty difficult to justify hurting a stranger for your own personal profit, but it's for the greater good. This might counteract the corruption of value. Or in other words, ill-gotten gains might be able to be laundered with good storytelling. So what does this mean for our models of human nature? On the one hand, we clearly care not just about our self-interest, but also the interest of others. At the same time, we seem to be able to trick ourselves back into selfishness with a good story. Maybe a better model we might call homo neuratus. If so, what kinds of stories are you going to choose? Thank you.