 Our next speaker is Julia Galef. You saw her earlier yesterday on a panel discussion. Her topic is how to change your mind. Come on in, come on in, come on in. Here's the limerick. When shown proof that counters belief, most folks get filled up with great grief. Now Julia's here to say loud and clear what you can feel is joy and relief. Please welcome Julia Galef. It's great to be here. This is my first solo talk at Tams. I'm really excited. Thanks. You've heard a fair amount already this weekend about cognitive biases, which are sort of the focus of my organization, the Center for Applied Rationality. And just in case you want a refresher definition of what the field of cognitive biases is all about, this is my favorite definition. Cognitive biases, the study of what other people think. And I'm just curious. How many of you, when you read that definition, thought to yourself, yes, exactly. That's exactly how other people think about biases. I've gotten that response before, just saying. And to be fair, I think it's not actually that unreasonable because it's so easy to see how other people are wrong and it's so hard to see how we're wrong. Not just emotionally, but cognitively. It's hard. Our brains are wired to see the evidence for our beliefs and not to see the evidence against our beliefs. It's hard. So what do we do about it? Well, I think that we can train ourselves to notice clues in the world. Little trails of bread crumbs that we can follow to find the things that we might need to change our minds about. Those clues I call anomalies. Take a look at this picture. Raccoons on a hillside. So how many of you noticed that that's not actually the sky above the raccoons? The body of water. Some people? Yeah. So if your mind works at all like mine, this is the process you go through. You see the picture? You see raccoons on a hillside sky. Boom. And then a moment later, this weird little detail catches your eye. There's this rock that seems to be floating in the air above the hillside. So your eye sort of lands on that and it confuses you. You go, huh. And so then there's this sort of like churning of gears in your head. And then one or two beats later, depending on whether you're quicker than me. You notice, oh, that's not actually the sky. That's the reflection of the sky in a body of water. And the rock is poking up out of the water. Oh, and then perhaps you start to notice other more subtle anomalies that you hadn't noticed before you were, you know, switched to the water model instead of the sky model. Like, I don't know if you can see from where you're sitting, but there are little ripples in the sky. And once you're viewing that as a body of water, the ripples are obvious, but they certainly don't jump out at you when you think that that's the sky. And this process is sort of in miniature what science as a discipline does to advance. This is how scientists collectively notice anomalies and change their minds about how the world works, about how human psychology works or how the solar system operates or just how the fundamental laws of physics work. But I think that this is also a great model for how we as individuals can learn to change our minds. So this is the classic example of scientists noticing anomalies and changing their minds from the 16th century. This is the picture from my title slide. It's actually the path of Mars across the sky. And that little kink in Mars' path across the sky was to the 15th and 16th century astronomers what the floating rock was in the raccoon picture to us. It doesn't make sense in their model of the world. Their model was the geocentric model in which the sun and the various planets all revolved around the earth in circular orbits. And so it does not make sense why Mars would be traveling that way. And then Copernicus put forward the heliocentric model in which the earth and the other planets were revolving around the sun. And suddenly the little kink in Mars' path snapped into meaning. That's exactly what Mars would look like to us if we were orbiting the sun in a tighter path than Mars. And interestingly, in the following decades after Copernicus put forward his model and it started getting more widespread and accepted, astronomers started noticing other subtler anomalies that they hadn't seen before. So under the geocentric model, the heavens were immutable, same stars past, present, future. And so now they started noticing comets, new stars appearing, none of which would have been possible under the geocentric model. And interestingly by contrast, Chinese astronomers had never been operating under this immutable heavens model. And they had been recording new stars and comets for centuries before Western astronomers got around to it. So this is basically the classic example of science noticing an anomaly and changing its mind in response eventually. And this is sort of the same process that I think that we can go through as well. You have a built-in anomaly detector in, for lack of a better word, your gut. You are constantly predicting what's going to happen next on a subconscious level. So if you see a glass fall off a table, you are predicting that it's going to fall and probably shatter. If you watch a Disney movie, you are predicting it's going to have a happy ending. And if you see George Robb walk out on stage, you're predicting he's going to say something funny. And when you read words on stage, you're actually predicting what kind of word is going to come next based on the words that have preceded it, which is why our brains can read so fast and so fluidly. And you don't notice that you're making these predictions. They're not at the level of conscious attention until and unless reality violates those predictions. So if the glass falls and bounces instead of shattering. Or if we get to the end of the Disney movie and the heroine dies a gruesome death. Or if George walks out on stage and says, you're all dumb and I hate your stupid faces, then he leaves. For example, you might feel surprised and confusion, consternation, perhaps. And those are the signs of your expectations about the world having been violated. So a psychologist I love named Dan Gilbert has a nice way of putting this. He says that surprise tells us that we were expecting something other than what we got, even when we didn't know we were expecting anything at all. This is one of my favorite things about rationality that I think is woefully underappreciated. That people think of it as this very conscious, deliberate, deliberative process. And it certainly is sometimes, but I think if you're doing it right, a lot of the time rationality consists of these sort of felt senses in your gut that that doesn't quite sound right. Or I don't quite believe that. Or that's kind of weird or surprising. So some of the things that I've been surprised about that I've trained myself to notice. I was surprised reading C.S. Lewis. I was surprised at how thoughtful and sophisticated he was. Which I guess was just because I didn't expect a Christian apologist's writings to be so well thought out and argued. My model of the world, although I hadn't been thinking of it consciously at that point, did not allow for someone who is both a Christian apologist and such a skilled thinker and philosopher as C.S. Lewis. I was surprised recently when a friend of mine who's super analytical and empirical came back from a meditation retreat, raving about how much value he got out of meditation. And this may not have surprised you, but I just, you know, never saw much value in meditation and sort of assumed, I guess, that the people who got a lot out of it like weren't that empirically minded. And so it sort of like threw me a bit that my friend was so positive about it. And I think in general I feel that sense of surprise when someone who I expected to agree with me disagrees with me. There's plenty of people who I know are going to disagree with me, that doesn't surprise me. But the interesting ones are the people who I like expected to be, you know, on my side of the issue. And those are the ones that I sort of lean into and try to find out more, because what that means is either my model of them was wrong, which is kind of interesting, or that there's some sort of new and interesting reason not to believe what I believe that I wasn't aware of already. Early on at CIFAR when I first started teaching classes, I was surprised to notice that the ratings for my classes in the follow-up survey weren't that great, which was surprising because I'm an excellent teacher. And I just, I sort of assumed like I must have had an off day or something is what I told myself the first time. And then what I told myself the second time was I guess the people who came to the workshop this time happened to not really mesh well with my material. And I forget what I told myself the third time. But eventually I was like, why do I think I'm a good teacher? I've never actually taught before founding CIFAR. And when I thought about it, I was like, I think it's because I think what my brain was doing was, well, I'm good at speaking and I'm good at knowing stuff. And teaching is basically speaking and knowing stuff. So of course I'm a good teacher, which is not quite right. So this was actually a very useful realization because I hadn't been trying to improve my teaching since I knew I was already good at it. And I think that beliefs about ourselves, about our traits, our strengths and weaknesses and just our personality in general are particularly sticky. And so it's particularly useful to be able to notice when reality is telling you that those beliefs aren't actually true. And you might think, you know, I just like I hear you about anomalies. I can see how that would maybe be useful. But I just don't think that I'm going to notice that many. I don't think they're out there. So I disagree with you and here's why. 1954, some residents of Washington State started noticing that their cars windshields had these little dents in them as if someone had shot a BB gun or buckshot into their windshields. So they complained to the local police department. The police department, I don't know, rounded up usual suspects or whatever, but they couldn't find anyone who seemed had done it. And then the pitting epidemic started to spread to neighboring cities. More and more people started noticing that their windows now had these, the windshields had these pits on them. Still the police couldn't find the culprits. Eventually it spread to the greater Seattle metropolitan area. At its peak over 3,000 people had noticed that suddenly their windshields had these pits in them. The governor of Washington started taking it seriously. At this point it was clear it wasn't vandals. So people were starting to bring up other scientific explanations for the epidemic, like gamma rays, a change in the earth's magnetic fields. And according to this newspaper article from the time titled, Furious Thrive as Pellets Run a Muck, Aliens Kicking Rocks on to Tacoma, Washington from the Sky. Anyway, so the governor of Washington convened this panel of scientists to try to figure out what was causing these pits to appear in people's windshields. The scientists studied it and after great deliberation they concluded that's what happens when you drive a car around for more than like a year. Little debris hits your windshield. In other words, they'd been there all along and everyone's car that was like more than a couple years old had little pits in them in their windshield, but people only started noticing them when other people started reporting pits in their windshields. In other words, pits were there all the time and they were looking right through them. And this is, honestly, this describes how I feel about noticing anomalies, that it was only when I started actively looking for them that I noticed, oh there's all these things that like, whoa, I didn't expect that and huh, that's kind of weird and I hadn't been looking for them before, which I think actually has a kind of poetic justice or delicious irony to it because the problem that we're facing, the problem of our beliefs being so entrenched is due to the fact that our brains see what they expect to see based on our beliefs. And so what we're doing here is we're learning, hey, there are anomalies all around us and now that I know that I can see the anomalies all around me and so we're basically fighting confirmation bias with confirmation bias, which I just kind of like. I also think that you start to notice anomalies more and more the more you do it. This is Paul Graham, he's a programmer who created Lisp and a venture capitalist and entrepreneur and he wrote this great essay about learning to notice feelings of surprise and he compared the feeling or the process to learning history. So when you first start learning history, stuff doesn't really stick for you because you don't, you know, it's just a bunch of names and dates and places and then as you learn more and more, you have these facts to hang the new facts on and they sort of fit into the system, they're more memorable and they're more meaningful as well. And so he says, the more anomalies you've seen, the more easily you notice new ones, which means oddly enough that as you grow older, life should become more and more surprising. He also, by the way, said that he used to be very scared of flying and so he would live vicariously through his friend's trips. He would try to pepper his friends with questions when they got back from some exotic place to find out what it was like and the one question he said was most useful to him for his purpose of living vicariously was what surprised you about the place you traveled to? How is it different from what you expected? And he said even the most unobservant people are able to report interesting things when you ask them this question. So another option you have, aside from sitting back and waiting for anomalies to appear in your path is to invite the world to surprise you by making explicit predictions about what you think is going to happen. This is something I do a lot just on a very informal level. Before I ask someone a question I'll try to guess what I think they're going to say. Before I go to an event I try to predict how I'm going to like it. Before I check my teacher ratings I try to predict what they're going to be and I'm very often wrong. I'm like still surprised. And it's also sort of for extra practice helpful to notice how confident you feel in your predictions so that over time you can see what kinds of things am I overconfident about, what kinds of things am I underconfident about. This is a clip from the 2012 presidential election right before the election and it's about the Colbert Report and appearing on the show is Nate Silver who is the statistician who successfully predicted basically all of the state races in the presidential election just using statistics. So here he is appearing on the Colbert Report. Now my buddy Joe Scarborough, Joe Morning and Scarborough, you bet him a thousand dollars that your predictive model was correct. A thousand dollars that would go to Red Cross. He didn't take the bet. He didn't take the bet. The idea is that look, I used to play poker and in my world it shows you have integrity to be putting your money behind an idea that you have, right? It means that you're being serious and you actually have incentives to be accurate and you're not just standing there blowing hot air potentially. And Joe to his credit he did donate, make a donation on my behalf and I donated as well so it wound up being for a good cause but I think a lot of people get on TV and they say things and they don't really have any conviction behind them. It becomes a game just to entertain half the audience or the other half of the audience or whatever else, right? And we're trying to say something very simple which is that hey, go and look at the polls and take an average and then add up the states and see who has 270 electoral votes. It's not really that complicated if people treat it like it's Galileo or something, right? Something totally heretical. It's just like no, look, these polls are pretty simple little facts, right? There are many things that are much more complicated than looking at the polls and taking an average and counting to 270, right? That is the longest possible way of calling him a bullc***er. Indeed. So my take away from that is not that bets help you call other people out on their bullshit but that our own brains bullshit us all the time and betting is a great way to call your own brain on its bullshit. So this is like extra, extra practice if you want to invite the world to surprise you with what your expectations were wrong about is make bets with people. So this is something we do at CIFAR all the time if we disagree about how long something is going to take or about how well a workshop is going to do or about whether so and so is going to change their mind if we try to argue with them, et cetera. And it is striking how often when I go to put my money where my mouth is, I am surprised to notice, whoa, I don't actually believe that. This is also, by the way, in case you were wondering what the picture of Almond's was on the previous slide about prediction, this is something we do at our workshops. We make predictions about just random things to let people practice that process and one of the workshops, someone set up a prediction market about whether an almond would sink or float if you dropped it into water and this one guy bet the almond would float and he bet it with 99.9% confidence. We play with poker chips so he basically just wagered all his poker chips on that bet and so he got together and we did the test, we pulled an almond out of the jar and like with everyone watching, dropped it into a glass of water and it sank and this poor fellow lost all of his poker chips and of course everyone wanted to know, why were you so confident that the almond would float and it turned out that he had sort of snuck away ahead of time and done his own secret test when no one was looking with an almond of his own but I guess almonds have different densities so now he's our parable about overconfidence and the dangers of a sample size of one. So I've so far just been talking about how to notice anomalies, how to sort of draw them out from the world I want to talk a little bit about how to react to them, how do you respond when you see anomalies and I think one very common way to respond is to say, you know, I can explain this without having to change any part of my model of the world so to be honest the first time my friend told me that meditation was awesome and like totally changed the way he thought my first reaction was to say he probably just really liked the idea of meditation and so he's like convincing himself that it works and some people will one up this process and they'll explain to themselves why not just why the anomaly doesn't contradict their theory but why the anomaly even supports their theory so this is a picture of Japanese Americans heading off to an internment camp during World War II and at this time Governor of California Earl Warren testified to Congress that he thought he was very confident that Japanese Americans were a grave threat to our national security and when one of the congressmen pointed out that, you know, there hasn't been any signs of Sutter-Fuge from the Japanese American community, Warren replied, that only increases my confidence that they are up to something big because they are so well coordinated that they can avoid giving off any signs of Sutter-Fuge so it's like, it's going to be serious which is obviously stretching it but I feel very sympathetic to the urge to explain away anomalies rather than changing one's own model and what I think is going on is that people feel like they have two choices they can either change their mind about something serious that's maybe an entrenched belief or they can find a way to reject the new evidence or the new anomaly and changing their mind about a big thing on the spot in response to one new observation just seems like over dramatic and drastic and also maybe unpleasant and so they have to find a way to reject the anomaly I think there's a third option, it's called putting anomalies in abeyance and what it means is you discover an anomaly, you acknowledge it, you notice okay this might indicate there's something slightly wrong with one of my beliefs about the world or maybe not, maybe there's another explanation and then you move on and this might seem like a cop-out but I don't think it is because this is actually what science does, scientists discover anomalies all the time that seem to contradict established scientific theories and they notice it with interest and maybe curiosity and continue on and sometimes it turns out that the anomalies build up together and there's a shift in scientific theory this in fact is a picture of Mercury's orbit which got placed into abeyance as an anomaly because Mercury's orbit was rotating faster than Newtonian physics predicted so this was very sort of confusing to physicists but Newtonian mechanics was very well established so they just sort of said well that's confusing and didn't try to toss Newtonian mechanics out the window and they just figured eventually there will be some resolution and there was when Einstein came along, general relativity resolves that discrepancy so I think that the notion that you don't have to immediately change your mind or reject a theory that you can sort of acknowledge it and pay attention to it is actually very helpful because it eliminates some of the resistance to noticing that there are anomalies and just noticing is like a wonderful first step and if you feel like you have that freedom to notice it without being forced into anything you're much more likely to make it so this I think is a phenomenon that probably all of you have encountered in some way or fashion before where let's say you have a relative maybe your parents who maybe you don't call as often as they wish you would and when you finally do call them the first thing they say is so you're finally calling you never call why don't you call more often and you just feel worse than you already did and what your parents are doing although they don't mean to is punishing you for doing the very thing that they wish you would do more often and I think that this is what our brains do to us all the time when we notice things that might contradict our theories about the world and about ourselves so having the ability to sort of shed that stress or anxiety about a potential threat to your theories is going to allow you to actually notice them rather than training yourself not to notice I have this model that I like called the soldier and the scout these are two different ways of thinking about anomalies so in soldier mode there are your ideas and then there are the enemy's ideas and you go around with your shield and your sword to protect yourself from ideas that might threaten you and try to knock them down and as a soldier the way that you win or not lose is by not letting those enemy ideas in and I think this is a very natural intuitive model and you can see it in the way that we talk about arguments and disagreements we use battle words we talk about attacking an argument or defending an argument we talk about seeding ground in an argument the same way you seed territory we talk about stronger weak arguments like winning or losing an argument etc alternatively you could be a scout and the way that a scout wins is by coming back with as accurate a picture of the landscape as possible so scouts are in exploratory mode scouts are curious if there's something that seems like it might be wrong scouts are more curious to find out well of course it would be great if there were a bridge over that river but I better find out if there was a bridge over that river rather than just putting my shield up to any evidence that there might not be a bridge so being in scout mode I think is I don't want to say necessarily necessary but certainly crucial super important to actually successfully changing your mind I like this quote from St. Exupéry who wrote The Little Prince he says if you want to build a ship don't drum up the men to gather wood, divide the work and give orders instead teach them to yearn for the vast and endless sea this is the mentality that I see in the people whose judgment I respect the most the people who I think are right the most often, the most thoughtful they're not doing it they're looking for anomalies and they're not doing it out of a sense of duty or a sense of like this is what I have to do to be a good skeptic or a good rationalist I have to go out and question my beliefs because they're curious they want to know what the right answer actually is so this is my exhortation to all of you to get into scout mode care about what the right answer is learn to notice things in the world that are surprising or that make you go huh or that's strange and then lean in to that surprise and confusion and curiosity and follow that trail where it leads and finally remember the most exciting phrase to hear in science the one that Harold's new discoveries is not Eureka but that's funny thank you