 Last year I got on a hotel elevator in Toronto, with three men, a staff member of the hotel and two others. I hesitated before getting on, but I thought better of it. There was a staff member, it was a nice hotel. But I did something that should have clued me in. When I got on the elevator, I pushed the button for the wrong floor. The staff member got off on the fifth floor. I had 14 more to go. The doors closed, and I felt the large man behind me grab my backpack. He picked it up slightly and asked why I was wearing this cock block. I froze. I made myself small. I tried to ignore him. Eight more floors. He persisted. What's in there? My heart raced much faster than the floors were ticking by. Eight, nine, ten. I realized that if they wanted, those two men could drag me off the elevator and into their room. And I would be more or less helpless. The doors finally opened, and the man to the right, who was much less drunk than the man behind me, dragged him off. The doors shut. I was alone. But I felt anything but safe. I felt vulnerable, exposed, fragile, scared. I went up in the elevator to the wrong floor, found the stairwell, walked down. It was eerily quiet. Found my door, fumbled with the key, got inside, locked the door. I called my mom, and I sobbed. I got lucky. That story could have taken a much darker direction. I travel alone a lot from my job to conferences. And that experience fundamentally changed what I act like on the road, the precautions I take. After that experience, I became extremely interested in the instinct of fear. Because the clues were there for me. I hesitated before getting on the elevator. I had a bad feeling. I pushed the button for the wrong floor. Why? What did my gut see? And what was it trying to tell me? Why didn't I listen? So I did what every developer does. I went to Google. I discovered Gavin DeBekker's book, The Gift of Fear. The book delves into dozens of stories about violence and the precursors to violence. With the hopes that we can begin to recognize the warning signs and avoid harm. DeBekker opens his book with an incredibly gripping story. A woman named Kelly had gone grocery shopping. She arrived to the front door of her building. It was unlocked. She was frustrated at the neighbors who consistently left the door to the building unlocked. But also a little relieved that it would be easier to get in with those groceries. As she headed up the stairs to the fourth floor, one of the bags broke. And her groceries tumbled down the stairs. As this was happening she heard a voice from the floor below. Got it. I'll bring it up. DeBekker goes on to say that from that moment Kelly didn't like that voice. She knew instinctually something was wrong. The man who walked up the stairs had a very friendly face. He handed her the item and went to take her bag. But niceness isn't kindness. Kelly didn't let go of the bag. And for this moment they both held on to the handle. It was a moment of tension and decision. The man had pushed himself into Kelly's space and she had to choose in that moment whether she would trust him or ask him to let go. Kelly went against that internal guiding voice she let go. The man took the bag up to her apartment, stopped at the door, turned to her and said, or she turned to him and said, thank you, I've got it from here. It was the second time she had asserted herself. She made herself clear. And a kind person would have walked away. But this man wasn't kind. He insisted. He'd help her take it inside. Everything in Kelly said no. But then she convinced herself he was nice. He was just trying to help. It was probably fine. She let him in. He held her at gunpoint and he raped her. After he got up, got dressed, shut the window in her bedroom, said he was going to the kitchen to get a drink of water and that she should stay right where she was. She didn't. She knew in her gut he intended to kill her. And this time she listened. She got up off the bed and walked as his shadow down the hall right behind him. When DeBecca tells this story he says that the man could have felt her breath if she'd been breathing. She went out her front door across the hall, opened the unlocked door of her neighbor, walked inside, locked the door and told them to be quiet. We know now that that one incredibly brave decision saved her life. That story is every woman's worst nightmare. We think about it on a weekly if not daily basis. But it's not just women who have intuition. In 2006, Army Reserve Staff Sergeant Martin K. Richburg was in Iraq and he was talking to his wife on the phone outside an internet cafe. A man approached with a blue bag and Richburg had a bad feeling. Something didn't seem quite right. His suspicion led him to watch the man closely. The man set the bag on the cafe's air conditioner and quickly walked away. Richburg alerted to the danger chased after the man and found out that an IED was located in that bag. The building was evacuated before the bomb detonated and his quick action saved the lives of 12 soldiers and five Iraqi civilians. Military field reports cite a sixth sense or what they like to call a spidey sense so frequently that the office of naval research has put nearly four million dollars toward research on intuition. That's the military, a group of people who aren't exactly known for being in touch with their feelings. The Department of Defense calls this sense making and defines it as a motivated continuous effort to understand connections which can be among people, places and events in order to anticipate their trajectories and act effectively. Which is military speak for listening to your guts. Not a lot of writers end up at West Point apparently. Fear is not a gut feeling. Fear is your brain delivering critical information delivered from countless cues that have added up to one conclusion. Stop, get out, run. But sometimes fear isn't life or death. Most of us, thank God, don't experience threats to our safety in our everyday lives. Instead we experience a more subtle, less threatening form of fear. Sometimes our gut feelings are as simple as code smell or that shiver of hesitation before a deploy. I'm Emily Freeman. I'm a developer advocate for Kickbox. Our API is a smarter, better recapture that ensures your users are real people with legitimate email addresses. I'm thrilled to be here with you all at RailsConf. It's my first RailsConf, so very excited. Great conference. This talk is going to explore fear, instinct and denial. We'll focus on our two brains, what Daniel Kahneman calls system one and system two. And we'll look at how we can start to view our feelings as pre-incident indicators. There are endless dark alleys in our code bases. I think we all have that one feature that really shouldn't work the way that it's built, but it does. And you don't want to touch it. And you're pretty sure that it's going to explode everything at some point, but you're just kind of hoping that you won't work where you do when it happens. I'm lucky enough to have never experienced anything like Kelly and I've never served in our military. But I think most of us drive, right? So imagine you're in your car driving down the highway. There's light traffic, but you're moving along at a pretty good clip. Suddenly you tap the brake and back off. You don't like something about the SUV in front of you. Now, you've got no real cognitive reason to take your foot off the gas. They haven't signaled or swerved, but then they do what your gut said they would do. They move over into your lane. If anyone here rides motorcycles, you know it's even more intense. You develop the sixth sense for other drivers. So what's happening here? Well, I have some bad news. You're not actually clairvoyant. That would be cool. I don't recommend you quit your probably lucrative day job. Change your name to Ms. Cleo and open the stand up at the mall. Instead, your brain did what it does best. Take a thousand inputs and distill it into one conclusion within seconds. What did your brain see that you didn't? Maybe the driver checked the mirror. Maybe the car moved ever so slightly over in the lane. The signals are much less important than the result and your discernment of the basic instinct of fear. Because fear is one of the reasons humans have survived as a species. Fear has kept us alive and it's an incredibly useful and powerful gift, a skill honed by thousands of years of evolution. This feature of fear comes out of the box for us. We have it, all of us built in and we don't need special training to experience its effects. But you can become an expert in your own intuition by purposefully training yourself to recognize the warning signs and precursors to danger, even the risks lurking in our code bases. Now I want to take a minute to distinguish anxiety and worry from true fear. I suffer from anxiety and I'm sure many of us in this room deal with some kind of anxiety or depression. Even those of us who don't experience occasional worry. The difference between fear and anxiety is rather subtle, sometimes indistinguishable. Both feelings elicit the same physiological response. Your brain is alerted to a stressor and the hypothalamus arouses the autonomic nervous system. Your body begins to preserve fluids and controls the release of saliva, tears, and gastric acid. It creates cortisol, which actually assists your blood in clotting, and noradrenaline is released to strengthen your skeletal muscles. Your body literally prepares to experience harm. Your heart rate intensifies to increase the amount of oxygenated blood in your system, your pupils dilate, blood vessels constrict, and your mental acuity increases, making you hyper aware of yourself and your surroundings. The difference is time. Fear is a response to an immediate threat. Anxiety is rooted in anticipation. And that's the key to discerning fear from anxiety, time. Fear will hit you like a ton of bricks. Worry brews and ruminates. Psychologist Daniel Kahneman was awarded the Nobel Prize in Economics in 2002 and the Presidential Medal of Freedom in 2013. He was a man who intimately knew fear. Kahneman's parents were Lithuanian Jews who had immigrated to France in the 20s. His father, believing that Jews were safe in France, chose not to flee to Tel Aviv, then in Palestine. Kahneman prayed daily to live just one more day as his family hid in the woods during this time. His father died of untreated diabetes just six weeks before D-Day. His research partner was Amos Tversky. Nearly everyone who interacted with Tversky talks about him as the smartest person they had ever met. And he had an incredibly sharp humor. While leading a group in the Israeli Army, he dealt with men who didn't want to wear their helmets. They felt that if a bullet had their name on it, it was God's will. He remarked, what about all the bullets that say to whom it may concern? In thinking fast and slow, Daniel Kahneman describes the work he and Tversky conducted over their careers, researching judgment and decision making. It's brilliant. I highly recommend it, though at times it's a little bit dry. So if your brain's like mine, I mean the audio version will help you work through the particularly thick statistical bits. Kahneman and Tversky became extremely interested in the cognitive biases and heuristics that humans use to make decisions. Heuristics are shortcuts that your brain uses to make complex problems much more simple. And while it mostly works, your brain can sort of deform logic in order to come up with the conclusion faster. It's a speed accuracy tradeoff. Cognitive biases are assumptions our mind makes based on existing patterns. For example, the availability heuristic makes all of us more likely to believe that we may die in a plane because we're sucked out of the window because the engine blew, especially if we're flying southwest this week. Then in a car on the way to the airport. But we know statistically that air travel is the absolute safest way to get from one place to another. Over their decades of research, Kahneman and Tversky theorized the human brain has two styles of thinking, what they call system one and system two. System one is quick, instinctual, and driven by emotions. I like to think of this as our mean girl brain. The 13-year-old hormonal monster that lives inside all of us. On the other hand, system two is highly logical and quite deliberate. I'll refer to this as our nerd brain. Now, most of us in this room are engineers and if you're not, you probably work fairly closely with us, so you know the type. We, I think, like to think we're always running on our nerd brain, right? Because we measure ourselves on intelligence. We're smart, rational, data-driven. We're not. Bad news. We are driven by emotions. We make snap judgments all day long. Humans, all of us, default to system one, your mean girl brain. I know. It's pretty hard to hear. I'll give you a minute to mourn. Listen, system two is expensive. So it's most efficient to run on system one most of the time. It's unconscious and processes information quickly. All right, this is audience participation time. The Ruby community is fantastic, so just shout out the answers. Don't be shy. Which line is longer? Light blue, dark blue. Good job. Two plus two is? I see what you did there. And complete the phrase war and good job. Congratulations. You've accessed your system one. It's pretty easy. So if we run on our mean girl brain most of the time, what's the nerd's job? Well, system two is conscious and effortful. So let's really exercise our brains. What's 24 times 19? Significantly harder than two plus two, right? Funny story when I was giving this to my boss as practice. He had done it on the calculator and he just blurted it out and like, I know what you did. How many golf balls fit in a school bus? It's actually 660,000, that's apparently. Which chemical element has the shortest name? No, tin. Not as easy, huh? We're just like a white word away from a full on technical interview there. If you access and utilize your nerd brain often, you will begin to feel run down. It takes much more energy to think with system two than it does with system one because it's rather lazy. But what if you could use both to its advantage? Remember Staff Sergeant Richberg? System one alerted him that something was wrong and that curiosity that innate suspicion triggered his system two. Your lazy emotional system one is an alerting system. This cat would probably not be a very good watch cat. And the buttoned up rational system two is the investigator. Now, you're never going to be able to fully trust your system one. It is rife with bias. System one runs purely on existing patterns. Snakes are bad. Homeless people are dangerous. Hackers live in their mother's basements. Only not all snakes will kill you. I mean, a few will and RailsConf wants to make it very clear that we do not endorse you picking a brand of snakes. Most homeless people are absolutely harmless and less than 50% of hackers live in their mother's basements. Probably. While you can't trust the conclusions your system one makes, you can trust its ability to trigger your system two. And system two is what delves into the details. Forms hypotheses and creates algorithms. Michael Lewis, the author of Money Ball, became interested in the Oakland E's impressive performance against teams with triple, sometimes quadruple their budget. You think that the team with the highest player salaries would get the best players and be the most successful in the league, right? That would be an efficient market. And economists, like developers, love efficiency. Only it didn't work out like that. The major league baseball at that time, it wasn't efficient at all. The Oakland E's held their own against teams, even the most wealthy teams in the league, like the Yankees. His interest peaked Louis Duggan, or you could say his system two Duggan. Like most sports ball, baseball players are picked by scouts. And scouts, like all humans, run on their mean girl brain. They fall victim to the biases we all do. For example, a handsome player with an athletic build will most likely be overvalued because he just looks the part. Due to their budget limitations, Oakland realized they had to find players that were undervalued. So they turned everything upside down. They approached player selection with an entirely new perspective. These scouts for the Oakland E's looked at statistics no one else did. Other scouts leaned heavily on stolen bases and batting average. Oakland looked at on base percentage and slugging percentage to gauge offensive success. They even went so far as to seek out unattractive players, often with some kind of physical oddity. The players on average were a bit fat, actually. Louis likes to say that if you lined them up on the wall, they'd look more like accountants than professional athletes. One player had two club feet. I'm not kidding. They employed a catcher with a busted elbow and a pitcher with this very odd submarine style pitch that's like long and lanky. The scouts used their system one to make those snap decisions. For example, that shortstop runs with a limp. Maybe he belongs in our team of misfits, where then they defaulted to their nerd brain to test those assumptions. They measured every player against the statistical algorithm they produced. Engineering isn't baseball. It's not dangerous. Thank God, if your job is dangerous, like on a daily basis, we should probably talk. How do we make this instinct of fear intelligent? And how do we apply it to our everyday, not-so-dangerous engineering jobs? First, pay attention. When your gut sends you a message, listen. Think of it as a precognitive fire alarm. If you find yourself thinking, it's probably fine. No, it's probably not. And if you find yourself thinking, what are the chances? Hi, the odds are high. Stop equivocating. Your brain has noticed something. So take action. But don't freak, not yet. Action does not mean panic. It doesn't mean alert your CEO. It does not mean deleting the entire service you just wrote because you think it sucks or canceling the schedule to deploy. It simply means acknowledge your curiosity and allow it to transition into suspicion. Now, suspicion has a bad connotation in our society, but it's derived from the Latin suspicier, which simply means to watch. So observe. Look more closely. Finally, trigger your system two. Ask questions. The more detailed and difficult questions, like 24 times 19, will actually trigger your brain to alert system two. And it'll jump into action. Investigate. Get to the root of it. Sometimes your gut feeling is nothing. It's a false alarm. Sometimes it's misdirected. Maybe you went home because you thought your oven was left on, only to discover your phone was left on the kitchen counter. And sometimes you're about to delete dozens of servers and bring down S3 and US East 1 for hours, effectively faulty on the internet. Articulate your feelings to someone else. Talk to a real person, in person, or on video if you happen to be remote. Communicate your concern and ask their opinion. Now, this can be tough to do, right? Our industry isn't known for respecting feelings. We measure each other on our intelligence and our technical precision. But remember, you are an expert. We have technical intuition as engineers. It's a thing that enables senior engineers to debug something and half the time it would take someone with less experience. It's code smell. It's the squint test. It's knowing that when you see your third nested loop, shit has gone wrong. Our brains are fantastic observers. They collect and process unfathomable amounts of data. Your intuition is not a feeling. It's a powerful mechanism derived from thousands of years of evolution and enhanced by your years of experience as an engineer and as a human. We would be wise to trust ourselves and to respect the gut feelings of those around us. Thank you. My name's Emily.