 Thanks for coming and I am here on a continuing series, Michelle's continuing series of Psychological Principles and how they apply to the social engineer. Do I need to move? Tommy, do I need to move? Okay, and it occurs to me that I should have gone to the gym a lot more before I squeezed my FUPA into this dress. However, we'll move on. All right, so the topic of today's talk is decision making and how and why we make bad ones. So, let's see, Peter, make the derp noise. Make the derp noise. Does anyone know the derp noise? Derp, derp, derp. What is the definition of a derp? Can someone tell me? See, Peter, it's just making a decision that you kind of regret later, making a stupid decision that you feel bad about later. And here's the thing, we've all done it, all right? So, we're not making fun of these people, but we're making fun of the following decisions. Post it on Facebook. I'm so proud of my son. Here's a picture of his credit card right here. Derp. Second example, the road to success. There are no shortcuts. Bad decision. And I'm gonna give you three here. Derp, derp, derp. Yeah, bad decisions. All right, so here's the thing. It's really easy to sit around and talk about stupid people, but really, I wanna talk today about stupid behaviors and the reasons why we make stupid decisions because we really have all done it at some point in time, I promise. The smartest people in here have made a dumb or bad decision. And the really interesting thing to know as social engineers is that if you understand why and how people make these decisions, you can affect their choices, whether that is for good or for evil, depending on your profession. So, let me give you an example, all right? The most wonderful time of the year, I love Halloween. There was a guy named Dan Ariely. We're gonna talk about him later. He does a lot of interesting stuff. This isn't like a real psychological experiment, but what he did was, one year for Halloween, he gave out Hershey's Kisses to kids. And kids all love getting Hershey's Kisses, but he twisted it up a little bit. He said, okay, now that I've given you the Hershey's Kisses, we got a deal. We can make a trade. I can give you one Hershey's Kiss for a fun size, which I think all of you agree that this is not fun size in any metric, okay? Promise you. So, the deal is you can trade one Hershey's Kiss for a fun size, or you can trade two Hershey's Kisses for a really fun size. So, here's the interesting thing. What do you suppose the kids picked? For the most part, because kids are pretty good at maximizing chocolate. Yeah, the two kisses for the full-size Snickers makes perfect sense. At some point in the evening, he switched up the deal a little bit. What he did was, he said, all right, for the rest of you trigger-treaters, I have another option. You can trade one Hershey's Kiss for the full-size Snickers, or get a fun size for free. Now, what do you suppose the kids did? Yeah, what he found was a vast majority of kids elected for the free mini fun-size Snickers. Although, you know, if you think about it in terms of pure chocolate volume, obviously the first trade is a much better deal. Why do you think kids messed up on that choice? Free. Avoiding loss, thank you very much. There is no possibility of loss when it's free, right? So that is the deal, kids get anxious, and they get excited, and if something's free, they're not gonna lose a thing. Well, you know, we adults, we can sit around and say, oh, we're so smart, but actually, he did a very similar study on adults, and we made the same stupid mistake. So it's not just kids, it's all of us that make these kinds of errors. So I have four considerations for you as social engineers. The first is that we, the interesting thing is that we tend to make the same kinds of mistakes over and over in the same kinds of ways. What that means, then, is that we are predictable in our decision-making process. So if you have done any sort of reading or are interested in doing any sort of reading into decision-making, Tversky and Canaman are sort of the granddaddies of decision-making. They've done decision-making for years and years and years, and what they have basically come up with, this is this idea that people use shortcuts. We all take shortcuts in our decision-making process. It makes life a lot easier, right? We can make quick decisions in a very brief amount of time and most of the time we don't die when we do it. So if it continues to work for us, we continue to do those things. So because we rely on these shortcuts, again, we are fairly predictable in the kinds of mistakes that we will make in our decision-making. So there's three different kinds here, which I'm gonna talk about briefly. All right, so the representative heuristic is the tendency to estimate the likelihood of something by how much it resembles a specific category. So I know a beautiful girl named Jessica, blonde hair, green eyes, boom, and figure. She loves good wine, jewelry. She loves going to the hot springs. So let me see a show of hands. Is she an accountant? Any hands? Any hands? Okay, we've got a few. We've got a few. All right, is she a farmer? Farmer, got a couple. Is she just an experienced scholar? Got a few more here. All right, is she a talk show host? Probably the most participation we've gotten at this point. You guys are so lame. Okay. All right, so who picked talk show host? All right, got anything short. Stand up for me. Come here, baby. All right. Why did you pick talk show host? Okay, she seems to fit the category. Thank you very much. Bubbly personality, she's attractive. You know, she likes to do things. She's sociable. And what it seems like is that she is representative of this category that we call talk show host. Now, what do you suppose the right answer is? No, actually she's an experienced scholar. Okay, I promise, I promise, this is true story. I would not lie to you, this is actually true story. Now, the problem that we have here is that we use these categories and we try to decide if people fit these categories to make these decisions. Why is there a problem in this particular case? Say that again? Okay, so you don't really know what those would look like and so you find it most likely? Well, it's interesting, right? And we do this all the time. Does this person or we judge the likelihood of a person or a thing or decision fitting into a specific category because that's what they look like. And really, whether or not she is any of these things has nothing to do with her personality or what she looks like. We haven't even taken into account how many accountants there probably are in the United States versus farmers versus talk show hosts. I think it is fair to say that there are probably more accountants than talk show hosts and yet we may be more likely to choose one or the other simply based on how she looks. Let me give you a real life example. Who can give me in their mind what is the definition of a kidnapper? Wait, what's the definition of a kidnapper? Somebody who steals a kid. Somebody who steals a kid, anything else? Usually a parent. Usually a parent? Ow. Okay. Let me lead you a little further. Is there anything else that we sort of have in our minds that kidnappers do? Please, PG-13. Yes, ask for ransom. So here's a really interesting thing. People who take kids and ask for ransom are much more likely to be charged with kidnapping than people who just take a kid. And yet they're both kidnapping, all right? But in our minds, we have sort of the stereotype of kidnappers take a kid, they ask for ransom and that's what they do. So it even affects our decisions and our behavior. All right, so think about that. Think about how you have these little categories in your mind and how they affect how you see people, how you judge the likelihood of things happening and how really that will impact your decision-making process. So that's representative heuristic, okay? That is the shortcut that we take to make a decision based on how closely something or someone fits into a specific category. All right, let me ask you another question. Now, availability heuristic is our decision-making based on how easily we can recall something happening. How easily we can recall something happening? So let me, and really the bottom line is if you can think about it, it must be important. And that makes sense, right? If we can recall something quickly, well really it might have been life or death, it might be something really important so we can think about it right away. So the question is, what is more likely to kill you? Your dog or your sofa? I know what you guys are, you guys are so smart. All right, how many people can think of a story or know somebody who's been bitten by a dog? Okay, probably the vast majority of dogs or your dogs can be pretty, pretty scary. And so we can think about that immediately. We can think about people who have been bitten by dogs. I remember there was a big story in the U.S. of a dog catcher, female dog catcher she was bitten by pit bull really bad. I saw the video, it was very scary. And so if you were to ask me, I would say, well you know, much more likely that a dog's gonna kill me than my sofa. However, the statistics actually say that people are 30 times more likely to fall off, die from falling off a piece of furniture than being killed by a dog. All right, so that is the power of the availability heuristic. We can recall something and therefore it must be true or therefore it must be important. Again, think about how that impacts your decision-making. Now why do you think it is that we can recall things so quickly? What are some reasons? Survival, who said that? That guy, Brown shirt, why do you say that? Okay, so if something is really scary or something is risky, it would behoove us to remember it, okay? But why do we remember stuff like this? It's really simple, a false positive causes very little. A false negative is death. Okay, let's explain that in ways that the rest of the class can understand. It's not even a class. Sorry, the people in the room. Why is it that things, what is the reason that things are easy to recall? Repetition. Repetition, media, okay. What's that? Emotion, very nice. Yellow shirt. Instinct. Instinct, okay. I always ask this question in class, it's the same kind of thing. Are you more likely to be killed by a shark or a soda machine? Soda machine. Soda machine, you guys know the answer now. But why would the vast majority of people answer shark? Jaws, sharknado, shark week on discovery channel? Absolutely, right? I do not ever remember a new story of a guy getting squashed by a soda machine. However, again, actuaria tables indicate that people are much more likely to be crushed by soda machines than eaten by sharks, unless you live in Australia, from what I hear. Okay, so that is point two under this whole idea of how we make mistakes in sort of consistent manners. Tversky and Caneman also talked about the anchoring effect. So the anchoring effect is the idea that for the most part, people have to have a way to make a decision and they use a number as a starting point. That starting point has a way of drawing your answer. So for example, you go to the store and the salesman showed you a beautiful leather jacket. Okay, I'm from the 80s. I love my leather jackets. And he tells me, you know what? This leather jacket is $1,000. But today, it's on sale for 400, okay? What does that do to my decision-making, right? It gives me an anchoring point. In my mind, that leather jacket was worth $1,000, so if I'm only paying 400 for it, that is fantastic, all right? So anchoring is a very interesting way to think about the decision-making process. And the reason why I have this annoying wheel of fortune here is that Tversky and Caneman also did an interesting study where they had people estimate the percentage of African nations that are part of the United Nations, okay? And they had a wheel of fortune going and they had it actually rigged to stop on either 10 or 65. And what they found was that it affected people's answers. So the people whose wheel of fortune stopped on 10 were likely to estimate a median of 25% of African nations and the people whose wheel of fortune stopped on 65 were likely to guess a median score of 45% of nations belonged to the United Nations. So anchoring points are important to us. Again, these are shortcuts that we make. We have to have some way of making a certain decision. If someone throws out an answer to us, it will affect our decision-making process whether we think it is important or not. There was another really interesting study done where people were told to write down the last four of their social or actually the last two digits of their social and then ask what they would pay for like a good bottle of wine or tickets to a basketball game. And the people who were willing to pay the most for a good bottle of wine, their socials ended like in 99, all right? So again, we have an anchoring point. We wanna take these shortcuts. We don't wanna think about all this information. We wanna take a shortcut that's quick, that's easy and that will make sure we don't die, okay? So that is consideration number one. Consideration number two, I am going to lead in with a video or a movie that I'm sure we all know and love. So I'm gonna show it to you first and then we're gonna talk about what it means. And no, we're not gonna see the whole thing. I don't have any sound. Evan, sorry. And my tech support crew comes in. You just want to go and help your mic again. You need to adjust my mic again. Yeah, put that right in your USB there. Just put that right in my USB. Nothing bad will happen. She works for me. They look trustworthy. Sorry guys, should have done trouble shooting earlier. So I'm gonna go and help out. All my settings are open to the universe. It's all right. Are you wanting to start the video? Yay, thank you. He knows what movie this is, right? Okay, this is, I think this is the second dark night. Anyway, it's the reboot of the Batman. We have the decision where we've got the two boats and they need to make the decision whether or not they're gonna blow each other up or they both get blown up, right? Okay, decision making at its most critical. We're still here. That means they haven't killed us yet either. But you don't know how to take a life. Give it time, we'll kill you and take it anyway. Everyone wants to get their hands started. Fine, I'll do it in their choices. They chose to murder and steal to make any sense for us to have to die too. You can tell them I took it by force. Give it to them, then I'll do what you should have did 10 minutes ago. Arguably a very dramatic and difficult decision. Now what does this video say about our decision making as it gets more complex and more difficult? Are we super on it? Are we better at it? We would like to think so for sure. The basic thing to understand is that as decisions get harder and more complex or more important, we really don't get better at them. We really will often take the path of least resistance or allow the system or decisions to be made for us. So think about how many of you probably, well, all of us I think should probably be saving for retirement and we should probably know what's being done with our money. It's a really important decision to make sure we're not eating cat food when we're in our 70s, right? However, a lot of us leave the decision making to money managers or we don't save it all. We just sort of avoid the situation because again, money is like scary. It's your life's blood. It's how well you're gonna do when you're older. And so really what we know is that this is a hard decision. It's a complex decision. It becomes too much for us and oftentimes we just sort of let things slide. Dan Ariely, Duke University, the guy who did the trick or treat experiment did a really, he did a really good TED talk and this was actually done by an organization in the UK looked at organ donations, okay? You can see there is a dramatic difference in organ donors by country. We've got some countries here that are not so great. Countries over here that are awesome in terms of donating organs, willing them after they die. Now, do you think it is because Austria, Belgium, France, Hungary, Poland, and Portugal and Sweden are so much cooler than all the rest of us in the US is over here too, by the way? Do you think that's why, you know, maybe? But yes, yes, you guys got it. It's an opt-in versus opt-out system. So it's opt-in, check the box. If you want to participate in organ donor, so the countries with the gold bars and low participation were under the system where you actually had to opt-in to participate. Countries that were very generous with organ donation, you actually had to opt-out if you didn't wanna donate. So what does this tell you? Does this tell you, again, that the countries to the right are much more generous? No, as a whole, this demonstrates that we're all lazy and we will all just sort of allow the default. Organ donation is kind of, you know, it's sort of an important decision. It's philosophical, it's what you want done with your remains. It is a complex decision if you really think about it. For the most part, we don't. And this demonstrates that we are all lazy and we all will resort to default when those decisions get really hard. All right. Consideration number three. We hate uncertainty. I will actually give you a real life example. I had an old phone for a really long time. Mike, our guy back there, Mike is the nicest guy in the universe. He threatened to throw my phone into the fountain at Caesars one year because the phone that I had was so old and horrible. So Mike and Chris, the exasperation, took me to Verizon to get a new phone. And I took over an hour to make a decision about getting a damn phone. Okay, and I promise it was not life and death, but I felt like, man, you know, I didn't realize they were gonna make me go and I didn't collect all this information and I felt really anxious because I wanted to make the best decision possible. And so I waffled and I waited and you know what, I ended up getting a phone. That's a piece of crap and I hate it. So here's the bottom line. We hate uncertainty, but it really doesn't make us any better. When information is missing though, we feel like it's really important. When all of you guys visit places, where do you look to find the best restaurant? Yelp, right? You feel like that's gonna give you just that little bit more information that you need to make a really important decision. And the reason that we do this, we don't like uncertainty. We don't like randomness. What does randomness imply? Sort of in life, in the larger picture. Out of control and why is out of control so scary? It is stressful, why? The unknown, why is that scary? Because we could die. Yes, thank you very much. This is a very uncertain situation. Who can see anything? You know, if I tell you there's something here in this picture, what do you see? Well, you see the little goats down here. Cute little goats. But this is sort of a vague, ambiguous situation. If we were here, we might feel a little uneasy and the reason is in uncertain situations, there's always danger. I think that's a snow leopard up there. It is a big cat waiting to eat somebody. So let me give you an example of how uncertainty and the need to collect as much information as possible affects our decision-making process. All right, actually it was Bastard and Shafir who did this study, really, really interesting lone officer study. In group one, you found out that the applicant, you're a lone officer and you're going through an approval for a mortgage request. You found out that the applicants at college grabbed them a solid credit score, they have a really good job. Now during the credit check, you find out that he hasn't paid his bill in the amount of $5,000 for three months, okay? What do you think they found in terms of who approved versus who rejected the application? More, more, do more people reject or approve? Yeah, yeah, what they actually found was only about 30% approved the application. The vast majority said, hmm, this guy's not a good risk and so they rejected the application. This is a very, very simple condition. What they did was they were really mean to the second group and made this a lot harder. Now, the applicants at college grabbed, they got a solid credit score, stable job, during the credit check, you figure out that he hasn't paid his bill in the last three months, but you're not sure if that bill is for $5,000 or $25,000. These people were given three choices. You can approve the application, you can reject the application, or you can wait until tomorrow because that's when we're gonna get the additional information. What do you think happened? They waited, yeah, 2% said he's good to go. Only 23% said, hmm, no, bad deal. Again, the vast majority said, hey, why don't we wait to figure out what the amount actually is? Part two of this very complex decision-making process. The next day, you find out that the amount owed was $5,000. So what do you think they did? Yes, the majority approved, okay? There were others that didn't make the decision, but so this is really interesting, right? We have a couple of factors at play, but let me compare the numbers for you. Regardless of the condition, if you think about it, the amount owed was the same, right? It was that $5,000. The fact of the matter was that the guy hadn't paid his bill in three months. However, that additional information, that the promise of information, it made Group 2 kinda waffle a little bit and it actually impacted the quality of their decision so they no longer saw the big picture. The big picture is the guy didn't pay for three months regardless of how much he owed. So that speaks to this idea of we think that information's really important, especially if it's missing. Yes, question? Yeah. Yeah, that is a great point. And so this gentleman here said that that $25,000 might have affected their decision-making and what was that principle that we just talked about? The anchoring effect, exactly. So you can see how all these sort of bleed together. They're not nearly as clean cut as I make them sound. So again, we wanna make these good decisions but now we've got these numbers that we use to make our decision and if information is missing, it really stresses us out because missing information must be really important. All right. So in terms of comparing the results of the study, condition one, people once again who approved 29%, they were like, no, he hasn't paid in three months. The condition two approval were much more likely because again, they were looking at, well, it was actually 5,000 that he owed, not the 25,000. So they anchored on that 25K. Very good point. All right. Again, point number four. People can definitely be nudged because according to Richard Thaler University of Chicago, he wrote two great books, Nudge and Misbehaving. He's a behavioral economist, writes very, very well. And his idea is that we don't make decisions that are necessarily in our best interests because of all the reasons that we talked about. You know, we tend, if decisions are complex, we tend to let other people or the situation make the decision for us. If information is missing, we'll waffle and stress out about collecting too much information. And because of that, the situation should be framed in a fashion that nudges people to make the best decisions. And he came up with the term choice architect. And this is definitely a choice architect. If any of you have done any sort of driving out West, you've probably seen this gas station like I have. And it used to stress me out, right? Last chance, this is your last chance for gas and a restaurant for 300 miles. And you may have half a tank, but you may stop simply because of this choice that's been provided to you. So framing is the last sort of point about nudging people in a direction that you would like them to go. I'm gonna throw a picture up here. Now, Colin, Colin, don't blurt this out. And anyone else who has seen this picture do not blurt this out. This is Colin, by the way. Raccoon suit guy. What do you see in this random group of dots? Does anybody see anything or does it look like a random group of dots? Oh, some people see it. Does anybody see a dog? Yes. Does anybody see a dog? Are people seeing dogs now when they didn't see the dogs before? Okay, we've got a Dalmatian here. His nose is here. He's like sniffing the ground. Here's his body. Here's his legs. Now, the problem is we have framed this for you and you probably can't unsee it no matter what. Now, if I tell you this is just a random group of dots, you're still gonna see the dog. So the situation can be framed in a way to nudge you in a certain direction or another because, again, we know that people don't necessarily make the best decisions and people will often default and let the situation take control. So if we are able to frame a situation as social engineers, then this will, this can guide behavior in a way that you would like. Framing. Ladies. Is you're feeling different? If I tell you that you're wearing this at the beach versus having to walk down the hallway and underwear a Defconn. Okay? It makes you feel different. It makes you think differently about a situation, affects your decisions, affects your emotional response. If she's in a bikini, she's still very attractive, not a big deal. But if she's in her underwear, I may feel differently about this. Amos Tversky did an interesting study about surgery. Would you rather have surgery with a 90% survival rate or a 10% mortality rate? Now, most of us can do math in this room and we know that the opportunity to die is precisely the same, but clearly people were much more likely to pick the survival, the 90% survival scenario. So framing is an important concept when it comes down to it, because again, we're not always aware of our decision-making processes and if it is a complex decision, not underwear or bikini, but surgery versus radiation versus whether or not you're gonna die versus investing your money in all these complex schemes, think about the load that that creates and think about what our general reactions are to complex decision-making. All right, so to sum it up, we tend to make the same mistakes in the same ways over and over and over again. That makes us relatively predictable because we have biases. We use shortcuts to make those decisions. As decisions get more complex or important or critical or life-threatening, we don't necessarily get better at them. A lot of times we're overloaded and we will allow people or the situation or the frame to make the decision for us. We hate uncertainty. I feel like if I had a little bit more time to research phones, I would have made a better decision on the phone that I decided on. You would have kept the old pieces. I know, and I would have been happy because I had the slide-out keyboard that I could use. I also had a full-size, pretty keyboard. It was awesome and I never made mistakes texting. Absolutely, that's what I feel like, all right? And lastly, people can definitely be nudged whether that is for their benefit or their detriment by the situation, how it is framed, how you frame the situation. Okay, implications for social engineers. Now, here's what's important to understand. Security decisions aren't always easy even for pros. Clearly, we even make mistakes. The example that I want to use for you is from 1983. Stanislav Petrov was an officer in the USSR responsible for early warning detection of nuke launches. This was when the US and the USSR was in full sort of cold war mode. We were very tense. We were waiting to see who launched first because the response was gonna be, well, somebody launches, we're gonna launch and we've got massive assured destruction and nobody's gonna survive, at least. In 1983, in September, the Soviet system actually warned of two launches by the US, one with one missile and one with four missiles. Imagine being that dude on alert and having to make a decision about whether or not to report that a launch has been made against the USSR. Fortunately for the entire world, he decided that it was an error in the system, did not choose to escalate the situation and probably saved us from mutually assured destruction, nuclear war, death, all that kind of stuff that we wouldn't be here today. So the implication for you all is understanding that making these decisions isn't easy even for people in security and it's surely not easy for your populations, okay? So understand how that works and why that works. If you understand how and why people make decisions, you can become a choice architect. Again, if you are in charge of security for your companies, think about ways to make people's decisions easier and safer and think about the ways in which they're likely to make mistakes or policies that you have in your companies that force your people to choose or make a decision when really they shouldn't be. And finally, don't be a jerk, okay? Now that you understand how all this works, don't use this against your populations unless you've been paid to do so. Yeah. All right, so my final thought to you is Dirty Harriet wants you to make a bad decision. Always think about the Dirty Harry's out there. I know what you're thinking. If I are six shots or only five, well, let's tell you the truth and all this excitement I have kind of lost track of myself. But even if this is a 44 mach, it's the most powerful handgun in the world and we blow your head, clean off. You've got to ask yourself one question. Do I feel lucky? Well, Dirty Hip Hunker. So think about that as you go back to your jobs. Dirty Harry's always wanting to make your populations make a bad decision. All right, folks. If you would like to call me, or well, actually don't call me, email me. Or come to our website. I will take questions if any of you have them. Thank you so much. Questions?