 Our next talk is the Cyber Threat Intelligence Mindset. Our speaker is going to be Cheerio. She works for Financial Services Fortune 500 Company. She's a graduate of the SANS 2017 Women's Academy. Has an MBA in IT management and currently holds, it looks like, just about all the SANS certs. CFR, GSEC, G-C-I-H, G-C-F-E, G-MON, G-DAT, and G-PEN. She's a member of the Financial Services Information Sharing and Analysis Center, YarExchange, and Fuzzy Snuggly Duck. Cheerio serves as an advisor for a cybersecurity apprenticeship program in Chicago, and she's on the advisory board of SANS EMA Cyber Threat 2019 with the National Cyber Security Center in London and SANS Purple Team Summit. So please give a big round of applause and Blue Team Village welcome for Cheerio. Hi everyone, thanks for being here on a Friday, because I know there's a million other things that you guys could be doing right now, like pool or drinking or God knows what else. So I am not speaking on behalf of my employer, just FYI, so don't be calling my director up and being like, well, you know she said siblings better. Okay, so if you want to live tweet this, just use the hashtag Blue Team, and then just tag Blue Team Village and myself in it. Here's my set list, put a ring on it, think about thinking, cognitive biases and fallacies, solutions and ideas, and then there might be some crowd surfing. So just be forewarned. In the immortal words of Beyonce, put a ring on it, because basically you guys already do a lot of the basic CTI analysis. It's just what after having multiple conversations with multiple different Blue Team people, it seems like there are some questions that would be extremely helpful to bring it to more of like a conscious part of the analysis that you do. So I'm here to ask you to commit to performing analysis and asking questions and utilizing some of the techniques in this talk. So why think about thinking? Basically boils down to this. We have different perceptions, we have different experiences, we have different belief systems. We have so many different things going into how it is that we think and how we view data, how we perceive it, how we think. You know, some people think in symbols, some people think in words. So this is a great example. Check out these three triangles. You look at it, it's very well-known phrases and this is actually in the Psychology of Intelligence by Richard Hewer. So a lot of people don't really see the second consonant or the second article, I believe, like for instance once in a lifetime. A lot of people, because they have the previous experience with that particular saying, they don't see the double A or the double V. Our short-term memory has about seven plus or minus two items that we can actively work with, that's our working memory. So with that type of deficiency, I guess you could say, in your thinking process, there's got to be something to help offset that. Our long-term memory forms through association models and analogies. So how can we leverage the way that the short-term memory is and long-term memory in order to improve our analysis? Because essentially experienced analysts have baggage. They come with previous experience, previous biases, previous preconceived notions, previous ways of performing analysis. So unlearning is a little hard sometimes for people that have been in the industry a long time or been focused on working alerts and in the sock and working with the SIM. But the good thing is that there's no substitution for experience. Like, we need you guys, we need the senior folks. So it basically comes down to what kind of analysts do you want to be? Do you want to be one that focuses on how they're thinking about thinking? Or do you want to just continue your analysis the way that you're doing it? So it's up to you if you want to step up your game and here are some tools for you to do that. Logical fallacies and cognitive biases. Here's a good one. So cause and effect. There's physical properties versus everything else. And this is a fallacy. So it's like if an animal is walking in the woods, it leaves tracks, right? And that makes absolute perfect sense that there's tracks being left in the mud. But yet people tend to extrapolate that cause and effect and then apply it to pretty much everything else. But it's not really a one for one correlation or relationship. So you have to understand that you can't necessarily apply the cause and effect to things like the economy or an APT attack or anything else that isn't like I drop an apple, we have gravity, it falls, great. But not everything works that way. Here's another fallacy analogy. So this this is just an example that I pulled for that's related to socks. So for instance, just because a certain type of training works really well for some of the analysts, it doesn't necessarily mean that it works well for all of the analysts. So just because you had previous success with something, whether it's alerts or a way to present things to the board or whatever it is that you're doing that's related to how you engage Blue Team, doesn't necessarily mean it'll be successful again or a failure probability. So this one's a big one just because many alerts produce false positives. It doesn't necessarily mean that the specific alert that you're looking at is causing or producing a false positive. So what I've seen is that sometimes alerts get closed out when they shouldn't be just because they assume that because there's a high probability of having false positive that all of these other alerts are false positive. So definitely pay attention to that and how you're thinking about it. The other one is anecdotal. So if you're an experienced analyst and you're presented with a compelling evidence, I guess you could say, to go a different path with your analysis, but then you're like, no, my experience is more important. Like I'm sticking to my experience no matter what and I'm completely ignoring everything that that has to say. That's an anecdotal fallacy. So you take all the fallacies and I took some of the main ones that seemed applicable after multiple conversations with Blue Teamers and kind of put it in language that wasn't psychology speak or CTI speak or military speak so that you can see that these have actual tangible results as far as closing tickets early, doing mass closure on stuff that probably might should not be mass closed. So these also feed into cognitive biases. So the first one is anchoring and this one pretty much to sum it up as simply as I can. It's the analyst relying on the first bit of information that they receive and then performing all of their analysis after that. So out of all the stuff I've read and the research I've done, having awareness of the anchoring cognitive bias and what your bias towards doesn't necessarily like help you not do it. Like the way that helps you not fall into that particular cognitive trap is through practicing something called empty mind where you come to the, where you come to basically the analysis or whatever it is that you're doing without heavily relying upon the initial information that you're given. And the other suggestion that I found was that people need to slow down when making decisions, like everything happens so fast, so rapidly, but if you can just take a second, even if it's a 10 second, you know, 10, 9, 8, count down to 1, take a deep breath so that you can make a better, more informed, less biased decision. The next one, the next big one is confirmation bias. So that would be like me trying to find all of the information that backs up my particular pre-existing beliefs. And that's generally not helpful to have a non-biased approach to analysis. So the way that people have gotten around that and dealt with it and managed it is through a structured analytic thinking and mental models, which I'll go into a couple slides in. So this is from the master himself, Richards Hewer. Analysts should be self-conscious about their reasoning process. And I really took that to heart. And ever since I started working as a CTI analyst, I'm like, how can I be more conscious of how it is that I'm thinking and how it is that I'm interacting with the data and analysis? Because you don't know how serious the alert or the situation is until you do. And to add on to that, of course, which isn't necessarily helpful to analysis, is that time is limited. Like the executives, the managers, the directors, all the people above you, like they want the answer like yesterday and they don't care. They want the answer. So I was like, how can I leverage this idea of a mindset? Because generally in CTI speak, mindsets are kind of frowned upon because they're quick to form but resistant to change. So I am proposing a CTI mindset as a technique and not necessarily as like something that you embody, just an extra tool in your tool belt that you pull out that has these specific parameters so that you can cut through the analysis like a knife and help feel more, I guess you could say, confident in your analysis results of decreasing the amount of bias in it and logical fallacies. So basically work out your analysis muscles. It takes practice. Here's a great suggestion. Break down the problem into easily digestible parts. So when you do that, it makes it easier to consume. So for instance, you're dealing with some crazy alerts and there's these hosts and these IPs and these domains and these malware alerts and those alerts and blah, blah, blah. Like if you break it down, it'll be easier to consume and get a full picture of what's going on. Another suggestion that they have is to get the problem out of your head. So in the structured analytic techniques book by Richard Heuer, he suggests a multitude of things. One of them is like a decision tree. You can do diagrams. You can do models. You can do mind maps, lists. I mean, it's pretty much endless of what you can do as far as getting the data out of your head and onto something that you can actually have a conversation with other people. So break up with your habits because at least for myself, sometimes I get caught in a rut where I'll look at a threat report and I'll view it the same way and so I constantly or more often than not consult with my coworker and get feedback from the ideas that I have and the route that I'm going with my analysis. Observe your thoughts. So the way that I found the easiest to do this is by pretending I'm in a movie theater and just watching my thoughts arise and not necessarily being attached to what they are so that I can be more aware of how and what I'm thinking and how it's impacting how I do the analysis. So here are some questions that as a blue teamer working alerts you can ask yourself as you're doing it. So for instance, the first one is what hypotheses am I forming? One, two, three, four, like am I just focusing on one and kind of feeding all the evidence and data towards that or are there two, three, are there competing hypothesis? That's a good way to bring to your conscious mind like how you're performing the analysis and what you're thinking about when you're doing it. Another good question is am I looking for data to prove or disprove the hypotheses? So depending upon how you answer that within yourself you'll get an idea basically of like for instance if you're trying to prove your hypothesis that could potentially lead to confirmation bias. So the way to offset that is to try to disprove the hypothesis. Another good question is what assumptions am I making? What information am I ignoring? Am I juggling more than seven plus or minus two items? And if so, how can I get this out of my head and onto paper so that I can have conversations with my other team members? Something that is helpful that I've read in pretty much everything that speaks about analysis of competing hypothesis and structured analytic techniques is to actually put something on paper. I believe there was a doctor, a physician that they're like we don't need no checklist. We're doctors, yo. And so the nurses were like no, no, no, we're gonna do a checklist for every time you engage this patient and magically just having that checklist and making sure every single step was met it saved hundreds and hundreds of lives. And so if it's beneficial in healthcare I believe that it could be beneficial for our blue team analysts as well. We may not be saving lives but if you work in healthcare you might. So another one brainstorming. I do this all the time coming up with potential solutions with my co-worker. Another one is star bursting. So that's kind of the opposite of brainstorming where you come up with questions surrounding a particular situation and you can do it solo or with the group. This is a little tidbit that I added. I don't know if you've ever heard of gestalt therapy but basically it's, I found it really helpful whenever I get kind of set in my ways. So I'll put on like a, I'm gonna pretend I am my co-worker right now, you know, like trying to pretend that you're wearing different hats or pretend that they're sitting across from you and you're having a conversation with them. Kind of like how the programmers use the rubber ducky technique. So you have to do what works best for you. I believe there's something called a six hats technique too where they have like a black hat that they put on and they look at things from a negative perspective. They have a yellow hat where everything's bright and sunny and optimistic and everything turns out fine. And then they just kind of step down like looking at the data and the analysis from different perspectives. So this also feeds into cultures and nation states. For instance, my family's from Mexico City. So I can't expect you people, you know, that don't have families from Mexico City to understand my cultural background and like what I grew up with. And that's kind of how it is with nation states. Like I'm not gonna try to guess what China and their culture is thinking, are thinking about. Same with Russia and a lot of other cultures. Like it's one thing to put yourself into having different hats and seeing things from different perspectives. But I under no circumstances recommend trying to put yourself in the perspective of a nation state just because their culture is so different and so foreign at least from my own. Like I don't wanna make those mistakes. The other suggestion is to thinking graphs, not lists. They found through some research that, and I actually have a link to it, it's from Microsoft, smart people. That the red teamers tend to think in graphs, whereas the blue teamers tend to think in lists. So they said, hey, blue teamers, if you start thinking in graphs, you can get a better idea of where the red team people are coming from. And so I'll illustrate this, you know, you have your WAF, you have your SIM, you have this, you have that, you have your proxy, and it's all listed down. But when you think of it in a graph way, it's like expanded kind of like a multigo. I don't know if you've looked at multigo. Yes, graph analysis. So you think about, okay, so yes, I have this, you know, whatever device, but how could this be pictured as a graph in my mind instead of a list? So I'll think of all of the things connected to that and how I could pivot from that to something else. So just think of pivoting, I guess you could say, for lack of a better explanation. Don't fence yourself in, pick multiple hypotheses. This is how they deal with a lot of the biases that come up in analysis. So if you pick at least two, three, however many you want and leverage that to kind of approach the data and the analysis from a different perspective than you initially had. You want to build evidence for and against your hypothesis and this will of course help with the biases. There's a matrix that you can do. So for instance, let's say you have four hypotheses, right? And you can say that this particular data that you get in is either consistent and consistent or neutral, doesn't impact the particular hypothesis. Then you make a really pretty like graph thingy, like a matrix, and then you map it all out so that you get an idea. And you attempt to disprove your hypothesis. And this of course is going through the analysis of competing hypothesis. So let's say you have all of this evidence, all of these critical items, and you come to a conclusion. How dependent is that conclusion upon those particular critical items? For instance, let's say that one of the critical items was wrong or misleading or someone has a different interpretation of it. Would it completely change your conclusion? So that's something to think about when you're dealing with analysis. Another thing is discussing the likelihood of all the hypotheses. Not just your favorite one, not just your pet hypothesis, but all of them. Another one is identify clues for events taking a different course than expected. And a great example of this is the Digital Shadows ACH Wanna Cry Breakdown. It's really quick, you Google it, and they have basically everything that I just said, like laid out with a tangible example. And I reference that from time to time, like when I need to understand it better or get a different idea, like they inspire me. Thank you, Digital Shadows. So this requires practice. It's not something that'll just happen. You have to think about thinking in order for this to work, essentially. You can't just be like, oh, I think I'm just gonna try to start observing my thoughts and apply it directly. You have to work up to it. Like it's a skill that you develop over time. And a good way to start doing this if your organization is up to it is scenario-based exercises. That's what they suggested is the best for actually internalizing the information and leveraging it in your day-to-day analysis. I'm suggesting to get kind of like a referee for tabletops. I've found that office politics kind of comes in a little bit and hinders people from speaking up and engaging with the tabletop or being afraid that they're gonna be made fun of or put down or talked about for years because they couldn't remember some stupid command. So I'm just saying, like, suspend the hierarchy, suspend the bullies, suspend all of that if you can, and just have a candid, honest conversation with the people within your organization that you want to help become better analysts. So basically, Beyonce will hurt you if you don't. So don't be hatin' on your teammates and teamwork. It's all of us working together. All of us in here that aren't the bad guys because I know bad guys come to this stuff too. We're all working together to protect our organization. So it's all of us. So let's say, for instance, that your organization is heavily steeped in a toxic environment, then reach out to a different blue teamer so that you can build the skill set and move on to a different organization and provide value that way. So in summary, think about thinking to up your analysis game. Use the CTI mindset as a technique. Ask yourselves questions when you're performing analysis. Attempt to disprove your hypothesis and practice, practice, practice. So I have a little game here. Okay, so anyone that knows the word in the blank when I read it, just shout it out. And I have stickers so you can come up at the end and I will hand them out. Okay, are you ready? Okay, yes, that was right. Okay, number two, yes. Number three, observe your thinking. Yes, you got it. Number four, think in. Yes. Number five, pick multiple. Yes, that's right. And here's a little bit of Taylor Swift because you want to take a picture and that's it. So do you have any questions at all? If not, we're done. Thank you.