 and welcome back to the Career Hacking Village. We talk a lot about doing technology, but we never really talk about how do we present, how do we get things moving forward? And I'm really excited, I was really excited to see the proposal submission from my friend Peter Keenan to just really talk about hacking security leadership. So Peter, take it away. Thanks so much, Kathleen. And thanks everybody for watching. Yeah, this is my first DefCon presentation. So thanks for accepting it. All right, the obligatory who am I slide? So I have been in the information technology and information security space for about 30 years. You can see from these pictures, there's at least three decades of bad haircuts there. I'm currently the chief information security officer for a global financial services company. You all are hackers, you can figure it out if you really want to, but I'm not allowed to use the name, so I'm not going to. Prior to the five and a half year stint I've had there, I was a global head of information security governance at Citigroup, which was, yeah, I owned a bunch of things like global information security policy, I was a director with Peak Pricewaterhouse Coopers and their information security consulting group. I owned a firm that did information security consulting for about a decade, primarily to military and intelligence agencies and spent a little time working in government service before that. Yeah, like I said, all in a lot of time in the space and I have hopefully boiled down what I've learned in the last 30 some odd years into about 30 slides, which sounds terrible when I say it that way that I'm getting like one slide a year, but sadly it's true. Yeah, two kids, 24 and 21, 31 years of marriage. All right, so I'm going to start with this CISO mind map. I think it's a great slide. I think this chart is really informative sort of about the things that somebody in my position is thinking about on a daily, weekly, monthly basis. The area of this mind map that I'm going to focus on is selling information security. I think it's the area that people struggle with most as they transition their career from sole contributors or engineering types or very technical sort of roles to a leadership role. And it's probably the most important part of information security. If you boil information security down to its core nugget, the thing that's probably describes it best is influence without authority. And the ability to drive security through an organization without directly being able to control people's comp or control their performance reviews and all of those sorts of things is really the challenge that we face. I mean, some things we have direct control over, but by and large, we don't. And we need to be able to convince people that this is the right thing to do, both for their own personal gain, the company's gain, society's gain, all of those things. And that's why I think that's probably the core function that information security, probably the most important function within information security, because it's a force multiplier. If we can get the folks in the company all pulling in our direction, it multiplies our effectiveness dramatically. The thing I don't like about this part of it here, and maybe it's just me misreading it. I think the authors, like I said, did a great job. It feels like it's a lot of top-down selling, right? And I think that that's good and it's important. And it's what lets you keep your job as a CSO, convincing the board and the CEO that you're doing a good job and security is important. You gotta do that. But if you actually wanna fix security at an organization, you've gotta sell it from the bottom up because it's the people on the ground, the people at eye level who are actually doing the things that will make you more or less secure. And you've gotta convince them that this is the right things to do and these are the changes they need to make in their processes to be better. And you've gotta provide them with ways of doing it that makes them more efficient, not less efficient. And that's a hard sell in a lot of cases with security because being frank, we're introducing overhead quite often to do things securely, right? I mean, the act of typing a password as opposed to just being on the system introduces overhead but short-term overhead, long-term overhead, it obviously creates a lot of problems not having passwords. Amazon S3 buckets, yes. All right, so jumping ahead, the key language of business and again, something that for me I think is one of the really big differentiators between a sole contributor and a leader in the information security space is the language of risk. Business leaders understand it. They may not understand your specific technical domain and they may not understand what a router or a switch is but they understand the language of risk and it's a language that we can meet in the middle with them on. We can meet in the middle with lawyers, with accountants, business people, the CEO, they will all get the idea of likelihood and impact. It's a problem for us in technical fields. In the words that I've highlighted on this page, uncertainty and probability, we struggle with that as a group. We do, we think about these things in absolutes like engineers and I know that's a rough way of saying it but too many times it's really a struggle to drag an estimate out of somebody who's in a technical discipline. They want to have the right answer with estimates. You're basically saying I'm giving you the wrong answer but I hope it's close. And that somebody who's got an engineering training or a mathematical training, that makes the hair on the back of their neck stick up. What do you mean? I don't know the answer. There's no right answer. I'm just thumbing the area. I'm putting that in front of the CEO. Yeah, you are. That's what risk is is you take the best educated guess you can sometimes about the likelihood which is a soft number and the impact of these events occurring. And that's the way that you have to describe them. I often hear people say this phrase and it drives me nuts. You can't predict the future. Absolutely you can't. I mean, with 100% certainty, no but why else would we eat right and exercise? Cause we don't think we're gonna die in the next hour. We don't think we're gonna die tomorrow. We don't think the world's gonna end tomorrow. We don't think they're gonna have miraculous cures for diabetes and obesity tomorrow. That's why we save money because we think that we're going to live to an age where we're gonna need money. There's a million predictions about the future we make every day, all of us do. And I think we just have to get comfortable with that but you can't predict it with 100% certainty but that's okay. You have to be able to get comfortable with the uncertainty and probabilities and presenting your risks in that way. And yeah, and it's gotta be done. If you look at the way the businesses report these things, they all know that they're not 100% accurate and people yell at you when your risk is way off but end of the day, you were clear it's an estimate. All right, two other things. Well, the last thing on this one, this is one of my pet peeves about information security folks when they're talking to non-security folks, lay people, FUD is the biggest thing that drives me nuts. And you see it, the sales people do it, the engineers, when they get frustrated, they do it. They just throw out these FUD bombs of like, well, you never know, could be the Russians, those Chinese, you know, it's just stop, don't do it. You will, it may work in the short term to win that short-term argument but you will lose respect from everybody in the room. And that's one of the things that defines you as a leader is being able to hold and command that respect over time. All right, risk strategies. These are some of the terms that you have to understand if you're going to communicate in risk. Risk reduction, this one's pretty straightforward, right? You know, I have 1,000 systems that are MSO867 vulnerable. I patched 999 of them. Have I reduced the risk? Yeah, probably, right? That's risk reduction. There's still a risk out there that I missed one or that the patch isn't fully effective or somebody didn't reboot after the patch or all of those things, but that's what that's about, reducing the risk associated with some adverse event out there, except that's, this is another one we struggle with as technology folks. There are some times the business is just gonna go, yep, that's okay, I get it. I get it, there's a, you know, you say there's a 10% chance that this website will get hacked, I'm okay with it. It's only up for 30 days, I'll take that risk. And we're all go, you know, it makes our heads explode but absolutely that's their call. Your job is to identify as best as you can quantify and report that risk. It's their job to say yes or no to that risk, right? The CEO, whoever's in charge of that business function, they get to decide that. And I know there's a great cartoon about there, you know, what's the magic word to get what you want? Risk accepted, right? And there are some people who will overuse it, but that's in a well-functioning organization, that's how that should work. Avoidance, we're gonna go through some examples later, but the classic one is just don't be in that line of business that's gonna create that risk, right? Don't launch that website that, you know, tweaks its nose at some hacker collective or, yeah, any of those sorts of things. Avoidances is pretty straightforward. And there are times when we're gonna wanna advise that, but again, business may just say, yeah, I understand. Avoidance is probably better, but we're not gonna do it because it's too much of a gain for us in business. And I've seen a million cases where they were right. I counseled, don't do it. And they said, well, we're gonna make $300 million off of that. So we're gonna do it anyway. And, you know, they were right. Transfer, risk transfer. This is the classic cyber insurance, right? That's like, you know, I think there's a lot of uncertainty around cyber insurance at this point, but this is the classic way of dealing with risk. And I think this will get better over time. My last one is probably the most common in this area, and that's hope. You just hope you don't get hacked. And you see it over and over and over again. Every day you see, you know, tons of people putting stuff on the internet and, you know, just hoping that bad things don't happen. All right. So I have a cat and she shares this office with me. So there's nothing I can do. Yeah, there's no risk avoidance on that. This was her office for the last five years and I've only been working here for the last couple of months. So she'll pop in and out of frame on the background. Yeah, her name is Schrodinger. All right, so risk, appetite and tolerance. These are some other terms that you're gonna need to know to speak the language of risk. Risk, appetite is the amount or the type of risk that your organization is willing to take. Thank you. Every organization is gonna have a different risk tolerance level. It's very much organization-specific, right? Financial services tend to be more risk-averse. Startups or tech companies tend to be more risk-favorable where they'll just take chances, particularly if you're a startup, right? I mean, what have you got to lose? A lot less than a hundred-and-something-year-old global investment back, right? I mean, it's just those are the sorts of decisions that they're gonna make. Your job is to understand what the risk appetite of folks is and the description below is great because this is the tricky part. If you could just say what's your risk appetite and people could say seven and you do seven, there is no numeric value like that. Everybody's gonna have a different definition and you're gonna have to speak to a lot of people and you're gonna have to deal in fuzzy terms to get a sense for how people wanna deal with this, right? Whether or not they wanna be able to say, it's a dollar value or none, right? Like, the CEO's gonna be like, no. What do you mean by getting hacked? You're saying how often I wanna get hacked? The answer is never, right? I mean, I can get hacked for free. What am I paying you for if we're gonna get hacked, right? CFO, he's probably gonna set aside some money for this and say, yeah, I put aside $8 million a year. So if you can exceed that, let me know and we'll budget some more for next year. And everybody else is gonna have a different view of it. Your job is gonna be to consolidate all of that and come up with a security controls that are appropriate to those risks. Yeah, thank you, thank you. All right, risk impact levels. This is an interesting diagram and I think it's a great way after you've had all those conversations to communicate this back to the folks, right? To communicate this back to senior leadership and say, based on what I heard from all of you, and this is just an example from the site that's listed below, these are the sort of numbers or metrics I've come up with to calculate our risk levels, right? Whether something's a low impact, medium impact or high impact and get everybody to just sort of nod yes so that there's at least a common understanding of the term, right? So lexicon that will help you build common understanding and get people, you know, when a bad event occurs, everybody sort of freaks out initially but then you sort of can go back to this and say, listen, this is what we agree to, this is in line with what we've communicated to the board, to the CEO and it gets everybody back to at least level set. All right, managing cybersecurity risk and preparation. So this is just a framework that I like to use for how you would do all of these things, right? Gotta build a team, diversity on that team, so important from all aspects of it, right? The type of people, the type of education, the type of functions that they've served, the more viewpoints that you have on your team, the better you're going to be. And this is another thing that I see a lot of technical folks and I've been guilty of it quite past. We hire people that look a lot like ourselves, right? And I don't mean that necessarily physically, but like I'm an infrastructure guy, that's where I come from. I love routers and switches and networks and, you know, early on in my career, I thought those were the people that knew the most because they understood the things that I understood. And as I've grown, I think I tend to gravitate more towards people who know things that I don't. And I just ask them questions all day, every time I meet with them. And I think that's maybe a better leader. And I think that's probably the most important part of building a team when it comes to risks because you've got to understand it from every angle if you're going to manage it effectively. Structure, you've got to assign formal responsibility for these things. And I know a lot of people say, risk is everybody's responsibility. Yeah, but if you say it's everybody's job, then it's nobody's job. It's got to be somebody's job that's got to be somebody's responsibility. And you've got to give them the freedom and the ability to actually manage around that risk. Managing the risk, we talked about a bunch of these things already, understanding your company's cyber risk profile, how likely are you to get attacked? How often, who's going to come after you? What is your risk appetite? What is the impact of you getting hacked? Is it, you know, the type of business where Congress is coming after you? All of those things have to factor into your management of risk. Get as many viewpoints from the outside, law enforcement, your peers, industry groups as you possibly can, and then constantly updating your profile, your processes so that you're integrating the latest and greatest techniques to managing the stuff. The last part of this, probably the most important, what do you do when you're wrong? When you've calculated all of these things and the worst still happens, have a crisis response plan. I think during, you know, this year, it's like, if anybody is arguing this point, you got to go, what are you thinking, right? I mean, like we've had all of this craziness this year. I live in New York City and it's just, you know, I can't, you know, we were waiting for locust, right? I mean, it just, you know, everything went wrong. And if you can't sell this now, you should probably think about moving jobs, right? This is as important as anything could be at this point, right? What do you do when you're wrong? All right. So I'm gonna jump into a little bit of psychology and I think this is one of the areas, again, and then I think technology people and I hate making generalizations, but I feel like I'm talking about myself here. So I've lived with this my whole life. We're not great sometimes at empathy and we're not great at understanding people's motivations. We know what they did and maybe technically we know why they did it, but we don't know emotionally why they did it. And I've spent the last probably 15 years really focusing myself on trying to develop that part of my personality. So I could understand why people make the decisions that they do instead of just going, what the hell is wrong with you? What were you thinking? I'm actually trying to ask, hmm, that's interesting. What were you thinking? Instead of what the hell is wrong with you? What are you thinking? So this chart, I think it came from a Bill Gates presentation a while ago. I love it. If you ask people what they're worried about, like what scares you? They'll be like terrorists, murderers, all these things that just are very, very, very sharks. People are worried about sharks. Kidney sharks? The chances of getting hit by, by getting killed by a shark like you stand a better chance of buying two winning lottery tickets while getting hit by a meteor than you do of getting attacked by a shark. If you look at this chart, you can see some of the reasons for it, right? The media tells you, media doesn't report about people having heart attacks. They report about terrorism. Terrorism sells newspapers, right? I mean, like it's such a rare event. Those aren't the things you should be worrying about. And in the information security space, there's lots of this, right? People talk about Stuxnet a lot. There's one Stuxnet and there's, there are APTs out there and some of us have to deal with that. It was really rare. It's really rare. Chances are you're gonna get owned by a mediocre ransomware crew. I mean, that's just, you know, probability overwhelmingly that's who's gonna own you and we certainly had a lot of evidence of that this year. And I think there's a couple of biases at play here that I'm gonna talk through in a little bit more detail. Public service announcement on this stuff though, it freaks people out. When you reveal to them their cognitive biases, it pisses them the F off. Like people will get really, really mad at you when you highlight this stuff to them. They killed Socrates for it basically. So, you know, this is not new, but it's absolutely true. People do not like to deal with the fact that they have cognitive biases. So public service announcement, use this internally and your inside voice don't necessarily communicate it to the folks in the room. I love this graphic up in the corner here though. I mean, people have been talking about like how social media is being used to change our politics. And this, you know, most of you are probably too young to even know what a TV guide is. But this TV guide from the 1980s basically had the same message about television, right? I mean, it's not new. This, you know, the foreign state actors are controlling the elections. Eh, they've been doing, you know, old hat. We've seen this a million times. All right. So if I can get my presentation to work again. All right, here we go. Why are people so bad at estimating risk? The optimism bias. This is the first one of the biases that I'm gonna talk to you about. It's really funny. It's not really funny, I guess. But when you ask people, what are your chances of getting cancer? They'll give you a number, right? Estimate your percentage chances of it. And they'll say something like 10% or 60% because they, you know, most people have no idea. If they say 10%, the answer is 30%, right? It's like, you know, we have so much data on this. We know how many people get cancer. We know what percentage, you know, it's pretty solid, right? It's 30%, right? 30% of people get cancer. And, but if people underestimate and say 10% and then you give them the answer and say it's 30%, they'll go, nah, for me, it's 10%. You go, why? I just feel that way. But if they say it's 60% and then they learn it's 30%, they'll be like, oh, no, yeah, no, I'm better than that. I'm like 20%. They'll adjust dramatically if they overestimate but not if they underestimate. And I think this bias carries through in all aspects of our life where nobody thinks that they are below average at anything. And this is like a psychological fact you can prove over and over again. 90% of drivers think they're above average drivers, right? Which is, you know, demonstrably false, of course, but there's very few people who admit they are a below average driver. And anybody who spent any kind of time on New York City's roads will know that that is definitely not true, particularly among the taxi set. So how does this translate to information security? I've linked to a report that I did some work on and I was part of the advisory team for this ESI Thought Lab piece which was published about a month ago. It's a great report. I'm not just shilling it because I was involved with it. I actually think it's great, but you can read it and make your own determination. There's some great data in there about what the chances are of having a breach. And they did a lot of great work around looking at the entire universe of companies and what percentage of them actually suffered breaches during the previous year. And if you go with the moderate or material level, people estimated 45% said, you know, that was our chance. And it's way higher. It's almost double that for a lot of industries. Their chance of actually getting breached. So people do not believe they're gonna get hacked. Even after they've been hacked, they will still believe their chances of getting hacked again are way lower than they actually are. And it's just a cognitive bias that you have to get people passed. Why are people so bad at estimating risk? Availability bias. This is the second one. And this is, you know, a couple of my favorite examples on it. Australia's national terrorism threat level is probable. And if you look at the data, I think like two people had been killed in terrorist incidents in the 20 year period in Australia. It's like, I don't think you know what the word probable means. I mean, like it's a terrible, these are terrible things, they're terrible tragedies. Every one of them horrible and shouldn't happen, but that's not what the word probable means, right? And I think we struggle with these things that are emotional and impactful like that and estimating them in sort of cold calculated terms so that we can get an accurate perception of risk. And nobody wants to say, that's ridiculous, a terrorist attack is not probable, right? It's just, you know, it's just, it's emotionally that it just doesn't make us feel good to say things like that. And there's a bunch of other examples here, right? Like the shark attack one is just my favorite, like 186 in 20 years, among six billion people. I mean, that's like, that's less, I don't even know how to describe that as it's so infinitesimally small, right? Yeah, I guess I could do the math and actually come up with a number, but it's really, really small. And, you know, if you look at like, there was this huge warnings about Zika, and I mean, it's just, it was very, very small. And I think we have to take those things into perspective and this is what keeps us from getting into the flood area when it comes to information security. Look at the numbers, look at the data and figure out what the real risks are and what really is probable and what really is likely and what really isn't. And yeah, the good news is we generally don't have to deal with people dying and all of that kind of thing. It's just like a computer getting hacked and some guy losing money or something. So it's not as serious and it's not as emotional, but it still is. I mean, people will generally, you get hacked in a material way, people losing their jobs, right? I mean, it's just that's sort of the way that works and people will get emotional and defensive about it. Like I said, when you expose a cognitive bias, it's like, makes their head explode and they freak you out on you. All right, learn helplessness. This is the other one. Oh my God, I see this one so often. Why are we patching? Everybody gets hacked anyway. What's the deal? Who cares? Like everybody gets hacked anyway. It's not like I can stop these hackers. I mean, like there's a big group of people who think the hackers are all powerful and they can't be stopped anyway. So why are we bothering? I mean, at least three times every time I teach information security awareness, at least one person comes up to me and say, why are we bothering? Everybody's getting hacked anyway. Like I click, you know, you can't stop people from clicking. Why are you bothering teaching this? And it's just the wrong way to look at this, right? Nothing is ever perfect. Nothing in life is perfect, but you can make it better. And that's the message that you have to communicate to folks that it's never gonna be perfect, but we can get better every day. Yeah. I'm not sure whose quote that was at the end of this, but it's great. You can't spend your way out of cyber problems. It's like exercise and eating right. You just have to wake up every morning and do it. And that is absolutely the truth. You can't buy enough things to fix cybersecurity. You actually have to do the work. And that's hard for people to accept. And I think, yeah, it takes a while to get people over that, but you know, there's no, I'm gonna patch and be done. It's you're gonna patch today and you're gonna patch tomorrow. And you know, they're gonna feel like Sisyphus pushing that rock up the hill and waiting for it to roll back on. But that's sort of the game. Every patch Tuesday is the Microsoft, Sisyphian rock coming to roll over you down the hill. This is so weird. I feel like I should stop and ask for questions, but there's nobody to ask questions. All right, so we're gonna keep going. Anybody who hasn't read Cliff Stoll's book, The Cuckoo's Egg, please stop listening to me right now. Go buy the book and read it right now. This is absolutely required reading for anybody who wants to develop a career in information security, particularly if you wanna be a leader. It's unbelievably still prussian about, you know, this happens in the 80s and people are using analog modems and dial up terminals and TTY, green screens, the kind of stuff that I grew up on. But it's still really, really applies to today. All of the same principles apply. And he had a great quote from a Sands conference not too long ago. This was in 2017, where he was talking about, you know, how he found these guys in his system and he was trying to convince people they needed to do something about this. And he said, well, I thought all I had to do was show them the data and they'd understand. But it turns out I had to tell a story and that's what people will respond to is the story. And absolutely, he learns that lesson and I think as we read the book, we do too. It's great read, cannot recommend it highly enough. Absolutely, you should go and buy and read that book. All right, here's some thoughts I had about selling. Security to IT. We think, well, we're all technology folks. We've got a similar mindset, so this should be easy. And it's not impossible, but it has its challenges. Very different mindsets. IT people, they care about uptime, costs, and user experience. That is what they get comped on, right? That's what they get paid on. What was the uptime? Did they deliver the features? Are the users happy? I don't wanna say they don't care about security, but it's not one of their goals, right? I mean, like they'll tell you it's one of their goals, but it's really not. It's one of the things that's adjacent to their goals. And we have to convince them that security will help those three. And there's a few strategies to do that. You gotta get them over the hurdles of some of these things here. I got a firewall. I bought a firewall. I plugged it in, I'm secure. But there's not any rule at the front of it. Why would they sell me a firewall that isn't secure? Why would Cisco do that to me? It's like, you know, that's not how firewalls work. That's not how life works. But yeah, you have to do the work to get them through that. They'll tell you about their audit and regulatory reports, all of these sorts of things. And your strategy for selling them is going to be to show them, it's gonna be a story, but it's gonna be a story with data, right? I think pentests are a great way to do this where you show this guy walking through their environment, right? And you gotta make them as close to real life as possible. I think there's a big part of this where you're gonna need to convince them that that data is not manufactured and all of those sorts of things. And you're gonna have to help partner with them, right? You can't throw them onto the bus. You can't sell them out to audit. Sometimes you have to, but generally you want those to be your last resorts. You wanna help them and enable them to make themselves more secure without affecting their uptime too much, without affecting their ability to deliver features and without driving their costs through the roof. I think this is one of the things we're really bad at security is we push that cost. Hey, we were cost neutral this year, but we caused IT to double their budgets. The strategies, like I said, if you're uncertainty and doubt, awful, don't do it. Rat them out to audit, rat them out to management. Yeah, the availability bias, just put it in their face every day. That kind of works, believe it or not. Demonstrate their fallibility. This is one of those things that will help with the optimism bias. If you demonstrate clearly that they are capable of making mistakes, they'll be angry at first, but generally if they're professionals, they'll get over it and wanna be better. Metrics, this is hard to argue with. Patching metrics, fantastic. Just cold hard facts about patching metrics and then a story to go with it that says, here's MS-867, here's MS-1710. And if you have this open, this is the horrible things that happened to you. And this is how many systems we have open and show the trends so that people get a story out of it. Pay for it, get put it in your budget. The more tools that you can get in your budget. And listen, right now, we've got insecurity a huge advantage, right? I think everybody, cyber is the cool thing at the board table. You ask for money, it's very hard for them to say no. If IT asks for money, it's really easy for them to say no. So I think that's one of the ways that you'll win people over. And of course, you're gonna have to have air cover from the board and senior management. All right, selling security to the board. This is a challenge, right? I think it's not as much of a challenge now as it used to be. I think boards have really focused on this. And I think boards, they wanna know, they all know this is a risk. And they wanna know that somebody's covering them to make sure that they are doing the bare minimum, not the bare minimum, but they're doing the things that they need to to make the company secure from cyber attacks. Most of them don't have a technical background. Almost none of them have a cyber security related background. And you're gonna need to figure out ways to communicate to them in fairly simple terms. This is the way that I do it. I've got these pretty simple four questions that I use and I find it really helps me get the message across in a succinct one slide presentation. And then I usually have some details behind it, but I don't wanna inundate them with data. I know that some people's strategy is just throw everything in the deck and this way you're ready for any question. I think these four, the only four questions that a board member really cares about, but every board's gonna be different. So are we compromised right now, right? Everybody, this is one that scares security people. Like, how would I know? Oh my God, you know, you're gonna give them a high, medium, and low. And some reasoning, why you think that? How vulnerable? Again, this is even fuzzier. High, medium, or low. Why do you think that might attack us? What would they use? And how are we defending against it? Just a couple of sentences. How are we gonna address, because there's so much uncertainty in this. How are we gonna address that next level of threats? And this is early on in the presentation. I love those maps with the dragons at the end of the earth and the falling off the end of the earth. The uncertainty bit. That's why they used to draw those, right? Because nobody knew what was on the other side of those maps and that's fun, right? The dragons at the end of the map are fun. Yeah. And then what's our plan if we do get compromised? Like I said, you have to have thought through this. You have to be able to communicate this. Some of the things that I've got at the back ends of this deck are folks who didn't, right? Some examples of folks who struggled with that. All right. I'm gonna move along a little bit here. Yep. Here's some other examples of things that you can communicate to the board. Some metrics, agree on metrics, make your metrics consistent and show a trend over time. That's what they're gonna understand. They're not gonna know what an attempt at attack means. But they're gonna wanna know, am I getting more? Am I getting less? Are they getting better? Are they getting worse? Those are the sorts of things that a board and senior management's gonna wanna understand. I think this is another great way to report risk. How is high, medium, low risk and impact on our core risks? What are those core risks around cybersecurity that we face? And what's the trend? What do you view the residual risk after the controls that you put in place? It's a difference between inherent risk, which is the risk as it stands naturally and residual risk, which is the risk after the controls and mitigations you put in place to deal with that risk. All right. So here's some case studies. Everybody probably remembers this. H.B. Gary, CEO of the company got up there and said really bad things about Anonymous and how he was all up in their business. And this ended really, really badly for him, right? He taunted Anonymous and they went through him like a hot knife through butter. They owned him three ways from Sunday to wiping his personal iPad at the end of the attack. I mean, it was just completely unprepared for the onslaught that came his way. Yeah, there's a great video if you ever have a chance to Google it on, Jesus. On YouTube, I can't remember the guy's name, but one of the late night comedians does about this hack on H.B. Gary, it's fantastic. So the other one that I'd like to talk about here in the same kind of group is Sony Pictures, right? And this got all wrapped into politics, but end of the day, they made a movie that was poking the eye of the dictator of North Korea, right? And you just go, is that risk avoidance? Somebody should have said, you could just call it something else, not North Korea and you wouldn't piss these guys off so much. Again, they went through these guys like a hot knife through butter and they owned them three ways from Sunday, released movies, dumped all their emails. It was just horrible. If you look at the common elements here, they were engaged in an activity that was not universally admired, right? I mean, you pissed somebody off. Management didn't understand how dangerous that adversary was. The information security controls were not up to the task of defending against somebody who is very well-prepared to attack you. And the attackers have all the advantage in that space, right? So, you know, then obviously, neither of them was prepared with any kind of response once this had happened, right? None of them had any kind of a disaster recovery or recovery strategy at all, it seemed to deal with these things. They were both on pen and paper for months afterwards, supposedly. And the responses followed the standard script there at the bottom, right? So, here's our lessons. This is my view of it. Don't pick fights with people who have nothing to lose. You have everything busy, I mean, literally they have nothing to lose, you have everything to lose, that's dumb, right? Private companies with profit loss calculations, if they go against people who don't, that's gonna end badly for you, right? They got all the time and energy in the world. You know, if you give somebody that much leeway, they're gonna get through, know your enemy and yourself. No cybersecurity presentation is complete without a sun suit quote. All right, just some questions to think about, right? There's a great statement in the Sony case that came out maybe a year before, it's a valid business decision to accept the risk of a security breach. I will not invest 10 million to avoid a possible $1 million loss. That's a valid statement. The problem is, he was totally wrong about the amount of loss, right? I mean, totally, way, hundreds of times off base on the order of magnitude there. And that was the issue. Wanna cry and not pet you. And I think I'm wrapping up my time here. So I'm gonna talk through these next few slides fairly quickly and I'm happy to chat about these afterwards, I'll be on chat. So obviously everybody knows what these were from 2017, big worms, MS1710, spread through the networks. Wanna cry was an amateur version, not pet you was like, you know, somebody who knew what they were doing, said, hold my beer, watch this and just let loose on the world. I don't think it meant to spread as far as it did, but it sure did. And boy, was it a whopper. It taught me a few things, right? Patching is not optional, right? And I think IT will tell you, why would I patch? Every time I patch, I lose 2% of my systems or whatever percent of my systems. And here's why, right? You just gotta patch everything. And it's not the same as it used to be where you had security through obscurity. Showdans out there, man. And they can scan your whole network in seconds. And they will find every single one that you didn't patch. So patching 99% is only marginally better than patching nothing, because they'll find that 1%. And then, you know, if you got hit with not pet you, it was it, they just needed to find one. And, you know, it's a bunch of other things here that are pretty relevant, like email, having email. So it's not connected to the internet, but it's got email. That's the same thing. Maybe it's low latency, you know, a high latency network, but it's still on the internet. Yeah, why is patching so hard? I think we covered it, right? Some other questions to think about. I think these are the sorts of things to me, these questions, but what lets me get inside the head of the people in IT and the people in management so that I can better understand the problems they're going with. And hopefully this is a helpful exercise for you. Well, again, I'm willing to talk and chat through any of these things with anybody, but this is the sort of thinking that will get you to that high empathy state so that you can relate to why they're having problems and maybe help design solutions instead of just, you know, clinging vulnerabilities out, which is fun. I absolutely, but actually helping solve the problem will actually increase your company's risk tolerance. All right, Equifax, everybody knows this one. I'm not going to spend a lot of time recounting what happened there, but let's just say Congress gets involved and if you're a private company and Congress starts interviewing you on stage, that's about as bad as it gets, right? You're up there and they're going, they're wagging their finger at you. That's not a good thing. And sort of the same results that we were talking about before, patching 99% only marginally better than patching one. You know, the CEO at one point was saying, it was this one guy in the test unit who didn't patch his server and that's why all of this happened. And it's like, dude, you've missed the point if you think that, right? You don't have a process that covers your entire organization. It's on you. Yeah, and I think he went to jail and a couple other people like, cause they also had stock as soon as they heard the breach, that's lesson number one. If you're involved in a breach, stop selling or buying stock. Absolutely, do not, do not. Yeah, planning for instance, this is another one where they were totally unprepared. I mean, it was just, it was like comical to watch those couple of weeks afterwards. It was like, oh my God, what are they doing over there? Some more big questions to think about. And the one that I like here, and I'm just guilty of it as anybody, right? This was victim blaming, right? Everybody was like, Equifax, you're horrible. What if somebody showed up at their offices with a gun and stole the files? Would we say the same thing? And I don't know why this is fundamentally different, but I guess it is, I don't know. But I mean, you wouldn't expect them to have armored personnel carriers outside their office and all of their staff wearing bulletproof vests so that they could fight off the guys who came in with a gun to steal the files, but we expect that from them in a cyber realm. And, but it's, you know, read the terrain. That's the way the world is, I guess right now. But those are the sorts of things I think about when I read these things. All right, a couple of documents and I'm gonna wrap up because I'm probably gonna get yelled at pretty soon. This is a great document on what it takes to become a leader as a link to it on the bottom. The cybersecurity guide for leaders in today's digital world and world economic forum, fantastic document. These 10 things, like just absolutely nail it for what you need to do to get to a leadership role in cybersecurity. Best practices, these are the best practices from the ESI Thought Lab piece. Again, I think really nails it a different take on it because I think it's from CSOs themselves as opposed to folks on the outside, but fantastic. It really gives you some great examples of what you can do to improve any cybersecurity program. This is a method that I like, I've used it for a long time for how to build a cybersecurity strategy. Sort of what's the funnel look like? What goes into that process? What are the things that make up the levers that drive your cybersecurity strategy? Right, putting it all together. These are my closing thoughts. Everybody talks about APTs and nation-state elite hackers. Yeah, they probably could get through because they have unlimited time and budget, but they probably don't care about you. That's not where I would spend most of my time. I mean, you got to do it, but chances are overwhelmingly you're gonna get attacked. You're gonna get owned by some Chintzi ransomware gang out of Millwareville. If management makes bad decisions, there's not a lot you can do other than be prepared with a response plan. Your job is to tell them when they're making bad decisions. Even if they yell at you and say, why do you always shut down my plans? Because your plans always suck. No, you have to do that. You have to have the emotional courage to say, this plan is stupid. And I am going to tell you this plan is stupid. I'm not gonna stop you from doing it, but I'm gonna tell you that it's stupid. Yeah, vast majority of this stuff, almost all of it. Like almost all of these things happen because we suck at inventory and patching. Just suck at it. And I think the basics, and everybody says this, and it drives me nuts when I hear it, because it's like a platitude, but if we could patch our stuff and we knew where our stuff was, 99% of this stuff wouldn't happen. If we did that and put MFA on all of our logins, almost, it would be almost impossible. I like it would be really, this would be really, really hard. I say that and then they'll figure out 10 ways around that stuff, but if we were just effective at this stuff, we would take off so much of the low hanging fruit. Yeah, and these are the five rules about how to manage cybersecurity risk. Assign responsibility, identify and quantify the risks, mitigation strategies, communicate, like I said, tell them when they're doing something stupid, and make sure you highlight the risks. And plan for when you fail, right? Because everybody's gonna fail, and you're gonna be wrong. You're gonna be wrong, they're gonna be wrong. Everybody's gonna be wrong sometimes. You gotta have a plan to deal with that. And that's it for me. I'll hang around and chat and answer questions. Peter, thank you so much. That was absolutely wonderful. And I miss the cat. I wish the cat would come back more, but... Thank you, thank you so much for having me. No problem. In addition to Peter being in the chat to talk about this topic, he's also going to be available for career coaching. So be sure to check out the schedule that's pinned in the Discord channel to sign up to be coached by Peter. Peter, it was great seeing you. I'm sorry I have to cut off and go to the next one. Take care. Thank you very much. Bye-bye.