 Thank you for joining us, and our next talk is going to be on horse racing. It's going to be on horse racing, and it is my pleasure to introduce you to Russell Baturini. We're having one hour analysis of this year's Kentucky Bear. Wow. Well, thank you. Thank you, Ming, for that wonderful introduction. I appreciate that. Now everybody knows I'm a gambling degenerate, and I also want to say, sir, this outfit in the front row is fantastic. You have Vegas Def Con. It's all captured there, so very well done. Very well done. So this is Fishing for Economics. So like Ming said, my name is Russell Baturini. I am the senior security architect at a top 20 CPA firm that I'm required to not name. But yeah, that's exactly it. Don't you love when you go to a hair salon? They're like, I have my secret employer, and you go to LinkedIn and you can look it up. Very good O-sense, sir. I appreciate that. So it's good. I have actually, I think my voice has had about as much of Las Vegas as it can handle for the week, so hopefully it'll hold out. You may see me chugging water during this talk copiously. But yeah, so the abstracts, I guess, weren't in the book. If you didn't have a chance to look, this talks a lot about user awareness and training and correlating data from different sources to figure out who your highest risk users are and how to remediate that. I think there's a lot more to it than that, in that if you're a penetration tester or even an attacker or a bad guy and want to know who are the users you should be targeting as part of your efforts, then yeah, so I mean, I think there's a lot of value for a different audience and different folks, so we'll get started. So a little bit about me, and I have to clarify, just to make sure nobody knows, last year when I, or everybody knows, I was here and I was sitting at dinner one night and this girl who had had a lot to drink comes up to me and says, are you one of the property brothers? So just for clarification, I'm not one of the property brothers. I don't know where that came from, but it's a good Vegas story to tell, so I thought I'd throw that in, but I am the, like I said, I'm over all the IT security things for a CPA firm, we're the 19th largest in the US, and it's a lot of fun. I have my security and compliance dictator, Chris Christie, who is a lot of help and really good to work with, and we have good complementary skill sets, and we've had some times, but that's what I'm on Twitter. You can find me there, there's occasionally security things there, there's a lot of horse racing things too. There's also horse racing things in this talk, so be warned. Present, I printed it here five years ago, I printed it besides DerbyCon, things like that. And the standard disclaimer is these are my opinions, and this is not my employer sponsoring this talk, and this is all my own research, so anyway. So I always like to start my talks with why do I think the talk is important to give, and these quotes on the board, I kid you not, are things that I've actually heard said inside talks at different security concepts I've been to, and it's always just fishing, it always works, you know, users are stupid, it only takes one, and the thing that's always interesting to me about that is I've been around the community here for a number of years now, gosh, probably going back to 2004, 2006, and I just, we've never been a community that's accepted something as a universal truth or unfixable, right? So why are we perceiving the user awareness problem and the fishing problem this way? Why do we, you know, why do we just say like this is always going to work, it's never going to change, let's just give up? And so hopefully this talk gives you guys some ideas on how you can fix in your own environment. So I'll tell you, I like stories, and this talk will largely be a story. And, you know, I came in to my current employer about a little over two years ago. There really was no training and awareness program in place. There really wasn't a formal information security program in place. Never done any testing, never done any annual training. I'm sure how many people at their job you have the big annual, you have to sit down for three hours and watch videos and yeah, so none of that existed, which given the type of data we handle and the things we also recommend a lot of our clients do, really didn't make a lot of sense. It was one of the first things I wanted to get addressed. And there wasn't a lot of structure. So if say a user got a suspect email, sometimes they would forward it to their boss, sometimes they forward it to my boss, sometimes they send it to the help desk. And so we started to correct all that. We picked a training vendor and started to unify some things. We actually had a mailbox where we would review the messages and send feedback and talk a little bit more about that later. But, you know, it was a lot to fix. But I thought we really headed off the ground nicely. So we did a handful of phishing tests and enough where I thought I had a pretty good statistical baseline for where we were as a firm and we had a 5.6% failure rate. 5.6 isn't too bad, you know. It's a little above where I'd like us to be. I'd like us to be in a 2% to 3% range. But I thought, you know, we're doing pretty good. So we did all the things, right? You know, you give the people the training. You print out the funny posters and you put them all over the office and the passwords are like your underpants. That's one of my favorite ones. You know, make sure to change them regularly. Some people at DEF CON could use that advice. But that anyway, three to one rule, folks. That's all I'm saying. Quick PSA here. But, you know, 5.6%, that wasn't too bad. And so we did all the stuff and I was telling, we're ready to do another one and I was telling one of my colleagues, I said, man, people are really grasping this. We've hit them with everything. They're loving it. Everybody seems really engaged. We're going to get down to like 4% next time. It's going to be awesome. We're going to, you know, we're going to kill it and we're going to be in really good shape. So we give the next test and that happened. We got worse. And that's the kind of thing that I just, it makes me want to scream. I don't know how you guys are, but when I, you know, when you put a lot of effort into something and the result gets worse, I just, it makes you want to just throw your hands up and give up and say, you know, security's hopelessly broken. We're never going to do this. And it makes me start to think back to those quotes I showed at the original, or the few slides ago in the presentation. And just like, gosh, you know, well, what happened here? So I started looking through and thinking about holistically, how are we approaching user awareness in our environment? And so I kind of came up with a few things, which is, number one, we didn't really have a clear goal on what we should be doing. So should we be trying to fail people? Should we be trying to inform people? Should we just be trying to collect numbers? You know, how should we be doing this? And then it became more of a, instead of a what, like a quantitative analysis, like the percentage is this and it went to this or whatever. Why is this happening? And I think one of the things I like the best about our, our training vendor is they give us a lot of data and access to a lot of data. The other thing, and I'll talk about this a little more later, I'm a data pack rat. And so I kind of keep data from other jobs when I've done this same sort of thing. Or I also do a lot of side penetration testing work. That's actually my background is as a penetration tester and social engineering type stuff. And so I keep it all. And I started to think, I'd like, what if I take all this data and I put it together into like one unified repository and start to analyze it and start to get it to that why that I mentioned the second bullet point. So by the way, I really don't think this presentation is going to last an hour, meaning generously give me an hour to talk or 50 minutes to talk. And I don't even like to listen to myself for 50 minutes, much less subject somebody else to it. So if you guys have questions or comments, feel free to just throw them out. It's fine. So anyway, redefining the goal of our, of our phishing testing and our user awareness training. And I think we get into the mentality of we should try to be tricky. We should just constantly be trying to raise the bar, make it harder on our end users. But if it's just constantly trying to fail people, that's a very negative approach to it, you know. And I think you really have to have a more positive approach where you have that positive feedback loop for the users where if it's really like you're teaching them instead of just constantly you failed, you failed, you failed. Well, look, you know, I learned from the last one, but you made it harder this time. You know, and then I failed it again. You just, you will never get anywhere with improving those percentages and improving people's knowledge and they just get frustrated and they tune you out, right? I don't know if you guys have experience like that or not, but that's kind of been my experience with it. But yeah, I tell you what, can I address that at the end because I'm going to kind of talk about that. That's a good, that's a good point. Now I would say sort of anecdotally, I'll top my head, the positive approach works better. And I've got some more information on that that I'm going to talk about later. But yeah, great question, great question. So I talked a lot about stats and before I dig too much into this and that's the best picture of a data scientist that Google Image Search has. So, but that is not me. I made lots of Ds in math in college. D is for diploma. And so you will probably see statistical anomalies and things that I didn't analyze right or things where there are other reasons for the way they are that tie back to the math. So please, please don't take it out on me or be too hard on me. But anyway, I'm not a data scientist but I do spend a lot of time at this place. Which, and it's jokingly I say that, but you know it's literally looking at data to try to predict an outcome. I mean that's what it is. And that's kind of where I learned to do my data analysis. I was going to the racetrack and playing and trying to predict. So I had this idea and I'll show you this. If you've never been to the racetrack, they give you a program and you get something like this on every horse in every race. And it has every piece of information from the color of the horse, from where the horse was born, from how fast it ran the last race to a description of the last race. I mean anything, and you can take as little or as much of that data as you want or you can throw money away, betting on them, which is fantastic. But I started to think, can we profile users like this? Can we take when there is some kind of incident that happens or they fail a test or something else happens, right? Can you build a profile to start to predictively model that past performance and say, you know, these are the users who are going to have the worst problems with or are going to be the most likely to create a security incident, fail a test or we might need to train in a little bit different way. So that's all it is, it's really trying to predict an outcome. And so I had kind of started reading these two books at the same time, Picking Winners by Andy Byer. Andy built the, he's one of the best handicappers in America and he built the buyer speed figures which are a way of taking data points and creating a metric for analyzing race data and how a race looked. And then Freakonomics, who's read Freakonomics by the way? Yeah, I read it after everybody else did. I think I've been meaning to read it for years. As one of those things you get, it's on your reading list and you know, just keep putting it off. But, you know, if you haven't read it and I'll let somebody else speak to this better, they probably can speak to it better than I can. It's about taking things that seem like they wouldn't be related and using them to compare and predict outcomes and also analyze why certain things happens, like the hidden why of everything. I really liked the ideas both of these books presented around trying to analyze the data and figure this problem out. So, I came up with two key questions and I wanted to decide what were the important data points to predict, predictively model who's going to fail a fishing test or who's going to fail a real life fishing test and do something bad in the environment. And I should clarify too, with user training, I'm trying to predict everything, you know, fishing tests I didn't schedule that somebody else scheduled for me, right? You guys get those. And so, you know, can I predict who has a higher likelihood of falling victim and are there things that I wouldn't think of that were related to a user's performance on fishing testing and the first repository I started thinking about where I could get that data was human resources data. So, that is where we had a collection of data on an individual and I'm fortunate because I've kept in touch with all the HR folks and from places I've done this work and also we have some very generous HR ladies at my current employer who provided me with, in nothing real sense, they're not like performance reviews or salaries or anything like that, but just data on people's resumes and job levels and years of experience and things and it was really helpful data as I started to dig in and take a look at this. So, one of the things I would also say this is a little different than a typical stats talking that we're not really dealing with something that's static. This is people are unpredictable, right? And no matter how many times you tell them to do something they're liable to do this. And human nature is very, very unpredictable and also one thing I would say is I'm going to present what I found from my data that I had. Your environment is probably very, very different. So the idea behind this talk is not to say everything I found is going to be applicable universally. It's to give you ideas about like, hey, I've got this awareness program, I've got this training program how do I take the data I have and maybe look at it in a different way or a unique way and a way that we hadn't thought about before to make improvements in our own program. So, again, no matter how much we analyze this we come up with somebody who's always going to screw up and I think we all know that in this room. Yeah, so this is kind of where I started. I've got about 10 years of phishing test results and three different large environments. I've got side test work or consulting work I've done. I've got data on security incidents which were the aforementioned unscheduled phishing tests where people failed and kept those, took some HR data. I took a survey actually of some of the folks who had failed our tests, of all the folks who failed our tests internally and I'll get into that a little bit more just simple little two-question survey and some other miscellaneous data. I took it all and I threw in an elastic search which we'll talk about and I came up with a sample size of about 3,200 events that I started trying to make assertions from and what I would say is also not everywhere did I have HR data, not everywhere did I have certain pieces of data but I was able to query in subsets where I had different types of data which elastic search is great for if you've ever used it and I came up with about 60 working theories on what I thought would be an important correlation and not all of them panned out to be anything or panned out to be statistically significant but I started about 60 and worked down to the ones that I found had the most statistical significance. So like I said I used elastic search and number one on the list is it's free. Anybody here who ever used elastic search? Okay, yeah. We like it, we don't like it. It's great. I love elastic search, it's so cool. And for the price it's right and like I said using something that was schemalist where I could have events where not every document had the same data points in it was very very helpful in analyzing the data since I had variances in the data and also I was able to use Python real easily to dump data into it which worked really well. So yeah, let's dive into the data now and I'll kind of start showing you some of the things that I found. So the first thing I did was I started looking at figures by years of experience and my working hypothesis when I went into this was the people who have been around a long time when I say that, that's a polite word for old would fail more frequently and I saw that there was actually some truth to that and you see about in the top left of the graph there the green slice about 26% of the figures I analyzed came from people with more than 10 years of experience. So no surprise there. What really did surprise me though was the biggest slice came from the people with 0 to 2 years of experience. So I was thinking these will be the younger people, they're tech savvy they've been around for security or they've been around for security events and technology for a long time, they're going to be up on this stuff they're not going to fail at a very high rate. Boy was I wrong. And so I thought this was pretty interesting and you can see also you look at the sweet spot about 4 to 6 years of experience these are folks they've been around long enough and they're professional enough to be security conscious but you know they don't fail at a very high rate so those seem to be your lower risk users but yeah you have the people who are kind of opposite ends of the spectrum in their career tend to fail more than anybody else and so I was subdivided that 10 plus group further and 40% of that group came from 22 plus years of experience again the old people and I hope nobody here has that much experience and I'm sitting calling you old right now but when I started looking in the other slice with the folks who had less experience I was able to correlate that back to the timing of the test where it went way up I showed you at the beginning so I gave that test in January when everybody had just graduated from college in December so we had a lot of people with no experience who started then and I was like hmm ok and then I started looking at who failed and it was a lot of the new hires so the timing of your test when you have mass numbers of new employees onboarding has changed the risk profile of your company to phishing attacks drastically and I'll show you something else interesting about that here in a minute so I talked to our office managers we have a pretty distributed environment and said hey would you work with the folks who have been around a while will you help them take the training will you make sure they understood the high points of the training and the ladies were fantastic and they were very good about working with them and got some really good feedback on that outreach I think and this is a tangent but some of the older folks who have been around with the rapidity of technology changing they get intimidated by it and have somebody sit down with them and say look here's what's important we've given you this training and we really want you to know this I think it actually meant a lot to them so I got some very good feedback on that and then the other thing I did is I actually changed our new hire training as well to add some new content you need to come in I also changed the way that we gave it I used to give it to them to say knock it out in the first 30 days well now they get it immediately when they're on board it and they get a reminder every day to do it so it makes sure they get it done since they are a higher risk group apparently from the data so I mentioned the survey I gave I gave this just real quick little two question survey and I got a couple funny stories on this and I just asked them then do people who failed these tests this survey was over people who failed and do they feel our internal communications like HR, IT so forth are easy to identify as legitimate communications so as in I can tell the difference between a phony one and a real one and one of the things I found was almost nobody had had security awareness training in our environment or phishing testing to us starting to do it so it was a new concept for them and clearly not enough companies were doing this so my action I took out of that was since this is new we want to continue to aggregate data we want to continue to drive participation in our programs and get them used to it and also because they haven't had a lot of previous training we may need to increase the frequency of our trainings and that one round may not be enough for a lot of folks so that was pretty interesting so kind of correlating to the question on the survey and this is why I wrote that question on can you tell if our internal communications are legitimate or not I looked at the categories of failures and this spread across all 3200 samples I had and you see the top two failure categories are HR and IT and the HR actually made a lot of sense because it ties back to the new hires we talked about how they were failing a lot well if you start a job and you get an email from payroll or benefits or whatever yeah I need to go ahead and do that and you're highly likely to click on it so if you are a pen tester and you want to do your OSINT and get on LinkedIn and see who just started a new job somewhere that might be a good vector yeah total failures this was the category they failed on yeah so 90.8% of the total number of failures came from emails in the HR templates in the HR category about 1 in 5 correct no no no not departments I apologize yes thank you that's a good clarification this is not who failed yes correct no thank you for asking that's a really good point exactly the Office 365 ones those are fantastic that's a really really good question can I come back to that actually because I'm going to talk about that a little bit more but sure right you're talking about statistical skew there and that's a good question and so again I was looking at a body of work that was across multiple environments and multiple engagements I guess you'd say so this isn't just one environment where we did this every three months and we sent X number yeah that's a good question I'd have to go back I honestly don't have those numbers because I draw from a random pool and so there wasn't necessarily an even balance what I would tell you is there's an essentially the same number of templates for each category so the odds of one being drawn over another is about the same okay so does that help? okay yeah good question thank you anything else oh god that one this chart was actually built from all 3200 because I had the data on all 3200 failures yes so yeah okay yeah and that's fair because you could have different populations of colleagues inside those companies who were you know some of them were younger and had less experience and had more and I would say at our company we skew toward the older a little bit but some of those environments so one in particular was a fairly large environment about 3000 employees that skewed more toward younger people so if I had to come up with a distribution again you guys are challenging me on my data science skills here and my statistics skills I made a C in that class so it was one up for math C is for college but yeah I mean I honestly didn't look at that data that's a really good way to dive into the data more and say hey where is the skew and start trying to adjust the numbers for that yeah I think it's a fair conclusion and I would say that probably because the sample size was so large I mean I don't want to say 100% just because I don't have the data in front of me I would say the skew would be insignificant I would say it's going to be about the same out of every sort of category of age so good discussion was there anything else okay one more oh okay we got two more here that's fine yeah mm-hmm yeah yeah actual security incidents yes because I categorize uh no that would be a good thing to do yeah I just I wanted to look just at the at a macro level at the entire body of data data when I was doing this just to get an idea of what it looked like but that would be good for further research absolutely some great ideas yeah yes ma'am hahaha we did they did not like it they said it looked unprofessional when we replied to emails so what that again and so that was that was on me I'm okay so we'll get off on a little tangent here that was on me that was one of the first things I did when I came in was say we need to do this and we hadn't really built that trust in the security program yet to where they understood like if Russell saying something we probably need to do it you know it was more just me coming in with the jack hammer and like we need to do this this this this and and so one of the things I've learned it's a cultural thing is you have to slow down with them and you have to bite things off in small chunks and you have to when you start trying to grab and grab and grab and like oh we need to do everything all at once they don't respond well so that was backed out now we are revisiting that and then maybe something we wind up doing but yeah they we derived that they didn't like it so um anything else I think yeah one more than I really need to move on we can we use those yes we use the fishing button for reporting email it's we started with just a generic forwarding mailbox and then we use the button and they they love the button and one of the things is great the button we get from our training vendor and you may know who it is I think they all do is when it's a test and they hit the button they get the little hey congratulations you passed the test and yeah that works really well so no thank you so much for the discussion that's those very good discussion um so uh moving on so this is where I found a huge huge disconnect I mentioned that the survey data showed that uh or I gave the survey and I asked them I said do you think that you can tell a legitimate email from our IT department or HR department from a fishing one and they were all like cool yeah we totally can but then you go back to this and you say so so there is some huge disconnect here that we have yet to figure out now getting back to one of the questions and I apologize I forgot who asked it um oh it was about the trust in HR we don't at our firm have a good standard look and feel for uh internal corporate communications they just kind of come from a person and I guess we're kind of that that small business mentality that we still have uh because we were small and then we got really big really fast of everybody knows this is the HR lady so when someone sends you something then it's legit but um yeah so there was no good consistent way that the communication is supposed to look and feel um and so I actually talked to our marketing manager about that and she is working on a more standard template for communication so if they look a certain way they know it's real and if they don't it's fake it's kind of a way of mitigating the missing external tag of at least giving the email the body should look a way and then if it doesn't you kind of know it's fake um uh and and I've heard that she's had some challenges getting that done they don't feel like it's really necessary but uh that's another one we're gonna I'll fight to the grave on so um the next thing I looked at was time the next thing I found uh statistical significance in was timing of things uh and I love this picture uh but you know how did timing related to certain events uh affect people's performance uh on on uh phishing tests or when it came to identifying uh phishing messages in the wild um and so this actually surprised me so I started gauging uh the number of people who failed based on the time it took them to complete security training it could have been their annual training it could have been a refresher training after failing failing a phishing exercise um it could have been um just something else we gave as an ad hoc uh and generally training I always give 90 days to complete uh and I thought that people who did it really really fast would do really really good on phishing tests what I found out was they're doing it really really fast just because they want to get it done and they don't care about what's based on this data so you see actually people who completed the training within the first 30 days of a 90 day campaign accounted for what is that 43% of the uh of the failures so yeah okay it is it breaks down pretty easily evenly so if I do a 90 day campaign it generally is about a third a third a third um so uh I do find that uh I do start pushing on them of course that may be the byproduct fact that I push on them every 30 days to get it done they get a reminder so uh I give an informal one yes I give an informal training yeah an informal evaluation yeah so I send out spot checks to some folks so um anyway yeah I this surprised me so you see the block is the the gray block is people who took a really long time to complete the training they waited until the last minute before they got their network account turned off or whatever uh to get it done um that was expected this block uh the blue block I didn't expect so people who kind of fell into the they were waiting for the right time to do it and digest it uh did fairly well they looked down there at the at the orange slice 30 to 60 days in um I think they did fairly well so again my conclusions I reached completing training early not necessarily an effective uh uh an indicator of how effective the training was because people did still fail at a fairly high rate um and people who wait for the right time to do it are are less risky than the other group um you know one of the things that I did change after this was the communication of the uh the um training reminders uh I actually did swap them to remind them in 30 days or whatever uh generally about a a third of the way uh each time before the training deadline um and the other thing I did was I talked to our office managers and I said send me the busy times so we do tax uh and there's not just April 15 there's various IRS deadlines throughout the year and what I wanted to do was make sure people completed training outside the busy times because I knew if it hit busy time they just wouldn't do it so um so anyway uh computer based training and uh yeah yeah it's it's all right so so when I say training that's a broad it's a broad category so we do an annual training um and we do refresher trainings um and then in other events I got some of the data uh from and also in our current environment we do um just kind of ad hoc you know a new firm merger or company comes on we'll do a training then uh we'll do specific targeted training as requested for various business units that kind of thing and so oh that's gauged on this so um do we follow up with testing? I'm going to do the testing quarterly that's how I've watched them the testing is just quarterly and it from the time that they completed the training yeah yeah yeah I mean it can create a little skew I agree with you yeah but um the training occurs quarterly okay let me say that generally there is some kind of training for a population of users quarterly and then there's there's phishing uh quarterly so um as well so um this gets back to what you're just asking about too so I also looked at the days since a user took a training before they failed the test and um as you can see about 60 days 60 to 90 days out from the last time they took the training is where the largest body of failures occurred so um I thought this was pretty interesting I thought this actually told us pretty much the lifetime of our training is 60 days and when I talk about training one of the things I would mention is you only have to tell them to hover the mouse over the link so many times you know you would think but I think any you can give that's a little hint or just something about security awareness is it gets at top of their mind and um and I think that's very very important um so yeah yeah that's about right the phishing test occurred quarterly um I what I said was some kind of training occurs quarterly our big training we do outside tax time the one that everybody has to take that's long uh we do during non-busy season which runs about June 1st to September 1st that was actually one of the things I improved from last year when we did this last year I did it and it overlapped IRS deadlines and I got um I got a little heat for that which my thought was you should have done it in the first when I sent it out because it wasn't busy time you know but then they waited the last minute and then it became busy time and you know so that's how that worked um so anyway um conclusions that kind of reached from that that sample was that the training lasted about 60 days um if we don't reinforce it frequently and refresh people and again it could just be refreshing people on any security topic um will they'll fail more so uh we do a lot of reinforcement now we actually started sending weekly messages out just little tips and things like that uh I talked about doing multiple small trainings getting away from like one large you gotta sit out and do it for a couple hours uh every year uh and I was told for us that's actually an issue because we use um our training as internal uh continuing education credits for uh some of our tax and accounting folks and so um it has to be a minimum length so I would rather do like four very small 20 minute trainings uh than one large training but um I'm told that we can't do that so um but something about for your environment if you don't have that challenge um so other weird stuff and I hope you're not weird out by clowns I thought a meme of Ronald McDonald talking on the phone about environmental factors completely captured uh weird stuff uh but I looked at a couple other weird things just kind of threw these in here for fun uh here at the end of the presentation um uh failures by time of day uh you can see that early in the morning lunchtime going home time uh tends to generate the most failures uh one of the things I think happens here as well is when the fishing campaign starts the general starts at 8am and I didn't think about this and I've always done this with every one of them I've ever run starts about 8am and a bunch of people fail and then they talk IT's testing us again because they figure out that and so it drops way down and then you know there's a shift change or people go to lunch they're not paying attention or they didn't get the water cooler gossip that I was coming after them and then they fail a lot and then it goes back down and then at the end of the day it goes back up but um I think it's actually a good habit to get your users into because they communicate yeah I did not find a statistically significant correlation uh based on day a week the only uh it was all pretty even uh time had the biggest uh biggest impact so that's that's a good question yeah I did look at that though appreciate that um and so you know it was uh you know I think oh yeah users I think getting them the habit of talking because you know one guy will say hey I got this email from you know the CFO and oh yeah I got that email too he's asking me the iTunes gift cards and then they start to figure out something's not right uh which is really good but um but anyway um I also looked at the weather just because I thought it would be fun and so I want to figure out if it was raining or not raining uh and I did figure out that people did a little bit worse not statistically significantly worse but a little bit worse when it rained versus when it was uh was sunny outside so something to think about well just thought it would be interesting to figure to look at you know as well as real freaking omics things I thought I could be in the next book I guess uh so uh so anyway um a few more thoughts though that I had that I want to share about stuff that that we did and we've talked about a little bit of this and I really do appreciate all the questions and and discussion um I've found uh and and Christy and I have found that the more we talk to people on a personal level uh it really really um makes an impact and if you make them feel like they're contributing and they're making a difference in insecurity that they they do better uh you know I told you we have the phishing buttons we have the spam box we reply to every one of those messages personally and tell them when they don't get like an automated form thing back like we reply to them all hey thanks sometimes I'll even cut up and joke with them like oh man look at this guy he sucks or you know something like that just well what an idiot he misspelled your name in this email it's something stupid to make them think it through and think that you're really paying attention they are contributing so um I found that's really effective um shaming never works never that you failed you failed walls of shame for people who clicked on the most messages or whatever that that just doesn't uh doesn't work I've also tried to run contests in the past and that um that creates some real animosity because apparently a twenty five dollar itunes gift card is very coveted uh so you know where yeah we should now now we use the button and it has a the positive feedback loop where they get the message that says hey congratulations and you've passed the test it's great um yeah oh like if they yeah I mean we do in terms of like I just mentioned when they report things we reply to them and say hey you did a great job catching this that's awesome I would love to have the budget to hand out like a tangible reward like a starbucks gift card or something but our volume is way too high the users what's that yeah that's a good idea that's a really good idea yeah yes we use security incidents inside the sample as well yes um yeah yeah quarterly is what I prefer because they get worn out on and so that's a that's a good story um so on the on the survey I actually did leave a third open ended question where I said write me comments about the you know in fishing test or whatever anything you want and and you know it'll be anonymous you don't have to tell me who you are and that became a field where people explain to me why they failed the fishing test or why they hate the fishing test uh and so I had one who was just like these are a distraction from our customer business and you know I had um I had you know I was buying my son a cell phone and got an email from about a cell phone you know um but they the users do not like to feel tested um and that was one of the things that our training vendor was very helpful in recommending me was like give them the training first make sure they're comfortable make sure they're trained and then start testing them because if they don't feel like they've been adequately prepared and then you give them a test they'll hate it and they will never want to participate in it again um so yeah yeah in the back give your hand up for a little bit there yes yes we yeah we did and we have a few who have been um repeat offenders and I tell you what um this gets back to the shaming thing so this is a great time for that question in previous lives if we had repeat offenders I wouldn't talk to their manager and said you need to get these guys in line and um that was not very effective because they got mad they felt like they were under the gun they were being uh uh strung up a little bit hung out to dry were tattled on and um in our environment here where I work now I just reached out to them personally getting back to that contact thing and said hey look we've had the training you failed a couple what can I help you with and that approach I found works way better than going and trying to create some sort of shaming environment um so yeah um I mean they uh you know I got a number of different responses uh some some of them gave me the excuse this is why I failed and some of them were just like really there were complaints about the IT department you know I don't understand what's legitimate and what's not and which we saw from the from some of the data um uh but mostly they were just embarrassed and they said I'm so sorry I failed a couple of these you know how can I do better and they reached out and said and I gave them the sort of the flyer of the mouse over the link look and and and the folks I tell you the folks we reached out to personally they've never failed another one so um go gosh I do it could you guys hold your questions for the end I'm gonna run through these last couple slides and then we'll field some more so I just got two slides left here uh so um of the of the results I don't feel like I've got enough data we haven't implemented everything yet again I've had some challenges with some getting some things implemented but um we did drop 2% uh after the bad test where it went up to the next test so I felt like that was really good now I want to do a couple more tests to make sure that wasn't some sort of just anomalous thing I want to see particularly with the timing we talked about the next batch of new grads who come in I want to make sure they do better and and just make sure that that it's a continued trend downward before I say conclusively this is the panacea for everything in our environment but I will tell you the feedback I've gotten and really more than stats when people email me and say I really like the trading I really like the stuff you guys are doing it makes me feel better than seeing a percentage drop down or something like that so um I've gotten a lot of that uh and and that's um that's that's been very very helpful uh so anyway yeah that's that's it that's the material I've got uh thanks again me for introducing me and uh packet hacking village guys for let me come talk that's me I'll throw these slides up later there uh if you want them and uh yeah let's grab some of the questions uh and have some more discussion here so alright no no let me answer your questions come on alright yeah no I that would be a that would be a good next place to go so the question was did I separate the actual phishing emails from the uh the tests and yeah I I wanted to look at the whole body of data because I like to think my tests are so good they're like real phishing emails I guess so anyway um yeah no no I'll go ahead yeah I'll get him next yeah so 15 minute refresher training yes my licensing for my training platform doesn't let me get to the games because I'm on a budget so uh but that's something I would like to do is actually analyze that and printed material versus an interactive exercise I like our training vendor uh I'm not here to sponsor anybody so I'm not gonna say who it is but they the training they provide is highly interactive so I'd like to think that's uh that some of it um as well let me see okay yes ma'am and then I'll grab you back here okay right I well I and I think that gets back to that whole data where you're talking about technological aptitude where the younger folks who I would expect to be more technologically apt than our older folks still failed regularly so that that lines up exactly with what you uh you just asked so yeah that's a great point um yeah that's a good question yeah so um I will tell you that uh was are you asking for two parts like did I compare failures to successes or just strictly around holidays okay uh no I look strictly at failures in this this exercise and that that's a great question and that would be another another um really the successes I only have success data on my current environment I don't have it on previous environment so that that would be a little bit more difficult holidays that would be a good timing based thing to look at as well so um that's a good question I just kind of looked at day of week and time and things like that so um yeah oh never uh no some people like to take their junk folder and just fish fish fish fish yeah so they do that yeah so yeah we get a lot of them yeah mm-hmm yeah they could there should be a distro that's actually a good point but most of them just email me directly so we have a fairly small security team it's the two of us so between one or the other it's they find one of us so yeah oh gosh yeah go ahead uh uh pit you guys fight it out amongst yourselves who wants to talk first mm-hmm we do that too yeah the good thing about the vendor we use uh is it's a it includes the original url as a query string parameter so I actually teach people like hey look it's not right here where you'd expect and where the training says it is it's over here so yes correct yeah so it's still in there yeah yeah go ahead sure mm-hmm that's a good question that's that's yeah I mean we're our environment is about 2200 uh so we get 100 ish a day maybe I mean it just depends yeah now one thing we do I will say this so if we get a broad like 400 people get the same email and they all report it we have a little word press site I nailed up and we'll just post it there and people are actually referred to that said like hey before you send something in check the fish board see if it's on the site first um so you know really what we're applying to is a lot of the one-offs um it's um we really need about five interns to do that so we can do better stuff but uh but yeah while they're trying to offer yeah well and we try to train them in multiple parts that's like that's like one of the last things we tell them to look at it's really like who's the sender were you expecting it does it have an attachment you know all those things and then the hovering sort of a last resort now we're also fortunate to we use the vendor that does the link rewriting so they've blown up the link and scanned it and it lands them on a splash page as this may be malicious I've also found the splash pages and stop our users from clicking on things so I mean you know we've layered up God knows how many URL filtering and that and they still somehow go around all of it but uh yeah anyway so anything else oh okay yeah okay mm-hmm yes it can ooh as valid no that's a great point that's actually something to think about okay anything else awesome yeah what do you mean in terms of multiple like email domains and or just yes so ours right now we're spread way out like I mean we're I work from home and I'm not near any of our so uh yeah we have about 80 offices and they're spread from across lower third US yeah actually that that's a really great point and so we did have and this is not my current company this is a previous employer had a similar decentralized environment um we did have an office that was out in Seattle who we never visited just because they were out in Seattle and I'm from Nashville and so we found that they did not do real well on the phishing tests repeatedly they didn't have a local IT they didn't we didn't do any security testing they were kind of man on an island out there and um yeah that that is the amount of interaction you do with them uh it does make an impact yeah so I think it's really important to try to make everybody feel part of the program and feel that personal touch so yes yeah we actually took a trip out there and we sat down and had a like a security brown bag two hours just me and um our director of compliance and let him ask whatever questions they wanted and things like that so yeah so this is all domestic data only domestic data yeah so okay yeah we do track statistics on the device that was used and uh I have found that mobile devices people will open crap on their iPhones every time they get it so um yeah well thank you guys I know this is a tough time slot oh I'm sorry there are a couple more questions for using templates uh no well so now I stick strictly to what our training uh vendor provides but um yeah in the past you can clone I've cloned a few so yeah good well thank you again for coming appreciate it guys yeah