 For the time to join us, securing digital networks is one of the most important challenges all businesses face. Today's expert connect, educating for cyber mindfulness, building a community of cyber allies we hope will provide a practical approach to solving one aspect of good cyber security for any organization. And that is teaching users safe computing habits. Before we begin, a few housekeeping chores. We will try to leave 15 minutes at the end of this session for Q and A. If you have questions for our presenters, you will find a button at the bottom of your screen marked Q and A. Please post your questions there. When we're ready for your question, we will call on you. If we don't have a chance to get to your question, we will email your response at the end of this webinar. One other point too, we are recording this expert connect. It will be available on the CGE YouTube channel by end of the day tomorrow. Now to the reason we're here. We are fortunate and wanna thank the researchers and practitioners from the University of Dayton Center for Cybersecurity and Data Intelligence for sharing their strategies, tactics, and the lessons they learned developing and implementing a cyber mindfulness campaign. Universities have long been among the most vulnerable and frequent targets for cyber intrusions. They are, in a sense, the canaries and the coalmine for the latest cyber security threats. And after all, most of their customers, or students, are on the cutting edge of technology, often checking out the latest apps, tech products, and websites with hardly a thought to security. No wonder the school came up with a strategy for turning its users from security risks to allies. I'm gonna turn over the mic to Thomas Steele, Associate Provost and CIO of the University of Dayton, who will introduce the rest of the presenters and get on with explaining the cyber mindfulness program. Thank you. Great, thanks Ira, and welcome everyone. We're really thrilled to be here today. I wanna express a special thanks to CGE for inviting us to share our work today. We're gonna be sharing with you our model for developing cybersecurity awareness. We think it might be helpful as you think about ways that your organization might implement awareness training. And so what we're gonna do is, first I'm gonna start off and introduce a few of our folks here. As I said, I'm Tom Schill, I'm the Associate Provost at the University of Dayton and Chief Information Officer. With us today also is Dr. James Robinson, a professor of communication and our resident social scientist that's helped us work through a lot of the behavior issues that we are trying to grapple with. And then also with us is Kim Condi, she's our IT coordinator for communications. And Elizabeth Timmons, who serves as our web services manager. And of course, they are the co-directors of this project. So they have a very special role in helping us understand some of the tactics that we'll look at today. Not with us, but a key part of our team is Dean Halter. And Dean is our IT risk management officer and played a huge role in helping us deal with a lot of logistics around that. As we start into the next slide, what we thought we would do is start with a quick poll. And what we're sharing here is the SANS Institute Security Awareness Maturity Model, fancy words for where does your organization stand in terms of its readiness for cyber awareness training and education. And so we're gonna bring up a poll and we're gonna ask you to kind of place on that, on the poll, you know, respond, where do you see your organizational landing? This will really help us as we go through our conversation today to explain that. So we're gonna run that poll, go ahead and pick a point. I just said, just kind of a quick kind of explanation there. Obviously not, not existed means that you're not doing much at all with the, with a program at all. Compliance focuses are for those organizations that are looking at the kinds of audit requirements. Perhaps you do once a year training and you track that in kind of an ad hoc fashion. The third area would be if you're doing a much more intense promoting awareness and behavior change. And that model, of course, begins to deal with that engagement around interesting communication practices, ongoing activities, those kinds of things. And of course, if you're in the next area where you're doing long-term sustainment and culture change, that would be where you really begin to look at a role where you have strong leadership support. You're doing an ongoing engagement in very regular ways. And then finally, the metrics framework. And if anybody is in this category, you can join us up here and help us give this presentation because the metrics framework is one where you're not only using robust metrics as part of the capturing of what's happening, but you're applying those in a continuous improvement model. And there are not many organizations that have achieved that level of sophistication, but you might be among one of them. So select one of those and we will be able to get a sense and we'll share that poll back to you in just a moment. You know, one of the interesting things for us is that as we look at the kind of SANS Institute maturity model, we believe that we kind of land in that long-term sustainment category, but we also know that we have some gaps there. Like for example, we haven't quite refined our onboarding for our new users yet in a way that is systematic and sustainable. So we are still dealing with some of our gaps. So I don't know if we have those results up, but... And there we are. So this is interesting. So we have about 30% of our audience out there in the non-existing category, 15% of compliance focused, 25% are in that promoting awareness and behavior change and 35% are fitting into that long-term sustainment. So this is a great mix of folks out there and we're gonna look forward to then sharing some of the practices that we do. So thanks, Toby, for that. We can take that poll down and we'll move on to our next slide. What we're gonna do today is we're gonna give you a little bit of an insight into where we started on this and then how the research that we've looked at in the social science arena has helped guide some of the work that we've done and ultimately how that drove us in the development of our cyber-mindfulness model. And then we're gonna talk a bit about how various tactics such as fish training is being used to help us develop the awareness and sustainable practices out there. And of course, the one thing that many organizations are very concerned about is how do you measure the results of what you're doing? And finally, we're gonna share a bit about the lessons that we've learned. So as we start through this, we'll begin just at the very beginning. And in about 2015, the University of Dayton began looking at the possibility of moving towards a much more intense cybersecurity model. And that model was gonna include the implementation of two-factor authentication on most of our systems. And as we began thinking about that, the one concern that emerged for us is that our user community may very quickly assume that with the implementation of two-factor authentication, we will begin to have folks think that that's great, all security's taken care of. I don't have to worry about this anymore and they can move on. And as we began talking that through, we realized that before we go to two-factor authentication, we really needed to do something that would engage our community in much more thoughtful cybersecurity behaviors. And so we decided to build out a year what we called our year of computer awareness or a year of safe computing. And that project led us into the development of the model that we're gonna share today. One of the things that we realized as we went into this is that we had a great deal of folks on our staff who are very traditional in the way they came at, cybersecurity awareness and training. For the most part, those folks were kind of used to doing scattershot kind of approaches to cyber training and awareness. We tended to do what many IT organizations do and that's to kind of issue requirements or mandates and rules and not to really think much about how that engagement might work. And so our beginning stages there began to say, okay, what can we think about doing to change that piece of it? On the other side, with our faculty, staff and students, what we realized that most of them were relatively uninformed or misinformed about the real threats that were out there. We were at a point where we didn't pay a lot of attention to telling people about what's happening or what they might do about it. And so we began thinking that how do we transition from an organization of employees that may not be very aware of cybersecurity and know little or nothing about what they can do about it into an organization that was much more responsive and active in that area. So with that in mind, we kind of moved towards this idea of realizing that there's a group of folks within our organization that fit into this category of cyber carelessness or cyber careless. And that's the folks who kind of thought, IT is in charge of security. I don't have to worry about this or I'm much too small an organization or nobody really cares about my stuff. So there's no real need for me to worry about these kinds of problems. And that was a group that we thought we really needed to begin addressing in some meaningful ways. But on the other side of this is this other group of folks that we discovered, which are what we call our cyber cynics or our cyber fatigued group. And these are the folks that are paying a lot of attention to the media stories that are kind of told as horror stories about what's wrong with cybersecurity and how you really can't stop the bad guys. And so this group of folks feel that there's nothing they can really do to secure their data or their technology. And that's certainly that if anybody wants my stuff there's nothing I can do to stop that. And we felt that it was very important that we really balance those two perspective and moving towards a model that would help us, really address the risks that are there. So what we wanted to do is make sure that as we began our awareness training that we were sensitive to this balance that we didn't ignore those on the carelessness side and we didn't create a group of folks that felt that cybersecurity was so dangerously powerful in terms of what the bad guys could do that there's little that they could do to impact that. And so what we began doing is we began looking at social science and in that terms we wanted to know what evidence-based research can we identify that can tell us what's most effective in the ways that we would impact the behaviors of our users. And so with that we will have some comments from Danny Robinson and he's gonna talk a bit about what we've done in that arena. Thanks Tom. Today I'm gonna talk a little bit about social influence theory and how social influence theory applies to cybersecurity. There are a number of relevant social science theories that I would like to discuss but I'm gonna cover them pretty quickly because of our limited amount of time together. But if you are interested in more detail on a specific theory or all of the theories if you just let us know we'll forward you a handout that has more detailed information and it also has a reasonable bibliography at the back so you can do some deep diving into this on your own time. What these theories tell us is that communication is a complex process. For communication to occur it requires that the individuals are exposed to the message, that they attend to the message, that they comprehend or understand the message and of course that they retain the message. If any of those events don't occur there's little chance of a message being effective. Now if you keep that in mind and you think about designing a message campaign you begin to see why message campaigns are so difficult. They're difficult in large part because so many things have to happen in a particular order for there to be success. These theories also tell us that a critical element of message effectiveness has to do with the relationship between the communicants. And so when we talk about the relationship between communicants one of the things that we wanna make sure that we don't do is that we think about it in terms of everybody has to be best friends or something like that. Rather what has to happen is that people have to understand the nature of their relationship with the other person so that they can have a better understanding of how that message, whatever message that they send, how that message will be understood by their target audience. In fact the importance of this contextualization and relationship is so important that it's pretty safe to say that if you change the nature of the relationship and that's just a perceptual variable. If people perceive their relationship to be in a particular way it actually changes, it actually changes the meaning of the message. Obviously here in this theory helps us understand some important implications to the idea of staff in user relationships. So we'll talk more about that and we'll also talk about the development of a sense of agency throughout this presentation. Finally, people oftentimes think that we process information in a very, very careful way. We like to think where most social science theories predicated on the assumption that people are rational and while people are indeed at times rational there are other times when they're not rational and when they're not behaving in a particularly rational way what they do is they rely on peripheral cues. So they use things like the credibility of the speaker or the attractiveness of a speaker and they use those heuristics to decide whether they agree or they disagree with the message. And so one of our goals in this campaign is is to help people move from that peripheral kind of processing, oh I have an email, oh it says do something, I click on the link to get this out of my end basket, lo and behold I've made a tragic mistake into processing email messages like that in a careful or scrutinized those messages more carefully and so they start looking for things like is there anybody making a request of me here to perform some behavior? And if they are then I need to as a matter of practice I need to double down my efforts at being mindful. The basis of many, many, many of the theories of campaigns and health information campaigns and frankly any kind of campaign comes from this idea of the health belief model. In the 1950s the health belief model was developed to explain how public health officials could get people to engage in healthy behaviors. Things like getting people to get an X-ray or get a vaccination. Now the government was offering these programs at this time and for free and they couldn't get, they were getting very, very low turn on people actually taking advantage of these opportunities. And so these researchers came up with this model and what they found out was, was that if you want to get people to enact in a healthy behavior then one of the critical elements, the first critical element is, is that you have to make the people feel susceptible to the health risk. In addition, they also have to believe and you have to be able to demonstrate in your message strategy that the consequences are severe or that they're significant enough to merit the person's actual behavior. Get them off the side, so to speak. Third, we need to, we need in our campaigns and our message campaigns, we need to make sure that we offer people solutions that they believe will be effective. So it's critical that we can demonstrate the effectiveness of our solutions if we're gonna try to get people to adopt them. And then last but not least, and this is really important, something that we typically, one of the weaker areas in information campaigns and training is the development of that sense of efficacy. And of course by efficacy, what we mean is, is that people feel like they can successfully enact the behaviors that you're trying to promote. So if it's a fishing, an anti-fishing campaign people need to feel like they can identify fishing messages when they see them and they know appropriate behaviors to avoid those, to avoid those problems. So you can see then that the basic message strategy is kind of a rewards cost ratio examination where people need to see that the potential problems that are out there are more costly than the solutions that we offer them to avoid those problems, okay? Now a couple more things that almost sound like they're outside the campaign but they really, really strengthen the campaign and that is that we have to develop within our message strategy, within our training we have to develop reminders and cues that'll activate thoughts about cybersecurity when the people are not engaged in training. We've all been parts of training campaigns where we do something at the beginning of the year and then we probably forget about it and avoid thinking about it until the next time we have to go through that training. And so campaigns that are designed to promote behaviors that are, we could call them chronic to stay with our health belief kind of orientation things that people have to be aware of all the time it's critical that we offer them opportunities to be reminded of that on a regular basis. Finally, in terms of the health belief model one of the things that the health belief model has taught us that's really, really important is that people are not contrary to popular belief people are not particularly concerned about their health until they don't have it. You would think that everybody will tell you oh, I'm very, very concerned about my health you don't have anything if you don't have your health but the truth of the matter is is that until people don't have it they kind of take it for granted. So you have to ask yourself well why in the world would we be talking about health campaigns when we're supposed to be talking about cybersecurity? And the answer is, is because we found essentially the same thing holds true with cybersecurity safety behaviors people don't value cybersecurity very much until they lose it. So our training efforts in our campaign one of the things that we try to do then is we try to demonstrate susceptibility to attack. So we put this into action in a variety of ways one of the things that we did was we provide campus fishing exercises through a company called Know Before and what that allows people to do is it allows them to actually have the experience of being fished if they in fact fall for a fish. In addition to this, we also have newsletters that reinforce this notion of susceptibility. In those newsletters we talk about how often breaches happen, their famous breaches that have happened around the world and in a sense we provide people with that again that susceptibility information but in our newsletters we also provide people with the information about the effectiveness of that strategy by putting in things about here's an example of a person who didn't fall for a fishing exploit on our campus. So it's important that people have that experience and that they have it in under circumstances where they don't feel like that they're gonna be punished or that the consequences are gonna be punitive. We try to think of these fishing exercises like any other kind of exercise. It's important that we understand that this is something that we do to increase and improve our strength or our skills and so when we fail we try to use that failure as a way to strengthen us in the future. One of the things that we have found has been really important to communicate to people is the fact that everybody falls for fishing campaigns if the message is good enough. If a message resonates people will click or people will open files or people will provide information. That's just the nature of human behavior. You're never gonna get away with that 100% but the good news is that we can still dramatically reduce it and by providing people with not only the skills and the information that they need but the practice that they need then ideally what happens then is it becomes a more common topic of conversation on campus. People begin to generalize those principles and skills into their everyday lives and then it makes them less susceptible to new fishing or new kinds of exploits that they might be susceptible to at some point in the future that maybe we've never had an opportunity to even train them up on. So we talk about severity. We talk about susceptibility and I guess when we talk about severity we try to promote the idea that there are personal consequences to these behaviors but there's also organizational consequences and there's even a work group or more local group kinds of consequences to these problems. And so by doing that by making people feel that they're part of the organization and that they're actually the first line of defense for our networks but by trying to do that and promoting that sense of agency we have found that people feel a lot more responsibility not just to be careful with their own behavior but they feel the need and they want to remind or warn other people if they've fallen for mistakes. So we get stories all the time about people who say I fell for this or I got an email that looks like a fishing exploit and they'll crank out an email to 10 or 12 with their colleagues say hey be careful this is kind of going on. And so this is the kind of influence that is kind of hard for organizations to target sometime but it's clear evidence that we are improving people's awareness of cybersecurity on campus. It's critical and in our newsletters we do this too that we demonstrate that our solutions are effective. So we talked about that a little bit already. So what we try to do is we try to provide a few or a variety of tips and some of which that are relatively easy for the end user to an actor to engage in. And so we do that for obvious reasons so that we can have success but we also do it because when people have some experience say for example in these fishing campaigns and they gain some success oh I identified a fishing exploit then what happens is that it increases their sense of personal efficacy. And so one of the things that we know about changing people's behavior is that not only do they have to believe that the solutions or the changes that they're considering are gonna be effective but they also have to believe that they will be able to perform those behaviors effectively. And that's again that's one of those, that's one of those byproducts of a body of literature that's been done on fear appeals where we find that when we scare people or try to scare people straight or try to scare people into making sure that they don't make some kind of a mistake that what happens is that if we really, really, really truly scare them what ends up happening is as they end up feeling like there's nothing that I can do this and so they just ignore it and they try not to think about it. But when they feel efficacious, when they feel like they can do something about this and when they feel like the solution that we're offering them is something that can help them avoid this then they're more likely to enact the behaviors that we're trying to promote. Perhaps one of the most important things that we do is that we provide those cues for our end users to make sure that the computer security and cyber mindfulness is always at the front of their mind as much as possible. And we do this in a variety of way and Kim's gonna talk about more about that in a few minutes. But one of the things that we've done is we try to brand and use colors and logos and those kinds of heuristics so that people when they see, we hope that when they see an orange pylon that they think about cyber security because we used orange as one of the colors in our cyber mindfulness campaign. But you have to be careful with these strategies because anytime you try to use logos and branding and those kinds of priming tricks getting that to the front of people mind you have to be careful because those cues can lose their effectiveness over time. And so once again, it comes back to that development of a sense of relationship and that development of a sense of agency between the end users and the IT department and the organization because the same messages that can be viewed the exact same message that can be viewed as nagging by if one person does it can be viewed as somebody trying to provide me with service or help or they can even view it as contextualized caring. And so if you can get your end users to that stage where they think that the people on IT actually care about them and are trying to help them and not just trying to tell them what they're doing is wrong then I think that what you'll see is you'll see an increase in awareness. You'll see an increase in discussions about cyber security at your place of business and you'll see that people take these skills and they take them home with them and they find that they improve their personal security as well. Okay, so the last thing that I wanna talk about as a kind of summary is this idea that in general what we know from social science research is that the knowledge of the risk, a sense of efficacy and frequent cues in constructing an awareness program can help you be more effective in changing attitudes and behaviors. Particularly when those messages come from somebody that we have a positive professional relationship with and we feel like we're in a partnership to help protect everyone else's data as well as our own. So to further discuss the cyber mindfulness model, I'm gonna turn this back over to Tom Skill. Yeah, thanks, Danny. So we saw here that evidence-based social science research drove us to think differently about the ways that we train and educate for cyber security. And part of this model begins in an area that is very, very frequently used by pretty much anybody doing something in cyber security and that's the awareness stage of cyber education. And at this stage, what we're looking at is the awareness that our users have a sense of the personal and institutional risks. They're learning about the threats that are there and they're continuing to learn about them. The way that we categorize this or the way that we would capture a success in this stage is that if our users are saying, hey, I know that cyber security threats are real, persistent and dangerous. And so with that, that's a pretty standard piece. Here's where things kind of change a bit for us. So typically in those awareness programs, we jump right from awareness into the assumed stage where people will move to action. And at the action stage, you're looking at, are they actually able to enact behaviors? Are they looking at those threats and are they taking preventive actions to that? And here at that stage, people would say, hey, I will take actions to reduce risks in my organization and I have practiced them. That's what we all want to land. That's the goal there. But the struggle that we realized in this particular model, it was missing a very important middle stage. And that's the stage that Danny talked a bit about is this agency approach where people are really developing this attitude of personal responsibility and accepting that responsibility and seeing it as shared among the community that they're members with. And so success at this stage would have our users saying, hey, I believe these risks are important and meaningful and I can do something about that. So creating a sense of agency in your user community is critical to achieving sustained attention to cybersecurity. And that's what brings us into this next stage of saying, okay, so if cyber mindfulness is about awareness, agency and action, so how does this model fit in here? And this is where we see our back to our, what we call our engagement fulcrum, where we're saying that we've got to make sure that we're paying attention to the way we engage with our user community so that we're moving them from cyber carelessness to a midpoint and from cyber fatigue to a midpoint. And of course that is cyber mindfulness. So in ways of achieving this kind of outcome, Kim's gonna walk us through the tactics that will help you establish cyber mindfulness in your community. Thanks, Tom. Sure. Having identified cyber mindfulness as that quality we wanted to encourage within our user community, we wanted to build an awareness campaign that would put that social science theory into action to motivate that kind of change. And we approached this project, those communicators rather than IT technicians. We wanted to build an information security awareness program rather than kind of a traditional information security training program per se. So we started by considering the facets that most communicated communications professionals are gonna consider when they're planning a campaign like this. We're gonna spend a whole lot of time on the nuts and bolts of our planning, but just to give you an idea of how we approached this. We took some time thinking about our audience. So what did they already know about the topic? You know, as Tom mentioned earlier, we suspected this was probably all over the board. And we wanted to know too what their current attitudes were. Were they intimidated? Were they disinterested? Were they confused? So we conducted a survey at the outset and we found out that a good portion of our respondents felt that cyber security was definitely IT's job and it wasn't something they should have to deal with themselves and they didn't really have a role to play, which told us that we definitely had some room to improve with this awareness campaign. And we also thought about the messages we need to reiterate throughout the campaign to make our community more likely to think and act more cyber-mindfully. So particularly messages that concentrated on agency and efficacy. And then we came back to those messages kind of over and over and over again in the hopes that they'd really start to permeate our campus culture. For message delivery, we're thinking here about both tone and the communication channels we're using. So very intentionally we took a tone that was very casual and friendly in the language we use. So we didn't intimidate our less technical colleagues or worse bore them because we need their attention. So we tried to infuse some humor and keep things light and interesting throughout. And of course, as we know, people consume information in different ways. So we wanted to use a variety of channels to get this information out as well. Our monthly e-newsletter was kind of the primary vehicle, but we also used some face-to-face outreach and monthly fishing exercises that Danny mentioned and that we'll talk about more in a bit. And we used some giveaways. So for instance, we had pens, mugs and clips and all these kind of tangible cues. That would again, like Danny mentioned, try to remind people in a real physical way to think about safe computing. And as you can see, we use a nice bright orange. So these would be unavoidable. They would have to see these. And at the outside of this also, we worried about what kind of measurements we would do at the end and Elizabeth will talk about how we kind of measured whether this was working or not. So how did this end up being different from a kind of traditional information security training approach? You know, as I mentioned, those considerations that we just talked about are pretty standard for communication planning, but we found that this did lead us to a different kind of approach than if we'd done a compliance focus during annual training kind of model. So first, we were really intentional about inviting interaction with our campus audience. We wanted to, in our newsletters, for instance, offer opportunities for them to interact with us. So we offered to send them swag if they were applied with input or answers to a question. And a lot of these questions we'd even asked weren't related to cybersecurity, per se. So for instance, we asked them in one newsletter to send us something that was the same cyber mindful orange as the swag we sent out. And we got these kind of responses back. That data security poster you can see there at right, one of our community members actually created that. She said she spent the time cutting up all that confetti. She said it took her forever. And the bottom, if you can't read it, it says no personal or identifying information was harmed in the creation of this photograph. So you can tell people kind of enjoyed this. That car was some guy's wife's car he said that he sent in. So again, if these didn't teach people about safe computing, but they were an incentive to open and read their emails because they might find something kind of random and fun in there. And that's worked really well for us. We also offered kind of face to face opportunities. We did a secure disposal event for old equipment. We did lunchtime information sessions, those kinds of things. And also, as I mentioned earlier, we took a tone that was different from our other IT communications that we'd sent out in the past. So instead of being kind of preachy or technical, we looked for ways to put a little energy and life into our messaging because we knew that even if all the facts were there, if it was boring, no one's gonna pay attention to it. And attention is really what we were looking for. So for an example, this is just the header to one of our newsletters. You'll see that this monthly topic, Home Safe Home, is gonna focus on digital assistant devices. And we also included some enticements to engage in that cyber mindful summer camp that had a little section where they had a chance to respond to something. And then we would update on the campus fish training and that fish commission section. That was short for fishing commissioners. That was our personification of the mysterious IT staffer who ran the monthly training exercises, kind of like the banker and dealer no deal that you see up there in the, up in the heavens during that game show. So we also had Elizabeth was able to design kind of a custom image for each month's theme. And we pepper in some comics or some memes along the way to keep things interesting. So again, the goal was to kind of keep this approachable and interesting. To help build our community's confidence about their own ability to actually take the right actions to protect themselves. We provided these chances for practice through our monthly fishing exercises. And with any other skill, like any other skill we have to learn from a book or a class until we actually put it into practice, we don't really absorb that. So this was a really good hands-on way to do that. And we'd hear back again from many folks about how they caught this one or they clicked on that one. So they seem to kind of enjoy these as a chance to show off their skills. Here you'll see kind of an example of one of the landing pages they'd hit. They clicked on a message. It would give them a couple of reminders about what they could look for in the future if they received another suspicious looking message. And as I showed you earlier, we did have a batch of these low cost swag items that we could use as incentives for interaction. But we've also been really deliberate about responding to reports of potential fishing messages or questions with sincere appreciation. So for instance, during our monthly fishing exercises, all of us on the safe computing team were bound to get many emails just like this where they'd forward us something and say, hey, found this. Is this something I should worry about? I just want to let you know this, but hit my inbox and either go to our help desk or to all of us individually. And our appreciative responses to these reports really are sincere. We know that if a live fishing exploit hits our campus, these are the kinds of early reports from our most alert users that are going to be the early warning system we need to take action quickly. So we want our community to know that we recognize that they're taking time out of their day to pay attention and reach out to us to protect the whole campus. And that's a great service to the organization. And while annual compliance-based information security training certainly has a role to play in a lot of organizations, we decided early on that that's not where we wanted to start this conversation with our campus community. As Tom mentioned, we hadn't done any kind of intentional training on this topic. So we didn't want to start it with this kind of one and done approach. And we didn't think that was going to move the needle on attitude and behavior change if we had our users do kind of one training exercise and then didn't say anything about it till the next year. So we committed to a steady stream of information doled out over the course of an entire year. You'll see here, this is just kind of the way we charted out that first year's worth of information. We borrowed a lot of resources and ideas from organizations like Stop Think Connect and the Sands Institute and Stay Safe Online. They've got a treasure trove of great stuff you can borrow from. And the whole fourth quarter of our campaign though focused around the two factor authentication implementation that we were getting ready for at the time. And of course, at the end of the year, we realized that if we wanted to build on that momentum and culture change that we'd started, we were going to have to continue this. So it couldn't be even a one year and done program. It was going to be an ongoing thing. And finally, one last way we thought this was different from kind of a traditional approach was that we worked with our campus IT staff really deliberately to make sure that whenever a user reached out to ask one of these questions or check about a suspicious email that they'd receive a welcoming helpful response. And we cannot emphasize this point enough. We know that if someone reaches out and they feel like they're in annoyance or they're asking a dumb question, that's likely going to be the last time they reach out to us. And if we really want to activate our community as first alert allies when that zero day exploit hits, they've got to feel comfortable coming to us. And this can be hard some days, particularly when you're busy or you know that the monthly phishing exercise just went out and you're bound to get a whole bunch of emails that day or the next few days. But we see this investment of time by our IT staff as an integral part of building those relationships with our community that are going to pay off of a social engineering attack really hits our campus. So in a nutshell, it really boils down to this. We try to give our community permission not to be the experts. Our message consistently throughout this campaign has been that everyone has a role to play in keeping our campus data and systems safe, but that role doesn't require having lots of technical know-how or knowing all the answers. We just want them to reach out to us as something seems strange and trust that their IT team will help them figure out what to do from there. And we think that that's a lot more manageable for our non-technical users especially because we're sending a reasonable expectation for how they can help to keep the campus secure that reinforces that sense of agency. It's something that's really doable to them. They can just say, well, I don't know what to do about this, but I know I can raise the alarm about it. And that, we think, makes them much more likely to engage. So important part of running this campaign is making sure that you can measure the hope for change. You want to keep those measurements so that you can justify the effort to your leadership and also so that you can learn what works and what doesn't work for your ongoing efforts. So how do you know something is working? How do you know what, if someone doesn't click a link, if they are behaving cybermindfully, how do you measure that? So it can be tricky, but there are indicators that are measurable. There are things you can track, things that will demonstrate change. And so I just want to quickly talk about some of those. An obvious one is the phishing statistics. I do want to say a little bit about how we ran our phishing campaign. As Kim already mentioned and Danny mentioned, we presented this as an exercise. And I just want to point out that this was a no shame, no singling out anyone, except for the very first test that we did as a benchmark. We always told people that we were going to do this monthly and we would report back to them how that progress was going. This shows you very briefly over time that your results are not going to be a study decline. Our certainly weren't. What we found was that it really depended on the nature of the fish exercise itself. So I want to show you just some of the kinds of fish we ran. We did phishing with inline links. And if you look at the bolded boxes, you see that we did one where it was an Amazon order or a FedEx package delivery and people were asked to click a link to track their order. But over time, so we ran this three times, we did see a decrease in the click rates responding to that Amazon order. And we also saw declines in other kinds of phishing with inline links. We did test that included an attachment. You can see an example of one of the tests we ran with a blue cross donation where we told people they had thank you for their donation. Please click the receipt. This would require them to open the attachment even enable macros. And we did have people go that far and do it. But over time again, by running those kinds of tests again, we got to see a decrease. We also ran a fish that would ask for login credentials. So open a doc and it would require your Google credentials to view that doc. So with comparative phishing exercise over time, we can gather statistics that show our messaging was having the desired impact. But other ways you can measure your success, we really relied on anecdotal evidence. This is a legitimate way of measuring effectiveness. And so we would hear stories. Tom would be out in meetings across campus and administrators, staff leadership would tell him stories about fish they got and they were so proud because they hadn't clicked the link. Even a single anecdote of success can be really powerful to share with your community or with your leadership as a demonstration of your awareness campaign taking hold. Proactive reporting is also another thing you can track. So when you get reports to your help desk or you're hearing reports of people saying, I saw a fish and I started alerting other people in my office about it. You can track that sort of reporting as another measurable indicator that the awareness campaign is effective. Also from the beginning, we decided to track that one-on-one engagement that we were really focusing on. So discreet interactions with people. If we sent them email, if we sent them swag, if they attended an event, we kept track of every single one of those people. So that we knew over time how many people we were directly impacting. Also, you can take a look at your help desk tickets or lack thereof as an indicator, maybe of a reduction in malware compromises or other reports of issues. And then finally and importantly, survey. So at the beginning we did a survey and a lot of it was focused on trying to get a sense of where our audience was with their attitudes and their sense of agency. And I do just wanna show you some of the things we asked in that survey as maybe a way to help you think about how do you survey for attitude? And so we asked people to agree or disagree with some claims. For example, everyone has a responsibility to protect their computer from hackers and to ensure that stored info is secure. Do you agree? Do you disagree? Another type of attitude question. If hackers wanna break into a computer system, there's little I can do to stop them. And we certainly had strong agreement coming from a part of our audience on that. Computer security is really a technology issue that should be handled by IT. Again, that was a question that we saw a lot more agreement with them than we wanted to. And we knew that that meant we needed to address that. And finally, I am confident that I would recognize a suspicious email message. Again, what is their knowledge? What is their confidence? What is their attitude? And so over time, we're hoping to move the needle on those attitudes and sense of agency. And so we did a follow-up survey after a year so that we could measure the change in attitude and agency. And so now I'm gonna turn it over to Tom and he's gonna talk about our lessons learned. Great, thanks Elizabeth. So you're getting a sense for some of the tactical practices and the deep evidence-based research that we've been driving a lot of our work around. One of the things that we wanted to spend a little bit of time on is talk a little about some of the lessons that we learned. And there's several. I mean, one of the advantages I'll start off with that's not on the list is that as a university campus community, we have the great challenge of having a much more open and risk-prone environment where people are doing lots of things. And that gave us some struggles. But on the other hand, we also had the opportunity to connect with our academic colleagues like we did with Danny Robinson from the Department of Communication where we were able to apply a lot of that research. So the partnering in there is something that, in organizations like higher ed, I think that that is one lesson that we learned that made this project a much more engaging and fun project for the team of folks that we did over a long bit of time. So that's a good side of the lesson there. Some of the challenges I think that you'll see out there, though, is that one thing you have to understand is that there's a very, very significant time and talent commitment that you have to make to this. This is not something that you can simply do as a one and done kind of effort. In fact, the Sons Institute reports that they see a minimum of an FPE, that full-time equivalent, of being more than just one person's effort on an ongoing basis. The other piece that we believe is the core to the success of what we were doing is the commitment level that we were able to get from our campus leadership. And I think the important piece here is that with this campaign, no one was exempt from being engaged in it. That means that the fishes went to everybody, including presidents and provosts and IT staff. And that leadership within the campus community supported us doing those kinds of things and in reaching out to the community and making this a serious endeavor for our part. But I think the primary piece that I think we wanna have you walk away with in terms of some of the lessons learned there is the importance of the relationship. I think that what we did is that we understood that this is not a campaign that comes from some anonymous unknown place. It's not the IT department that's doing this. It is our organization, it is our people by name building these closer, more significant relationships with our campus community. And one of the consequences to that is that you have a much higher rate of what we would call false positives. But in many cases, I call that an engagement opportunity, is that you have folks contacting you on a regular basis, wanting to know is this real or should I be worried about this? And what we discovered there is that we needed to spend a lot of time working with our staff to make sure that they understand the importance of preserving and building that relationship. We don't wanna walk away from any of those of those opportunities to link there. So relationships really are really one of the most important pieces that we wanna do there. I think that as we look a little bit further down the road and we're gonna jump into another poll in a second. But as we look down the road, we know that we have to work on building soft skills and our IT staffs around this. This is something that many organizations would wanna think about doing. I think the other part too is that we have to realize that as an organization ourselves, thinking about cybersecurity, that we have to develop a much, a maturing attitude towards that, much like we saw in that SONS maturity scale that we've shared earlier on. So what we have is I'm gonna jump because we have another quick poll that we wanted to share with our group out there. And this is where we wanted to kind of test the waters in terms of what are you hearing from us in terms of the biggest challenges that you feel you face in building out your security awareness program? You know, is it in the management support area? Is it in the resources area? Is it in the staff skill sets? Perhaps organizational culture is part of it that you have folks that are resistant to training and education in these areas. Or is it the fact that you may be looking at some real challenges there around employee engagement? And I think that on this poll, you can click more than one item if you feel that you have several of those that are hitting you at this point. And we're gonna just give you a second there to do that. And while we're doing that, I'm gonna look to Ira to share with us if he has any questions you might want to kick back to us to deal with. And then we can share the results of this poll in just a couple seconds here. Sure, Tom, thanks for a great presentation. It's very interesting. I do have a couple of questions. And I also wanna remind people that please, if you have questions, enter in the Q&A section you'll find at the bottom of your screen. So one thing I'm curious about, obviously this is an investment in money and time. Why did you decide or how did the university come to the decision that you wanted to invest in educating your music community as opposed to buying more armaments to protect against cyber intrusion, building up bigger firewalls, virus protection? Yeah, great question. And so I'd say that it's both and. I mean, we do spend quite a bit of money on technology to secure both the perimeter and the interior. But what we've realized over a long period of time is that no amount of technology is gonna effectively reduce our risk if we have users that aren't being thoughtful in what they're doing. And the university was really thoughtful about this. And we were lucky because we have a few board members that have a great deal of knowledge in this area. And the support we got in pushing this effort forward came not just from our senior leadership on campus but even from our board. And they began kind of discussing with us ways that we could enhance this. And so building out the smart community strategies around cyber-mindfulness was something that was mapped as a dual track of let's make sure we're giving sufficient technology strength to our infrastructure. But at the same time, there are so many things that the technology appliances can't protect you against that we really had to spend time thinking about how can we help our staff become smart and engage in this process? And I don't know if my other folks have any other comments to add on there, but do we wanna share that poll? All right, great. So here we're seeing that, and this is not a surprise, is that we're all over the map on this, but the resources is always a huge issue. And of course, culture is the other one. And so many times we think about the fact that culture is king. And therefore, if you don't have the culture with you, you're not gonna be successful. And of course, in many cases, one of the things that I shared with our leadership when we talk about the investments that we've had to make in cyber-awareness is that I said to them, if we ever had a breach, you would wish you could throw that amount of money at it to solve a breach problem. And so the context there is that the amount of money you're investing upfront in awareness is substantially less than you would spend if you had to actually address a true data breach of some type. And the consequence of a data breach, of course, is that you will lose things in terms of reputation and a data breach that you will never recapture with no matter how much money you spend on that. So I think that a lot of our organizations begin to understand that allocating those resources in those areas are important. And so I'd encourage folks out there to perhaps try to characterize the fact that, while this doesn't guarantee that we're gonna prevent all breaches, it certainly gives us, and as my colleagues in the legal community say, it gives us an opportunity to tell the right story and the fact that something does go wrong, the fact that we put that effort in. Now, I'm told that we have a couple of questions that came in, so I'm gonna kick at the camera. We have one question about how we might in the future extend some of this cyber-mindful awareness campaign to our student population. And that is something that we did dip our toe into the pool on this fall. In October, when it was National Cyber Security Awareness Month, we, with the help of a very talented GA, we're lucky to have put out a campaign to our student community to try to kind of open this topic that was a real focus for our campaign. And we're looking at ways that how we can reach out to them, they're much trickier to reach. They're all over the place, technically, and social media-wise and everything. So that's definitely something we're looking at as a next approach. And particularly because we know they're gonna be going into workplaces where they're gonna be expected to have these kinds of competencies to help out with their employer's security risks as well. So we wanna make sure we prepare our student body to be successful in that as they go forward. And Tom, I'm gonna read this question to you and have you find it's a great question. Were users introduced to a risk-based approach to cybersecurity? For example, using more resources to protect high-risk assets as compared to other low-risk assets. So that's a good question. So part of it was is we began the campaign as a broad-based, everybody is all in. But as the campaign progressed, we began to realize that we needed to focus resources around those who have the greatest privileges, the elevated privilege teams. And we'd begun, now we haven't fully enacted this yet, but it is part of what's on our planning board. And some of this has actually been done on a pilot scale, is where we now are looking at specific user communities at the university that we have judged because of what they can do or what access they have, we have judged them to be a much higher risk. And so what we're doing is that we're gonna be actually targeting those groups with a much more focused level of training and awareness. And hopefully we'll be able to learn that they've actually figured this out better than some of our average users. And that would be where you'd hope that those with elevated privileges are much more sensitive to that. But that's great because actually targeting those groups is something that's absolutely critical to do. So wonderful. So any other questions, Ira? Yes, I have one actually. I'm curious, and I think I want you to sort of elaborate on this point because I'm not sure that it is widely grasped and understood, but one of the roles of universities, obviously, is to educate young people to be successful and wonderful members of society contribute positively. What role do higher education institutions have in terms of making students, let's say cyber aware of security and taking that knowledge to their organizations as they join the workforce? Wow, that's an outstanding question. In fact, as Danny is looking at me, this has been an area where we've been spending a good bit of time on that. And we began part of this by looking at the generational differences. How are millennials and Gen Zs responding to this? We know that the perception out there is that they may not be as thoughtful or as privacy-focused. But the thing that we really believe is key is as students move through university, developing their cyber-savviness skills is absolutely an enhancement to their resume that we have discovered with the students that we've had go through some of our classes on social engineering and cybersecurity communications that in those areas, they have found that to be a really a valuable asset and many of the organizations that have interviewed them were, it was clear that that was a distinguishing value add to their repertoire of experiences. And so we've been pushing this very thoroughly across campus and have had very good response from our colleagues. All right, we've got one more question I think we'll be able to take. Did you have any legal problem regarding the implementation of the program? You know, actually, we did not. And in terms of the, our assumption there was that all the university personnel are part of our community and engaging them in this awareness campaign was part of the responsibilities employees. So we didn't have any issues there. And in terms of, we weren't really ever sharing any privacy data. So on the legal side around private data, and particularly because we do not even try to call people out if they're frequently falling for our fishes. It's really more of a proactive approach where we're trying to educate them and be positive about it. So we did not have any legal issues with that. But we are private universities, so we don't have as many constraints perhaps as some of our public colleagues. But I know that we're almost out of time, so I'm gonna kick it back to Ira. Okay, well, you know what Tom, I'm gonna take the prerogative and throw one more question to you. Okay. Great. Cause I'm curious, you know, we've all had the experience where, you know, you call up the IT to help desk and the first response is turn off your computer and turn it back on. So I'm curious, how do you instill that kind of softer skill into the IT department so they really connect with the users? We have really spent a lot of time talking to our IT staff about the importance of these campaigns and any kind of an engagement as being our opportunity to build a positive relationship. And that to us has been the key to us having folks feel comfortable in engaging with us around real risks that are out there. So we think that the staff has been very responsive to that. I'm sure there are days when they're grumpy and things don't go as well. But I'd say that for the most part, our staff has been very responsive to that. But part of it is that, you know, we've done this as a leadership level initiative. We said, this is really important to the university. This is really important to the key directors within IT. And it's important to many of our faculty. So we find that the staff has been very responsive to that and the community feels much more comfortable in bringing any kind of a challenge to us. We don't want to appear to be the cops on the block. We wanna be their colleagues and their partners in solving these very big problems. And that's why we really define the whole idea around the phishing as an exercise for us to build up our muscles against the bad guys. And that's how we really turned our user community into our early alert allies. Great. Thank you, Tom and everyone at University of Dayton for a wonderful presentation. And for those of you that didn't get to your questions, we will respond by email. And for those of you that requested more information, we will get that to you shortly. And you can see on the screen here this contact information for University of Dayton. And as I said earlier, we will have a recording of this posted on the CGE YouTube channel at the end of day tomorrow. And again, thank you for taking the time to join us. Great. Thank you everyone. Look forward to hearing from folks. Bye-bye.