 Thank you very much. Good afternoon everybody, it is a pleasure to be here. I'm talking today about cyber security, particularly the human element of cyber security. I'm going to try and talk about cyber security in a slightly different way. I'm using myths and monsters and various stories that you'll probably be familiar with, but as a way of trying to explain various things connected to cyber security. Before I get started, I actually can't see, like any of you, the lights are very blinding here. But I'm imagining there's lots of people here who don't know who I am. So before I get started, I wanted to take a couple of minutes just to introduce myself. I am Dr Jessica Barker. I am not one of these. I'm one of these. So that means do not come to me today or tomorrow if this happens because I won't be able to help. My work is far more serious than that. I work in cyber security, but I am not one of these. I am not one of these. And I don't often go to work dressed like this. That's purely for casual Fridays. I am a sociologist by training. So I work now on the human side of cyber security rather than the tech. So my background is in sociology, politics, civic design, the lovely humanities. And I work now as a cyber security consultant. So if you ask Google image search what that is, you'll find out I create success. And I solve puzzles. So all day long I get high fives and fist bumps. When I'm not getting my high fives and fist bumps and solving puzzles and doing jigsaws, I get to do stuff like this. So fair bit of speaking stuff and some media stuff talking often about the latest data breach. So you will sometimes see me on the news. You'll see me looking shocked. You'll see me looking disappointed. You'll see me looking angry. And sometimes you'll see me looking really sad. Because working in cyber security can be quite tense and stressful. I relieve stress by doing this. You thought I was joking about casual Friday. Stuff like this. Stuff like this. And I like to attempt to make some stuff like this, which is an alarm clock, not a bomb. This kitchen timer and a little bit of jewellery. I have also probably without you even knowing I have done something that I'm guessing nobody else in this room has ever done before just now. And that is I have stolen from the man who used to rob banks. If you don't know what I'm talking out about then come to Freaky Clown's talk. I think it's at 7 o'clock today and you will see. So cyber security myths and monsters. Why am I talking about myths? Why am I talking about myths and horror and things like that as a way of explaining a complicated thing like cyber security? I'm doing that because we have long used myths and stories as a way of trying to understand things that are complicated as a way of trying to comprehend good and bad and people doing bad things in society. And also when we use stories people tend to remember them. They tend to relate to them and they tend to latch on to them a little bit more. So one thing I do in my consultancy work, it's a lot of awareness raising. It's a lot of translating technical messages to a non-technical audience. And so I find that using stories, using analogies really helps. But why are we talking about cyber security? Well, it has exploded in the last few years, not just in terms of number of attacks, but also in terms of, I would say in terms of awareness. People are a lot more aware of cyber security. They see it in the news. They maybe are victims themselves with a huge rise in identity fraud and companies being attacked. So it's on people's radar so much more than before. And you would think that would mean that people would have changed their behaviours. Exactly. That sceptical laugh that some of you may not have heard is very apt because of course people have not changed their behaviours. We still see the most common passwords the same last year as they've been for the last five, six years that splash data have been doing their analysis. So people are still using one, two, three, four, five, six and password one. People are still engaging in all the kinds of behaviours that unfortunately make them more vulnerable. So, as a person interested in the human side of cyber security and as a sociologist, I like to do a lot of research. And I get paid to do research. I do research for companies, but I also do it for fun. And I do it for conferences and blog posts and things like that. So this is a bit of research I did a couple of years ago now and it was only a sample of 100 people. I was interested in where is there a difference, where is there a gap between people's attitudes and behaviours? Do people actually care about cyber security? And what I found is that most people are worried about it. So the vast, vast majority of my samples said they were worried about cyber security. They did think it was important. People rated their worries, as you'll see, from naught to ten with ten being the highest. Twice as many people were seven and above. So people are pretty worried about cyber security. So, again, you would hope that the more worried people are, the more they take action. And so I looked at behaviours around passwords. If people are more worried about cyber security, are they more likely to have unique passwords? Are they less likely to share their passwords? All these good things that we keep telling people to do. And what I found is that people were slightly more likely to use unique passwords and to not share their passwords if they were worried, but not as much as we might like. So when we talk about the threat in cyber security, when we keep telling people about the dangers online, that's not actually really changing behaviour among most of the people. It's just worrying them. Another behaviour that I've looked at is around two-factor authentication. So I did a sample of a thousand people in the UK to find out whether they actually get two-factor authentication, whether they know what it is, whether they use it, 70% of people said they don't understand it, and 80% of people don't use it. So something really simple and actually pretty effective and the vast majority of people don't know what it is. And for me part of that probably comes down to its name, two-factor authentication, which sounds really dull and really technical and really complicated. So I think we've got a long way to go when we're engaging with the human side of cyber security. But what are we actually trying to change? What threats are we trying to get people to understand and engage with? And we have our first monster. So for me cyber security can obviously be split into the malicious threat and the non-malicious threat. And for me the malicious threat is summed up by Dracula, someone who takes and exploits others as a way of building themselves up. And that threat, that malicious threat, takes lots of different forms. So we see the hacktivist threat. Sometimes we may not disagree with the target of the hacktivist threat, but that's one form of a malicious threat. And of course hacktivists are loose collectives who can attack a target that we may not completely disagree with, like ISIS. We may not disagree with them targeting ISIS, but they will do so in their own form. And what we've seen with this, so for example with the anonymous attacking ISIS and taking attacking Twitter accounts for example, the argument from law enforcement is that although that seems helpful on the outside, in fact law enforcement try and track those Twitter accounts, they try and use them as a way of getting information. So actually taking them down can be counterproductive to try and target the threat. We also see financially motivated and criminal syndicates in terms of cyber space and cyber security. And of course we've seen a huge rise in the last year or so in ransomware. So this is a story of 407,000 attempted ransomware infections last year. The number probably is a lot higher because of course with statistics and cyber security, it's very hard to measure. But this was £255 million extorted from ransomware attacks last year. So there's a huge amount of money to be made by criminal gangs in terms of malicious cyber security. There is of course the nation's state threat. And this latest story is about Estonia setting up a UK data centre. Basically backing up all of their data. It is government bureaucracy. It is house and property deeds, banking credentials, birth records. All of the important data in Estonia being backed up to the UK because of rising tensions with Russia. Of course, as most people in the audience will probably know, in 2007 Estonia was basically brought offline, reputedly by Russia as part of a tension and a war that they were engaging in. And of course, we have the malicious insider. So somebody working, it may be as part of a group, it may be to assist a group or it may be a lone disgruntled individual working in an organisation and using their position to then attack the company in whatever form. And some research by Poneman and this has been backed up by Kasperski has found that the insider threat is actually one of the biggest threats, not the biggest malicious threat facing organisations. So that's the malicious threats. In terms of the non malicious, we're going to take a look at Frankenstein. Frankenstein created, sometimes called his monster, his creature. He created a creature in the hope of creating an adonis. He wanted to create the perfect man and instead he created this. So he was not satisfied with his adonis and he cast him out. He refused to take responsibility for him and so Frankenstein went out in the world. He attempted to socialise, he attempted to educate himself. Because of his appearance and manner he was cast out by society and became violent. So for me, Frankenstein's creature sums up the non malicious threat. So there's been the hope that we will introduce technology and technological advances to people and create this wonderful marriage between humanity and technology and that's not quite playing out as we'd hope it might. So what we need to do is we need to take responsibility for that. We need to engage with that. Otherwise, we see things like this. This is the non malicious threat in cyber security and it's arguably the biggest threat. It's just people making mistakes. It's not necessarily the most costly but it's a slow drip drip of errors. What we have here is some examples of data breaches last year. There's the London sexual health clinic where somebody sent an email out. They made that BCC, CC error. Everybody's email address, everybody who attended the sexual health clinic put their email address into the two field rather than the BCC and so exposed the names and the details of everybody attending that clinic to a very large sample of people. Potentially quite embarrassing and quite distressing. There is the Welsh police force that had two CDs of unencrypted video footage of a sexual abuse victim talking about the attacks on them. The DVDs weren't encrypted. The DVDs were lost in an office move and not found. That's really personal sensitive information that the police force didn't recognise the value of, didn't recognise the sensitivity of and failed to protect in the way that they should. The series Fraud Office, it was a temporary worker packaging up some files of evidence and they weren't supervised, they weren't trained and they sent them to the opposing party in an investigation. So they exposed a lot of quite important evidence to entirely the wrong party. Simple mistakes that people have made usually with not bad intentions. They just haven't been trained, they haven't thought about what they're doing and the security hasn't been strong enough. So when I talk about cyber security I think about Cerberus and I think about Cerberus protecting from the gates of hell. Cerberus is a three-headed dog. Can anyone see where I'm going with the three prongs of cyber security? We've got people, we've got process and we've got technology and they all need to be pretty powerful and at the minute we've got the Achilles heel of people. So it's often said that the user is the weakest link in cyber security and that people are stupid. You'll see the register there saying middle management are infosex biggest problem. Not the attackers, it's not the criminals. It's the people using the computers and making the mistakes that are the biggest problem. We have this somewhat harsh tweet that the user knows nothing, has a disease and is an idiot. So yeah, we all have a laugh and we all like, yeah, the user is really annoying. But the problem with that is Pygmalion. So Pygmalion here is in reference to what's known as the Pygmalion effect in sociology. And the Pygmalion effect comes from some research in the 1970s, that glorious time when there were no ethical committees and sociologists and psychologists could do some really questionable research but get great results that were really fascinating and have told us a lot. So the Pygmalion effect. A couple of sociologists went into a school in America and they split the students, randomly split the students in a year group into half. And they told the teacher this half, they did a test on all of them. They did a kind of IQ type test and then they said to the teacher this half they have got fantastic results. They are really smart kids and no matter what tests you've done before you're inferior tests, we're from a university, we have far superior tests and our superior tests have said that this group of students are the ones to watch, they're really smart. This group of students, not so much, they've performed poorly even though some of them have performed really well in your other tests. Discount that, these are the dummies. And what they found, they did a long study and they found that this impacted how the teacher engaged with the kids. The ones who were told, the teacher was told were really smart and performed really well and were going to get good grades. The teacher gave more support to, more encouragement, expected more of them. The group that the teacher was told were going to be performing poorly, were neglected, were told that they weren't going to do well, that they weren't smart. And what happened at the end of that study when more tests were done the investigators, the researchers, found that the group who they'd said were going to get the highest marks, despite what marks they had actually got and how well they'd performed in previous tests, they did far better than expected. And the group that they'd split out and said were going to perform poorly, completely randomly, they performed worse. So the pygmalion effect is basically the self-fulfilling prophecy that if you talk to people like they're smart and they're going to do well, they do better. If you talk to people like they're stupid, then you have the opposite of the pygmalion effect. You have the golem effect. And the golem effect, it's not golem, it's golem. And it comes from Jewish mythology, the idea of the golem effect. And it's the opposite to the pygmalion effect. It is if you talk to people like they're stupid and if you say things like the user is stupid, the user is an idiot, the user doesn't understand anything, then what you're going to do is you're going to drive poorer behaviour. And the myth behind golem is golems are made out of mud and they're inanimate creatures until a rabbi places knowledge and truth into their mouths. And when the rabbi does that, they become human. So we talk about the users being stupid. They don't do the simple things that we want them to do. But we also don't recognise that sometimes the threat can be harder to recognise. It can be harder to spot. So who can tell me the difference between these two URLs? Yes, well done. So there is a capital I and a lowercase l. Can you tell which is which? Whoever shouted that out and I'm sorry I can't see, I'm being blinded, I lied. Bottom one is genuine. The bottom one is the lowercase l. Is the lowercase l and the top one is the uppercase i. So thank you for proving my point. The human eye, we can't tell the difference when they're side by side. And I'd guess that the people who work that out, you work that out because you're smart people and I've just put those two next to each other. But on their own, if you see those individually, you cannot tell the difference between them just by looking at them. And you're all smart, you're all educated, you're all interested in technology. So for the average user, how are they supposed to recognise these kind of threats? Speaking about communication, I'm going to play a little clip. And I want you to just have a listen. The clip will explain what I want you to do and I'm going to try and pause it at a point and ask you a question. So what you'll hear is a sentence, a spoken sentence that's been transformed by a computer to sound like gibberish. Any idea what they said? No. You can hear it one more time. Did anybody, can anybody tell me what that sentence said, apart from anyone who's heard it before? The right stuff. Any more takers? Nope, okay, I'm going to play it all the way through if by some miracle you do work out what it says. Or you think you have an idea, put your hand up. But I'm going to play it all the way through now. So what you'll hear is a sentence, a spoken sentence that's been transformed by a computer to sound like gibberish. Any idea what they said? No. Okay, you can hear it one more time. Okay, now we'll hear the real sentence. The constitution centre is at the next stop. Does it make sense that time? Yeah, wait, was that the same? It's the exact same sentence that you heard the first time. No way. It's the exact same sentence. Your brain is always using prior information to make sense of new information coming in. So once you know what the sentence is, when you go back and hear the distorted version, you can apply that information and it makes sense. And it's amazing, nobody ever hears it the first time I heard that clip. I did not hear it, I heard it as gibberish. Now I play it sometimes in talks and stuff and I hear it the first time and I can't unhear it. Now my brain has been conditioned to make out those words. You can't not hear it. So it wasn't the right stuff. You did make me very tempted to sing new kids on the block and I'm glad I didn't because I'm just remembered of being recorded. Maybe later. So as she says in the recording, the reason that you can't hear it the first time is your brain needs layers of information to make sense of things. If someone starts talking to you in a foreign language or a language that you're not attuned to, then you won't be able to make sense of it unless you have information as building blocks to be able to translate it. And this is where I think we are with cyber security. When we talk about it, a lot of people hear gibberish. And not just that, but actually what we call our discipline in terms of cyber security or information security or infosec or any other term can cause a barrier between us and the user. And this is from another piece of research I did for a talk I did at B-Size London last month. And I asked people on Twitter. Most of my followers on Twitter are into cyber security. And I asked them, you know, what do you call it? And of course I could have phrased the question differently and all of that stuff, but I went with a fairly simple straightforward question. And I got over 400 responses, which was really nice. Thank you to everybody who responded. And the vast majority of people said they call it information security. And you can imagine some of the responses I had back about cyber security was like, you know, it is the dirty phrase. Don't use it. I hate it. You know, it's just for people who know nothing. Somebody uses cyber security and I know they're an idiot. You know, there was a lot of hate directed towards cyber security. But I asked the same question of people in the general public. And I had 737 responses to that. And I don't know if you can see there at the bottom, but e-security was more popular just among people in the general public than information security. So even when we use the word I work in information security, people are like, what's that? People are more likely to engage if you say I work in e-security. I don't know anyone who has ever used the term e-security. That was my ringer in that survey. And I couldn't, I didn't even put it in the Twitter one because you can only have four responses to a poll. And I just thought no one's going to pick e-security because that's ridiculous. But people in the public think information security is more ridiculous. And they all engage far more unsurprisingly with cyber security. It's what the media uses. It is what the board uses. It's what people in business use. So I gave a talk at B-Sides London and if you want to hear more about that, check it out or look on my blog. But why does it matter what words we use and how we talk about something? It matters because of the fluency heuristic. So heuristics are used in psychology as a way of understanding how people make decisions. So our brains will often make decisions without us consciously loading them with making the decision. They will just make a decision by themselves because we're lazy. And the fluency heuristic is basically the simpler, the more concise, the more straightforward, the more engaging you explain something, the more people will take it in and listen. So using terms that people actually get and actually engage with means that people are more likely to listen rather than just switch off and think they're talking gobbledygook. And we already use myths in cybersecurity. The Trojan is very familiar now for people. And people engage with that, they get it, they can relate to it. And we've seen recently a new reported form of Trojan that actually turns this non-malicious threat into a malicious threat. So has anybody heard about Delilah? Delilah has come out in the news recently. Of course Delilah comes from mythology. She is the woman that told the Philistines and gave the Philistines access to Samson, saying shave off his hair and you'll have his strength and she did that for money. So that's where the name Delilah comes from for this Trojan. And this Trojan, it enables remote access to people's webcam. It comes from people going to certain adult websites and gaming websites and trying to download certain files. You can see where I'm going then, the remote access to the webcam. And the attackers then look for information on family contacts. They look for information on where a person works. And they attempt to turn the target and the victim into a malicious insider in their place of work. So why do people go to dodgy websites and download things that they shouldn't? Why do they click on links when we've told them to be careful of links? Why do 30% of people when they're told a link is malicious still click on it? One reason is we are all innately curious. I said innately, not innately. We all have that little Pandora's instinct inside of us, which is to see something that looks a little bit tempting and to think, what is that? What's going to happen? What's the worst that can happen? And we just can't help it and curiosity is a great thing. It helps us with our technological advances. It's amazing to see in children. But in terms of cyber security, it can be a very damaging part of human nature. Another damaging part of human nature is the little Homer Simpson that is also in our brain. So people often tend to think about brains as being sensible, ordered, rational things that we can pick apart and understand. The philosopher Carl Popper said brains are clocks, not clouds. We all expect people to have brains that work like clockwork that we can look at and think, yep, that went from there to there to there to there, and that's why they made that decision. But actually our brains are like clouds. They are hard to predict, they're hard to map, they're hard to see which way they're going to go. So this idea of the Homer Simpson in your brain, it comes from behavioral economics and a book called Nudge. And in Nudge the authors talk about how you can make small changes to get people to make better decisions without them even really knowing about it. And they say we need to do that because all of us have two sides to our brains. We all have a Homer and we all have a Spock. And Spock is really tired trying to battle Homer all the time. So this is why when your alarm goes off in the morning and you know you have to get up but you still press snooze like four times and then you're really late and you're thinking why didn't I just get up. It's why you eat donuts for breakfast even when you're on a diet. It's why you do all those things that is just like, you know, damaging yourself. So in cybersecurity what we need to do is try and make small changes that people don't notice and try and engage a little bit more with their Spocks so that they can try and fight their Homer a little bit more effectively. One way that Homer takes form is in being unable to resist temptation. So this of course is the idea of the siren song. Sirens who would lure men out at sea with their beautiful voices and it led to this idea of a siren song and that it's something that is tempting and you know it's dangerous but you just can't quite resist it for whatever reason. So we see cyber sirens all the time on the internet. For example we see them spreading spam. This was some work done by one researcher at Symantec who kept seeing on Twitter these like spam tweets coming out from accounts that looked like celebrities but weren't and so he did some research into it and he found that one guy was behind this network of 700,000 Twitter accounts spreading diet loss spam and it was quite a sophisticated network. He set up, he had the fake celebrity accounts that were sending out the spam he called Mockingbirds, he then set up a network of parrots so accounts that looked real, they had pictures usually of pretty girls stolen off the internet, they used stolen content from other people's tweets popular jokes and stuff and they were there to bump up the number of followers of the Mockingbirds and then to bump up the number of followers of the parrots he had eggs and they basically looked like new users, they had egg avatars they didn't really tweet but they kind of retweeted and they followed and they were just there for bulk. So we see that kind of thing on social media a lot and on the internet a lot, encouraging people to click on links. We also see more serious and damaging forms of cyber sirens so this is the very sad case of Daniel Perry a few years ago. Daniel was a teenager in Scotland, he was on Facebook I believe it was Facebook or another social media service and he was befriended by a pretty young girl, his age he got talking to her, they became friends, he really liked her she wanted photos, she wanted web chat, it got explicit and then this pretty teenage girl turned out to be a criminal gang who tried to extort money out of Daniel and Daniel was a teenager, he was threatened with his family finding out about it, all his contacts on Facebook, his friends and very sadly Daniel, like a few other teenagers involved in similar circumstances, Daniel committed suicide. So we see some really damaging cyber sirens on the internet as well and fear is a key thing when we talk about cyber security we're often accused of scaremongering, we're often accused of talking about threats that make people feel really worried, they make them feel worried for themselves for their kids, for their grandparents, for whoever it might be we scare people and sometimes in this industry people do it to sell products but also inevitably we can't help but talk about scary things because we're talking about threats and dangers online so I did some work a few years ago, I'm not going to talk about this too much now because I gave a full presentation on it at EMF camp a couple of years ago but it's about the psychology and sociology of fear and the impact that we have when we talk about something scary and how we can talk about scary things in one way drives people to just engage with the fear, with the emotion and they will just bury their heads in the sand they will think hackers don't want my data or they will go into avoidance, they will stop using the internet altogether whereas we can talk about threats in another way to encourage people to engage with the danger, like the actual threats online and we do that by talking about the threat in a way that people can understand that they can see as a real thing and that it applies to them and then we talk about what people can do and we give them the tools to be able to go and do that so this model here is called the extended parallel process model and it comes from some more 70s research where a researcher called Leventhal looked in America at why people weren't taking up the tetanus inoculation tetanus was a really big problem, people were dying of it people were incredibly ill from it but people weren't getting the inoculation so Leventhal was tasked with trying to find out why and he decided to scare people into action so he took a big group of students and he showed them the horrible side effects of tetanus he talked about what happens the fact that there was an oculation they could get and people went out and they all scored their fear as being really high they said they were really worried, they were scared just like my sample at the start said they were really worried about cybersecurity he then found, pretty much like I did with my sample that despite people being really worried about it they didn't go and get the inoculation only 1% of the sample group went and got the inoculation so it's that irrational link in humans where people worry about something but they don't do anything about it so Leventhal got more students in and this time he split them into two groups and with one group he did the same thing again he scared them, he gave them all of the information the pictures, the horror stories about tetanus and he told them there was an inoculation available with the other group he scared them in just the same way he gave them all the same scary information but he then had a member of his team sit down with each and every one of them and use a campus map and say this is where we are now this is where the health centre is this is where the library is the students union, where do you live or you live over this way okay so this is where the health centre is what journey will you take to get there when are you going to go these are the opening hours do you want to ring and make an appointment now so he went through all the different ways of how they were going to take that action he made them see that their response their efficacy was doable and he encouraged them to plan it out and what he found from that group it certainly wasn't 100% of people that went and did it but it jumped from 1% which is what it carried on with with the other group only 1% of those who were scared went and got the inoculation the group that he scared and then gave a roadmap to how to take a response to that jumped up to 30% so when we actually give people a response they can engage with they do change their behaviours so why have I talked about myths today and what are the lessons I want you to take away I want you to recognise that the enemy is the one which exploits others to create themselves and people process and technology are the three pillars of cyber security but you're only as strong as your weakest link and things online aren't always as they seem pride comes before a fall and curiosity is a human condition so you need to think before you look educate about the monstrosities and if you expect a lot you get a lot back but if you expect a little then you get a little back so empower people because if not we might unleash a monster thank you very much for your time and attention today