 Helo, ddweud yw'n gweld i'n fath o'r gweithio'n mynd i'w ddweud yw'r computer yw'r all yn ymgyrchol. Mae'r ddweud yn ymgyrchol. Yn ymgyrchol, mae'n gweithio. Rwy'n dweud o cyfansu cyfansu, ac yn ymgyrchol yma'r cyfansu cyfansu cyfansu. A dyna'n gwneud yn ymgyrchol i gael'i'r ddweud ac yn ymgyrchol i'w ddweud o'n cyfansu cyfansu cyfansu cyfansu. Er gyfweliwch, substantially gyrtaf am yw'r rhaglen i fynd mewn. Mae ydych yn dda ni'r cyffredin caeth ei rai? I am Jessica Barker, y cofondor y Gwyrdd Cymru, ac mae angen at yr unig,'r cyfnag ymgyrch, ac y dyfodol cyfraith cyfraith. Mae'n angenoween yr Gwyrdd Cymru, mae'n cyffredin cyfraith cyfraith, sy'n hanes yr adrach sydd hefyd datblygu rydyn ni'n ddweud y gallu cyfnodol a technolig yn ymddangos. Rydyn ni wedi'i gweithio cyfnodol i gyd yn cyfnodol a'r gwaith ar y dyfodol, a i gyfnodol, rydyn ni'n ddweud allwch gyd yn cyfnodol, gyfnodol, oedd y cyfnodol, oedd, oedd, ac rydyn ni'n gweithio cyfnodol ar y perspektif ymgyrch. Ond wrth gwrs, rydyn ni'n ei gweithio, rydyn ni'n gweithio ar gyfer. Rydyn ni'n dr Jessica Barker. Rydyn ni'n ddwy'r dr, a dych chi eich ddwyngan dr, a rydyn ni'n ddweud o'r ddweud, i ni ddweud bod yw'n credu arbennig. Rydyn ni'n gweithio, rydyn ni'n gweithio ar gyfer cyfnodol o ran oedd. Rydyn ni'n ddweud i gyfnodol, a rydyn ni'n ddweud, rydyn ni'n ddweud, a rydyn ni'n ddweud weithio ar gyfer. Rydyn ni'n ddweud o beth, fan am leidio i chi ddim i cyfnodol o'r gwasiol y fradig. I will change it up a little bit. My background is in sociology and politics, and so I look at cyber security very much from the human side. I run a consultancy. Anyone who doesn't work as a consultant, they will often say to me, like, what does that actually mean? What do you actually do on a day-to-day basis? Well, I write the word success on transparent walls. I'm really good at jigsaw puzzles, and I'm even better at high fives and fist bumps. Because I'm so good at the high fives and the fist bumps, I get to do a lot of stuff like this. So I am sometimes on the news talking about cyber security. I come and speak at lots of events like this. So if you're ever in the UK or you turn on maybe BBC World News, sometimes you might see me on there when the latest cyber breach has come to light. And that means, of course, you will see me looking shocked. You will see me looking disappointed. Sometimes you'll see me looking angry. And on a really bad day, you might catch me looking really, really sad. Working in cyber security, I can guarantee, it does make you shocked, disappointed, angry and sad. So it's good to let off steam. And I do that by doing stuff like this. And stuff like this. And stuff like this. And I like making stuff like this, which is an alarm clock, not a bomb. Honest. I make stuff like this. I make stuff like this, which is an excuse to use a hammer. And I really, my favourite is to make stuff like this. The quickie mark, if you haven't got it, it's the best Lego set that there is. So there we go. That's who I really am. But I'm talking today not about building Lego alas, but instead about how to build up the human defences when it comes to cyber security. Cyber security, of course, is big news. At the minute it has been for the last couple of years, you can't have escaped cyber security in the news. I get so many gigs on telly. But it has increased, attacks have increased, certainly our awareness of attacks has increased. And so the news has an unrelenting appetite for cyber security. News stories have increased at least 400% over the last year or so in mainstream media. So we're hearing about cyber security all the time. One of the most important things to understand about cyber security is that even though it sounds very technical, it's actually at its heart a very human subject. So it's long been said that cyber security is about people, process and technology. And yet the industry has focused much more on the technology part than the people and the process. And so you will often hear it said when it comes to the human side of cyber security, like, oh, people are the weakest link. If only there were no users, then everything would be secure. So this is something that I've always tried to challenge. Yes, people are the weakest link in cyber security, but that's because there are weakest link. It's because as an industry, we have very much neglected to look at the people side of things. And we've focused so much more on the tech because to some extent that feels easier. If you can get a piece of technology and you can put it in place and you can turn it on, you can be like right, there we go, we are safe, we're secure. If only it was that simple. So what we've done when it comes to cyber security is we have forgotten a lot about human nature. So we're at a point now where actually I would say cyber criminals understand psychology more than we as an industry do. And so many attacks therefore in the last couple of years have really been kind of classic con artist type attacks that really prey on psychology or social norms, the way we behave in society. And we have not been able to defend against them. One of the reasons we've not been able to defend against them is this key lack of understanding about how people's brains work. I think when you work in technology and when you are very technologically focused, we can, so for example, the cyber security industry has tended to kind of split people into two camps and has tended to think of people as either people who get technology or people who don't. And the people who get technology are deemed as being rational, sensible, they understand the rules and they behave in a rational manner and they would never click on a stupid link in an email. And then we have the people who don't understand technology. We have the users and they're the ones that we need to worry about because they're irrational and they don't behave how we tell them to. So they're the home of Simpsons and we are the doctors box. And this approach thinking that people can be split in that way is fundamentally flawed. If we look at psychology, we can understand something called hot states. And a hot state is when you are tempted by something, when you're curious about something, when you feel greedy, when you feel lust, when you feel tired, when you feel overworked, all of these kind of emotional states. And what happens when you're in a hot state is that you're more vulnerable, your defences are down, you don't think in a rational way and so you're much more vulnerable to being taken advantage of. And so what we fail to understand in cyber security is that everybody goes into those hot states. Everybody's brains all day, every day, is a homer and a spock and they are battling it out as to who gets control over your behaviors. So when you wake up in the morning, you're tired, you know you should get into work on time but you keep pressing the snooze button so you end up running out of bed, running down the street, eating your toast, putting your make-up on the bus. Yes, I'm speaking from personal experience. But you end up making yourself more of an enemy to yourself because you were tired. You've finished work, you're hungry, you're stressed. You know you should eat and prepare a nice salad but there's a box of donuts there. You're tempted, you can't resist the donuts. You get an email and it's from one of your friends and it says, click on this link to see those wild photos from our party last week. You didn't go to a party last week with your friends, but you'd quite like to see some wild photos of your friends maybe in compromising positions. You click on the link, you're curious. You go on to LinkedIn, you get a request from someone, you don't know them, but you'd quite like to know them. You are tempted, they're pretty, they're attractive so you accept the request. This is a hot state and this is what we've overlooked in cyber security. Something else we've overlooked and that is often overlooked in technology is how we communicate. I am going to play you a clip. It is one sentence in the English language and it is something that has been transformed by a computer to sound like gibberish. I want you to have a listen and if you can hear what the computer says, wave your hand at me. Anyone? You're all laughing because you got it, right? Any takers? No. Okay. I'll play it again. Have a listen, close your eyes, really see if you can hear what the computer says. I love this. Any takers? Anyone? I'm not even a guess. No. Okay. I'm going to stop being mean. I'm going to play you the translation so you're going to hear in very clear English what the computer says. If anybody can't make it out, wave at me and repeat the sentence. The Constitution Centre is at the next stop. Did everyone hear that? Anyone? Anyone not hear it? The Constitution Centre is at the next stop. The Constitution Centre is at the next stop. I'm going to play you the original again. Now you know, now you've had the translation, I want you to see if you can hear the original clip. Everyone get it at that time? So the way that our brains are wired, when you have pre-existing information, you can make sense of new information. But if you don't have that baseline there, if you don't already speak the language, then it just sounds like gibberish to you. So sometimes as a technologist, you might be in conversation with someone who doesn't work in your field, who doesn't work in technology, and they will be giving you the blank look that you were all giving me after that first clip. And you'll be thinking, why? Am I boring them? What's going on here? What's going on here is what's known psychologically as the fluency heuristic. And the fluency heuristic is that if you don't speak in the simplest, clearest terms to people, then they will switch off. As soon as you start using jargon, people think whatever they're saying, this is not for me, and your brain just won't take it in. So when we are trying to talk about something technological, something complicated, something like security, we need to make sure we translate our language into a language that the listener will understand. Often that listener is a user, or it's known as a user. And in cybersecurity there is a bit of a tendency, as probably in most technology fields, of talking about the user like they're the problem. And I've got a classic quote here from Twitter. This was like an InfoSec Rockstar who said that information security is hard because the user knows nothing, has a disease, and that disease is being an idiot. And this is the kind of thing we hear a lot when it comes to cybersecurity. It is often said people are the weakest link and you can't patch stupid. Which obviously, I mean A, isn't very nice, but also it's very unhelpful. And the reason that it's very unhelpful in trying to spread messages and trying to get people on board and more secure is what's known as the Pygmalion effect. The Pygmalion effect is essentially this idea of a self-fulfilling prophecy, which is if you talk to people like they're stupid, they will prove you right. They will act in a more stupid way. If you talk to people like they're smart and if you expect more of people, then they will also prove you right. And this goes back to psychological research in the 1970s when two psychologists in America went into a school and they gave the school a new IQ test, telling the teachers that this test was really going to find out who were the clever and who were the stupid ones in the class. They did the test, they got the results back, they randomly split the results into two groups. But they said to the teacher, we've done our analysis, this half are smart, this half are not so much. And then they observed what happened. And what happened, of course, is that the teacher spoke to the ones that they expected to do well in a more respectful way, gave them more time, gave them more resources, and spoke to the ones that they expected to do poorly in the exams that had done poorly apparently in the IQ test, spoke to them as if they weren't very smart, didn't give them time, didn't give them attention. So then, of course, when it came to the proper end of year exams, the half that had been told to the teacher would be smart, they did better than ever expected in any of their previous tests. The other half, they did worse than ever predicted in any of their other tests. So how we talk to people, what we expect of them, really has an impact on what we get back. And of course, why does this matter? Well, when it comes to security, when it comes to technology in general, if it doesn't work for people, then it doesn't work. Like what are we doing it for? Why are we bothering if people can't engage with it? And of course, we talk about users as if they are one. We talk about people as if they are one thing, like this is going back to the Homer and Spock thing a bit. But of course, everybody is different. And so when it comes to communicating cyber security, we need to understand that. We need to understand that everybody learns in a different way. Whatever it is you're trying to teach, whatever it is you're trying to learn, I do a lot of cyber security training. So I think about this in terms of that. But whatever it is that you are trying to teach or impart, you need to understand that everybody learns differently. Some people learn by doing, some people learn by having a conversation and by listening, other people learn by observing and other people learn by trying to solve problems like actually getting hands on and doing. So trying to combine all of those different methods is the most effective way to ever try and impart knowledge. When it comes to cyber security, I often think of Richard Feynman, who's heard of Feynman. I imagine quite a few people good. So Richard Feynman, very famous physicist. Also very famous for being able to take a really complicated subject and distill it down into something that everybody could understand. So an incredibly smart guy. So when it comes to learning, a really good person to look at. And I use something called the Feynman Technique whenever I am trying to learn something new or when I'm trying to teach others. So something that you can think about. For example, if you're leading a team of devs and you want them to engage more with cyber security, then something like the Feynman Technique could be really helpful. And the Feynman Technique is the idea that you can understand something best when you try and teach somebody else. And trying to teach somebody else or talk about a subject is the perfect way to find out what you don't understand about it. So what I do with a class, for example, is I'll split the class into two, give them all a notebook, and give them half an hour to go away and learn about a security concept. So it might be SQL injections and cross-site scripting or ransomware. I'll get them to go away and learn about it themselves and then they pair up with someone who's learned about the other subject and they teach it to them. And in doing that, they might find that they don't understand a particular aspect so they go away and learn some more. And what you'll find is that after that, in the days and the weeks and the months after that, what they will remember is not what they were told by the other person, not what I told them, but what they went away and learned for themselves. An issue with cyber security and the human side of cyber security is that people are very tired. So when it comes to... I talked about how we've seen this huge rise in cyber security stories in the news and everybody's talking more about cyber security. One of the problems with that is that people feel this thing known as hack saturation. People feel absolutely worn down by how fundamentally insecure the internet is. It's very tiring and it's very tiring being given a list of things not to do. And this is often what we do with cyber security is we tell people don't do this. And the problem with telling people don't do this is if they don't understand why they shouldn't do it. So, for example, if you've got a team of devs and you're trying to teach them about cyber security and you're trying to get them to work in a more secure way, then you may be saying to them when they're building, pull a web form, don't allow unsanitised data to be inputted. You might say that to them and they'll be like, okay. What will be far more effective is if you get them to learn why not. So there is a tool called Webgoat. Anyone can go on it. It's free. It covers the OWASP top 10. So instead of telling people don't allow unsanitised data inputs, you could tell them have a look at this and learn about cross-site scripting. They can do a quick tutorial. They can understand how cross-site scripting works from an attacker's point of view. And then when you understand how attackers work, when you understand the techniques of a criminal hacker, that is the best way of being able to defend against it. Because then you really understand why you shouldn't do certain things. Then it really sticks. The problem with showing people how attackers work and the techniques of attackers is that you soon realise it's actually really quite easy to attack websites and the internet is fundamentally quite insecure. And this can be a scary thing to confront for anybody, the average user or anybody who even works in technology but hasn't had much of a security focus. So what you then may be confronted with is fear. And this is something I come across a lot because I do a lot of cybersecurity training. There is a time in my training when I'm going through the threats and I'm talking about examples of cybersecurity and people are realising, okay, this is real and this does apply to me. And I really see like the whites of their eyes. And so you have to understand a little bit about the psychology of fear because if you show someone something scary or you tell them something scary if you don't do it in the right way then you can actually make things worse. So this goes back to research again, psychology research from the 1970s where a psychologist called Leventhal was charged with looking at why the tetanus vaccine had not been taken up by many people in America. There was a vaccine for tetanus. It was very effective and yet people weren't getting it and so people were dying from tetanus. So this psychologist called Leventhal thought okay, I am going to scare people into getting the vaccine. He got a bunch of students together. He brought them to an auditorium much like this and he talked to them all about the horribleness of tetanus. He showed them awful scary pictures. He made them feel absolutely terrified about getting tetanus. At the end of the day he asked them all are you scared about getting tetanus? 100% yes. Everybody was scared about getting tetanus. Are you going to go away and get the vaccine? 100% yes. Everybody was going to go and get the vaccine. They went back and checked some months later and discovered that 1% of that group had gone to get the vaccine. So scaring people doesn't work. He got a bunch more students back and he split the group into two. Half of them he repeated the same experiment. The other half he sat down at the end of the day one on one with them and he gave them a map of the campus. Everybody was scared. Everybody was going to get the vaccine. When he presented the students with a map he showed them this is where we are now. This is where the health centre is. This is where the library is. The children's union asked them where they lived and how they would get to the health centre. Asked them when they would go. Really talked them through the responses. What they found a few months later was that the group that had just been scared it stuck at 1% who had the vaccine. The group that had that roadmap of what to do, where to go and how to get to the health centre that went up not to 100% of course but to 30%. of talking to people about something scary and getting them to change their behaviours. And so this led to what's known in psychology as the extended parallel process model. Which is when you talk to someone about something scary, you talk about why it's scary, you talk about why it applies to the person that you're talking to. But then most importantly you talk to them about the response and you help them understand how they can respond to the threat. You empower them with their response. If you don't do that, then they will simply engage with the emotion and they will go into denial or avoidance. They will say, no, hackers wouldn't want my data or I'm so terrified I'm never going to use the internet again. And I have met people in both of those camps. So I've tried to give you today a quick overview of cyber security, particularly from the human side of things. What I have talked about is how we can make technology more secure by our communications, by talking to people more clearly. By creating a space where people can ask questions, how you can use a roadmap to turn fear into curiosity. I have talked about the Feynman technique, using stories, using pictures in accelerated learning, trying to empower and reward people. And I have tried to show you how, even when you are very technologically focused, it's important to translate your messages to a non-technical audience. And if we try and work on some of those things, then I think what we'll find is we're much further on when it comes to cyber security. I have been Dr Jessica Barker. If you have any questions, we have some time now. Or feel free to contact me in any of the ways you can see. Thank you for your time and attention. There's a few questions, so there's some still coming in. So, let me pull some up. So, I think the general theme of the questions is around the scale of technical knowledge that users have. Especially, I guess, with some of the audience here who obviously would consider this at least to be experienced in cyber security or thinking about cyber security down to junior people in their teams. So, one of the first questions was, if we write messages for non-technical users, does that risk putting more technical users off who will then ignore them? That's a good question. I think knowing your audience is really important and it's very hard to write one message, one communication that meets all. So, I would say tailor your messages as much as you can. So, for example, when I do training, I try to deliver different training sets to different groups in an organisation. So, I will have a different approach. I will talk about different things. I will use different language when I'm talking to the average user compared to when I'm talking to the more technical teams compared to when I'm talking to the executives. So, if you can tailor your messages, then I think that's better. But if you just have to go with kind of one approach, then I would go for the lowest common denominator. Okay. Another question, instead of focusing on the efforts of the social security issues, is there a place to be focusing more on the cause, so the educational system in general? How can we be teaching that, I guess, at a younger age all the way through? Yeah, so, if I understand the question right, you're sort of saying how can we engage with embedding cyber security learning from a young age? Yeah, that's a really good question and something I really strongly believe in. I know in the UK there's been a lot of effort to look at the school curriculum, so they're introducing a lot of stuff into the school curriculum at a really young age to teach cyber security. The problem that I foresee with that is nobody is teaching the teachers and already, so I do quite a lot of stuff going into schools. We do this at Redacted Firm. We go into schools. We talk to the kids about cyber security. It is usually the teachers that don't get what we're talking about. It's usually the teachers that have the most questions and they feel very adrift. So, if you're looking to do something active, I would say get in touch with your local school. They're crying out for expertise and they would love you. I'm sure to go in and talk about cyber security. Okay, thank you. Another question. Do you think developers or maybe people in general could learn about security from previously disclosed vulnerabilities or maybe from vulnerability disclosure platforms? Somebody shamelessly posted a link on Slack to hacker1.com, which I think they're involved in, but are those sorts of platforms useful for teaching maybe for everyone here to go back with more junior devs and get them to use? Yeah, I think going back and looking at previous vulnerabilities because we see the same sort of approaches with a lot of vulnerabilities again and again. So going and seeing what has been used in techniques that have been used in previous vulnerabilities is a good way. And you can always kind of find write-ups of how certain malware has worked. So it's worth trying to find people... Twitter's a great place when it comes to cyber security. So kind of trying to find people who do those kind of technical write-ups is something I'd highly recommend. Two more questions just come in. One is I guess about more social engineering. So I read the question. So you mentioned LinkedIn. LinkedIn accounts from employees, the person's company, have been used to attempt social engineering as well as for their vendors and their customers. How can you encourage employees to limit or control their profiles without seeming like you're trying to prevent them from getting another job? Something like LinkedIn is the main way of finding people. Yeah, this hits on something that I see time and again. So I find most effective is kind of going back to the idea I sort of said about Webgoat. So it's demonstrating why. We do this a lot in Redacted Firm. We go into organisations. They will give us a list of people in the organisation. We'll have their first and last name. We'll have a few days to do some O since, Open Source Intelligence research into the list of names. And then we go in and we present to the people in the room. This is what we've found out about you. And so we will show them, you know, we know that so-and-so goes to Salsa class on a Tuesday at six. We know that Susie went on a holiday in 1996 as part of her high school trip to Scotland. We know all of this stuff about all of you. And then we show them this is how an attacker could use it. So we will show them like a fake spearfishing email that we'll craft with a link. And then we'll say, okay, you would believe Susie that this email came from that friend that you went to Scotland with in 1996. There's a link here saying, you know, click on this and you'll see photos from our high school trip. Susie would probably click on it and this is what would happen if that was a spearfishing email. We'll do a demo then from the attacker and the victim side of what you can achieve with that kind of attack. Doing that is incredibly powerful because people kind of get to experience then what it's like from an attacker's point of view. And it stops just sounding like scaremongering. You know, if you can actually say this is how it works and this is how you could be targeted, it really makes it real for people. They have that experience of being hacked without hopefully being hacked. I guess this is a sort of link question as well. Somebody else says, a lot of people will just want to use one simple password for everything because they'll remember that password. And even maybe a complex password, just one thing they can remember. How can you teach them about things like one password or password managers in general? Like how do you get past that social barrier of them not really wanting to care enough to... Yeah, I think there's a couple of... I mean passwords are a nightmare. On one extent, people don't want to care enough. People don't understand why passwords are such a big issue. Also psychologically, we can only really remember about five and most of us kind of have like 26, 27 more internet accounts. So passwords are a bit of a nightmare. What we've done again is the demo side of things. So we worked with an organisation a while ago. They'd been trying to push their in-house password manager for months and get an incredibly low take-up. So what we did as part of some training was a password reuse demo, not even a password cracking demo, but we showed how easily passwords that have been made public can be used to crack new passwords. And it was incredibly powerful. We gave that demo to about 300 people in a week or so after our sessions, 60% of that audience made inquiries about getting the password manager. So when you actually show people the threat, because cybersecurity sounds so intangible, people don't get it and they don't get why passwords matter. So if you really show them what an attacker can do, then they're far more likely to take it seriously. And would you encourage as well, I guess, with developers here who can control maybe the website side of it, having things like OAuth and things so you can log in with Google instead of a password? So that's, I guess, a key takeaway as well is that stop users from using those passwords and get them to use them. Exactly. As much as you can kind of take the burden off users, the better. One last question. There's slacks filling up. What are your opinions on white hat hacking as an approach to this? Sure. Yeah, I think it's really fundamental white hat hacking because unless you know what an attacker can do, then you can't defend yourself. So it's absolutely something that I would encourage people to do. It's something that we do in our company. And so it really is that thing of taking an attacker's eye view, whether it is your website, whether it is your physical office space, whether it is what you're sharing online over social media. If you take an attacker's eye view on it, then you're much better prepared to defend against it. Brilliant. Thank you very much. Thank you.