 I'm just very excited to hear as well is a heuristic park, why we can't, we can fake it until we make it. So our presenter, Iskimo, is going to share on these topics. Why do we believe in fake news and what are news silos and more topics? This lecture discusses the psychological reasons as seen from the perspective of a social engineer. We were talking beforehand, he wanted to point out that if you're thinking of social engineering just in terms of one area, it might come up in a security focus con like this, thinking just in terms of manipulation to say bike passes, security protocol, you're missing a lot of the apparatus of what he'll be talking about. This talk explores further and focuses on ways that we respond neurologically hardwired to details and what some of the consequences are related to his topic. So thank you very much. Thank you. Take it away. So first and foremost, I might need to apologize up front because I might look a little fatigued and off balance and that's not because it's day three and it's 11 a.m. at a hacker camp because it is. Kudos to you for being here at this time and the only advantage that we have is that it's not too hot yet in this tent. So with me, it's actually medical. I'm suffering some post-COVID things. So if you see me wobble around a little bit, that's medical and it's OK. I'm fine, but it looks a bit funny sometimes. That having said, I can fully adhere to what you said, wear a mask, and everyone ask, protect teacher, be careful because COVID is not over. So that being said, a small introduction and this is very scary for me. In my spare time, I do some magic for children, little car tricks and a little making stuff disappear and reappear. And the most scary thing you can do is perform for magicians because they know all your shit. And they will call you out if it's not perfect. And I have the same feeling here because it's all hackers. You guys know about social engineering and probably because you're here, you know a little bit more or at least you're interested in social engineering. So I'll have to do my best today and I'll try. This talk is meant to be a little longer than the 15 minutes that I got. So I might skip some things. But first by a raise of hand, how many people understand Dutch quite well? Okay, so there is some slides in there, which are examples, but they are in Dutch. So I'll be showing them, not taking too much time about them, but they're funny to see. So that's good. A small introduction, I'm a social engineer. I have been for a lot of time, a lot of years. I've been pentesting companies professionally for 15. After that, my face got on too many wanted posters and I had to stop doing that because if you show up and people say, hey, that's the guy from last time, that is problematic also for the rest of your team still conducting the test. So I decided to go and teach and that's what I do at this moment. So furthermore, I'm a researcher. This stuff is very interesting. It stays very interesting. So you read all about it and you delve into it very deeply. And I do stuff like this, public speaking. I attend conferences. And of course, we have this wonderful Angry Nerds podcast. It's in Dutch. Find it, it's funny. Right. I have been sweating on this one for a long time because I needed to form a definition of social engineering and I started writing its psychological techniques to get your password. No, that's not it. It's to make you do things that you actually don't want or cannot or... No, that's not it either. And it became a very, very long and confusing definition, like most job descriptions, very long and confusing and half of it is not even true. So I came up with this. Actually it's two words. There's the word social, which is any interaction you have with other people. You can initiate or undergo it. And you have engineering, which is you have a plan, you design something. And from that design, you go and build a house, a bridge, anything. So if you combine those two and you say, okay, I'm going to have a social interaction with anyone by a design to get to a certain goal, that is what social engineering is. It's nothing more. It's nothing less. This is a very clean definition and I could come up with just by translating those two words. So what is it? It is being clever with the truth. You can do that by words, just by plain lying. We'll get to that later on. You could indirectly lie by showing up in a nice suit because you have an interview or something or you could fuddle with the truth. We'll get to that later as well. Or you could just set the scene. You see it in Bitcoin. It's great. Bitcoin is great. All these fancy offices and these great companies that do all kinds of wonderful stuff. Of course, Bitcoin is not a very good idea. At least that's my opinion, but it gets hyped a lot. Sorry for the guys having Bitcoin. Yeah, it's not a good time for Bitcoin. So who does it? And this is actually a small nerd joke. Some of you will get it. Criminals, lawyers, salespeople, of course, but parents also. We have a four-year-old and we have to social engineer the hell out of her to get her to eat her veggies and get to sleep in time. But children can also social engineer very well, especially ours. She knows exactly how to look, how to smile to get candy or to stay up later. How many of you have kids? Respect. So this is very hard. They come up with all kinds of crazy stuff to not have to go to sleep. I need to pee. I need to drink. That's also social engineering. You just adjust your normal behavior to make it logical what you're doing. So actually anyone, everyone social engineers. I am not at this moment because this is what I actually look like, but at work I wear other stuff. Of course, you don't go to work in the couch outfit, let's say it like that. So why do we do it? This is still an introduction that you're probably familiar with. You could go for influencing people. And there is a subtle difference, but a very important difference between influence and manipulation. Because I think manipulation is the bad kind and influence is good. You can also influence people to get them in a certain track which is actually in their own interest. An example that I sometimes use is that when you go to the hospital and there's a doctor, he comes up to you with his hands in his pockets, he says, oh yeah, you have cancer. That is weird, right? So these doctors, they get trained to bring this news in a special way. You have to serve it up so you won't black out. If I go and tell you, hey, you have cancer, I can be sure that anything I say after that will not land. You just don't hear it. And it's in your interest and in the interest of the doctor that you keep listening. So we probably invite you over, okay, come, sit down, have some coffee. I looked at your test results. This is what I found. There are treatments, but it is severe. So we have to do this and this and this and this and this. And he won't use the word cancer until it's absolutely necessary just to keep you listening. And doctors know this. So that's what they learn. Social engineering influencing also has a good side, but when it gets to manipulation, that's bad. You don't want that. So there are some techniques and I love listening to radio commercials. I'm the only one, I guess, and it can be really funny because in the Netherlands we have .nl and it rhymes with snel, we do a Dutch word for fast. So they make all these rhymes, it's not because it rhymes and it's not because they think they have a unique little slogan, but it implies urgency, very subtly, but it implies urgency. Oh, we have only a few left. That also implies urgency and scarcity, so we wanted more. So these are techniques that you see in marketing, but you see them in a lot of other places too. The last one that's one I learned very late in life, you get a free phone when you renew your subscription, the phone is not free, it's not. You pay for it several times through the higher subscription rates that you accept. It's not a free deal. But these are tricks, they tell you that it's a free phone, right? So what's interesting is I would love to see a judge look at a case like that and say, look, I said it's a free phone. It's not free, I want a free phone. And there is actually a guy who does that with all kinds of commercials and he wins those cases as well. This is very funny. So we can play on emotions and this is where it gets interesting. We know that fear is a big one, insecurities, lots of people are insecure. And if you go and stress them, and later on we'll see why they will cooperate sooner, then there is greed. When I tell you that I can double your money in two days, of course you want that, you wouldn't want it. But it's also a cue for you to start thinking because when it sounds too good to be true it is. And then there is confidence. Of course we can get the other way around and get people really confident about something and then take them along for a ride and they fail miserably because we are very good at getting them. That's the confidence games, right? You see them on TV all the time. And then the last one and that's the one I'm talking about today. There is cognitive biases and there is heuristics. And these are properties of your brain that we try to exploit because what would be better for a social engineer to find a flaw in your brain, play on that flaw and you don't have any defense against it. I mean you will go for it even though you know you are scammed. It's like it's impossible not to be startled. Think about it. You can't not be startled. And we tested this with a guy, very highly educated guy, very, very smart and we told him okay we have a haunted house here. We build it and there is a few rooms. So we'll tell you from any second when you go in after two seconds you feel a little bump. There is something on the floor. The door knob is slimy. There is like fake spider webs and there is stuff and there is wind blowing and evil sounds and you hear this and you hear that and after exactly seven seconds a guy in a black suit on the floor will grab your ankles. Seven seconds. So just go in there and count one, two, three. And it was impossible for this guy not to be startled. Even when we counted down through an intercom we sat there and we said five, four, three, two. And still it's impossible because this takes place in another part of your brain and if you can tap into that as a social engineer you'll always win. So this is where this talk gets interesting. And of course there is identity fraud. This is the first recorded case of identity fraud. I just wanted to show you that. Okay. Who is very fast at thinking because we see patterns all the time, right? Who is very fast? What's on the question mark? Who knows? Just by a raise of hand. Who can see it? Okay. Any answers? 25? Okay. Anything else on 25? It's not true. It's not 25. And this is because the fast part of the brain, there is two systems actually. Daniel Kahnerman, a very famous psychologist, a writer, he talks about these two systems. We have the fast and the slow system. And the slow system is the reasonable part. You think, okay, look, what is it say? What is this? And the fast part says, oh, it's 25. And you're done with it. So we tap into that. The fast part, it is not 25 because the answer is right there. It's five. One is five. So five is one, right? That's what it says. This is not a range. It looks like a range, but it is not. And only if you tap into the slower part of your brain and say, look, what, what, oh, yeah, it's right there. It's five. Then you get the right answer. So this is a perfect example for what we call heuristics. And heuristics are shortcuts in your brain. Why do we have these? It does not. It says so. Oh, of course, mathematically, this is incorrect. True. But for this example, it is because, you know, but okay, I should have said one X or something. So I could move either way with it. You're right. Very, very short. Okay. So why do we do this? The brain has an energy saver. And lucky for us, because if your brain would be on all the time and in thinking mode, it would consume about a quarter of your energy. And for something that has a weight of only 2% of your whole body, using a quarter of your energy, that's a lot. So your brain taught itself to go into saving mode. Saving mode is very easy because we have all these heuristics. We see a range. We say, oh, you can see that, that's 25. I don't have to go and calculate. I don't have to go and check. I see it. It's right there. I see a pattern. And this is what the brain does. So the brain has, if they call it neuroplasticity, it forms new paths between your neurons. And when you do that often enough, they become common sense. Later on, we go into the guys that deny COVID and stuff like that. That has a reason. No one is that stupid, right? I hope. Okay, so there's a few of those, and some of them are very funny, and some of them are not. Actually, they're all not, but some are, you can laugh about them. The expert heuristic, if there is an expert and we know that he is or she is, then we believe him. If the doctor tells me that something is wrong, I believe him. If my garage tells me there is something wrong with my car, wrong example, maybe about doctors, normally you trust. So there's the consistency heuristic. It's all about when you see something often enough, you see it daring, you see it daring, you see it daring. There are several sources, and they all say the same, well, it should be true, right? That's the consistency heuristic. Then there's a bandwagon heuristic, and that's everybody does it. Who of you has WhatsApp on their phone? On a hacker camp? Are you kidding? Okay. Who is a signal? And who of you has trouble convincing their family and friends to leave WhatsApp alone? Almost everyone, right? And what's the argument you always get? Yeah, they all have it. I won't miss out, because if I don't have WhatsApp, I can't ask such all my friends. So I have this discussion with my dad, he says, yeah, all my friends are on WhatsApp. I say, you have only two friends left, the rest is dead. He doesn't like that argument, but it's true. So there's no reason for him to be on signal, but I couldn't win that argument as a social engineer. That's very frustrating. So I found another way to win it, even though his argument was valid in his eyes. I said, okay, so you don't want any pictures of your grandchild. Now he has signal. That's very good. That's a bandwagon heuristic. Persuasive intent is actually reverse. I have this very strongly. If I see a salesman, something with a plastic smile and, hey, hello, I recognize all these techniques and the hairs in the back of my neck, they just stand up and I'm like, okay, fuck off. That's the persuasive intent heuristic. And on the expectancy violation heuristic, I have a slide to illustrate that. It's very funny, but it's in Dutch. So this is the expert heuristic. In the old days, this was the expert. He was on a square in the market and he was there and this was the doctor, right? He said, oh, yeah, you're sick, we'll just let some blood out because your blood is probably sick. We'll let something out and then you'll be fine and take this. It's only four pieces of gold, but it helps against everything. Sounds a bit like the guy that comes over to your office and said, look, I have the problem to all your network solutions, all your network problems, right? I said it right. I have the problem to all your network solutions, but this guy has a nice silk tie and he comes over to you and says, look, if you buy this, you have no further problems. This is a super modern firewall and it's smart and it has blockchain. Recognize this? These guys, but it's the same as this guy. So I have some experts here. Any opinions? Who is the real expert? We have this guy from a Dutch commercial a long time ago. He's from ACA, he's from ACA. Expert. Okay, then person B, the guy from Twitter, expert, okay, C, that's an actor with a white coat pretending to be a dentist, no. Remember that guy with D, remember him? He was hilarious. That was great. I mean, war is never funny, but this guy was. He should have had a red nose. There are no American tanks in Iraq and you saw them pass by behind him. It was hilarious. So then he's the Dutch weather guy, expert, yeah, maybe, okay. Let's continue. Yomanda, remember Yomanda? You're all healed, if only. And then we have who's that, G, I don't think anyone at a hacker congress would have heard of this woman. Expert, well, maybe an expert social engineer, we don't know, but the true expert is the weather guy. He has two masters and five bachelors in all kinds of climate studies. He knows a lot about his work. And the fact that he comes on TV and very casually tells you that the sun's going to shine tomorrow doesn't mean that he is not very, very, very smart, like because he is. Okay, and then context is also very important. I know a lot of PCs, computers, cybers, cybers. I know a lot of cybers you all do know a lot of cybers, but don't let me do that because that will not play out well. So then there's the consistency he was sick. I told you about this already. It is about fake news. And when you read it over and over and over and over again, it might become true for you. The danger is that we form bubbles. The algorithms of Facebook, Twitter and all the other stuff there, they are meant for only one thing, to keep you on their side, to keep you watching, reading, looking, and they can do that by only one way. And that's by showing you more of what you already like. So they choose for you what news you get to see. And you don't get to see any contradictory news because you don't like that and you might just close your browser. So they show you more of the same. Only they don't check if it's true what they show you. Ben Weyn, we talked about this persuasive intent. I like the illustration on this one. And this is for the Dutch guys here. I'll just give you a second to read them all. Expectancy violation has to do with, you have an expectancy in this case, it's the city and you might think that guys working for the city, doing all kinds of official business, they know what they're doing, right? Well, they don't, at least not always. And these are examples of really, really gross spelling mistakes. Very funny ones as well. I mean, this here in, this should mean goes for the entire street. But the way they spell it means there's money for the entire street. So maybe they won this post-codal today, you know, the truck comes in. That could be true, but why put a sign up? This is weird, so I like the expectancy violation bias. It's very nice. Sometimes people make mistakes, and this is a pretty rough story. We have the Dutch Ministry of Defense, and I can speak about this because it's a long time ago and it's fixed in the meantime, but this is the website of Miroslav Mindev. And the domain is Mindev.nl. But that is also with one different, one little letter difference. It's the Ministry of Defense.nl. So the guy that ran this domain got all kinds of emails for the Dutch Ministry of Defense, because people made a little spelling mistake, and those were not funny, these males. So when people make mistakes, even tiny ones, it can have large consequences. And if we can get people to make these mistakes based on the heuristics we already know, we just invite you to make the mistake, like the little table there. It was not 25, but it's very easy to go for it, and you're in trouble, right? So truth. What is true? And this is a very hot discussion at this moment. Something seemed true. Some things feel true. It has to do with experience and what you can see and what you can hear and what you can touch. But is it? When I do a car trick or a trick with some ropes would turn out to be the same length or they don't, people see it happen. But what you see is not what's happening, and what's happening is not what you see. That's the definition of an illusion. And that, having said, truth is very, very fluid indeed. So in the U.S., they have this very, very cool oath when you go before a judge. Guys know that? You hear it on TV all the time, right? Do you suddenly swear to tell the truth, the whole truth, and nothing but the truth? I think this is genius. Because what you say is, okay, do you swear to tell the truth? That covers lies of commission, plain out lies. You just tell something that's not true. The other one, the whole truth covers lies of omission, stuff you leave out. And it's also very easy to do. Yeah, I went to Bolivia on a business trip. I don't have to tell people that I have a mistress there. It was actually a business trip. Sorry, dear. Not true, by the way. And nothing but the truth is lies of influence, and they are the tricky ones. Because how many times have you seen this in the courtroom dramas that a suspect does not really answer the question, but say, look, I'm a respectable guy. I would never do something like that. I'm a Bible salesman, and I'm volunteering at the school, and I'm doing this, I'm doing... He didn't answer the question. He's trying to get away around... And these are lies of influence. Just by getting yourself into what we call the halo effect being a true angel. Become an angel, by the way. That's very... They need angels. But getting yourself on that level of, look, I would never... You know, it's also lying, but it's very, very subtle and it's very hard to catch. And this is... Sometimes it's... How do you say that politely? Sometimes it's by design, and sometimes it's just plain stupid. Ampto nareis. It's the language that civil servants tend to use. I see a lot of memos in my line of work, and I have to take a lot of time to decipher what they really say. And for the Dutch speaking, these are, and I have permission to show these, these are actually WhatsApp messages of students working with the Dutch government, trainees, talking to each other about what they're going to do next. And if you read their WhatsApps, holy shit, that's not going to be funny if this guy really gets to be a civil servant, or this guy. That is bad. But we do that ourselves because this is part of the ampto narevet. It's the rules on regulation that dictate what a civil servant should and should not do. Look how that is written. Could you expect for any normal person to know what this says exactly with all the ins and outs? You need a lawyer for that. You can't do that. And that causes a lot of mistakes as well. And then this is a favorite of mine. It has been decided. Laid on the Form is in Dutch. I don't really know the English translation for Laid on the Form because I'm not a grammar student. Passive tense? Thanks. Passive tense. So you say, well, it has been decided. And so what has been decided? By who? When? What were the parameters? What were you guys talking about anyway? What meeting was that? And who decided? And was he even competent to do that? Or was he in charge? Anyway, no, it has been decided. We talked about it. So be careful when you see this in a memo. Always try to get things very clear. And my colleagues hate me for it because I always return their memos. So I don't know. I don't understand. Just be clear and short. If you can't put it on the back of a coaster, I won't read it. They hate me for that. This is true. Then there's critical thinking. This is a problem we have a lot these days. But I saw it on the internet, so it must be true. I saw it a lot of times on the internet. Oh, that's a consistency heuristic, right? But if you're in a certain bubble and things get filtered out, you don't know what gets left out of your bubble. It's a very dangerous thing to do. Anyone can learn to be a critical thinker, but no one seems to be or do so. Because it's inconvenient. You need your system, too, the slow system for that. And that takes a lot of energy, which we don't always have, not the time, not the energy. So that's a problem. There is a drug to help you. So you could take this if any of this comes your way. And this is a grain of salt. So think about that when you hear some raffle and it comes over you like a thunderstorm. Take this grain of salt. Take a step back. Go in your system, too. And if you're too fatigued or pressed for time or whatever, just don't do it. So look, I have to look at this more closely. I'll get back to you. Don't let them suck you into it. New silos are very dangerous. This is where the consistency heuristic lives. I told you about the Facebooks and the Twitter and Instagrams. They want you to stay on their page. So they will always tell you what you want to hear, not what you should hear. And they tell all your friends, too. And they select friends for you to see in that same bubble. That's very dangerous. We tried this with several accounts. We made new accounts. And we posted just one remark on Twitter at one account and the other. And one was for Trump and the other was against Trump. And we left the accounts alone. And after a while, we looked at their separate timelines. That was horrifying, the differences that you see and the truth in quotation marks that they try to tell you. Because actually the one that was quite positive about Trump got a love. This is true. And this is true. And Trump is a great guy. And you would almost believe it. And the other one was exactly the opposite. It can't both be true, right? This guy says it very well. If you want to take a picture of the screen, this is the picture you should take. He says these bubbles, you don't get to decide what gets in or out. And you don't know what gets left out. This is a very, very strong remark. And always be aware of this. Thanks. So you have an opinion, right? That's great. These guys, if you know Dutch politics, these are the most left guy and the most right guy. And they try to make you believe that there is such a thing as left and right, which there is obviously not. I mean, you can be all for climate, and we have to save the climate. And I think so too. But you can't be against a lot of other things that we also need to just only save the climate and only give money to poor people and be very, very, very left. And it's ridiculous. You have to keep thinking about each separate problem and try to get to the bottom of it. A lot of people don't get that. But I've looked into this, and Daniel Ofman has made a diagram. And I thought I could use that to illustrate what I mean. So if you take a core quality, let's say we have mobility, you want to get from here to there, and maybe you work a long way from home, so you need a car, right? But a car comes with problems. If it's too much, we have pollution, we have traffic jams, we have all these things. So we need to mitigate. We need to find some ways, and that could be anything. We could make bigger or better roads or more roads or less cars or try to spread traffic over a period of time, or there's a lot of things we could talk about to mitigate the problem that arises from mobility. But if you go too much the other way, you get to be Amsterdam, ban all combustion engines. Amsterdam sucks, by the way. If you have a car, don't go there. So that's the allergy. Don't go there. And what we see is when you have a debate about things, best do it in the green zone. This is what we actually want. We recognize the problems, and we have a challenge. And let's talk about this. What you see in debate, in politics, is guys going nuts about that. Because it's very strong, right? Ban all combustion engines. Yeah, good for you, right? But it's not a discussion you should want to see. It should be in the green zone. So and then there is, of course, the Dunning-Kruger effect. Heard about it? I love the Dunning-Kruger effect. The theory is that the less you know about something, the more you feel like knowing about it. So when I bought a computer in 1983, I learned some basic, and I did some which I'm the biggest expert in the street. And in fact, I was, because I was the only computer user in the street, but you get a lot of confidence from that. And of course you do. When you grow into that profession, you think, okay, now I do, I know a lot about it, and you come here and you think, I don't know shit. So that's probably when you think that you are somewhere here on the slope of enlightenment, but everyone seems to start at the peak of Mount Stupid. Your confidence is really high, but your knowledge is not. And I love this diagram. It's my favorite slide of the entire deck. So then there is influence by proxy, and Cendric hates me for leaving this in. They asked me now several times to get their logo the fuck out of there, but we all know what influence by proxy is by now, right? It's Rian van Rijbroek against Gerard Zandering. She is whispering in his ears, isolating him from the rest of the world, because this guy is not stupid. He's a millionaire. He knows what he's doing. The only reason he believes in her is that he doesn't get any other information. He's in a bubble. So that's a problem. And influence by proxy, he is the boss of all these companies, but she runs them. It's very dangerous. This I'll leave for now. This is why I'm in with brevity. When I get a very long memo, and it's very elaborate, and there is a lot of passive tense, thanks, I just send it back. So I don't get what you're saying here. It's too long to read. It's too complicated for me. I don't get it. And I'll just play stupid. So explain me like I'm five. And you got a lot of discussion about that and say, okay, but you're not five years old. You went to university. Yes, I know, but explain like I'm five because it prevents you from lying and hiding it. So, and that's actually very hard to get to the point. So that's why this phrase is very nice. I'm sorry I had to write such a long letter, but I didn't have time to write a short one because you have to think about that. Get to the point. Okay, phishing, I'll leave out because everyone here knows about phishing, right? This is a very nice tweet. These are some random shots that I collected here and there. The professor fooling you. He is a security philosopher. Actually, it's me. And he said, well, criminals, they have a new way to make you pay a lot of money for anything. So for only $3,500, I'll tell you all about it. Of course, this is funny, right? But not why you think it is. It's not because I might be the criminal and I get the $3,500 just by saying, okay, I'll tell you the secret. That's not what it is. Because in the boardrooms, we get swindled a lot. And if you get a consultant for $3,500 and he says, look, that is 1.6 million not well spent. I wouldn't do that. That's money well spent, right? They save you from a lot of trouble. So nothing is really what it seems. This seems to be a bad idea, but it's not. Not in every case. So just give me the $35,000 and it's good. Here are some countermeasures. Most of them I already touched. Ask for clarity and details. And just tell them it's too long. Too long. I didn't read. It's too complicated. Make it simple. Make simple statements. I want to know what you're actually saying. Go to your system to think about it. Don't let yourself get pressed into time. Your brain consists of three parts. Ten minutes. Thanks. We have the frontal lobe, the prefrontal cortex. This is where your thinking happens. This is what most of us do here. Then there's the middle part, which is mostly medical. There is all kinds of subsistence there, hormones, drugs. There's all kinds of stuff going on there. And then there is the reptilian brain. And this is the link to the Jurassic Park, the dinosaur at the front, which is actually the boss of them all. And this is where fear, flight, fights, all those responses come from. So if you try to tap in this here, all the other brain parts will be silent or at least distorted enough not to function very well. If you tap into this one, the emotional bit, hormones, that's why men do stupid things. You see a nice lady and think, oh, that's nice. And this stops working. That's why, ladies, that's why we do that. So if you want them to piss off, tap in the other one, because the hierarchy is from here to here. This is the boss. Scare them. Propose. That's very scary. So call in external expertise. That's a very good countermeasure. And discuss scope creep. We have seen that in several projects, also in Amsterdam, the new subway system, way over time, way over budget, but it couldn't go back. So that's a problem. Discuss it and discuss the exit terms. Because you can't build half a subway system and say, oh, yeah. Okay, we'll just stop, get some landfill, close it up. Shame. So you have to discuss those things. Our world in data, I just wanted to mention it. It's a very good resource for statistics. If you want to know if something is actually true, poverty is rising. Well, these guys, they kept record of poverty over the last 20 years. So go look. Is it actually true? Our world in data is very good. Yes. So reclaim your brain, get rid of bad apps from your phone. I told you the talk is actually bigger, so I tell people to get rid of WhatsApp and stuff like that. Browser extensions, that's also for the larger part. And ask questions. This is very important. Try to be the boss of the conversation and try to pinpoint all the heuristics they're getting to. And you know, marketeers, good marketeers, they know all this stuff already. It's up to you to also know. Task segregation, we also know, communicate. This is important as well. If I social engineer someone, let's say I want to get into this complex here and I don't have a ticket and I go try it that way and I make a lot of fuss and they send me away and I'll try the other entrance. And they don't know about me. It might work. But if those guys get on the radio and say, look, there's a weird guy here trying to get in, so everyone be able to look out, you know, it's weird. My chances are very slim after that. So these are some sources. Herschneck, it's a Dutch, Maghriet Sitzkorn, Neurosurgeon, very, very good book. Read it. Factfulness, Hans Rosling, he's no longer alive, but he left us a great book. And Richard Nisbitt with Mindware, also very, very well recommended. And listen to the podcast because it's fun. No, you don't have to. Right, Cyber, thank you. And if there's any questions, I'll be happy to answer. Great, if you have a question, please line up by the microphone in the middle. And first we'll check with Signal. So we, okay, none for the internet right now. Why don't you go ahead and start with the first question? I thank you at first for the talk. It was very interesting and I learned a lot. I was wondering about the principle of people like which are in extremely bubbles, but they know, but they think everyone else is in a bubble. So they know the theory about bubbles. They know the theory about how you get the information you want. But they think they're the only one with the truth and that everyone else is being lied to in the bubble and etc. And like, for example, the flat earth people, some things like that, like the most ridiculous truths, but still they know, they know all the theories which are being discussed, but they can't seem to apply it to themself. They only, they use it to discredit everyone else. Can you tell us if you have some tips? Yeah, of course, these are complot theories and this works exactly that way. There's bubbles and they get served all the stuff they want to hear. And sometimes some little proof seeps in. Like we have a picture of the earth, right? It's not flat, but they just deny that. See, look, but you're all out to get us. I see this the entire day and everybody is telling me and so you must be wrong. And that's what's happening. And if your bubble gets large enough, you tend to be, you know, these neuro parts, the new parts that you are creating by constantly seeing the same and the same and the same over again, these parts are very hard to break. So this is what happens. Most curious in this, like they tell, so they know that bubbles exist. They know that, oh, you're in a bubble. So why can't they mirror that to their own situation? Why? Because if I get a new theory, like if I hear this and I wouldn't know about it, I think, oh, maybe this applies to me. Maybe this, but how can you know that it exists but then not applying it to yourself? That's the thing like I'm wondering what happens. That's called confirmation bias. And you are naturally inclined to always look at what supports your belief. So things that don't support your belief, they don't feel good because nobody likes to be wrong. So I have some pieces of information and because of the bubble, I have a lot of this and there is this and that doesn't just feel good. So I'll try to find some reasons to deny it or ignore it or whatever because, and that's confirmation bias. You're always looking for confirmation. What you already think is the truth. That's a natural tendency. It's one of the heuristics. And we just have time for one last question if it's somewhat brief and direct, concise like he was advising. I'll try. Thank you for the talk. Can you talk a bit about how you got started with social engineering? Excuse me? Can you talk a bit about how you started with social engineering? Oh, this is a very long story. If you guys like, I'd be over there somewhere or we'll find a nice spot, drink some water, do drink water and just find me here and we can talk some more if you like. And given his expertise, you should engage to see what it's like to be social with a social engineer. And with that, let's thank him again for amazing time. That's horrifying, I can tell you.