 My name is Kathy Ullman. I'm with the University of Buffalo. And I've got some stuff to share with you that I hope you will find useful and interesting. Forgive me, I normally would be walking around and trying to engage all of you a little more. But I've been told I'm basically bound by my technology here in the microphone. Do not feel like you have to wait till the end to ask a question. I'm very happy to answer questions in between. And with that, we'll go ahead and get started. So here's what we're going to talk about today. I'm going to do an introduction. And then we're going to talk about fear and loathing, the cost of fear, the history of fear, the consequences of fear, changing the paradigm, moving forward, and some final thoughts. So this is a little about me. I love sloths. I'm advertising as much right on my badge. This is actually a sloth that I adopted through the zoo in my area in Buffalo. Her name was Minnie. She had a gimpy claw, and she was very sweet. She lived to be about 32. So I always start with me as a sloth, because I like sloths, and I can relate to them. But this is really a lot more about me, my Twitter handle. I've been at the university at Buffalo almost 20 years now. I started in the support role and moved into security about 10 years ago. I'm on staff with B-Sides Rochester. I help run that. I'm also involved with our UB GenCyber camp. I'm involved in a whole bunch of other volunteer activities, even here at Duffcon. I've been a speaker at a couple other places. I've got a bunch of certifications, and I've got four degrees, most of which have nothing to do with any of this. So let's start with some fear and loathing. Hey, in Las Vegas, because that's where we are. So let's take a little step backwards, shall we, to 1989. Notice, we have some fear language here. Your system could be infected right now. Hmm, interesting. Fast forward to 2007. See anything similar? Yep, find out what's lurking inside your system. Same kind of fear language. Here's what we've thought fear could do for us. We could motivate a positive change or behavior and bolster that change in behavior in some way, right? We think we can get people to do what we want them to do if we scare the crap out of them. But there's a cost to that fear. If we're spending more, we must be more secure. I mean, look at this. In 2017, we spent over $86 billion on security products. In 2018, we spent $114 billion. That's up a significant chunk, right? So we must be getting more secure. Check it out. That fear thing, it's not working. So I don't know how many, any of you familiar with Chris Roberts? And any of his work? At least I see a few nods, few folks who seem to know who he is. Okay, so Chris gets on this rant about Blinky Boxes and how we need to stop buying all the Blinky Boxes. And for those of you who aren't familiar, the idea is that companies buy Blinky Boxes and a Blinky Box is just whatever technology is using the latest and greatest, you know, throwaway terminology. So whether it's AI or ML or Next Gen, it's supposed to be the easy button of technology. Buy this thing and you'll no longer have any problems. It will be wonderful. The problem is most companies don't bother to do the basics. So at the end of the day, you have a Blinky Box that is pretty and blinks a lot, but it doesn't really do anything useful for you. Here's what fear actually does. It makes us buy Blinky Boxes. It causes us to be very defensive. The problem is with fear is it's a very difficult thing to manage. If you're not using enough fear, you wind up with an attitude of complacency. People just don't care. If you give them too much fear, if you make them so terrified they don't know what else to do, now they do nothing. They just are paralyzed. They're controlled by their own fear. It can create this negative association and it can create an overreaction to what's going on. So how did we get here? How did we get to the land of Blinky Boxes? This is really what precipitated this particular talk is I thought, you know, here we are in 2019 and we're still buying Blinky Boxes. It's still not working. How did we get to this point? So I'm gonna talk a little bit about that. So we've always used fear by design, not just in technology but everywhere. You'll notice the one image has to do with concentration camps. We were afraid. The government used fear to control people. It's what they do. So on the left, or at least on my left, the concentration camps made us fearful or not concentration camps, I apologize. The internment camps of the Japanese, we were afraid of Japanese in World War II, and then the right image is fear of the Russians. The Russians are coming, oh oh, and there was all this propaganda that came out. So governments have been doing this for years. It sort of makes sense, technology should follow. But in the very early days of computing, security, as we sort of think of it today, didn't exist in the same way. We didn't have this terminology. There wasn't information security or cyber security. There was security, generally speaking, and it came in more of the form of government risk assessment, policy and controls, and there were these rainbow colored books. The one that most people are familiar with is this orange book, and the orange book was this Department of Defense document that basically was this compliance thing that they were supposed to follow. So this was the very early days of any kind of computer type security, and there's been security around longer than that, but this is kind of where we're gonna start. So fast forward a bit, and we get sort of the beginnings of the internet. For those of you who may not be aware, I'm guessing many of you are aware, in the mid-80s, the DOD starts with DARPANET, we wind up with something called ARPANET, which leads to something called BitNet. Now with ARPANET, we had four universities that were connected with the idea that they could back up information and be not a single point of failure for research. That was really the point of DARPANET. But ultimately, we wound up with several different ways of doing it. One of them was BitNet, which involved Yale and some of the State University of New York schools, including ultimately the University of Buffalo, State University of New York at Fredonia, which is the little tiny town in which I grew up, and where I first started playing around with computing. I was a little, little kid, and I can remember in the early days of BitNet, how many of you remember, because I can't imagine you haven't, the dial-up noise every time you dialed up. A whole bunch of hands, right? You had all this noise, and you had to, you know, and many of you probably hacked the whole thing so that you wouldn't have charges for your phone lines, right? Well, I grew up in this bizarre universe where we didn't dial anything up because we had a direct copper line from my house to the college. The only time I heard that noise growing up was when my father was testing a coupler. So for me, it was a weird place to be when I was like five, six, seven. I was playing wampus, and I was playing dungeons, games, and adventure. But this all started with my playing around with BitNet. And not too long after that, we wind up in this personal computer boom and companies are starting to use the internet. So then we start to have some early security developments. The first thing that people become afraid of are viruses. You have this virus brain, which some of you may be familiar with, which was this catalyst for McAfee to go ahead and try to help people using a BBS. He essentially provided an antidote, if you will, for this virus. What this particular thing actually was, again, for those of you who don't know, it was meant to be a copyright infringement tool. But a whole bunch of people complained bitterly to this company that created it. It was wiping their files. This is very not cool. So John figured out how to do this antidote, and not too long after that, a year later, he starts his company. Not too long after that, within a year or two, you have semantic and sofos debut. But here's the kicker. By 89, there were more antivirus vendors than there were actual viruses. But fear is still perpetuating everything. In fact, the government is so afraid that in an article for a Toronto newspaper, they were interviewing these government employees who were quoted as saying that virus, in fact, computer viruses were potentially devastating weapons and the high technology equivalent of germ warfare. No fear language there at all, right? Like, none at all. Now, they were right to be worried because by 1988, the Morris worm hits. And while this doesn't seem like a big deal today, a tenth of the computers on the internet get shut down by this. And in response, the government says, uh-oh, we need to stand up something to take a look at these viruses and start doing research. So you get the first cert started at Carnegie Mellon to start doing this research. If that's when it gets established. Again, all reactions to fear. Once we start branching out and people now have computer, you know, all their home machines are on the internet, now there's not only this fear of viruses, but there's this fear of something external attacking them internally. The thing is that malware removal was extremely expensive at that point. So you can see $1.5 billion worldwide, which right now doesn't seem like a lot. But think about it, back then, to a company that's maybe only has a few machines, this is a lot of money even for their own entity. So the idea of spending five, 10 bucks per machine or hiring one or two people to be in charge of, you know, virus removal and cleaning up machines, this was nothing. So they were totally willing to spend that kind of money and go ahead and, you know, and hire these people. Let's forward a little bit further and this is where security winds up. Because we're now afraid of the outside entity, we start to have these books appear about internet security. So this is when we start really seeing this terminology come into play. Firewalls and internet security, you'll see these are two of the early books that were referenced on a regular basis, 91 and 94 respectively. And then the whole world changed because with Mosaic's initial release in 1993, now everyone's on the internet and everybody's surfing, not that there's a ton of content yet, but everybody's surfing. You know, because virus protection is just a matter of trust, according to John, we shouldn't be afraid at all. So what about those blinky boxes? Not this kind of blinky box, this kind of blinky box. So where did they come in? Well, in the 1990s, this is when network firewalls become commercially available. The first one is this deck seal created by Marcus Ranham. And basically, it's a gateway. Door's open, door's closed, very basic, and it was considered virtually fail-safe protection. Okay, now we have firewalls. We're good, right? No problems at all. And Till Mitnick basically hacks the San Diego Supercomputing Center through a sequence protection attack using a spoofed address, and now all bets are off. So we've gone from, okay, we can afford virus protection, that's cool, then we're gonna block everything at the border, okay, that's cool. Then, all of a sudden, now, even having all those tools in place, what are we gonna do? So we've got, today, we've got the money for all the things, right? We're buying all these blinky boxes. How's it working? Gee, check this out. I don't know, consumer PII records? 126% exposed compared to the previous year? I don't think it's working so well. We keep doing the same thing. No. So ultimately, we're in a mess, right? We're doing the same things. Fear has sort of created where we are. It's created the industry that we're in and it's leaving explosions in its wake. It's caused some consequences. So let's talk about those. So one of them is this notion of FUD. Who's heard of FUD before? Yeah, pretty much everybody. How many of you know that FUD is not just the term fear uncertainty and doubt and actually had a specific history tied to IBM? Like maybe a couple people. So the origin of FUD is not sort of what we think of in terms of just marketing and playing to fear. It was actively going after another company, a rival company, by saying things about that company that were not true, but also not relevant. So in this particular case, IBM had an engineer working for them named Gene Amdahl and he left IBM and he created a series of computers that had the processor on the outside. It didn't need a fan because the way the case was designed, it had plenty of air flow. So IBM pointed at them and said, don't buy that stuff. You can't buy that because it's gonna overheat. It doesn't have a fan, not relevant. Doesn't matter, but that's the actual history of FUD. This is this marketing idea and this is IBM and this was a long time ago, but check this out. This slide is from an internal deck that was by the way on the internet publicly available despite if you look in the fine print here, it actually says not for distribution. This is from 2008. We're still doing the same thing. They're still doing the same thing and this is FUD about the company EMC. Same idea, I'm not surprised they told you they can't do this because they can't sell you. I mean, it's the same tactics, it hasn't changed. Image problem that I think a lot of us run into. There are a couple of quotes you'll see in here from some folks at Duo who I respect tremendously. I'm not, please understand that I am not pro or for or against Duo but I think some of what their folks say has a lot of merit. And in this particular case, the idea that mainstream media has taken this sort of fictional idea of what security and hacking is, you know, they have created what our battleground looks like. Certainly most of us, if not all of us, have seen hackers and we know what that looks like, but people are influenced heavily by what they see in the media. Anybody seen marketing that looks like this? Anybody actually hack like this? I mean, seriously? I can't even imagine how you would do that. Yeah, casual Friday, right, exactly. Yeah, you might have the gas, right, exactly. So this is important. So I try to educate as many people as I can about this idea that hacking is really about understanding how something works as opposed to how it's supposed, how it's ought to work, how it's supposed to work, like the company says. I mean, this is really what hacking is, right? It doesn't matter if it's computing or something else. Hacks happen in like every field under the sun. So why is it that the media has chosen to focus on this? Well, because that fictional character has been created. So what's the result of that? Well, we also wind up with some fear in the dev world. There's shame and embarrassment. If you create code that ultimately leads to something like a zero day, Heaven help you, and are you still gonna have a job? I mean, that's terrifying. And given the fact that you're usually expected to create large amounts of code in a very short time, you don't have a lot of time necessarily to do QA afterwards. And it's not foundational in security programs at all. If you get to take any sort of like security stuff, it's always after the fact. You're gonna take your CS 115, 116 classes. They might be like intro to programming some language. They might mention security and passing, but it's not foundational in any way. So ultimately, what happens is as a developer, you're gonna do a whole bunch of development and then you're gonna have to spend extra time going and doing code review, which is time consuming and problematic if your employer's pushing to get things out the door. So anybody see a pattern here? I mean, here's the problem, right? Consistency is usually a good thing, except in our case, it's not. It's a business afterthought. It's after the fact. We're buying these blinky boxes, easy buttons are at least companies typically are. We're working against and despite our users instead of with them, because we've been taught that that attitude is okay. And there's minimal, if any, training for folks coming out of CS programs. And even the folks who are learning themselves, they're not necessarily doing traditional school while it's not built into the untraditional methods either. We're doing ourselves a disservice. And what is that definition of doing things over and over and somehow expecting different results? We thought fear was gonna do this for us. It's going to help us. It clearly hasn't. And ultimately, it's led to the same results. We've gotta be insane. So what happens when fear leads to anger? Anger leads to hate. And hate is leading to suffering, right? I mean, people fear what they don't understand. They can be angry about that. So this is a very interesting article that was a study that was put out by Thiccotic. And it's literally titled, Everyone Hates Cybersecurity Professionals. What do you think? Do you think that that's kind of a general feeling? Do most people hate security? Lot of self-loathing, too. Lot of self-loathing? Yep, absolutely. And it has a lot to do with people not really understanding what we do, why we're there, who we are. So I thought these statistics were very interesting. 66% of the people surveyed said we were doomongers and a necessary evil, right? 38% of the surveyed folks said, oh, oh, we're policemen. That's our job. That's why we're there. We are massively understood. 74% are like, wow, this is bad juju. Everything security does is terrible. 50% think we're just there to keep the lights going. And 67% is like, ah, we're just a cost center. We're reactive. That's the only reason we're there. We're not an asset. We don't provide anything. No value. So it's no wonder that at the end of the day, this is how InfoSec ultimately gets portrayed. Thank you. So what I would argue is we need to hack this paradigm. How many of you have ever seen this image before? A few of you. Okay, so for those of you who have not seen this or even those who have and maybe don't know the history behind it, this is a very well-known example from psychology in which if you look at this image one way, you see a rabbit. And if you see it look at it another way, you see a duck. What I'm proposing is a subtle shift in how we look at things and how we approach things. It's not about rewriting the past. It's not about completely rewriting everything we know because who the hell has time for that shit, right? I mean we just, who has time for that? So this is kind of what I'm talking about. We need to overcome fear. There was a very interesting article in, there was a symposium on usability, privacy, and security at USENIX that talks about the things that we can do to overcome fear. We need to be honest but still yet communicate risk in a direct and discerning way. Sometimes less is more but we still need to communicate it. We need to empower people. We need to not just say no. We need to give them tools. Otherwise they're gonna be paralyzed and think they can't do anything. And we need to be the advocate for everybody out there because we're the ones who really understand what's going on. We need to actively encourage best practices. That leads me to this idea. Us versus them. We can't continue an us versus them ideal. And when I say us versus them, I mean this in many, many ways. I mean this in us in IT versus non-IT. Us in security, hacking world, infosec, whatever you refer to yourself, versus everybody else. It's not us versus them. We're all in this together and we need to let them know we're all in this together. Which requires patience. Even if you tear your hair out and go screaming into the next room when you're done having those conversations. You need to provide honest responses and follow through. When I was in support, one of the reasons why I think I was fairly successful was that when I told somebody I would let them know what happened or explain something to them, I followed through. And it garnered trust and trust is huge. So we need to help illuminate for these folks who are in the dark about this stuff. It's hard. We need to provide communication at a level folks can understand and relate to. And what that ultimately means is we need to frame the conversation for whoever we're communicating with. That does not mean the same message to every single person at every level in an organization that will never, ever work. You always will have your CISOs who are like the best way to talk to them is three minutes in an elevator. Be thinking ahead. You better know what they're about. You better know what's important to them because that's basically what you're gonna get. When you're talking to an end user, you're not gonna give them this big long explanation. You're gonna give them essentially what they need to know so that they feel empowered. So they don't feel like this thing is happening to me and I don't know what to do about it. And when I say that, I mean anybody not just within a company. This is what we don't want to do. This is incredibly ineffective. So what we're gonna do in terms of shifting that paradigm is we're gonna replace fear with a healthy skepticism. So instead of paralyzing overall, oh my God, can't do anything fear, we want folks to learn to be skeptical about what they're seeing. Healthy skepticism is about questioning, not just accepting it face value, but it's also not about being paralyzed and not knowing what to do. So in some cases, you're gonna play devil's advocate and you're gonna ask those questions. You're gonna tell them, hey, so you get this email and it says that it's my favorite, of course, the one where you've been left $4 million by somebody from other countries. We laugh at that, but there's still people who read that and think, my saving grace, this prince in Saudi Arabia just sent me a million dollars. If we look at people who see that and think it's legit and turn our noses up at them because we know better, how is that helpful? I mean, it's tempting and I get that. Healthy skepticism is the basis of all accurate observation. And what Arthur Conan Doyle talks about here is this idea of the difference between observation and seeing things, okay? How many of you have been up and down one of the escalators in any of these hotels since you got here? Pretty much everybody, right? You saw the escalators, you rode the escalators. How many of you could tell me how many steps were in any one of those escalators? Probably nobody, because you saw, but you didn't observe. And observe is taking in these extra little details and paying attention to the things that allow us to do critical thinking. And this is what Conan Doyle is getting to. And he actually has a whole thing about this in one of his Sherlock Holmes books. So outside of Infosec, we're gonna encourage people to question material on websites. We're gonna encourage them to question the legitimacy of email. It's gonna probably create a little more traffic for you in terms of having to respond, but in the long run, it's worth it. They need to have some awareness of what these online risks are, and we can't just be training folks once a year, and that doesn't mean we have to get them in front of us every three months. It means all kinds of different things, everything from a lunch and learn, to online training, to maybe sending out an article that you find that's kind of interesting, to literally running into people in an elevator and saying, hey, did you see that crazy thing that came in email that everybody got? I was able to provide some awareness in a place that I didn't kind of think would ever happen. Facebook of all places. So this is gonna sound kind of crazy, but a woman that works at our institution posted something on Facebook and she says, I have this email, it had an attachment, and not only did I have to log into my email, but I ultimately had to put a password in to open this attachment. It's so redundant. She actually messes, you know, put this on Facebook, and she's complaining bitterly about this. Now, what most of us tend to go is, oh, good grief, she just doesn't understand, move along, and instead I thought, here's a learning opportunity, and I said to her on Facebook, tell me more, why do you think this is redundant? And she explained the fact that she had to both log in and put in this password for the particular document, to her seemed redundant. So then I gave her the like two minute version of the history of email, and how email was never meant to be a secure mechanism of communication, and that while logging into her email, that in and of itself is a, you know, the mechanism you're authenticating to email, but has nothing to do with that attachment, which is sent through email in an unsecure fashion, which means anybody could see it in transit, and she was like, oh, illumination. Does it mean that it's, you know, any less of a hassle to her in the end? No, but she's less frustrated, because she's not questioning, why the hell do I have to do this? From inside our own neck of the woods, we should be speaking out against these fear-based marketing materials. We should be skeptical of products that ultimately are still using fear, because fear isn't effective. We're gonna, we should be speaking out against this image of hackers as bad people, because we're not bad people. We just understand how stuff works in a way that most people don't, and we really need to stop with all the scary lingo. I mean, you know, okay, advanced persistent threat, yeah, we all kind of know what that is, and you know, what we know as a bad actor got in, and they're still in, and that's fine, but all this terminology does is scare people. It doesn't make it better. All the lingo in the universe doesn't fix the problem. So what we have to do instead is this idea of nuanced learning. So again, it's just a very slight paradigm shift. So instead of, when you do a phishing campaign, we're not going to catch users. And this is tough, because I'll tell you, most of us, this is how we think about it. When you're self-phishing, or you're pen testing, and you're thinking about, you know, going out and doing a phishing campaign, you're thinking, I'm gonna catch people clicking on stuff so I can educate them and teach them not to click on stuff. The minute you think that way, you're already going in the wrong direction. You wanna think of yourselves as a partner to them, and you wanna do things like talk about, oh I don't know, let's say 60% of your users clicked on something and 40% of your users didn't. Instead of giving people an email that says, okay 60% of you, you clicked on stuff, that's not so great, we need to be better, you need to say 40% of you didn't click on stuff, that's awesome, how much better can we get? Because we will always, always, always do better doing an influence that is a positive spin than a negative spin. There's a ton of research that's been done. There's a wonderful person named Jessica Barker, Dr. Jessica Barker, who has done some research on some of these methods, and they've discovered that any sort of positive reinforcement is far more effective than negative reinforcement. So if we're not using language that is lingo that they don't understand, and we're providing ultimately this positive messaging, we are more likely to change behavior than if we're using this negative messaging. So again, focusing on who didn't click the link instead of who did. And of course, from an operational sense, I won't belabor this because I know this is ultimately stuff most of you are aware of. Where is our data? Who's using it? Where are the assets that potentially could be storing it? We need to move the easy ways in. We should be monitoring logs, fix all the simple things that we can. And there's lots of ways to do that, but I'll tell you in the end, don't let the goal of perfection become the enemy of good. Don't look at all of that and think, oh my God, if I can't do it all, I can't do any of it. And it's so easy to get wrapped up in that way of thinking. We have to go back to basics with education too. We need to see that this stuff gets integrated into CS education, both formally and informally. If you're working or have any sort of connection with the university, talk to them about this. Tell them why this is a problem. If you're friends with or know somebody who's doing online free classes, reach out to people you know who are offering this kind of stuff and make sure they're including it as well. It's definitely a foundational issue. Really, developers should be able to self-evaluate their designs. They should be able to communicate about security issues and they should recognize when they need further expertise. And they come out of programs and they don't have a clue about most of this. So how do we move forward? This is kind of what we need to do sort of at our level and our generation. And I've been doing this stuff a long time and I can tell you we're here. We've been here a long time. Are we likely to significantly change ourselves? No, probably not. Cause affecting change is really hard. These are the things I hear all the time. Oh, InfoSec is a disaster. We're broken. There's nothing we can do. We haven't had a Eureka moment to talk about why in the world this is happening and how we can fix it, right? It's because we're trying to change people. And some of what the research that Jessica has done talks about something called social proof which speaks to this whole idea that if people don't know how to act, they assume the behavior of other people. So what happens if you tell folks, hey, 60% of you clicked on stuff, that's bad. What are you reinforcing, bad behavior? So if we want to use social proof to ultimately be on our side, we need to turn that language around. It's that slight paradigm shift. We also need to worry about optimism bias. Who here has heard the term about our end users that ultimately they're the weakest link? Most of you, right? I mean, it's something we hear all the time. What are we doing by constantly telling each other that and telling our users or our company and telling them we need to provide phishing simulation because our users are our weakest link? What we've done is we've created a bias, right? We've basically created a situation where that's exactly what these people are now going to become because we've told them they are our weakest link and now they are. And it's not the intent, but it's exactly what happens. So this is not a good idea. We're trying to change people. This is particularly hard. We are creatures of habit. Think of all the times any of you who have ever tried to change anything about your daily lifestyle, whether it's your eating habits or smoking or drinking, whether you want to drink more or drink less, right? Changing habits is hard. And that's really what we're trying to do. We're trying to effect change. So this is from a book called Switch, these quotes. And the idea behind Switch is making a change from the bottom up instead of the top down. Obviously, if you have support from your upper management and you can get top down support, you're gonna have a much easier time getting changes to your environment. But even if you don't have that support, you can still effect change. They're going to be small changes and they're going to snowball. And they're going to get better, but it's not the same thing as the idea that change is easy. So I would suggest to you a different kind of change, one of participation. Again, we're all on this boat together. We're all in it together. Move away from the silos. If you don't know your systems people, get to know them. If you don't know your end users, get to know them. If you don't know anything about your company and what it does, get to know it. All of these things matter. And a lot of us, I mean, at least from my personal experience coming up through all of this, is to be completely oblivious when I was working in the private sector, not so much in education, about what the goals of the company were and what the people I was supporting, what they were trying to do throughout their day, I just wanted to fix their crap and get on with life. But that really wasn't as helpful as it could have been. So we need to see more in terms of like a neighborhood watch where we're helping other folks, they're helping each other, and everybody gets a partnership. You get a partnership and you get a partnership and you get a partnership. Yes, and we need to partner with the next generation because I don't know, I'm getting old folks. I've been doing this a while and it's hard to change us, right? But boy, that next generation, they're young, they're eager, they're already computer literate in most cases. Why are we not paying more attention to them and teaching them about our failures? Because these are our failures, folks. I don't want to belabor that. I want to focus on moving forward. But we need to make sure that they don't perpetuate that same consistency that we've been seeing. So how do we do that? How do we partner with folks? B-Sides is a great opportunity. Like I said, I help run the B-Sides in Rochester, New York. We partner with students all the time, both as volunteers and just folks who come and attend. And it's awesome because they're so eager. Coder dojos, which I don't personally work with, but I know several folks who do. A variety of cyber camps. I'm involved with our GenCyber camp at the University of Buffalo. I spoke in front of the one at the Rochester Institute of Technology. They actually had two different cyber camps. There's something called the Odyssey of the Mind, which my husband was involved in when he was much younger. There's hacks for kids. They're awesome. So, you know, get involved with hacks for kids and any kind of mentoring opportunities you can find. It doesn't have to be young people, but boy, they are eager. Let's help them. So, with that, some final thoughts. You need to become a judgment-free zone, which is really hard, because we spent a long time looking down our noses, going, uh, how do they not understand this? Well, you know what? I worked with rocket scientists at one point in my career, and those people are brilliant. Absolutely brilliant. And half of them didn't know how to turn their computer on. That did not mean they were not brilliant. It meant their expertise was not where my expertise was. And you know what? That's okay. I learned some cool things about rocket science that I didn't know. And in exchange, I taught them stuff. I didn't judge them for what they didn't know, and they didn't judge me for what I didn't know. We collaborated. We had a partnership. So, educate folks. Don't adjudicate them. And trade that knowledge. Any questions? Okay. Well, thank you all very much for coming.