 Okay, everybody count off? Good. Thanks. Anybody? Is an empty seat next to you? Raise your hand. Looks pretty full. Anyone know if this place has a fire code or not? Don't start fires. If you start a fire, you have to share. Some seats over there. People with hands, people who have their hands up have a seat next to them. So the smart money goes that way. The smart money is not in Vegas. All right. So hi there. I'm Bruce Schneier. Are there any questions? That was the easiest talk I ever did. If there are any questions, I'm happy to take them. Yes. There are actually mics here. There's one there and one I can't see behind the camera guy there. We'll find out, right? Yes, they work. That one works. First off, Bruce, thanks for coming out. Again, always pleasure to see you here. The question is, it's an old question, and I'm wondering if maybe you have any new insight into an answer on this. With cryptography becoming more in the kind of collective consciousness, especially with people who are less technically savvy, there has been an argument for a long time trying to explain to people that encryption is not security. It's very common for people who are not technically savvy to say, oh, we'll just encrypt the shit and then we're secure, which obviously is total bullshit. Do you have any insight on how to better explain to those people why that's fundamentally flawed? It's interesting. I think you're right that a lot of people think of crypto as a panacea, where in fact it is just a tool and a very powerful tool for a bunch of reasons, but it doesn't automatically make security. Any data has to be used. One of the things that the Snowden documents have really brought forward, which I think is a good thing we're talking about, is metadata. Data that has to be unencrypted and consistent to operate. This cell phone is a great surveillance device because the metadata where this phone is has to be in the clear. Otherwise it can't ring. I'm thinking about that, I should turn the ringer off. So there's a lot of things encryption can't do. Encryption can protect data at rest, but if you are, I'm going to make this up. Target corporation, you have a database of credit card numbers that you're using, it can't be encrypted because you can't, or at least encrypted, the key has to be there. I talk about encryption as a tool, not as security, just like your door lock is a really important tool, but it doesn't magically make your house secure. What encryption does, and I think this is a real important, is it, I think that NSA surveillance, it forces the listeners to target. What we know about the NSA is that they might have a bigger budget than anyone else on the planet, but they're not made of magic. They are limited by the same laws of physics and math and economics as everybody else. And if data is unencrypted, if they can tap a transatlantic internet cable, they can get everything. But if stuff is encrypted, they have to target who they're listening to. I mean, if the NSA wants into your computer, they are in, period, right, done. And if they're not, there's one of two things that's true. One, it's illegal, and they're following the law. And two is you're not high enough on the priorities list. So what encryption does is it forces them to go through their priorities list. And they can hack into your computer and there's no problem, but they can't hack into everybody's computer. So encryption is just a tool, but it's actually a really powerful tool because it denies a lot of the bulk access and forces the listeners to do targeted access. And there's a lot of security benefit in that. All right. Are you first in line? Yes, I believe I am. Okay, so I just want to see what that mic is. I wanted to see your opinion on the back door that Obama wants. Which one does he want? You know, so... Okay, true. So I'm not sure Obama personally has an opinion here. Why the bad guys are the only ones that are going to be able to... I'm not sure Obama personally has an opinion here. It's interesting. This is the same back door that the FBI has been wanting since the mid-90s. In the mid-90s, we called it the crypto war. Now we call that the first crypto war. And so number three, I'm done. It is you guys. I only do two crypto wars per lifetime. So it's interesting. You're referring to PGP? No, in general. FBI director Comey gave a really interesting talk, really Q&A at the Aspen Security Forum. Actually, I recommend listening to these talks. This is a very high level, mostly government discussions about cybersecurity, national security. Really interesting stuff. He was interviewed by, I think, by Wolf Blitzer, who actually asked a great question saying, what do you say? This is kind of personal, but why don't you like the term lone wolf terrorist? That was kind of funny. Anyway, he was talking about the going dark problem and the need for back doors. And this is the scenario he is worried about. And he's very explicit. It is an ISIS scenario. ISIS is a new kind of adversary in the government's eyes because of the way it uses social media. Unlike Al Qaeda, which was like your normal terrorist organization, would recruit terrorists to go to Afghanistan, get trained and come back. ISIS does it with Twitter. And this freaks the government out. So this story, and they swear up and down this happens, is that ISIS is really good at social media at Twitter and YouTube and various other websites. They get people to talk to them who are in the U.S., like you guys, except a little less socially adept and maybe kind of a little crazier. But they find these marginal people and they talk to them. And the FBI can monitor this and go FBI ra ra. But then they say, go use this secure app. And then this radicalized American does. They talk more securely. And the FBI can't listen. And then this and then, you know, dot, dot, dot explosion. So this is the scenario that the FBI is worried about. Very explicitly. And they've used this story again and again. And they say this is real. This is happening. Okay. Now, it's sort of interesting. If this is true, I mean, let's let's take it as let's take it as read that as true. Oh, the other phrase they use is actually a new phrase I recommend. They're talk about the time between flash to bang. It flashes when they find the guy, bang is the, you know, when the explosion happens. And that time is decreasing. So the FBI has to be able to monitor. So they are pissed off that things like iMessage and other apps cannot be monitored even if they get a warrant. And this really bugs them. I have a warrant, dammit. Why can't I listen? But I can get the metadata. I can't listen. So if you think about that as a scenario and assume that it's true, it is not a scenario that any kind of mandatory back door solves. Because the problem isn't that the main security apps are encrypted. The problem is there exists one security app that is encrypted. Because the ISIS handler can say, you know, go download signal, go download Mujahideen secrets, go download this random file encryption app I've just uploaded on GitHub 10 minutes ago. So the problem is not what he thinks it is. The problem is general purpose computers. The problem is an international market in software. So I think the back door is a really bad idea for a whole bunch of reasons. I've written papers about this. But what I've come to realize in the past few weeks is it's not going to solve the problem the FBI claims it has. And I think we need to start talking about that. Because otherwise we're going to get some really bad policy. So the question there, you've got to like a good morning. So this will probably go less in the direction of for instance crypto. My question is somewhat twofold. I'm going to focus more on the first one in one at a time because otherwise the perfect in the course of day to day interactions, both with security people and with less security minded folks. I've come to the conclusion that operational security is very difficult to instill. From your experience is there an easier approach to getting the understanding of OPSEC through to lay people? So I think OPSEC is pretty much impossible. I mean even general portrays got screwed up with his OPSEC. Dealing with his mistress. And if the director of the CIA can't get OPSEC right, we're all done. This is very, we see that in the hackers at Hack Sony. We see people screwing up OPSEC again and again. I'm not sure there is a solution. Because good OPSEC, I mean good OPSEC is really and truly annoying. Right? Good OPSEC, mediocre OPSEC means leaving your cell phone at home and who's going to do that? It means not using email or for certain things. I've come to the belief that we're not going to be able to train people in good OPSEC. That the best security is going to be pretty good security that's ubiquitous. And so we saw some of this in some more of the Snowden documents. A lot of people read the real recent article that came out of Germany on X KeyScore. Really good article, actually OK article, really great document dump on documents on how X KeyScore works. This is one of the NSA's very flexible databases for monitoring the internet. And you can read their documents and they talk about how they can find people using encryption and roll up, basically roll up the networks. They can't read the traffic. They know who's talking to who. Remember metadata, encryption doesn't solve everything. And I'm reading this and it's clear you can do this with PGP as well. So you want to find out who's using encryption, it's easy if you monitor enough of the internet. And what that tells me is that someone would be better off not using the great encryption program they wrote or the really powerful one they've just downloaded, but the average one that everyone else is using. That you actually are better off using an iPhone with iMessage. Even though, you know, I'm pretty sure the FBI can get at it individually, but you can hide using it because we're all using it. You don't stand out. So I think there is a lot of power in that. You had a second part. Make it quick. You actually basically answered it. Excellent. It was for pros making the use of Opssec less obvious. Right. I think you make Opssec invisible. Yeah. Good security works. I mean, think of SSL. Good security works if people don't even know it's there. The encryption from the handset to the base station, great if it was better, but it's working because nobody knows it's there. So in thinking maybe not as thoroughly or deeply as I should about like cyber terrorist threats and bad actors that want to do corporations or, you know, infrastructure harm, these sorts of things, or just the public, right? It seems like all the ingredients are there for people to do really bad things. And there are a lot of holes in security flaws. What keeps there from being enough, you know, motivated bad actors and people, you know, what keeps them at bay? I think fundamentally people are good, right? Society works because most of us are honest. I mean, you're kind of looking to be funny, but none of you have jumped up and attacked the person sitting next to you, right? You laugh. But if this was a room full of chimpanzees, that would have happened. We are the only species that can get away with this. A room full of what a lot of strangers sitting quietly listening to me. So, I mean, this sounds weird, but I think a lot of what keeps the really bad things happening is most people don't want to do really bad things. If that wasn't true, society wouldn't work. So I think you write that all the pieces, a lot of pieces are there. There's a couple of things. Terrorism is harder than you think. I mean, yes, technically it can be easy, but the whole operation is actually harder, which is why you don't see a lot of terrorist attacks. And what you do see tend to be these lone wolves that wake up one morning and say, I'm going to do something bad, which know there's no conspiracy to detect. There are no mistakes you can make over the course of the planning. That flash to bang time is so short. So I really do think that's why. Something that's interesting, I ought to mention as long as we're on close topic, that it's a new tactic we're seeing more of. And we've seen it for a few years. Now we're seeing it, I think, a lot. We're going to see a lot more of it. This notion of institutional doxing. You can go into a company, take all of their stuff and publish it. It's freaking people out, right? This is what happened to Sony. This is what happened to the hacking team. The guy who did that is in this room, thank you very much. It's what might have happened to Ashley Madison, which is a little more awkward for some people. Right? If you remember a few years ago, it was H.B. Gary Federal. And I think this is a really interesting tactic because it empowers individuals against very powerful organizations. And it is the first, I think the first real counter-argument I've heard to the increasing practice of hiring a sociopathic CEO. That indeed, if you are worried about everything your CEO says becoming public in three to five years, you might want to hire a jerk. But I expect to see more of this. You know, I know people are noticing that WikiLeaks is publishing Saudi diplomatic cables. Man, they are corrupt country. But this is again, someone hacked in and is just dumping all this stuff. So that's an interesting example of a new bad thing that's being enabled by technology that's happening more and more of. But in general, I do worry about the bad things happening, but I think it's less common than we think because most people don't do them. It's the systemic stuff that bothers me. You know, that the Internet of Things and being able to hack cars and planes and heart monitors and, you know, other stuff. And the interconnection of those. I mean, I think we're going to see more unanticipated vulnerabilities. Because remember, complexity is the worst enemy of security. And it's not just any complexity, it is nonlinear, tightly coupled complexity. And that's really what the net gives us. So we've got to be real careful there. Yes? I got to turn out a reading in practical cryptography years ago. I had occasion to go back and look at it recently in the hash section. I noticed at that time, you guys had assessed that our ability to analyze hash functions was good 10 to 20 years behind our ability to analyze the other primitives. And so I was wondering if you think that gap has closed in the last decade? I think we're much better at understanding hash functions now. We're not, you know, we're still implementing bad ones, but that's more legacy. But I do think we are, but it's a very hard, you know, mathematically it is hard because, you know, your assumptions are squirrely. And I'm not going to bore everybody with it. I think I revise that in the revised edition of that book, which is cryptography engineering. And, but I do think we understand crypto, encryption primitives better than hash function primitives. Even though there's an interesting assembly, you can make one from the other. Which is why, and if people remember, when there was the hash function contest that NIST ran, what, five years ago, I built a hash function on top of a symmetric algorithm. Because I felt I understood that better than doing a hash function natively like Shah was. I'm not even convinced the NSA understood hash functions very well. You know, the with Shah they had a vulnerability, they fixed in Shah one, which still was a little dicey and it's been updated. So because a hash function is not something they're using in military applications much until recently. They didn't have their rich history like they had with encryption. The code in Reindale is not the code that's in AES. And I use two fish because I trust you more than I trust the feds. Do you think that AES is actually a trustworthy cypher? I think AES is. I trust AES. You know, it is Reindale. There are a bunch of tweaks in the parameters, but they're totally above board and kosher and everyone's happy with them. Reindale, it is weird because you can actually describe the algorithm in a linear, in an equation that fits on one page. It's kind of small type, but it fits on one page, which kind of freaks people out a little bit. But I do trust it. I think it is secure. I do not think there is a backdoor or anything snuck in. I truly don't. NIST did a great job with the AES process. With their SHA 3 process as well. I really do, you know, NIST unfortunately got tarred with that with the dual ECU and an energy generator. And they're trying to sort of rebuild their trust. But they've done a fantastic job with crypto primitives by and large. So, you know, I like AES. I mean, thanks for using two fish. I like it too. And, you know, kind of wish you won because that would've been cool. But no, I use AES without reservation. And don't worry about it. As disturbing as the current crypto war is, something that actually scares me a lot are stories like Lava Bid or even the larger tech companies getting national security letters. I'm wondering what we can do as tech companies or just the InfoSec community in general to defend against governments secretly ordering companies to put backdoors into their products. So, this is actually, I think the thing that should freak us out the most. And to me, this is the biggest deal revelation of Snowden and all the stories around it. And Lava Bid especially. It's, you know, it's not that we believe that encryption was perfect and nobody can break it. But we did believe that the tech rose and fell on its own merits. And the idea that the government can go into a company and say you have to break your encryption and then lie to your customers about it is terrifying. The law can subvert technology. And we cannot, as a community, as a society, truly believe anything is secure as long as that's true. I mean, I just talked about, you know, I message and, you know, so we don't know. And I blogged about this a couple of days ago. It didn't get enough play. It was kind of the last paragraph of a post. Maybe no one reads that far. That there is a persistent rumor going on around right now that Apple is in the FISA court fighting an order to backdoor iMessage and FaceTime. And Nicholas Weaver, I don't know if he's here this week, has written about how they could do that. I mean, how they can modify their protocols to make that happen. And we don't know. That is fundamentally terrifying. And I don't know how to fix that. We have to fix that through the legal. There's no tech fix. I mean, the kind of things you can do. I mean, I think that if we thought about it, we could rewrite the Apple protocols such that if they did have to put a backdoor in, we would notice. If they did have to make a change, we would notice that a change was made. We would say, why did you make a change? They would say, bullshit answer. We would know something was up. So maybe there's something in making your algorithms not backdoor proof, but backdoor evident. So maybe to think more about that. But this is a hard one. And I don't have a good answer. And it is one that I think really should disturb us. More open source is going to be good here because more sunlight, harder to subvert. But as long as the government can issue secret orders and secret courts based on secret laws, we have a problem. And it is not a tech problem. It's a legal problem. I think I'm on that side now. Hi. We seem to be in a situation where the software industry can release software that runs on billions of devices and it's completely insecure and badly written. And there's no consequence whatsoever to those companies for the problems that they create. They're just recently, what comes to mind is the MMS hack on Android. Can you just discuss generally what you think about this from a legal perspective and software companies being held liable or accountable for the bad software that they write? So I've always been a fan of liabilities. I've written the first thing about it maybe in like O2 or something. Even before Y2K. And so here's the basic argument that right now as you say, there are no costs to writing bad software. When you read a software license, it says pretty much explicitly, if this software maims your children and if we knew that it would do that then so I did not tell you because it would hurt sales, we're not liable. And those string-wrapped licensees issue even security software. It says, you know, no, read the license, it'll say no claims about security are made, even though there are. So what liability changes that, right, it adds a cost to not designing software properly. It adds a cost to insecurity. It adds a cost to non-reliability. And that has real value. I mean if you think about it, we were already paying these costs, we're paying it in losses, we're paying it after-market security devices, we're paying it in sort of the entire industry that has strung up around dealing with the fact that the software sucks. But with a liability regime, we would pay anyway, right, the cost would be passed on to us of course, but at least we'd be getting more secure software out of it. So I see a collective action problem, I see even market failure here, right, the market is not rewarding good security, the cost of insecurity is too low, the value of insecurity is high, and liability changes that. It is a lever we can use to rebalance these cost-benefit ratio. And I think it's a powerful one. It is not a panacea, lots of ways liabilities go wrong, but liabilities do a lot. They really provide value. And I think they would 100 percent here in software. So I want to see liabilities. I mean we know why the Android vulnerability isn't being promulgated because they design their system so that they Google produced the patch, it won't go down to the phones. Because the phone manufacturers don't care very much and you don't have that tight connection between phone and iOS like you have in the iOS world. So the patch doesn't go down stream. If suddenly the phone manufacturers will libel, I assure you the patch mechanism would work better. And that's a lever we have as society and we should use it. I think it's a better one than regulation here because it's one that's dynamic and tends to seek its own level. But that's why you use it and I'm a big fan of it. Actually thinking about this, hang on. Everybody smile. There's more of you than fits on the screen. It's not going to work. All right, hang on. People at the edges you don't have to smile. All right, thanks. Who is next? It was you, right? Bruce, it seems like less and less surveys seem to show that Americans are concerned about the privacy of their information. Often you hear terms like I'm not hiding anything, I don't have anything to hide so I'm not worried. And it seems like people my age and younger don't have much of an understanding of Edward Snowden or the relevance of what he released. What would you say to those perspectives? So I don't know people know I had a book published in March called Data and Goliath. And it talks about surveillance, government and corporate. And I spent a lot of time, I just spent a whole chapter on that question, on privacy and why it's important. And it's not true that people don't care about privacy. And it's not true that young people don't. Old surveys show that they do. They're very concerned about it. And you know this is true. You remember being a teenager. You concern about privacy a lot from your parents, from your teachers, from your friends. You don't care about the government because who cares. But you're concerned about the privacy in your world. And you know people who are fluent kids, teenagers who are fluent in the net are very fluent in how to maintain their privacy. They might not do a good job, but they try a lot. I argue that privacy is fundamental to human dignity, to individuality, to who we are that without privacy we become conformist. We don't speak out. And I think it's a really interesting argument in social change. And so we're in an extraordinary year where gay marriage is legal in all 50 states. And that issue went from impossible to inevitable with no intervening military, middle, middle ground. It's amazing. But what it means is, and you can take legalization of pot, you can take a lot of issues you can take in this way. Back then something is illegal and let's say immoral. It goes from illegal and immoral to some cool kids are doing it to illegal and we're not sure. And then suddenly it becomes legal. But in order to get from here to here, you've got to be a point here where the thing is illegal and people do it anyway. Because you've got to do it and say, you know, that gay sex wasn't that bad. That was kind of okay. Or, you know, I tried pot and the world didn't end. And it might take 40 years and a couple of generations. But then you get to the point where it's legal. Interracial marriage. I mean any of these issues. But if you have surveillance here, if you can stop people from trying the thing and saying, you know, that's not that bad. Maybe we're wrong. You never get to the point where the majority of us believe we're wrong. So I think surveillance, broad government surveillance will really have a stifling influence on social progress. Because it won't let experiments to happen. Now, it's not arguing you can make to anybody, right? But I think it is probably the most important one. But really, in anyone who says I have nothing to hide, there, you know they're lying, right? I mean there aren't cameras in Scott McNeely's house because he has nothing to hide. You know, and so I think you really have to point out that those are arguments aren't true and that privacy isn't about something to hide. It is about maintaining your sense of self in a public world, right? I get to determine what I tell you people and what I don't tell you people. And that is empowering. And if I lose that, I am fundamentally a prisoner of society. So attaching privacy to something to hide, to secrets, is just wrong. It's about human dignity and it's about liberty. And I do, I spent a whole chapter on that and I do it better in the chapter. So I offer that up. Yes. Most people seem to be more worried about back doors and forced government back doors, but I'm sort of more worried about a sneakers, no more sneakers, Marty type of dealer. What is your opinion on quantum computing and current encryption and also quantum encryption and its rebuttal to quantum computing? All right. So quantum encryption has nothing to do with quantum computing. They're completely separate. And let's do quantum computing first. Quantum computing is, it's going to become real, probably not in our lifetime, but it will become real. I mean, it's a technology, it's advancing. I think we can factor like 24 now, but it'll get better. It has potential to change crypto, but not, not destroy it. It will break all of the common public key cryptography algorithms, the ones based on factoring discrete log problems. So RSA, Diffie-Hellman and those, it'll break those in linear time and be very nasty. But we do have public key algorithms that do work. Even the original Merkel knapsack, the first public key algorithm, not the knapsack, the guy, his early one that had to do that additional work factor of a square instead of exponential. That still works. There are coding theory algorithms still work. They're less efficient, but they still work. We'll still have public key cryptography, symmetric cryptography. In theory, the best quantum cryptography does is have your key length. It reduces your brute force search by a factor of a square root. So double your key length and you're done. And this does actually hosting conferences on post quantum cryptography. Go download the papers. People are thinking about this. How can we build secure systems that are resilient to a quantum cryptography, sorry, quantum computing theoretical world? So that's the breaking, that's quantum computing. Quantum crypto is really quantum key exchange. Vented in the 80s, research continues. I think there's a product now. And it is a clever way to exchange keys using quantum properties. Really neat, great science, great physics, something I as a group probably absolutely no need for. I can exchange keys just fine. Thank you. The math is working. So I think it is kind of pointless from a practical point of view. Great science, I love reading the papers, but I would never buy such a device because I would use one of the math systems and they work just fine. So that's sort of my quick quantum primer. But it's great science and I love the research. And eventually, yes, we'll be able to factor numbers very quickly, which will be cool. Yes. Sorry if you caught the previous talk by Yijian and Demon Saw and to get your opinion on sort of this idea of simplifying some of the complexity associated with encryption for personal use. So I definitely want things civilized. This is the answer about the OPSEC, the same thing. The more you make it invisible, the more you make it transparent, easy to use, no work. Even sacrificing some security, I think we do better. I'm really liking Maxim Marlon's bike signal right now on my iPhone. It's a great program. And it's just has a really clean interface. It works. I can actually... All the key exchange happens in the background. It's well designed. I can actually confirm there's no man in the middle. I don't have to, but I can. And the fact that I can is enough of a deterrent for people trying it. So I really like simple stuff. Simple stuff that's easy to use because I want everyone to use it. There's a value in it being ubiquitous. So expert only encryption has much less effectiveness. One last comment to the quantum guy. One of the things we know about the NSA documents is they have a budget line for quantum crypto. It's not very large. Which means they do in research but they don't have the quantum computer in Utah. I'm pretty sure that it's not something they can do. Yes? First of all, Bruce, you're my security crush. Mind if I take a picture with you after the show? I don't. But you guys all have weird pie plates on your chest. I'm just saying. You look like... I'm putting the hype down for a little bit. You look like some embarrassing cult. It's Flava Flaville. Bruce, so with the explosion of software defined networking and enterprises looking to use it and employ it quickly, do you have specific concerns around the security of such bleeding edge technology and this virtualization of router switches, firewall, et cetera. You have some thoughts on that? You know, I don't have any specific concerns. Just the general of more complexity, more things to be insecure and another layer of organization means someone else to serve a warrant to. So those are my concerns. I mean, there's huge value in this. And I'm a big fan of security outsourcing for organizations because it's very hard to do right and the more you can consolidate the expertise, I think the better you'll do. So I tend to like those, but there are legal risks that, you know, someone else can and, you know, we've been seeing cases that the FBI can serve a warrant on Facebook for your stuff by passing you. That they can do that. And that does cause problems. But in general, I think the value of outsourcing, the value of software to find this and that are great. And there are security risks, but I think in benefit, I tend to like that technology. So as a balance, no major concerns over shared control plans. I mean, that's it. I mean, you got it all. Those are the things to be concerned about. But are they major concerns? They're like regular-sized concerns. All right. Thanks. Yes. First of all, thank you, Bruce, for everything you do. With the pie plate also, I'm saying. My question is, even if they wanted to, would policymakers be able to stay current with the pace of technology? You know, it's interesting. I think where I've come to the belief that the United States, it's ungovernable. And that's one of the reasons. That technology is moving so fast that the people who understand it can run rings around policymakers. And whether it's writing laws that, you know, five years in retrospect, you realize, whoa, they understood that and put that sentence in and we didn't realize that. This is hard. I like seeing laws that are technologically invariant. And instead of writing laws to keep up with technology, write laws that don't have to. And there are examples. I mean, you know, laws about assault and murder don't really care about the weapon. You know, I could write a law about privacy for communications that doesn't care if it's email or voice or voice over IP or something. I can I can do that. I think that's better. I'm not sure it'll happen. I mean, there's so much co-option of the legal and policy process by people who stand to make and lose a lot of money. I mean, right now, the cybersecurity bill, you know, that's probably going to get signed, it's got all sorts of amendments and writers and what it actually does isn't what they say it does. That's an easy one. You start doing something like healthcare or climate change. Forget it. So I'm I'm not optimistic about lawmakers staying current because of technology. I think we're going to have to go through some bad times before we realize how to create law in a society where tech moves so fast. Now, there's an argument to be made that the modern constitutional democracy is the best form of government mid-18th century technology could event. Right? You know, communications are hard, so we've got to pick one of us to go all the way over there and make laws in our name. Right? It made sense in 1782. It's kind of what? Now. And there's a lot of ways that our systems that were designed when nation states started becoming a thing are sort of breaking now because things are different. Things are moving too fast. The communications are different. It's all different. And I think we have to redesign democracy. This, of course, is a ridiculous thing that will never happen. Right? But I think we kind of need to. I don't know. That wasn't an optimistic answer, was it? Yeah, sorry. Yes? Hey, Bruce. So a few months ago, there was news about Chris Roberts being detained at the airport after he posted a tweet about the security of the United States. Is he here? Does anyone know if he's here? Okay. Yes? And then your blog post said that maybe FBI knew that Chris Roberts worked in the field of avionics. That's why he was detained. And then, recently, Wall Street went down, and there was this news that the Anonymous had posted a warning about it, even though Wall Street claims that it was an IT issue, it was a minor IT issue. So what do you think? Is the emphasis on offensive security? Right. Like, the issue was similar with Chris Roberts as well as Wall Street. So we didn't know in the Chris Roberts case that he actually was being at least somewhat watched by the FBI. All right. That he'd talked to them before. And the Chris Roberts case is actually very complicated. And I stopped commenting when I realized it was way more complicated than I understood. You know, people don't know this is the case of him being on a plane and saying something about going into the avionics bus via the USB port in his seat. Which, you know, would be crazy if you could, and wouldn't surprise me if Airbus forgot about that. It really seems every time you put physical security people and give them an IT security problem, they completely forget that they should meant talk to an IT security person. Anyone follow the hack on the bring safe? It's like completely embarrassing. It's like they never even open an IT security book. Oh yeah, we can do this. No problem. I don't know how much proactive it does seem like the FBI is monitoring more groups and more individuals. And we see them monitoring the Occupy movement or Black Lives Matter. Real social change movements that might not be as I don't know as mainstream as they could be. So there's a lot of that going on. How much in those cases I don't know. The Wall Street case, I have no idea. I mean, certainly there's always a lot of bragging that that might not be true. Yeah, but then they posted it the day before the Wall Street went down. Yeah, I don't know. I don't know the details. And it's hard to speculate. I think there is more monitoring and this is the point of fusion centers. This is the point of information sharing between the NSA and others. And I think a lot of it's going on. Yes. So do you trust elliptic curve based Cypher suites or non-elliptic curve or NY? So I've been skeptical on elliptic curves for a bunch of years. I am in the minority here. Most cryptographers do trust them. So I'm giving you a minority view. There's a lot we don't know about elliptic curve mathematics. And there certainly could be classes of elliptic curves that have hidden trap doors that we don't know. The NSA uses elliptic curve crypto. So I can tell you that. So in general or at least in some instances it is secure believed by the NSA. But I also know that they have in some cases tried to influence curve selection. Now for good or for bad I can't tell you. So I worry about elliptic curves where the curve selection process isn't transparent. Now if you want to use elliptic curves in your system Dan Bernstein, right a guy we all trust or at least I do had a public process and they called the Bernstein curves and they are available to us. The NSA said here here's some great curves I would say, you know, so that's my feeling on that. I think the math is sound but I think there are gotchas we don't fully understand. I'm getting the go we'll get off stage soon signal. Do you have a yes or no question? No. Thank you so much. So my question is now that I find myself in a position where I'm working with government agency where they use the same software as everyone else they got the same problem as everyone else and I'm convinced that the actions of stockpiling zero days weakening crypto is just as harmful. What can I do to convince or show these people that our arms of the government are doing things that hurt us. This is hard and a lot of us are trying to do this. I mean really just keep saying it this is the political process it's not the tech process it's not clean. I have to leave I'm doing a book signing at the bookstore at four so come by and say hi not all of you at once I will be around I'm going to go outside there