 Next up is Nate Cardozo. He's gonna go over some history up to the current state of cryptography and the law. So let's re-welcome Nate Cardozo. Thanks. Thanks for coming. Thanks for almost filling the room. I don't know. It's pretty, pretty full. Uh, I am Nate Cardozo. I'm a senior staff attorney at the Electronic Frontier Foundation in San Francisco. Uh, I was on the EFF panel just now and I answered a couple questions but there may be some more time at the end of this one. I am a lawyer. I am not your lawyer. Unless you know that to be false. Because I probably am some of your lawyers in this room. Uh, I've been working on crypto policy at EFF for the last couple of years. Uh, and it's been an extraordinarily busy time. Uh, in this talk I'm gonna talk a little bit about the past where we've come from in terms of crypto law. Uh, and of course what is old is new again. And where we're going. Um, I'm gonna talk about the legal challenges that face people who design, implement and use crypto around the world. I'm a US lawyer. My focus will be on the US but I'm gonna touch on, uh, some dumb things that some countries are doing besides the United States. I mean we're doing dumb things here but there are other countries doing dumb things as well. And I'm gonna talk about the future. What we're likely to see. So, on Wednesday, or maybe Thursday I forget, at Black Hat, Jennifer Granick said end to end encryption is legal. Period. Alright that's the state of the law. Questions? I mean, she's right. End to end encryption in the United States is legal. Period. Um, but there's still some places to go and some things to talk about. Uh, the story I'm about to tell you isn't really particularly true but I'm gonna tell it anyway. From around 1784 when Joseph Brahma invented a particularly good lever lock until the second half of the 19th century there was such a thing as perfect security. You're looking at it. That was it. Uh, the, that safe was unbreakable. The lock couldn't be defeated and with the advent of overlapping cast steel rather than forged steel, uh, overlaps of course hiding the rivets. Uh, you couldn't just, um, break it open. You couldn't pick the lock, couldn't break it open, uh, couldn't be drilled, couldn't be direct, uh, bashed. You could drop it from a very tall building. Um, so the solution of course is to just build one big enough that you can't lift. And that's exactly what they did. Of course lock pickers have been around as long as locks and in 1851 a locksmith called Hobbs picked Brahma's unpickable lock. Took him 51 hours but he did it. At the same time, right around the same time that Hobbs figured out how to pick this thing, uh, TNT was invented and that, uh, made it much, much easier to break into this thing. Um, but of course as, as I said, none of this is true. Safes were picked all the time in that intervening 67 years between Brahma and Hobbs but not by picking them and not by blasting them open. Uh, as you all know, even, even with a perfect spec, there's no such thing as perfect security. Uh, volns are in implementation. They're not in the, well I mean sometimes they're also in the spec but more commonly they're in implementation. Uh, so you overlapped your, your cast plates, uh, but you left the hinges exposed and you just knocked the hinges off the door. The, one of the co-founders of the Electronic Frontier Foundation, John Gilmore, in something like 1993, uh, I cannot get an exact date for this statement, said the internet interpret censorship as damage and routes around it. Um, that statement is more true now than it was 20 something years ago. In 93 there was no tour, there were no VPNs, there were no anonymizing proxies, at least none to speak of. We barely even had the first inklings of transport layer security when Gilmore said this. Um, but there were words, there were lots of them and images and code and politics. You name it, it was online. And for more than two decades the internet has provided us with a truly global platform for expression. Today anyone can write an opposition party blog, they can post photographs of their cats, which if you follow me on Twitter you see. Uh, you can organize a street protest, contribute to open source crypto on GitHub, send 419 spam, uh, search for extraterrestrial life, mine for bitcoin, swap selfies, use PGP. But in the 90's we had the first crypto wars. This uh, is actually an anachronism, what I put on the screen, this is uh, a pearl implementation of RSA, so it's not exactly the right time, but you get the picture. The first crypto wars were an attempt by the US government to regulate that. That, you couldn't put it on the internet. Uh, if you were here for the last panel there was a question about ITAR versus EAR. This was considered a munition. In the same category as hand grenades or tanks. And you had to get the same permit to put that online as you did to export a tank. Um, the fear was that this would become this. This is of course the enigma machine or rather a set of its code wheels. Uh, the enigma machine uh, who here, raise your hand if you're familiar with enigma. Okay, most people. Um, enigma was not invented as a military technology. Enigma was invented to protect the European banking system. Uh, and became famous after some modifications by uh, by the Nazis uh, when it, when it went into effect in World War II. And for a time it defeated all allied crypto analytic attacks. And for a time allowed perfect security. Uh, it took a set of stolen code wheels, the invention of the computer, and the most brilliant cryptographic minds of their time, both in Poland and in the United Kingdom uh, to crack this thing. But getting back to RSA. That, the US government's fear was that if we didn't regulate this, it would allow our adversaries this, perfect security. Uh, and we'd end up where a situation where a cipher designed to facilitate banking, both RSA and uh, enigma, would instead be used by the Soviets for their nefarious plans for world domination because that's what they did. Um, who here's old enough to remember this? Okay. Uh, a case in point, um, I'm certainly old enough to remember this befuddling option. Do you want Netscape Navigator uh, for the US? Or do you want it for the rest of the world? Do you want the version that only supports 40 bit RC4 or the full 128 bit capable version? Um, the strong version of course was only available if you lived in the United States or Canada because of the inclusion of encryption in ITAR. Um, but, you know, this was the 90s. There were no G.O.I.P. blocks. There were no verifying mechanisms and all you had to do was check the box that said you were in the United States to get the strong version. That was it. The, the US uh, implementation was that bad and it was completely ineffectual. It didn't keep strong crypto out of anybody's hands. And it led to things like this. Right? People put algorithms on t-shirts. You couldn't publish this on the internet but you could print it on a shirt and wear it through the airport. It led to this. Um, Theo Durrat of course lives in Canada so he wasn't subject to any of it. It led to this. Who here recognizes this? Awesome. Uh, crypto moved literally offshore for a time. Uh, this is SeaLands, the principality of. Um, and it led to this. This is the clipper chip. I'm gonna talk a little bit about this later. Of course, I'm a lawyer. Um, and my colleagues are lawyers. My boss is a lawyer. And if all you have is a hammer, then everything looks like a nail. And if all you have is a JD, then everything looks like a loss, uh, a lawsuit. In the late 90s, a grad student at the University of California Berkeley walked into the EFF office, I don't think literally, but we can picture that. Um, and his, his, his deal was that he had invented something and he wanted to publish a paper about it. Um, his name was Dan Bernstein. He's now a professor at Chicago and Eindhoven. And one of, uh, one of the, the best cryptographic minds of our generation. He wanted to publish a paper about snuffle. He didn't even want to publish the algorithm, although he also wanted to do that. Uh, so we went to court for him. My boss, Cindy Cohen, uh, who is now the executive director of the Electronic Frontier Foundation and at the time was a partner at a small firm, uh, down the peninsula in the San Francisco Bay Area, uh, represented Dan and got crypto declared not a munition. Uh, code is speech and we won. And crypto is now legal and exportable. The Ninth Circuit in Bernstein said the availability and use of secure encryption may reclaim some portion of the privacy we have lost. And that's still true. Um, the UN special repertoire for freedom of expression just last year agreed and stated in his report that encryption protects not only security and privacy, not only free speech, but the right to hold opinions without interference. Um, and that's exactly right. We depend on crypto in order to read, in order to write, in order to speak, all around the world and in the United States. This was a button, um, produced back in the, in the first, uh, crypto wars against the clipper chip, which was, uh, that, that really terrible little chip that I showed you a minute or two ago. Uh, the clipper chip was an NSA developed chip set for secure voice communications. The thought was it was going to be installed in all of our regular old telephone handsets and we would be able to make, uh, secure calls using it. Uh, use something called the skipjack encryption algorithm and it included a back door with Kiosk grow in something called the, the leaf, the law enforcement access field. Um, Matt Blaise at, at Penn among others showed that the algorithm was broken. Uh, and thankfully the clipper chip was defeated and key escrow, uh, appeared to be dead. Um, or at least requirements for key escrow appeared to be dead. And the internet was a safer place for it. Uh, this is the, the cute little, uh, golden key which EFF, um, was, we had a campaign around the golden key to ask web masters to put on their home pages because that's what we had back then. Uh, in support of strong crypto without key escrow. Uh, this was from 1996. Uh, I was 15 at the time and my homepage had this key on it. Um, the clipper chip failed mostly because it sucked. Uh, but also because of the actions of, of cryptographers like Matt Blaise and, and his partners, um, who were able to show policymakers just how insecure it was. And thank your lucky PKI's for ECCN 5D002. Uh, this is the encryption exception, uh, to the export controls. Um, this giant block of text which I'm sure you can all read and have already entirely digested is what makes strong crypto legal and exportable today. And we thought we had solved the field. Uh, we won. Our friends in Mountain View and Cupertino are free to ship products that actually protect their user's security to the best of their ability. People like Moxie Marlin Spike and Adam Langley are free to publish, uh, free and open source tools for all of us to use. Um, as Jennifer Granick said end to end encryption is legal period. But thanks to Comey, more work remains. And we're back to exactly where we started. Everything that is old is new again. In 1997, the director of the FBI at the time Louis Free said we are in favor of strong encryption, robust encryption, the country needs it, the industry needs it. We just want to make sure we have a trap door and key so that we can get into anything that you, that you look at. That sounds shockingly familiar to what, uh, the director of the FBI today, James Comey is saying, uh, almost 20 years later. Um, 2015 and 2016 brought us a new set of challenges. iOS 8 and Android M brought us full disk encryption by default. Uh, WhatsApp joined iMessage in actual end to end encryption for more than a billion people around the world. And that's to say nothing of, uh, signal or GPG or Tor or pond if you're crazy enough to use pond. And we're back into the crypto wars. We're calling it the next crypto wars or crypto wars 2.0. Uh, you, you, you, you heard what I, what I said, uh, Louis Free said about crypto in 1997. Uh, in 2015, um, Jim Comey, the director, the current director of the FBI called crypto only a business model. Uh, the government has of course been downplaying company's support, uh, calling it, uh, just a marketing pitch and not a technical feature. This of course completely disregards the facts of strong crypto. Um, IDG and Lookout did a survey in 2013 before iOS 8 and Android M that found that of the something like 4 million phones that were lost and stolen in the United States, fully a quarter of those to lost or stolen devices resulted in identity theft. Um, one million Americans were, uh, victims of identity theft because we didn't have full disk encryption on our phones. And, uh, that's where director Comey wants us to go back to. And I think that's crazy. Because what could possibly go wrong? With back during our crypto. To this day, most, uh, actual proposals, actual, actual technical proposals, uh, for weakening of, of encryption are something like this. They're something like key escrow. Uh, or maybe double key escrow where you escrow the key once to a private key held by the manufacturer and then again to a private key held by the government so that you need both to unlock it. Um, something like that. That's not really a good idea. Luckily, a whole bunch of academics wrote a really good paper telling us why it's not a good idea. Uh, this is keys under Dormat's published last year, the year before, it's either 2014 or 2015. Um, y'all should read it at least, uh, it for, for the technical people you should read the whole thing. For lawyers like me, read the executive summary, it's very good. Uh, the, the people who wrote this paper are some of the best cryptographic computer security and security engineering minds, uh, alive today. And they write that we find it would pose far more grave security risks, imperil innovation and raise thorny issues for human rights and international relations if we are to give Jim Comey what he's asking for. Keys under Dormat's identifies three major classes of problems, uh, with lawful intercept or lawful access capability. First, lawful access would necessarily abandon advances in crypto such as perfect forward secrecy. Um, that's crazy. We barely know how to build secure devices and secure systems. We're bad at it. We, we don't know what we're doing. And to abandon the state of the art and roll back to the bad old days when a lost or stolen phone had a 25% chance of resulting in identity theft strikes me as a really bad idea. And strikes them as a really bad idea too and they're smarter than I am. Second, it would necessarily increase system complexity. Uh, the keys under Dormat's metaphor kind of breaks down here but you see what they're getting at. Uh, there is no such thing as a back door that only the good guys can walk through. Um, remember back with the safe, uh, you had an unpickable lock. You had cast plates that overlapped, but you left the dam hinges exposed for someone to knock off so that they could open the door. That's the problem. The problem here isn't necessarily with the protocol, but the massive increase in complexity that any sort of lawful access system is going to necessarily result in. And finally, doing something like this is going to concentrate the attackers focus onto one or two incredible points of failure. And by definition, the, the key material is going to be, going to have to be kept online. Um, because as Jim Comey or the district attorney of Manhattan, Cyrus Vance, uh, have repeated over and over, they're going to use these capabilities a lot. They're not gonna be okay with having, uh, the keys kept in secure offline storage. They want push button access to our communications. Uh, and so Comey has actually heard us. He, he, he's heard us and he's come around. Um, and he, he's not pushing for back doors anymore. Uh, he said last year that we're not seeking a back door approach. We want to use the front door. Uh, which of course is the same damn thing. Uh, the, the Washington Post put it, uh, a little bit more weirdly, I'll say. And they said that, uh, and this is a quote, a back door can and will be exploited by bad guys too. However, with all their wizardry, perhaps Apple and Google, could invent some sort of secure golden key. That's what the Washington Post called it. Of course, you know, technology, sufficiently advanced technology is indistinguishable from magic. This thing is magic to people like Jim Comey, to people like the editorial board of the Washington Post. They don't know how it works. It's, it's obviously magic, right? So if, if the wizards at, at Mountain View or Cupertino can design this, then just nerd harder and, and invent the golden key. Or they'll beat us up and take our lunch money. Like, come on. But that's not the way the world works. Okay, the slide you're about to see is false. NSLs are not magic. Only friendship is magic. There is no legal tool in place in, at least in the United States, uh, that is currently sufficient to require a provider or developer to maintain or create the ability to provide plain text on demand. That is a much more, uh, verbose way of saying what Jennifer Granick said at, uh, at Black Hat earlier this week. Uh, end to end encryption is legal period. Um, there's a perception in our community that NSLs are magic. Um, I'm here to, uh, hopefully help you read yourself of that perception. National security letters and other types of national security process are terrifying. They're scary, um, they are, they operate with almost no oversight. National security letters get issued without even a judge's signature. But they're not magic. With an NSL, you can get subscriber information and maybe a little bit of transactional information. You can't get content, you can't get a backdoor, you can't force someone to build code. Uh, Jennifer and Rihanna at Black Hat the other day gave a great talk about technical assistance orders. Um, technical assistance orders may be a little bit more magic, but we don't know. Um, I'll talk a little bit about that later, uh, uh, later today. But there are things that might be magic around the world. Many countries are looking at or considering legislation that would mandate backdoors or have already mandated access to plain text or otherwise in danger encryption. The, uh, investigatory powers bill just passed the House of Commons and is up in the House of Lords, uh, in the United Kingdom. Section 189 for C of the IPV says that operators may be obligated to remove electronic protection at the sole discretion of the Home Secretary. What does that mean to you? Well, to Theresa May who was the Home Secretary at the time that the IPV was introduced and is now the Prime Minister of the United Kingdom, it meant that the Home Secretary or it will mean that the Home Secretary will have the capability to order providers to strip end to end encryption in the UK. Um, if at the Home Secretary's discretion it's practicable. Um, note the Home Secretary is not a cryptographer. The, uh, the second major problem with, uh, with this statute is it would grant the UK power to issue a national security notice. Uh, another secret instrument even more vaguely drawn, uh, than removing electronic protection that would require operators and operators is construed very broadly to include things that aren't UK entities like Google and Facebook and Apple. Uh, to carry out conduct including the provision of services or facilities which the British government considers necessary in the interests of national security. They don't have a First Amendment in the UK. They don't have the arguments that won the day in the Apple FBI litigation. And this scares the living hell out of us. In Australia, uh, the Australian Department of Defense, that's not a typo, that's just how they spell it down there, um, has already passed a, a regulation, the DSGL, which I don't remember what that stands for, uh, that prohibits intangible supply of encryption technology. Um, this is terrifying to us. Many ordinary teaching and research activities may well be subject to unclear export controls under this statute. Um, we don't know how Australian courts are going to interpret it, but it is certainly plausible, given just the plain reading of the law in Australia, that it is now illegal to teach encryption to students who aren't Australian citizens in Australia. That's horrifying. Um, other countries are doing crazy things as well. Um, China, uh, passed an anti-terror law last year. Uh, the final version of which says, and this is the best translation I could find, that companies shall provide technical interfaces, decryption, and other technical support. Um, end-to-end crypto is not legal in China, period, to, uh, mangle Jennifer Granik's phrase earlier. Um, okay, now I'll, I'll turn back to the U.S. Thanks Obama. In October of last year, the president said, we will not for now call for legislation requiring companies to decode messages for law enforcement. Okay, there's a problem there. Can you spot it? I bolded it for you. A month later, the National Security Council, uh, issued a secret decision memo, which was thankfully leaked uh, to Bloomburg, who published it, that said that they were going to identify laws that needed to be changed to deal with going dark. So for now, lasted a month. Um, also at around, around the same time, uh, we saw people like the Director of National Intelligence start thinking about what would be necessary, uh, to change the political climate in the United States in order to get those laws, laws changed. And of course in March 2016, at South by Southwest, the president sat down to talk about crypto. Uh, and that's what we got. We went from not now to, if we don't, we're fetishizing our phones. Um, Bob Litt, who's the general counsel at the office of the Director of National Intelligence, uh, one of the chief lawyers in the security apparatus of the United States, said that the encryption debate could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement. And that's what we got in San Bernardino in December. Um, of course, uh, I, I'm not even going to ask for a show of hands. I hope that you're all familiar with, uh, what happened in San Bernardino and its aftermath. What was this case really about? The FBI wants the ability, and, and I'm not, uh, I'm, I'm, I'm literally paraphrasing, I'm not literally, I'm, I'm actually paraphrasing, uh, what Jim Comey said, uh, before a hearing in the United States House of Representatives under oath. The FBI wants the ability to mandate that companies turn our devices into tools of surveillance. It wasn't about this one phone. If the question in San Bernardino had been limited to, should the FBI be able to unlock a single terrorist's phone, I'm comfortable with saying yes. That the answer to that question is yeah, the FBI should. But that's not what the case was about. Um, we saw from the League's National Security Council memo, uh, and people like Bob Litt's statements that they were just waiting for a terrorist attack or criminal event, uh, to turn the public tide. That's what the Apple FBI case was about. It was about whether or not the FBI or the Department of Justice or the U.S. government can compel a company to change its practices. The only reason I would, I would submit to you that the FBI pursued the case in the way that they did was to set a legal precedent, um, that would give them the ability to ma- demand U.S. tech companies stop providing end-to-end encryption or secure device storage. And they saw it as a win-win. They thought, the FBI thought, that even if they lost the court fight in San Bernardino, they'd be able to take that loss to Congress and ask for a fix. Um, there was also a case in Brooklyn, um, that was very similar in front of a magistrate judge named Judge Orenstein. Um, that case was about an iOS 7 device, I think, so that was probably unlockable. Um, and they got into it as well. But the FBI's ask, in both of those cases, in both San Bernardino and Brooklyn, was ill considered for three reasons. First, legally. What the FBI was asking for represented a fundamental shift in the way that the All Rits Act was interpreted. Um, I have, in, in other contexts, gone a lot deeper into the All Rits Act and what that is, I'm not going to, uh, for this audience, um, you find it super boring. But in any case, it's never been used to compel an American company to sabotage its own products. Uh, the All Rits Act was passed in 1789, and it was definitely available, uh, to, to police back in the Joseph Brammer days, in the days of that first safe, that I, the first unbreakable safe. Brinks, West, uh, Wells Fargo were never compelled to create a master key to their devices. Um, that is something that American courts have never done. Uh, technically, the, the ask was flawed. As I said earlier, we don't know how to build secure systems and the fact that the FBI was considering mandating Apple undermine the security of an already not perfectly secure device was crazy. And was crazy not just, uh, to, uh, the left wing radicals at EFF, um, but to the several dozen companies that submitted amicus briefs, uh, in support of Apple's position in San Bernardino. And finally, the FBI's ask was flawed for policy reasons. There's no way that an FBI backdoor would stay an FBI backdoor. The Russians, the Chinese, the Brazilians, the Turks, the French, the Germans, you name it, are gonna want the same access. And the only reason that Apple has been able to say no to the Chinese, to the Russians, to the Brazilians is because they don't give it to the FBI. And once that changes, the calculus all around the world changes very quickly. And that would be crippling not just to tech, but also to American business. Um, there are other litigations happening around the country, at least we think there are. Um, Wiretap Act litigation may be ongoing. Uh, in March, Matt Appuzzo, uh, wrote a story in the New York Times about an order directed at WhatsApp. Uh, a, a United States federal court order directing WhatsApp to do something. We're not exactly sure what. We don't know which court it's in front of. We don't know if the litigation is ongoing. Um, if I had to guess, I'm gonna say that it probably isn't ongoing right now. But who knows? There may be FISA court orders. Uh, the FISA court, the foreign intelligence surveillance court, uh, sits in the basement of a federal courthouse in Washington, D.C. and meets literally in a Faraday cage to issue its secret orders. Um, litigation before the FISA court, generally speaking, is one-sided. The government, uh, stands in front of the judge alone and is unchallenged. Um, that is changing a little bit. There's now an amicus provision, uh, in order directed at a company might be contested. Um, so far as we know, only one provider has ever contested a FISA court opinion, or a FISA court order, and that was Yahoo in 2007, 2008. We didn't of course learn about that until 2013 or 2014. 2014, I think. Um, but we're in the middle of a FOIA case to get access to any decryption orders that might exist at the FISA court. One of the nice things that happened last year, uh, this is a, a minor win for us, uh, the USA Freedom Act was passed in, uh, passed by Congress and signed into law by the president. And one section of USA Freedom says that the government has to declassify significant FISA court opinions. Um, of course it doesn't really define what significant is, it's not clear whether it's retroactive, so we sued. And we're suing to get a hold of that. Um, oh, remember, okay, I'll, I'll go here. Remember when I said that they were just waiting for a big terrorist or criminal something to update the law? That happened in San Bernardino. And we got the Burr Feinstein bill. Luckily the Burr Feinstein bill seems to be dead right now. But it would have required providers of just about everything to decrypt on demand. Uh, carried civil and criminal penalties. Would apply not just to communications, not just to storage, but also to licensing. Which means it would have included app stores. Uh, if Burr Feinstein had been passed in its original form, it would have required Apple and Google to censor the app store and the play store to make sure that nothing had crypto in it. And of course, not just end to end encryption, but full disk encryption, uh, would have been included. And actually, if you read the Burr Feinstein legislation literally, if you take it to its extreme, it would have actually outlawed general purpose computing. That's just a hint of how I would have touched the drafters of this legislation are. Okay, 2016. What are we looking at? Uh, there could be a Kiosk or a mandate. That's certainly something that China and uh, and India feel comfortable with. I don't think it's gonna happen in the states. I don't think it's gonna happen in the states for a couple of reasons. Um, all of which are enumerated in the keys under Dormat's paper. Uh, the Burr Feinstein bill may be redrafted and reintroduced. Um, it definitely won't pass in its current form because as I said, uh, read literally, it outlaws general purpose computing and even Congress isn't that dumb. I mean maybe they might be. Uh, but a law that says we don't care how you do it, just make plain text available, uh, is certainly plausible to me. That's the route the UK seems to be taking. Uh, the Investigatory Powers bill is as I said earlier in front of the House of Lords. Uh, if the House of Lords passes it, it will become the Investigatory Powers Act. And that will be the end of end to end encryption in the UK. Uh, I, I gave a talk at Real World Crypto in January and I, I made a lot of predictions, very few of which came true. I didn't anticipate the All Rights Act litigation at all. Um, but I made a prediction that said the government is going to focus on defaults, not primitives. And I think that's right. I think that's still right. The government knows. They're not stupid. They know that there's no way of keeping strong crypto out of the hands of people who are really determined to get it. But there is a way of keeping strong crypto out of the hands of everyone who just walks into the Apple Store and buys an iPhone. Um, they can force companies to change the defaults. Uh, we've seen a couple of states try it. In California and New York, a pair of bills that are almost identical were introduced at the beginning of the year that would have made it a crime to sell a smartphone that had secured device storage by default. They didn't, you know, they didn't need, uh, they, they, they didn't even really try. California tried a little bit, but they didn't try very hard, uh, to make it impossible to install full disk encryption. They really just care about the defaults. Uh, they, they don't, they, they know they're not going to get the terrorists. They know they're not going to get organized crime. They know they're not going to get the pedophiles. They're going to get us. They're going to get ordinary Americans. Uh, and that's what these two bills were about. What's likely in 2016? Informal pressure. One of the things that, that I do along with my colleagues at EFF is we represent developers and sometimes small companies, uh, who get a visit from their three letter agency friends. Um, and so I kind of know what this looks like a little bit. Uh, the FBI will request a meeting. They'll come down and sit in your office and say, you know, it would be really great if you gave us a backdoor into your stuff. And if you don't, blood will be on your hands. And they'll show you pictures of terrorists using your product. That happens. So they don't necessarily need to force you. They can just pressure you real hard. I don't think that any ban that we could possibly see in the United States is gonna hit for your open source software. I don't think it's possible. We have the first amendment here. Um, they're not dumb enough to try. Well, Dianne Feinstein is dumb enough to try. But, um, I don't think that's actually gonna pass. Two slides ago I said it's about defaults not primitives. For any ban on, uh, I, I don't think, I don't think we're gonna see bans on primitives. I don't think we're gonna see algorithms targeted. Um, I think we're gonna see defaults targeted. We might see a Kalea like mandate. Kalea is the communications assistance for law enforcement act passed in 1994 that requires telephone, plain landline and mobile telephone companies to have wire tap capabilities. This is a relatively decent possibility. Uh, mandate like this would apply to providers doing business in the United States and something like as a condition for selling something they must maintain the ability to turn over plain text. Um, this is only gonna be tough on Apple and Google. Uh, maybe even the app store and it's not gonna touch, uh, GitHub or, or your, your pet free and open source crypto project. Uh, and of course countries around the world might continue to do dumb things. Uh, Kazakhstan uh, appears to want everyone to install their certificate into your trust store so that they can man in the middle all of your SSL. Um, they depublished that requirement so it's not clear how serious they were, but they were certainly thinking about it. Uh, China already has. It, it's not gonna work any better this time than it did the last. Right? The last time all you needed to do was put it on a t-shirt and walk through an airport. Information doesn't give a crap about borders. These aren't centrifuges. These aren't scud missiles. These aren't nerve gas precursors. You can't stop code at the border. We live in a world with strong crypt, cryptography and there's nothing the US government or any other government around the world can possibly do to change that fact. We have Tor. We have GPG. We have signal. Uh, and we're beginning to have real accessible tools to evade censorship. Uh, WhatsApp is used every day by, or every month by 1.1 billion people around the world with strong crypto. That's amazing. So what's to be done? What if your developer's staring down the barrel of an order or request or a demand or an NSL or if the NSA comes and sits in your office and says blood will be on your hand? Email info at EFF.org and we'll help. What if you're just a regular person wanting to fight back a little against the surveillance state? Oh, we've got a site for that. In seven languages, uh, we will show you how to install, uh, signal or WhatsApp on your phone. We'll show you how to turn on full disk encryption on any device you might have that supports it. We will help you with threat modeling. Uh, SSD is awesome and you should definitely go there. Uh, and what if you just have some questions? I don't know, ask them. That's it. I think, do we have a couple of minutes? Yes, we have a couple of minutes. There's a mic at the front. Line up if you wanna ask a question or two. Hello. Hi. Uh, how do you feel about the, uh, democratic platform? Did you read the tiny little section and I mean tiny on cyber security? So unfortunately EFF is a 501C3 non-profit and we can't get involved in election politics. Yeah cause they use that weird language about the false notion between privacy and security. Well, I can, I can tell you what I think about privacy and security. Uh, you can't have one without the other. We need both. There is no tension between privacy and security. We need them both. Hello how are you doing? Hey, pretty good. Alright. Uh, I keep hearing that there's no such thing as perfect security. Would you say Bitcoin has perfect security barring, you know, the unlikely quantum thing? Uh, I have no idea. Luckily EFF at this point is big enough that, uh, I can trust other people to think about cryptocurrency and I can think about crypto without currency. So, uh, ask, uh, send an email to info at EFF dot org. Okay, thanks. Yeah, thank you. So the normal political argument for weakening crypto is we need to catch the terrorists and, uh, plus if you have nothing to hide, why should you care? Uh, but we have a sort of increasing, uh, number of terrorist events, at least hitting Western media. So, uh, how do you think that puts us if each time there's a terrorist event that convinces a certain portion of the public to, uh, not or care less about privacy? Uh, are we doomed or? I sure hope not. Um, if I were a pessimist there'd be no reason for me to get up every day and go into work at EFF. I have to be an optimist on this. Um, as I said earlier, there's nothing anyone can do to keep strong crypto out of the hands of a, someone determined to use it. Uh, in terms of the, I have nothing to hide, why should I care argument? That's something that we hear a lot at EFF. We hear, uh, from policymakers, from regular people. And my response to that is it's not about you, right? It's literally not about you. It's about everybody else. I don't have anything to say. Um, I, I, I don't have anything to say. And yet I benefit from freedom of speech. Because other people's speech benefits me. Their robust exchange of ideas benefits me. And privacy is the same. I benefit from you having privacy. Because privacy is a prerequisite for social change. Privacy is a prerequisite for democracy. We couldn't have had a civil rights movement or a gay rights movement, uh, in the United States without privacy. You can't organize in public. Um, if you're an LGBT teen in Saudi Arabia, you need privacy. And I benefit, um, from privacy being available around the world. I'm sorry, that's, uh, all the time we have. Um, if you want to point somewhere else where you can take more questions, if you have the time. Um, I'm going to be going to the contest area. I have a two and a half hour booth shift. So if you want to meet me at the EFF booth, uh, in like five minutes, that's where I'm headed. And I can continue answering your questions there. And while you go over to the contest to talk to him, you can, at the same time, get a Mohawk and donate to the EFF. And I haven't seen enough Mohawks this year. Thanks everybody.