 Specifically, programmers and hardware manufacturers are getting more and more pressure from low enforcement to get backdoors in their products and Kurt over here, his general counsel at EFF, in 2007 he received the title of one of the best attorneys of the years in the California Lawyer's Magazine. And he will tell us more about this problem with an overview of today's issues. Thanks. All right. Thank you. Thank you, everybody. Good afternoon. Welcome to CCC. I'm glad to see so many new people here today. What we're going to talk about today is the fight for encryption in 2016. Let's see. We're going to get the slides up here. There we go. So, yeah, my name is Kurt Oppsall. I'm the Deputy Executive Director and general counsel with the Electronic Frontier Foundation, which is a non-profit organization dedicated to defending your rights online. Thank you. This sounds like there are a few people familiar in the audience. So, let's start out with a bit of an overview of how things have gone over the last year in the fight for encryption. There is some good news that, first of all, that strong end-to-end encryption, both communication encryption and end-to-end, and device encryption, remain legal in most jurisdictions today. More deployments than ever, these things are rolling out all over the world. We'll talk about that a little bit more. The bad news is that this, as governments are still at it, they're still trying to find ways to weaken encryption, to get plain text access to encrypted materials, and they're trying to do this by pressuring companies, as well as pushing forward some laws. A couple of bad laws have passed. And then probably the worst aspect is some of the ways of sort of getting around that, which are blocking technologies, trying to make it so people can't use strong encryption by blocking at the network level, placing malware on devices to get around encryption by attacking the endpoints. In some cases, governments have resorted to arresting individuals associated with some encrypted communication tools in order to really push the pressure on those companies. And before we get into the nuts and bolts of things, just wanted to at least take a moment to provide an overview of the encryption debate and how it has been going. As an initial matter, I mean, why do we love encryption? I think many people here are already convinced, but just to go over some of the reasons, encryption protects our data and our infrastructure. It helps ensure both privacy and security. But at the same time, you have governments who are concerned that they are, quote, going dark. This is their euphemism for having less ability to access materials. They say it hinders their ability to conduct law enforcement efforts, national security. Over the last year, we've had some good shifts in the debate. One of the ones I wanted to highlight was moving from a discussion of privacy versus security to a discussion of security versus security. That is to say, recognizing that encryption provides security in and of itself. This is particularly good news because oftentimes, when a debate is framed as privacy versus security, privacy versus safety, it is privacy that loses out. And this was a rhetorical device used by people who wanted to weaken encryption, wanted to get more access to encrypted communications, to try and frame it in that way. But people have come forward, talked to other legislators, talked to policy makers and pushed a shift towards understanding it as a benefit to security that can be contrasted with some of the concerns the government has about reducing security. And there's also been a recognition among some policy makers that weakening encryption does have severe consequences. This is not a cost free improvement to law enforcement's abilities. And there has been at least some recognition of a core tenet from my organization that code is speech, that there are first amendment free expression implications that come from both regulating the ability people to use cryptography and also there are enhancements to freedom of expression that come from the ability to have that encryption and anonymity help enable freedom of expression. Thank you. So I want to take it before we begin again, another brief segue to put some perspective on things. So one of the things the government talks about, we're going dark, that this is an unprecedented inability of government to get into places that can be locked out of. And I wanted to go back a little bit in time into the 1700s when a locksmith, Joseph Brahma, created uncrackable lock. And for 67 years, put the lock out there, he put what we call an early bug bounty program on it. 200 guineas he offered to anybody who could pick his lock. He had it hanging up in the store so people could come along and give it a whirl. And it took 61, sorry 67 years for it to happen, 51 hours for the lock picker to actually do the first breakthrough. And during those decades, there was something which provided pretty good security that made it very difficult for government, even if it had a warrant to get within the lock. Now you could have implementation flaws. This is kind of the same thing with crypto today. The safe built with this might have a weak metal, hinges in the wrong place. But nevertheless, like crypto today, it provided strong security and society survived through those 67 years. So this is not quite as unprecedented as they would like you to believe. All right, so let's turn first of all to the big thing that happened this last year, a very big public showdown between Apple computers and the FBI. The FBI for many years has actually been seeking access to smartphones. They recognize that smartphones are an incredible window into people's lives. And they wanted access to that window. They wanted to look at and see what people have been doing. And they were trying to do this through the courts with court orders. And there were two key cases that helped frame this debate. One was a case in Brooklyn, New York outside of New York City. And the other one brought a little bit later, was in San Bernardino, California. And the first case, the Brooklyn case, was a relatively routine case. They were trying to get onto the phone of an alleged meth dealer, sort of a small time local dealer. They weren't able to get on the phone. They wanted the evidence there. They had a lot of additional evidence to be able to convict the guy. But they wanted a little bit more. And they submitted an application for an order to get Apple to help them onto the phone, which they relied upon something called the All Rits Act. We'll discuss that more in just a second. But this was not unusual for the FBI to go to the court to ask for this access. A little bit unusual to ask for a third party, an unrelated party, to assist in helping getting access to the phone. And the court did something very unusual, was it asked for more briefing? Oftentimes these are arguments made by the government directly to the court without anybody else weighing in. But in this case, the court was like, well, I don't know. I don't know if this argument really works. And I would like to get some additional briefing on that. So Apple filed a brief, EFF and ACLU filed a brief. And we attempted to explain this to the court. And this is why we didn't believe that the All Rits Act was provided the authority that the government sought. The All Rits Act is kind of a catch-all law. It is actually one of the oldest laws in US. It was originated in 1789. You have the language up here. It's a little bit convoluted. But all the rits that are necessary or appropriate basically to allow the court to do its job. So if the court had power to do something, then it could issue a writ in order to enforce that power. So when it was written, obviously it wasn't thinking about things like smartphones. It wasn't really thinking anything along these lines. It was just a basic tool. And pretty much it's the fallback position. If you have nothing else, you can always go to the All Rits Act and see if that will fly. So while that was pending, a new case came up in February of 2016, the California iPhone case. And this was out in San Bernardino, California, where there had been a horrific terrorist attack. Two people, employees of the San Bernardino County health system, went to their office holiday party and opened fire, killing scores of people before fleeing and eventually perishing in a shootout with police. It was a fairly devastating attack and really made a lot of news. Several months after the attacks, they decided they wanted to get access to an iPhone that actually belonged to the county, San Bernardino County, but had been in use by one of the attackers. And he had left it in a car, a black Lexus. So the case was actually styled in-ray search warrant of Black Lexus. And a couple months after the attacks, they wanted to get into this phone. And so they submitted an application to the court and that very same day, the court turned around and issued the order to Apple. The court signed off on the government's brief without any modifications on their proposed order, I should say, without any modifications and issued it the same day. And it was a fairly lengthy order. So this may suggest that not a whole lot of deep thought went into whether this was possible. And under that government requested order, they wanted to bypass the auto erase function, where after a certain number of failed attempts to access the phone, it would erase itself. They wanted to be able to submit passwords or pass codes electronically. And they wanted to have no delay in the password attempt. So what basically they were asking for was to remove the features that they designed to protect brute force attacks so that they could brute force the phone. So Apple called this Gov OS, that they were being asked to make a new operating system to be used just on this phone in order for enabling government access. And the court left open one thing, which is, well, Apple, if you want to challenge this, if it is unreasonably burdensome, you can do so. So indeed, Apple did challenge it. They asked the court to reconsider its order. And somewhat unusually, Tim Cook, the CEO of Apple, I wrote a big public letter about it. And he considered this something that, first of all, they didn't have. But more importantly, they considered it too dangerous to create. And Apple filed its brief. There were many amicus briefs filed in this case, something like 40 or so. Mostly, there were civil liberties organizations, industry groups, mostly on the side of Apple, though there were a few who were in support of the government's position. One of them I wanted to highlight in particular, from the San Bernardino County District Attorney. This is the chief prosecutor of the local area. He said that they ought to be able to get access to the phone because it might contain a lying dormant cyber pathogen. And he wanted to make sure that we had access to that. The logic of that was a little bit unclear, because if it was really that dangerous, maybe we shouldn't get access to it. But nevertheless, he thought that was a reason to get in the phone. The FBI Director Comey, he started out by saying that this was just about trying to get into one phone. It wasn't about setting a precedent. But later, under questioning before the U.S. Congress, he admitted that it was about precedent and they wanted to set this precedent so they could access more phones. And then he asked the question, if there are warrant-proof spaces, what does that mean? And what is the cost of that? And this is a reminder of the time when we had, for 67 years, the uncrackable luck. We may have had warrant-proof spaces before and we may have them again. And it also is not recognizing that there's something fundamentally different about access to smartphones because of how much of your lives are part of the phone or on the phone. If somebody has access to that, they have a, more than just a little bit of evidence, they have a window into your soul. And protecting that is more important than ever. So this became a major controversy. It became international news. They had a poll that went out during this time period and it was about 50-50 on whether Apple should provide access or whether they should deny access, which may not, you know, say, okay, people were divided, but what's sort of particularly impressive about that is that when they're doing a case where they're saying, we need to get this information for a terrorist attack, putting all of the pressures associated with a national security case, saying this was vital for national security, and still they weren't even able to get a majority on their side. I think this was a lot less support than the government was expecting when they brought this case. And both civil society and industry brought together to help support Apple in this. People understanding this would be a precedent, that it would be not just a precedent about accessing phone, but a precedent about the government ordering you to make a new version of your software that has security weaknesses in it. So this was coming forward, coming to a head with a hearing scheduled in March of 2016. And then we heard from the Brooklyn Judge. Now, this briefing had been going on since October of the previous year. And in the ordinary course of things, judges will take time to carefully consider it, it might take a while. But not very long after the news really hit about the California case, the Brooklyn Judge issued his lengthy and fairly detailed opinion concluding that Apple did not have to unlock the specific device, that the All Rits Act did not provide the authority that the FBI was seeking. Then moving forward into the hearing, we had all of a sudden new news that came out. The day before the hearing, the FBI said, well, we're exploring a way to get onto the phone. We need a little bit of time to check this out. Can we get a delay in the hearing? This came out the day before the hearing. Actually, a lot of people who I know were going to go down there had already departed for Southern California. I was actually just about to head off to the airport myself when the news came in and I was able to save myself the trip down there. But this was sort of a very surprising last-minute development. And then a week later, the FBI reported that, yes, they had gotten access to the phone and the hearing was canceled. And we got a little bit of details about this, that it was an exploit that cost well over a million dollars. And this is calculated because the director, Comey, said that it was more than his entire salary for the 10 years that he is going to be FBI director. So people did a little math and figured out that that would be over a million dollars. And this was a hack that apparently works on the iPhone 5C and older devices. A key factor there is it doesn't have the secure enclave, doesn't have the touch ID feature, which requires the secure enclave. And so it was apparently something that was defeated by the secure enclave that we have very little detail. So the FBI withdrew the case after the exploit worked. And there was no ruling by the judge on whether their power under the All Rits Act extended this far. And shortly thereafter, the government also, they had appealed the Brooklyn judge's order. But then they withdrew that appeal saying that they had somehow obtained the passcode. According to news reports, apparently the suspect remembered and provided his code. So what this means is that right now we don't have binding precedent on the question of whether the government has this power. There is the one decision out of Brooklyn that remains on the books. But that was a decision issued from the lowest level of judge, a magistrate judge. It is not binding on any other judge. And if they had appealed and lost and took it up the chain, then it becomes more and more of a binding precedent. But that is not so. We're still waiting for the next shoe to drop and get to bring these arguments out again and see if we can get some precedent on that. Also, the government didn't disclose how it got access to Apple. Apple was seeking that information. And they actually had suggested that if the case had continued and gone forward that they would use the case as a vehicle to try and obtain that information. And it's also brought up in some people's minds the the vulnerabilities equities process. This is a process that came out through a Freedom of Information Act open government request. And it is a process that the executive supposed to go through when it's deciding what to do with the vulnerability. So if the government has a vulnerability, it weighs the equities of disclosure to the vendor versus exploiting that vulnerability. When should they disclose and how do they balance out the security harm from the availability of this vulnerability versus the advantages that they would see with being able to exploit this. This would seem like something that perfectly fit within the vulnerabilities equities process and they should have used it here. But no. As it turns out, the FBI didn't buy the vulnerability. They bought a black box exploit. So they didn't have anything to disclose in their view and didn't need to go through the vulnerabilities equities process. So what did Apple do to to respond? Well, these are some of the the goals that Apple put forward. This came from a presentation they gave this summer at the Black Hat Security Conference and they were trying to continue to use the as a cure enclave, tighten up to try to limit the number of passcode attempts to take brute forcing out of the picture to make it difficult to do offline attacks. And to with a secure enclave, there is a true random number generator or a hardware random number generator. They try to make it so Apple doesn't know what that number is and then it gets entangled with the user ID and the passcodes. This makes it so that Apple has very little information to give that would be necessary to crack this. They also put forward a bug bounty program. So still slightly less or actually a lot less than what apparently the market is. 200,000 is the top of it. But this is a very important step forward. Apple had been actually one of the last major companies to put out a bug bounty program. And so I'm very glad they finally came around to start doing that. So the government, well, what are they going to do now? They don't want to rely on buying hacks. This is not to say that they are opposed to this. In fact, there are many instances in which the governments have either created or purchased exploits. They have bought some governments around the world have bought from places like the hacking team. The NSO group sold a exploit to UAE that was used to get access to a phone of a opposition activist. So these are continuing to go on. Rule 41, this is a new rule in the US criminal procedure. And it makes it easier for judges to issue orders to allow the government to use NITS network intrusion tools, which is another euphemism for basically malware getting onto people's endpoints. So the governments are certainly willing to do that, but they would prefer to have the government, the companies just provide the easy access. Not a back door, of course, but something like a secure golden key. So what could go wrong? If you have access to the secure, I guess these are brass keys, for a TSA lock, you should be able to get into. In fact, if you have this photograph and a 3D printer, you probably could make these keys. And this is the problem when you have something which gives, you know, in a nutshell, if you give access through a special method, well, you've got to make sure that that special method doesn't get into the wrong hands. So turning to the politics of it, beginning actually slightly over a year ago, there was an effort to push President Obama to take a stance in favor of strong crypto. There was a petition up at savecrypto.org with over 100,000 signatures. And his initial response was that, well, for now, we'll not call for legislation, which is not a very strong response, but at least it's not the opposite. And then later in 2016, Obama said, well, we shouldn't have an absolutist view on this. And what he's sort of meaning by that is people are saying, well, that you either have to have security, or if you have a backdoor, you will weaken security, you can't have both a backdoor and strong security. And that's like an absolutist view. And I think this is very symptomatic of politicians looking at this, where it's all about trying to find compromises. And if the technology doesn't permit for compromises, this is a political question, not a mathematical question, not a technology question, and that we need to find a middle ground. Well, I think that this is actually dangerous thinking that if saying that it's absolutist to say that we need strong security, well, you might call me an absolutist, but I think it's more than that. And then November 8th, we have a new president coming online. And so how's that going to be? Well, Trump is not yet in office, but we're able to look at a few things to get an idea of how this is going to be. First of all, just on the Apple iPhone controversy, Trump had a few statements. In the beginning, he was saying that who do they think they are? We have to open up this phone. And then as the debate continued, he was noting that he used both the iPhone and Samsung and said, well, we should boycott Apple if they don't give over the information. And yet, it's a question of how serious that really was. He still tweets from an iPhone. This is a picture of him doing a Reddit AMA after his boycott call and looks pretty much like a Mac there. So we can't really tell how serious this was. We also have some additional clues from the nominations that Trump is putting forward for key positions in his new government, the proposed new attorney general, Jeff Sessions. Well, he has been long in favor of law enforcement access to phones. He felt that Tim Cook, CEO of Apple, didn't really understand how serious this was. And then the new proposed CIA chief, Mike Pompeo, he wants to remove barriers to surveillance and also was pretty suspicious of somebody who used strong encryption. It could be a red flag just if you use it, which is a pretty dangerous line of thought. There was also some efforts in the legislative world, the Burr Feinstein bill, that's Senators Burr and Senator Feinstein. It was actually called the Compliance with Court Orders Act. They were trying to sort of key off of a rhetorical point being made about the Apple iPhone controversy, which is we're just asking people to comply with court orders. This can't be that unreasonable, but it was actually that unreasonable. It would require providers to decrypt things on demand and on pain of severe penalties. It applies to communications, to storage. It applied to the app store so that if you were at your Apple or Google Play, having an app store, all the apps that were for sale on that store would have to have a weak or backdoor crypto, and then you would need to enforce that. And it was more than just end-to-end and full disk encryption. Pretty much as it was drafted, it would have outlawed computers as we know them. It was a fairly terrible bill. But it fortunately didn't get a whole lot of traction. And the rest of Congress decided to do a little bit of looking into it, having committees look at the issue and issue reports. The House Homeland Security Committee, they made a big step forward by recognizing that it's more of a security versus security debate. They rejected the legislative fixes, and most importantly, the Burr Feinstein we just talked about. The House Judiciary Committee recognized that there would be severe problems with weakening encryption. They still called for cooperation between technologists and law enforcement agencies. And a little bit dangerously, they were saying one solution to this would be compelled decryption by the users. So rather than going to the companies and asking for a backdoor, have laws that would insist that the users decrypt their material under penalty, criminal penalties. This is a bit of a dangerous thing for some other reasons, but at least in terms of looking at the availability of technology without backdoors, the committee on the whole was headed in the right direction. And that's where it stands at the moment. So I want to turn now to the United Kingdom. The Investigatory Powers Act, it was for a long time the Investigatory Powers Bill. It is now the act, it's now passed and been signed off. It's often called the Snoopers Charter because it is a broad expansion of surveillance power. It would allow access to communications data from all sorts of agencies, the police, GCHQ, the Ministry of Defense. They would have access to internet connection records, internet service providers would have to store metadata about communications made, websites you visit, what time you do it, all sorts of information and that would be stored for up to 12 months. But then the European Court of Justice said, nope, not going to do that. So this is a very important ruling. The European Court of Justice felt that this went too far, that it was general and indiscriminate retention of emails was illegal and they only allowed for targeted interception of traffic that is justified when it is necessary to combat serious crimes. So this was a very important pushback on the Snoopers Charter. Now for purposes of our talk today, it does not affect the portions of it that were requiring backdoors. We'll go over those in just a second. And then another important caveat is that soon the UK will be leaving the European Union and maybe pulling out of the jurisdiction of the European Court of Justice. So this ruling may not be as powerful as it might have been, but it also sets the stage for additional challenges, hopefully both to continue pushing back on the data retention features and the encryption features in the future. So what does it say about encryption? Well, it says some pretty complicated things that don't really mention encryption by name. So this is a quote from the Code of Practice, which accompanied the legislation. And it talks about some things like technical capacity notice and that you might have to provide a technical capacity. It is interesting that it requires you to notify the government of new products and services in advance of their launch. So apparently you need to go get approval from the UK government before you launch anything that might have new encryption technologies. But it all comes out. What is this thing, this technical capacity notice? Well, the statute defines it a bit. It is something which is issued by the Secretary of State, better known as the Home Secretary. And after the Home Secretary looks at it thoroughly and they've considered whether it's practicable to apply, whether it's proportionate, they have to take into account the technical feasibility and likely cost of complying. These all sound like pretty good things for someone to weigh, but I'm not sure that the Home Secretary is really the best person that is positioned to weigh all those features. And they may end up having a lean towards allowing for back doors, allowing for these technical orders to go out. They also, they come with an automatic gag order so that if somebody receives one, they're not supposed to talk about it with anybody, which makes it hard to organize and fight back against them. And then it can be given to persons outside the United Kingdom. So in their view, everybody in the world could get one of these technical capacity notices and be required to, well, required to do what? Well, they might have obligations relating to the, oh what, it's a back door. They want to remove the electronic protection that the operator may have put there. So they've disguised it with a lot of wording, but in the end, it's a pretty dangerous provision that may both be challenging for people trying to do business in the UK and for those who might not even be doing business in the UK, but might receive one of these under that authority and have to wonder, am I under the jurisdiction? Do I have any business there? So it's a pretty dangerous thing. Elsewhere in the EU, things have been moving a little bit of a slower track. The EU Justice Minors have been discussing the issue, the Justice in Home Affairs Council discussed it thoroughly, they looked at different views. They spoke of the importance of a balance between individual rights and privacy and law enforcement agencies. So it's still under discussion, still under consideration, but hasn't moved forward. And we had some really good report out of INESA, that is the EU Cyber Security Agency, the Agency for Network and Information Security. And they issued a report earlier this month, which rejected backdoors. They saw the problems as outweighing the benefits. And they recognized that it is very difficult to restrict technical innovation through legislation, that even if you have the best possible platonic ideal of legislation, it's still going to be only good for the technology as it was envisioned the day that it was passed, and will continue to become more and more outdated over time. So it's a difficult solution to move forward on. Elsewhere around the world, in April, compliance began with the Australian Defense and Strategic Goods List. This has a provision which prohibits the intangible supply of encryption technologies. And this has gotten a lot of people very worried that that expansive definition will not just be for actual military technologies, but might encompass such things as giving a talk at a computer conference. India, the plus side, that they had a terrible encryption provision that would have required companies to retain plain text for a period of time. They dropped that requirement and plan. But they have proposed something which is a little bit, well, potentially dangerous. They're asking the various phone manufacturers to add their internal biometric authentication system to their phones. This is a system widely used in India for authenticating people for receiving government services. And they want to integrate it into the phones. This could open up security holes, having some government code on the phone. Apparently, Google, Samsung, and Microsoft did meet with India, but Apple refused to go. In Egypt, they started to block access to the signal messaging group. And this is going to be actually continue to be somewhat interesting. So after Egypt blocked that access signal, released an update. The update uses something called domain fronting. This disguises the signal traffic to look like it's going to www.google.com. And this makes it much more difficult to block. I mean, you can still block it, but you'd also have to block all of Google. And this really ups the stakes for censorship. They can't just as easily target the one system, but have to remove something which is used daily by millions of people. And that makes it harder for governments to try and block technology. Also, some good news. In the Netherlands, they came out very strongly in favor of encryption. So that counterbalancing some of the efforts to push back on encryption. The United Nations issued a report this year recognizing that encryption and anonymity are necessary for freedom of expression. And that encryption saves lives without encryption. Lives may be endangered. And we've also had this year has been tremendous for the rollout of encryption technologies. So WhatsApp has added end to end encryption by default to all of its chats and calls. This is over a billion monthly active users who are getting encrypted without having to do much of anything. Facebook has added end to end encryption to their messenger project, but not by default. So this is a half step, but it needs to go further. Encryption by default is really the gold standard. Likewise, Google's ALO included encryption in incognito mode, but again, something that you had to purposefully select. So it's, again, a half step. And then Signal has their downloads have gone through the roof. Apparently they reported a 400% increase in daily downloads since November 8th. And then we've done tremendous progress in encryption on the web. The Let's Encrypt project is providing certificates to over 21 million websites. It is by some measures the largest certificate authority in the world, and it is free. So this is a tremendous success. More than half of the page loads in Firefox and Chrome are using HTTPS. You can see a chart there which shows, I think this is for, I think Firefox. Anyway, go see where it's causing the 50% mark over the course of the year on various operating systems. Android is the laggard, so hopefully Anger can pick up the steam. But nevertheless, it is a good positive trend. And then if you look at a different measure on time, that two thirds of people's time is spent on secure websites. So what do we see looking forward in 2017? Well, we'll probably see more technical assistance laws. One of the things that policymakers have learned from the first crypto wars is that it's dangerous to actually propose a specific solution. So when they came out with the clipper chip in the 90s, this was quickly attacked, revealed to be vulnerable, and then disregarded as a good idea. And so they've moved to a different model rather than provide a target which could be attacked, is to say the technology companies need to nerd harder and figure out how to give us the assistance so that we can get access. And they create laws similar to what the Investigary Powers bill has tried to do, requiring technical assistance without any specific of how that will be accomplished. It's just up to the companies to figure it out. There also will be a lot more public pressure where there have been pushes for compromise. Things saying, well, you really don't want a bill like these bills that will require you to weaken encryption. So you should just go ahead and weaken it ahead of time to forestall the bills, which will be worse. Also, they will put in pressure on whenever there would be any big controversy trying to highlight that encryption may have made it difficult for law enforcement. These pressures will continue to exist. And then in some places, some countries where they are very upset on how people have been using the technologies, they'll continue to have blockages that will, like in Brazil, where they've blocked WhatsApp three times over the course of the year, where they've arrested some of the executives saying you have to give us the information, even though they know that it's technically impossible for them to give that information, these pressures will continue to exist. And we'll be seeing more attacks on the endpoint where law enforcement is going to be continue to work in a world where there is strong encryption. Then the way around that is to get to the endpoint. So we'll see more and more use of malware and more and more importance in people looking not just at making sure they're using encrypted tools, but they are using good security advice to avoid being attacked and fished as best they can. Another important prediction, I think that free and open source software is here to stay, that for a lot of these laws and policy things, they're less effective when going against open source projects. There often aren't companies to put pressure upon that if you attempt to legislate a requirement for a backdoor, it's going to be ineffective even if somebody decided they had to put the backdoor into the open source code when somebody compiles it, they could always comment out that section. So it's pretty ineffective to go at them. The real challenge for some of these software projects is in deployment, getting them out into the hands of billions of people, making them usable, making them as part of people's daily lives. And then an important thing for moving forward, how we should move forward, is that policy makers can be reached. We've seen when there have been some policy makers who have taken the time to get experts' views, conduct hearings, investigate the issue, then we're starting to see things more like it's really security versus security, weakening encryption, can harm security. There are important interests here at play. And this is a positive step forward. It's fighting against a strong lobby. Law enforcement agencies are a very powerful lobby. Legislatures look to them very seriously, but technologists' views can make a difference. So what you can do, well, if you're a coder, include default and encryption in any products that you have, wherever it needs to be. And also, work on usability, making it accessible for billions is a key point. For websites, encrypt all the things. Start using CERTBOT. CERTBOT is a program that works with let's encrypt that makes it easy to set up a CERT on a website. Use let's encrypt. There's really no excuse anymore to have a website that doesn't have HTTPS. And then for individuals, well, you can use encryption in your daily lives. We saw before that the nominated CIA director was suggesting that the use of strong encryption might be a red flag. Well, if everybody is using encryption all the time, it becomes less of a red flag. So try to incorporate encryption as much as possible to make it less of a red flag that someone is using that technology. And then keep active. Pay attention to what's going on. Help defend encryption by talking to policymakers, signing petitions, paying attention, and being a participant. Thank you very much. Feel free to queue over the microphones over there if there is any question. We've got already something on microphone one. Yeah, it doesn't work that much. Microphone one. Yeah, thanks, CERT, for your great talk. That was really, really interesting to see how EFF and your colleagues are battling bad ideas to stifle encryption. I have one question that addresses the argument that if encryption is illegal, only the bad guys use encryption. Do you think this argument, which basically means it makes no sense to pass laws against encryption because those who want to break the law won't respect that law either? Do you think this argument has gotten enough traction, for example, among lawmakers in Washington? Or have you heard of any conclusive counterargument to that line? Well, what I've seen is that, Amazo, you raise a good point. Well, first of all, the tautology, if encryption is illegal, then indeed anyone who uses it would be a criminal because they'd be a criminal by virtue of using encryption. But I think that one of the things that policymakers are really trying to do is get to the most widely deployed encryption. So they may recognize that there will be open source projects that people will be able to download, encryption made outside of their jurisdiction, that they won't be able to stop, and that the bad guys will be able to find and use those technologies. But they still want to make it a lot easier to get access to things which are widely deployed, where there are billions of users. And I think one of the things you can infer from that is that it's not just about targeted decryption going after a known bad guy, but they want to be able to have ready access to mass communications. And it ties in with some of the attempts to sort of predict who would be bad by looking at information before it happens, which raises its own civil liberties concerns. Microphone 4? The assumption that you had is that we are living in a democracy, so the struggle between you guys and the government is going to be a healthy one. But the transition from a democratic to something like authoritarian, like in Turkey, it seems to be like really fast. Do you have any like plan B, anything for the case that something like that might happen? I mean, except from like second amendment rights and like that kind of stuff, but do you have any plans that we could have for a case like a group of reactionary politicians slash the rights of a democratic society? Yeah, I mean this is one of the reasons why you want to have encryption widely available when you can is so that if later things move into an authoritarian mode, that those things are already widely deployed. For those who are living in authoritarian regimes, encryption very directly can help save their lives by protecting their information from being tracked and observed by the authorities that might want to put opposition figures in jail for the mere act of opposing the government. The challenges there are dealing with things like that using encryption might be seen as a red flag that if you get stopped by the police, they're going to want to get onto your phone. They may use strong measures to try and get your passcode. So even if it has the best encryption on the device in the world, if they're going to beat you with a rubber hose until you give up the password, this isn't going to help. So these are very challenging things, but I think that the best thing that you can do ahead of time is make it so that everybody is using encryption as much as possible so that it becomes less suspicious that someone is using it and having to be tied into widely used products that they would feel bad about blocking. So that's why it's nice that there's encryption in things like Facebook Messenger and WhatsApp. And then after a country has already gone into the authoritarian, it is those who are outside that country who are providing technologies in should try and make sure that those technologies are effective and secure. We've got a question from IRC and it's, what is your view on so-called warrant-proof devices? I'm not actually sure what's meant by this, but will such warrant-proof devices and whether they will remain legal in the future? Well, so this was what Comey was referring to. He didn't want to have a world where there was something that was warrant-proof. And I think that I'm in favor of having a full disk encryption on phones where the government, you know, they can try under their own powers to try and get in. That's what happened, but they shouldn't be able to compel the provider to change its code. And calling it about whether something is warrant-proof is a rhetorical device that the government is using to try and set it up as a discussion about the rule of law or whether warrant should be effective. But it's missing the larger policy issues. And so I guess in some sense, when they talk about something as being warrant-proof, that may be a side effect what happens when you have strong encryption, but it is not really hitting to the heart of the policy debate. Hi. On the subject of rhetoric, it seems like in the past year or so, a bit more about two years, we've heard a lot about strong encryption versus weak encryption. And it seems like it's going to be more and more a tool used by those in power to tell us, well, the bad guys, they're the only people who need strong encryption. You, the common folk, the good people, you only need the normal encryption. Why would you go for the bad one? So I think you've touched on that on that subject, but maybe you could tell us a bit more about it. Yeah. So I think I mean, strong encryption is really what we need. And we've seen actually the terrible effects of this when there was a misguided attempt to have weak and strong encryption in the 90s, where there was export grade encryption and domestic grade encryption out of the United States. And so Netscape Navigator had a weakened international version only with 56K strong keys, 56-bit keys. And then that turned out to be sort of an unwise policy move. They said, well, this will be good enough for the average person. And I think there's a couple of things to think about that. One is we've seen it happen and it kind of failed. But a second one is that if you're trying to protect yourself now, you have to protect yourself from a wide variety of threats. And you're going to need strong encryption from those threats. And they may be an authoritarian regime. It might be a computer criminal, but the value of strong encryption is there for all these threats and deliberately weak encryption. Every time that it has come out, it has turned out to be far more of a disaster than the government has predicted. Thanks. Microphone 1. Hello. So we've seen that governments try again and again to pass legislation that weakens encryption. What would need to happen so that they can't try to pass such legislation again and again? And are we moving to that direction? Well, I think there's not much that's going to stop them from trying, because they're facing pressures from law enforcement. So I guess we're moving that pressure, but that seems very difficult. But I think the key is to try and convince policymakers to actually pass, if they're going to pass anything at all, pass something that encourages the development and use of encryption. I think I'm very in heartened by the Dutch government's response, where they were strongly in favor of encryption and the EU cybersecurity, the network information security group, also coming out strongly in favor of encryption. So get the policymakers ahead of time, be looking at this as something that is beneficial to have. So they're less incentive to go and push for weakened encryption. I'm sorry to burst your bubble, but the Dutch passed a law last week that allows us to hack into any hackable device and that allows the government to buy backdoor software from companies. All right. Well, so I tried to keep this thing as up to date as possible, but thank you for that information. And well, I think that that is something that is in line with going after the endpoints. So that even if you do have strong encryption along the way, attacks on the endpoint are a common government solution to try and get around that difficulty. We've got one more question from IRC, which goes into a similar vein like this one. What could we do to stop our politicians from wrong, not just wrong actions, but simple politics like in the vein of the Berlin attacks? Again, they were asking for more video surveillance, which clearly does not prevent attacks like this. What can we do on a political level to stop our politicians from trying to enact laws that are basically orthogonal to the problem? Yeah, this is a very common thing. Whenever there is an incident around the world, especially something like a terrorist attack, legislators feel a very strong desire to do something about it and that something may not be directly related to the problem, but they will be able to go back to their constituents, the people who vote for them said, well, I did something. And part of that is educating the voters so that they are less fooled by this behavior and to get people active in calling their representatives and telling them that they want to have strong encryption, that they don't want these kinds of measures. The other thing that can sometimes be effective is legislators don't like to look stupid. And so if they are doing something which is a technologically bad response to a given area and you can show how it is ineffective, that can sometimes help legislators understand that it was sort of a bad move. This argument is often portrayed as between citizens and governments, and I'd like to propose another argument and ask what you think of it, which is that foreign nation states are actually a far bigger threat than crime, and therefore states need into encryption more than anyone else, and therefore they should get on the side of being pro encryption because they need it too. Absolutely, I think that sometimes that is actually an argument that can work with legislators where they're not so worried about the citizens directly, but they're interested in the balance of power between nation states. This has come up in some of the back door discussions where if you provide a back door to one government, let's say you think this is a great democratic government that only uses power wisely, what do you do when the other governments try to ask for that? Do they give them the same access? Some legislators will understand there actually is a very important national security component to having widely available strong encryption, and I think that an example that comes to mind is that there's been a lot of emails released from the Democratic National Committee in the United States that maybe now when they think back on it, maybe we should have encrypted that information, maybe we should have put stronger resistance. It's very difficult to fight against a nation state attacker, but at least you can make that a difficult job for them. Hi, thanks for your talk. I was quite curious to hear about the that there is actually an act in the United States that governs the use of vulnerabilities. Are there similar acts throughout the rest of the world? I think it would be amazing if we can enact a sea change that made it a buyback. So I should be clear, the vulnerabilities equities process is not a legislative act, that is something that came from the executive. So it was not commanded by the legislature, but rather done on its own authority by the executive branch in part to modify some of the critics who have said that they should be reporting more vulnerabilities. So it is something that one could put into a legislative process to require governments to go through that balancing and make sure they do it, but I'm not aware of any legislation that has yet proposed that. Thank you. Thank you. All right. Thank you, everybody. Pleasure to be here.