 that EFF has done more of or longer of or more consistently than making sure that you have the right to secure your communications when you go online. We've been standing up for your right to be secure in your communication since, well, honestly, before there was a worldwide web. I've kind of lost count. I'm not sure we're in crypto wars, part three or part four kind of depends on how you count. But however you count it, as more and more people are living and sharing and operating online, especially during this pandemic, we need real security more than ever. So it's puzzling as well as, I would say, pretty impuriating to have the same dumb conversation about cryptography in 2020 that we did when I started working on this issue in 1993. So EFF helped free encryption from government control in the first place and we're gonna stand up and do it again. But let's talk about where we are today and to do so we have an all-star panel. First from EFF's board of directors, we have Bruce Schneier, author of Applied Cryptography, Renowned Cryptographer, Public Interest Technologist of the First Order as well as an author, a teacher, and I'm not just saying that because he's on EFF's board. And also with us is Andrew Crocker, senior staff attorney at EFF and Erica Portnoy, senior staff technologist at EFF. And between them, Erica and Andrew with help from our legislative team and our activism teams are really spearheading EFF's work to protect cryptography right now. So I wanna get right to it. We're gathering questions and comments on Twitch. So put them there and the team will feed them to us. We'll get to them. I wanna start with some introductory questions, but if you've got things to say, make your voice heard and that's what the internet's for. And we'll try to answer your questions and talk about where we are and how we got there. But I feel like our house is on fire right now, that there is something moving right now. It's incredibly important and incredibly dangerous. So rather than start with the history, let's start with where we are right now. Andrew, can you tell us what's going on in Congress right now and why we should be concerned? Sure, so we have not one, but two terrible encryption bills in Congress right now. It's kind of a, you know, terrible and then super extra strength only with a prescription, terrible. The first really terrible one is the EARNIT Act, which I'm sure many people watching have heard of. This was introduced back in March and then recently passed out of the Senate committee just the last week, I think it was. What EARNIT would do, originally what EARNIT would have done was to create a commission spearheaded by the Attorney General himself, Bill Barr, to come up with best practices for fighting the very serious problem of child sexual assault material online. And the idea of EARNIT, the reason it's called EARNIT is that services would have to follow these best practices in order to earn immunity from lawsuits under section 230 law that's near and dear to our hearts, of course. Nevermind, of course, that federal authorities could already hold internet services liable for distributing this kind of material and that they have a duty to report it when they find out about it. Instead, these best practices probably would have gone much further with Bill Barr at the helm. He's been very prominently, very prominent critic of encryption referring to us as absolutists and in fact, throwing a pot shot at my dear colleague, Erica, after a comment she made New York Times last summer. So we were very concerned about this original version of EARNIT that these best practices would require services that offer end-to-end encryption to backdoor their services. And we mounted, I think, a very successful campaign against EARNIT Act. We got massive, massive letter-rating campaign and AMAs on Reddit and so forth, which resulted in some changes to EARNIT just last week, as I mentioned. So there were two major changes that were introduced in the Senate Committee. One is the EARNIT part is no longer, there's nothing to earn. In fact, services would just lose their immunity under section 230. They could be held liable, not just for knowingly distributing this CSAM as it's called, Tile Sexual Assault Material, but also under any new state law, they could be held criminally or civilly liable for any sort of behavior that the state deems to be in violation of the law, not just under federal law, but under laws that could be drafted by any of these 50 states. And so it's very easy to imagine that one of these jurisdictions could decide that encryption is a problem and that offering end-to-end encryption, for example, is in some way facilitating the distribution of this material and a service like Facebook Messenger or any others that offer encryption could be hauled into court in any of these jurisdictions and be forced to justify their use of encryption. There was an amendment offered by Senator Leahy, which passed along with this new version of EARNIT, which would prevent services from being held liable because of their use of end-to-end encryption. But there's a lot of worrisome loopholes there. One is that we've heard a lot about client-side scanning and my colleague Erica could talk more about that if you liked the idea of being that you could be forced to have the content that you share over end-to-end encrypted messaging shared, scanned by the service on the device before it leaves your devices and is encrypted. That seems to be outside of this immunity that Senator Leahy's amendment introduces. And also we have just this open-ended risk that any state can come up with a new way of holding someone liable, not because of their use of encryption, but because of a side effect of the use of encryption. And so we're really worried about the liability that that could create in 50 plus jurisdictions. It's all the states plus the District of Columbia, Puerto Rico and so forth. So that's terrible bill number one. Terrible bill number two I'll spend a little less time on is the Law Enforcement Lawful Access to Encrypted Data Act of LADA. I guess some people are calling it. And this is just a full frontal encryption is bad. We're going to outlaw it in every way. It would allow the government to create a freestanding requirement for device manufacturers, operating system creators, operators of end-to-end encryption messaging services to backdoor their products. Whenever they were served with a, what is called a assistance capability directive, I got to get that right. They could be forced to render any encrypted data in an understandable form. And the only defense they have is that it isn't technically impossible to make any change to their service. So not just that they have engineered their service in a way that makes it impossible to decrypt the data but that they cannot re-engineer their service in any way to make it accessible in plain text. And it allows the government to introduce secret evidence in support of its accessibility capability directives. It's really just every bad idea crammed into one. No fig leaves, this is just cross errors on encryption. So we're worried about both of those. As I mentioned, Ernit is already advanced out of committee and so that's the one that's moving and we're doing everything we can to stop it in its tracks. And EFF has an action item. One of the great birthday gifts you can give us is go ahead, sign, tell your congressman, tell people that this isn't gonna stand that trying to outlaw math is a pretty bad idea in any situation, but it's especially bad here because it's gonna affect you. So Erica, why can't we just nerd our way out of this? I mean, you know, why don't you brilliant technologist just nerd harder and get us out of this? And would that work if we tried to do that? What would be the gain and what would be the loss? Yeah, I mean, the short answer is that absolutely would not work by any means. The answer to why really lies in the technical details without going too deeply into it. Basically, there's no such thing as backdoor studies in the field of cryptography. So when we talk about building a backdoor, what we're really talking about is an entire field here of different approaches that try to weaken the encryption using some perspective that the particular researcher in question hopes won't make the system too unfeasible the week to deploy in practice. So of course it follows from an information theoretic perspective that you're gonna necessarily have some loosening of constraints and information leakage when compared to unadulterated encryption, which is to say that any sort of backdoor is necessarily going to weaken encryption in the first place. And if we were to go into the details for why exactly that is for all the different approaches, it's kind of a long story. But there's a typical pattern that comes up over and over again because it's not like nobody has tried to do this in practice before. People are constantly trying to find that holy grail of strong encryption on the one hand and access for law enforcement on the other hand. But when they do this usually goes something like this. Someone says, why can't we just fill that? Why can't we just nerd harder? And so they'll take months and they'll try something. They'll maybe implement one version of it, read a bit more theory, do some calculations, repeat this all a few dozen times. When after all of that, the realization hits. Even if you do massive batch queries for optimization, you charge your database into little bits and accept hours of latency. It's still too expensive for most computations to get both results from computations over encrypted data and still preserve sufficient levels of privacy when you're operating at a scale of any meaning. And if you're not trying to do some sort of approach that would be more computationally expensive, then you are immediately going to be giving up on something that people consider to be one of the core components of encryption in the first place. Because when we talk about encryption, we're not really just talking about specifically AES or RSA or anything like that. We're talking about encryption as a system where no one but you and your intended recipients can access or otherwise infer the contents of your data. And once you start picking apart at any of the pieces of the encryption system, you immediately begin to lose that promise that the provider is making. If we didn't need to trust our providers, if we're okay that we trust our providers with whatever they feel like doing and any weird code things that they're doing on the backend, we wouldn't need encryption in the first place. The point of encryption is to remove that need for trust and yet still be able to have secure communications. So even if you hear someone like a professional say something like, this might be possible, what they usually mean in practice is, here's an architecture that might be technically feasible to build, but there's no way it could ever be run at scale and still be sufficiently privacy preserving. And of course, that's not even starting to go into the international governance aspect of things where nothing is going to stop someone with criminal intent from just using an international product that's not required to have the backdoor in the first place. So you start to get into the policy perspective as well. And if you look at the practical teeth of any of the technical solutions that have been proposed so far and also considering it from a general theoretic perspective, it's really just not something that's feasible, anything else, anywhere outside of dreams. Outside of dreams, nightmares really for me, but I totally understand. So Bruce, how did we get here? Like why does the government hate cryptography so much? And what's the history here that has led to this being such a problem that just doesn't go away? I think there's a security versus security debate that's been going on since the early 90s. I mean, on the one hand, there's security in the fact that we have secure communications and devices. And there's security in the fact that the police can investigate crimes. And that's something that people have wrestled with. And if you are the police, what you think about is solving crimes and that is your perspective. So you're gonna choose a solution that has less security communications and more security for law enforcement. And that's what happened in the early 90s. We had the clipper chip. I pulled out my book that's about 25 years old and I write about it. This is 1993, Cindy was there, we were there. And there it was secure telephones. You look at crypto war number, I think it's two or three, which is the iPhone versus the FBI and that's encrypted storage. And now we're talking about message platforms. And so you're looking at what's more important, right? The ability to break into these things which has value in solving crimes, even though it gives insecurity for all some other reasons or do you want these things secure which harm solving crimes but gives security in other ways? Then in the 90s, certainly today, you have to decide that we must adopt a defense dominant strategy. As long as this device is in the hands of every single lawmaker and world leader and judge and police officer and nuclear power plant operators CEO voting official, it's really important that we secure these. Communications and the storage, right? And that has to win. And if the FBI and other law enforcement need to learn other forensic techniques because they can't get at the phone, that's an overall security win for us. And but that's always been the battle, the security versus security. And I think it's the myopic view of someone like an FBI or justice department that only has them look at the crime prevention aspects and not the broader, you know, all of our legislatures have these aspects of the problem. Yeah. So Andrew, I know you have been tracking a lot whether it's really true that the FBI has a hard time getting solving crimes because of encryption. And I think that they tell a story about how hard it would be to solve crimes. I actually think you prevent crimes with encryption because if they can't get your data in the first place, they can't do it. But how trustworthy is the government when they say that this is a real problem for them and that bad guys are getting away? Well, without ruling in on how trustworthy they are, I would say that they've had a very hard time coming up with reliable figures and anything more than anecdotes. So, you know, Bruce mentioned the San Bernardino case. We were all told in public that this was a ticking time bomb that the FBI needed to get into the phone, seized from one of the shooters in that case to find out if there were any other co-conspirators or things like that. And then the only way in, of course, was to compel Apple to assist to create a custom version of iOS to get involved in. And then on the eve of the hearing, I in fact had flown down to go to this hearing. I was in the airport. We found out that the FBI, lo and behold, had another resource at its disposal and with the help of an outside party had gotten into the phone. That led to an Inspector General report from the DOJ that found that the FBI, even as James Comey then FBI director had been testifying in Congress, saying that this was a necessity, had had within its components the access to this information and the ability to get in. Even after that, we had then FBI director Christopher Bray making testimony in Congress about the number of iPhones that are inaccessible every year that they access. He had a figure of almost 8,000 that year. I think it was 2017. Seemed like a big number. No data forthcoming on it. The Washington Post published a big expose. It turned out that had been miscounted. It was something more like a third of that 8,000 and not clear what inaccessible meant. Does that mean inaccessible for all time? Inaccessible in a short period of time. What had been tried on those phones? We just don't know. My colleagues and I submitted a Freedom of Information Act request about that after the fact and really have failed to learn anything. Of course, the government, not at all ashamed about this, has moved on to new talking points about CSAM, as I mentioned before, never really addressed the misstatements and inaccuracies in the wake of San Bernardino. So we can draw our own conclusions about how trustworthy that is. Erica, I hear a lot about client-side scanning. Can you talk just a little bit about it and sort of people hear that? And I think they hear it sometimes from our friends. And that's a solution here. And is that our way forward? Yeah, this is another one of those really tempting solutions that I was talking about earlier. So the idea of client-side scanning is that instead of sending the information over plain text, you will have your messaging system be completely encrypted. But instead of doing any scanning on the server side where the provider has access to the data, the scanning will all be done on the client-side that is your phone or your computer or whatever you're using to send the message in the first place, either before you send the message over the wire and it gets encrypted or after receiving the message and it's already been decrypted. So you don't touch the encryption, kind of you put it around the edges of that. Unfortunately, it's just another case where the problem all happens in the specifics of how you go about implementing something like that. Where to even start with the problems here? Well, I think the first place to start is that the current state of the field is pretty, let's say in progress. You have a couple of different implementations of scanning in the first place and there's maybe one or two that I know that have actually been implemented on the client side. But when you're doing that, you have to change around some things. So for example, some of the versions of it just put a machine learning model completely onto the client. But then you have questions about how to update the model, where are you getting training data for that? But, you know, even not talking about all the details here and that's not even going into the homomorphic encryption versions which are just as a whole completely not ready for prime time use. Once you start talking about doing the scanning at all, if you assume that the provider needs to do necessary reporting, if they know about something that they, if you assume that the provider when they know about something are required to report it and they find a hit for that on the client side and they'll be required to report it. Well, are you assuming that this system is 100% accurate or would them sometimes be false positives? And if you assume that there are sometimes be false positives, then you're saying that anytime one of these potentially mistaken searches finds a hit on the client, it'll be automatically required to send the contents of that message to the provider in plain text, thereby completely bypassing the encryption in the first place and making it as if you didn't even bother sending it encrypted. Basically, what's the point of sending it encrypted if you have this separate system over here that says, okay, sometimes we're just gonna not encrypt it for fun. Not for fun, obviously, but for fun. I mean, maybe a machine learning algorithms concept of fun where it finds the wrong thing in the first place. And anyone who says that there's 100% accurate machine learning system has never written a machine learning system in the first place. And so that's just a couple of the problems there so far. Another giant problem with this is that once you build a system for this, it pretty much takes as an input, a set of images or text. It's not a CSAM specific system by any means. You could put any images that you want to have censored in there in the first place. You've just built and implemented and deployed a general purpose censorship system which can decrypt and report on any messages that someone sends, obviously not something that we want to be putting into our devices because it would be extremely easy to move from there to talking about potential terrorist content or misinformation. And of course as we know, whoever controls the meanings of those things is the ones in power and the ones in power aren't necessarily the ones that we want to be giving even more power. If you're considering terrorist information to be, for example, like saying Black Lives Matter, that's not that far fetched that you can imagine an administration that says anytime you're talking about activism or protesting that automatically counts as the sort of content that would need to be decrypted. And then once that stuff starts being decrypted automatically, what do you even have your antenna encryption system for? Client-side scanning is something that, if you're really in the weeds of all the technical details of all the different ways you could build a backdoor terribly, it sounds a little more tempting, but we must not be tempted by it. It's not any better than any of the other solutions. That's right, that's right. A dictator's dream we should not build. So our audience has a question for Bruce. Bruce, do you think Beyond Fear is still an accurate perspective on the relationship between privacy and security? I think it's not bad. I think I did a better job in this book, which is like a decade later. So I would go here first before I would go back to Beyond Fear. You're assuming I remembered what I wrote back then. And really I think the way to think of all of these lawful access provisions is we're trying to put a vulnerability in your device, in your system, that the police can use. So this is a vulnerability. It's something against your interest. Maybe you're a citizen, you're a dissident, you're a criminal, it's something you don't want that is gonna work when the police want it. So now I have to build a tech system that only operates when someone with the correct morals and the correct legal piece of paper nearby operates. So that's actually not a tech, that's not something technology can do. I can't build a tech that works differently in the presence of a certain legal piece of paper. It's tech, it just works. So now I have to build a technical wrapper to try to make sure that only the good guys have access to it. And that just doesn't work. So I can build a secure system or I can build an insecure system and try to make sure the insecurity is only in the right hands. And that's why it isn't a matter of nerding harder. It's a fundamentally different way of nerding. And again, this gets back to if we believe defense has to dominate our strategy and build the things securely. It's more important that our critical systems, national, corporate, personal, international are secure than it is to have this one sliver of tool that law enforcement can use. Yeah, I often say that, if the police went around your neighborhood and said, look, we believe there's a thief who's breaking in. So why don't you leave your front door open so we can make sure that you're not the thief and we can catch you if you are the thief. Most people would think that that was not right. But that seems to be the tenor of what law enforcement would like us to do with our digital houses. And I think that we should look a scans of them in the same way, but obviously I'm a little biased. Andrew, we had a question about the Wyden bill. What do we think of senators bill that was meant to counter the earn it act? Yeah, so to my knowledge, we haven't taken a strong position on it. My understanding of what the bill would do is relatively well-intentioned. Those of us who are against earn it sort of point to all of the things that could be done to stop the very real problem of CSAM that this bill does not do. However, the Wyden bill would provide a lot more funding to the FBI and to the others that fight it. And we've seen the FBI in the pursuit of this very real problem, overreach, engage in very problematic surveillance that trenches on Fourth Amendment rights, such as in using lawful hacking, so-called lawful hacking techniques by using informants at Vespai to scan everyone's computer for illegal material and so forth. So we're a little bit suspicious of writing a blank check for fighting this sort of content. Great, thank you. Bruce, you wanted to talk a little bit about how we tie this to what's going on in Hong Kong right now. So Hong Kong's interesting, right? So there's now a new law there that gives the state broad security powers of censorship, of surveillance and compelling companies to turn over evidence of what in China is banned political speech and other crimes. And I think this really goes directly to why strong encryption is important. If these companies have backdoor systems, they will have to provide that backdoor access to the Chinese government, just they provide it to the US government. I think they can't play favorites. The tech is there, they're capable of doing it, they'll have to provide it. And only if we have secure systems can we ensure that dissidents around the world can speak freely. And I actually pulled a quote from James Baker. He used to be the FBI General Counsel. He wrote this in 2019. And he said, it is time for government authorities, including law enforcement, to embrace encryption good as it is one of the few mechanisms that the United States and its allies can use to more effectively protect themselves from existential cybersecurity threats, particularly from China. So here we are a year and a half later and he's right. Encryption is how we can help defend Hong Kong dissidents. And backdoors is how we abandon them. Thanks. Somebody asks, do we think that situations like a global pandemic make it easier to just smuggle in legislation that limits our privacy? What do you think, gang? I certainly feel a little like that right now. I keep going back to Andrew, but Andrew, I know you're in the thick of this and the things, but so is Erica, so either of you. Oh, I'm under all too polite. You got emuted, how did that happen? Sorry, so Erica, you have to speak for Andrew. Yeah, sure. Well, let me put on my not a lawyer hat here. Oh, he's unmuted, great, perfect. That's my fault I unmuted to stop my son from photobombing the live stream. He can't have strong encryption too. He certainly does. I mean, that's sort of my answer is that whether it's being snuck in or not, they couldn't have picked a worse time. We're all stuck in our homes or out in the streets protesting. We're using Zoom, we're relying on our phones and it couldn't be more out of touch with what people need. I think Bruce was touching on this a minute ago. We should all get used to the fact of strong encryption and whether it's using the pandemic as a cover or not, it's exactly the wrong way to go. Yeah, we always talk about necessary proportional when we talk about surveillance. And we're living in a world where a lot more things than necessary and proportionality is different. So you can easily see things being snuck in that's not related because it's an exceptional time. And I do worry about that. They have also been talking about this act for months. It's not like the Ornid Act just popped up since the start of the pandemic. You can start to see the gears of this maybe even as far as last summer is when some meetings started to be convened. So I don't think that the timing of it was specifically done with the pandemic but we certainly do have to keep our eye on it. Yeah, I think that that's right. I think that the train was, we managed to slow it down a little bit but it was already moving before this. That's why we have to be strong and consistent. I mean, one of the things that happens when you're fighting against a bill that a senator, a powerful senator is pushing is that they have a lot more tools and control over the timing of things than we do. And so we always have to be on guard. I always say, people ask, what is EFF doing? And I say, well, EFF's on patrol, right? Like we don't know when things are gonna break and so we try to stay here and ready whenever things go. Encryption's a game of whack-a-mole. You have to hit everyone down. Yeah. Let's see. We have a question about whether, we have a couple of questions about what's the balance between law enforcement's needs and our needs? And I think we've answered it a little bit but do folks have other things to say about that? Yeah, I think we need, law enforcement have good digital forensics. There's a lot more to crime solving than getting the encrypted data. That we were living in a surveillance society. There's a huge amount of surveillance data. And I think law enforcement is just very poorly educated from the FBI, especially down to local. So going to the phone is sort of what they think as this is the solution. And if they had more sophisticated, investigated tools we could have all the security we wanted for our data and conversations and they would still be able to have all the crime solving capability they needed. I actually don't think there's a conflict here if they actually had the tech that they needed. I think that there's truth to that. I also think that they have a lot of tech already. I mean, I would not put it past, just because you have nine of the cards doesn't mean you don't also want the 10th. And I think Peter Schwerer was right that we're living in a golden age of government surveillance. That the amount of ways that law enforcement has access to evidence that they did not used to have, the evidence that didn't even used to exist much less that they didn't have is pretty tremendous. The list is pretty long. So I'm in favor of getting good tools in the hands of government if they're not tools that undermine our rights. But I also just wouldn't assume that just because the government wants more that means that they're missing something as opposed to one that 10th card, not just the nine before it. But again, I have 20 years of this bias built up. Let's see, do we have any other questions coming in? Someone asked about how cryptography can be used in the context of contact tracing. And Erica, I don't know if you've been involved. I think you have been involved in some of these issues. I think that it's a bit of a redhead frankly, and I'm wondering if that's true, if I'm right about that. You know, I really haven't been looking at the context tracing stuff much. Some of our other technologists are more focused on those issues. That's because I initially saw some comments from epidemiologists and contact tracers who were talking about the tried and true methods that have worked over years and years of use and that putting a tracking chip in people's phones to try to detect when they talk to each other and finding a technical solution for this is not going to be the way that we know is going to get the best results. So I haven't really looked into it much more than that other than to basically say, let's listen to what the epidemiologists have to say. And if we need to build contact tracing tools in the first place, much less whether or not we should be putting cryptography on top of them. We do have some resources on our website where some of our technologists, Bennett Ciphers and others have dug into the actual specs of the Google Apple API and put it through the spaces. So maybe one of the moderators could throw that into the chat. Thanks. So this one's for Andrew and it's from our board member, Brian Bellendorf. He says, if earn it passes or something similar do we have a legal challenge and will we do it? We'll always try, right? That's what we as the lawyers on this team like to do. I think I maybe would be a bad lawyer if I gave up the store and said exactly what we would do over. But some colleagues and I wrote an analysis of the original version of the Earn It Act where we talked about ways in which it violated both the first and fourth amendments. Obviously it's been amended since then so we would have to do a new analysis. With the new Earn It where it sort of puts it all to the states we would have to see what the states do. What kind of backward laws they passed that would hold services liable for just deploying encryption. If that is in fact the strategy they take I could certainly see first and fourth amendment attacks on those as well. The general idea being if you the government dangle immunity or other benefits in favor in exchange for censoring content on your platform that's potentially an unconstitutional condition that you're encouraging censorship. Oh there we go you're back. Oh okay and then the fourth amendment component is if the government is encouraging private actors to scan through their services to look for illegal content that's sort of dragooning those private actors and making them agents of the government and engaging in warrantless surveillance. And that's unconstitutional well as well established unconstitutional practice. So those are two ways in which we would look at bringing a challenge of course. Great. Let's see. We had a question about whether we think a 256 bit AES is still good enough encryption? Yes. I like one word answers they're good. And we also had a question about Apple and its commitment to privacy. I would say that we don't like to comment on specific companies but I will say that we were certainly you know Apple took a risk and stood up with its users in standing up for strong encryption on its devices and we were you know having harangued Apple and all of the other tech companies to offer strong encryption and stand with their users when they were attacked we were happy to stand with Apple when they did the right thing. I think I don't know if others have things to say about that. I mean you know all of these tech companies are well many of them are you know privacy is big and complicated question. And so we'll stand with them when they stand with their users and we'll attack them when they don't. And so you know it's hard to say across the board what we think because what we think may depend on what they're doing in a whole especially when you've got a big comber like Apple that's in a bunch of different spaces but certainly Apple has stood with its users in the FBI case and we were happy to stand with any tech company that's gonna stand up for your rights. We would like them to encrypt their cloud backups. Yes we would like them to encrypt their cloud backups. And have stronger protections for the differential privacy. Yep. But these are relatively small deltas compared to some other things. But honestly you know if they want the list we'll provide it. Eric I mean Andrew did you have something? I was gonna mention that on the COVID API in particular which I'm not sure if that was the focus of the question. One of the things that we've called for for any contact tracing app is auditability and open sourcing of the protocol or the app. So we don't have to trust that Apple is living up to its privacy commitments so that you know this interested outsiders can get it and make sure that they truly are. Yep. We have a couple of interesting questions here but they're just not for this panel about what about AI and machine learning and what about surveillance cameras. And we will have people there are other plenty of other people at EFF have plenty of thoughts about this but if I'm not asking your question it's because I know what Erica and Andrew and well Bruce could probably talk about anything but I'm trying to talk about the crypto related things that I know they have expertise on. We have one question about is there a possibility to preserve household encryption in the face of potential quantum computing decryption? Like maybe Bruce or Erica could talk about that. So I'll do that quickly. Yes, actually I did an essay on this. If you look at this it says it was titled cryptography after the aliens land. My name and you'll find it. The short answer is quantum cryptography is not the end of cryptography. For symmetric keys you double the key length that's easy but we already have a process for quantum resistant public key algorithms. Even if they fall you're not losing security, right? I mean, this has an entire key infrastructure without public key at all and works just fine. So it'll be different. It won't be a disaster. We are well ahead of the technology cryptography than there is in quantum. So short answer is don't worry too much. There's a lot of other things to worry about first. The only thing I'd add there is to say that Shor's algorithm is just that 2x improvement. It's not the magical science fiction version of quantum cryptography where it tries all the answers at once. No, that's not actually how it works. It's a whole different system of mathematics and things like that. Really don't worry about it. We may need to change some of the ways we implement some things but we're already constantly doing that. We deprecated SHA-1. We can deprecate SHA-56 if we ever need to. Great. All right, well, we have time for kind of a last word from everybody and I did have, I wanna just remind everybody we do have an action going. Does anybody know what Sonia's gonna say? I think it was the Action Alert Against the Earned Act. That was gonna be my last word. So I hope that's what she was gonna say. Call your senators. I think you should start because she's not here. Yeah, a little happier. So let's start with Bruce and we'll go backwards around. So do you know you are, I think, offline for a bunch of that? No. Oh, I said such beautiful things. So last words from everybody. I will cede mine to you to say it again. No, no, we'll go around. I'll do it at the end. So this is an issue that isn't going away simply because if you're law enforcement, you want more and more authority. You wanna see everybody's hands and make sure you know where everybody is and what they're saying. And this is an issue that has to be solved above law enforcement. The need for security, for national security trumps the need for law enforcement. And we have to accept that and implement that. And it's getting to the point where this is getting there with is dangerous. It's not about data. It's about things. It's about life and property. The good security in our communications and our devices matters. 5G is not about you watching Netflix faster. It's about things talking to things behind your back. You don't have strong encryption there. Those things can hurt people. So this matters more than ever now. Great. Andrew? So I agree with Bruce that this is not going away and the theme of the history lesson of this session is that this is the same argument over and over again. So not to be a downer but I think we can expect for that to continue. And I think as a result, we should embrace what a report last year from the Carnegie Foundation called the absolutist position, which is the FF's position that no backdoors and never, not one, not ever. And that we should all embrace that that is the absolutist position and that we will continue to hold it. We should not resist any kind of false compromise for the reasons that Erica talked about. The nerding harder is not a compromise. Erica? On a very similar note, this one's for the cryptographers and technologists out there who might be tempted to try to come up with a system that lets both sides be happy. Like, don't try to do it because even if at the end you explain all the nuances and problems with it, some government officials gonna look at it and say, this is perfect, let's implement it now. And even though you had eight paragraphs on why this actually won't work in practice, they'll say, look, if you don't agree with this, you're just being absolutist. So don't help them make their arguments. When we say it's impossible, we mean it's impossible in practice and to help us spread that message. I think that's a really important note. We have seen efforts by law enforcement to get a well-meaning text to try to solve this problem. And what they really want is just a talking point in front of Congress or whoever they're talking to that it can be solved. That's, and we've seen really, some very well-meaning people, I think being really misused in the political debate because they were game enough to try. And so don't be that person, don't fall for that. They're not interested in your technical prowess. They're interested in a talking point and that's not the right place to be. But I wanna finish with what I tried to say, but I was frozen, thank you, Tech. That the Ernit stopping, kill this bill, killing the Ernit Act and all of its sons and daughters is the most important thing on EFF's agenda right now. It's front and center. I know that sounded a little darker than I meant it to. But we need to kill this bill once and for all. It is time for law enforcement to use all of the other tools that they have at their disposal and leave us our security. We need to have our security. As Bruce points out, this isn't a game. This is how we're organizing. This is how we're deciding who's gonna be the next, our next governors. It's how we're gonna protect people against human rights abuses all around the world. It's not a game and law enforcement has plenty of tools at their disposal. They need to leave us our security. So if you do one thing to honor, well, you should do two things. One is you should join EFF so that the next time Andrew or Erica or I stand up in front of people, we can say we represent 100,000 people who care about digital security and give us money to do this. Not the 30,000 we have now. 30,000 is great. I love all you people. We gotta go bigger if we're gonna continue to do this work. And then the second thing you should do is let your member of Congress know that the Earn It Act is a bad idea and that's you understand math and you understand standing up for strong encryption. So thanks everybody. Thank you for this glorious panel and thanks to the EFF staff, Hannah for running this and everyone else. We're just kicking off this. It's gonna get more fun from here. This was fun. It's gonna get funner. So stick with us. Thanks. Bye all.