 Hello. Oh, can you? Wow. All right. Thank you all so much for being here. I know we're all feeling really well rested after last night's party, but don't worry either way. This is not a technical talk, though I'm sure you're going to learn something anyway. Can everyone hear me okay? Awesome. Okay. So before I start, I just want to say two things. First I want to offer just a huge thanks to the NorthSek organizing team and all the volunteers here. They really make it look easy, but there's a tremendous amount of work that goes into pulling off something like this, so thank you. Second, just at the outset, I wanted to flag that while I have many fancy sounding titles and I've worked with some of the really best lawyers in Canada, I myself am not yet a lawyer and I am not here to give you legal advice. So while I'm excited to hear your thoughts and questions after the talk, if something I say makes you think that you might have a specific legal problem or a specific legal question, come see me afterwards and I'll refer you to a member of the bar. Cool. So with that out of the way, this talk is called Law Metaphor and the Encrypted Machine. At the core of what I want to share with you today is the argument that the language we use to describe complex technical problems, what words and stories we use to understand what technology is and what it does profoundly shapes the law. And in the case of encryption, this debate over metaphor is arguably a really important part of a much larger battle, one that's going to profoundly shape our lives. So I hope today that by making how the law thinks about this technology more transparent, we'll get closer to answering questions like could the government force your company to build a backdoor into its software or could a cop or a judge force you to decrypt your hard drive or hand over your password? And also should they be able to do either of those things? So there are really four parts of this talk. First, I'm going to spend a little bit of time explaining the legal problems we face around encryption technology. For most of you or some of you, these issues are going to be obvious. They're going to be in some way part of the work you're already doing. For others, maybe you've really never thought about your work as having a moral or a political dimension, but I'm going to make the case that you probably should. Next, I want to talk a little bit about how the law works, how laws change, how we interpret them. I'll argue that metaphor, reasoning by analogy is critical to how the law thinks. It's impossible to separate law from language and from the built-in biases of human reasoning and all these cognitive shortcuts we use to understand what's going on. That tendency is even more clear when we're talking about technology. The fact that we're only starting to understand the implications of metaphor in the encryption debate is really a problem. So third, I'm going to show you a few different models for how lawyers and judges think about encryption. And we're going to work through what they might mean for us and for our rights. Finally, I want to just talk a little bit about why this matters, what role that we as technologists or designers or lawyers or citizens or troublemakers might want to play in this debate. So to keep things really simple today in this talk, I just want to focus on the data at rest about a computer or a phone using full disk encryption. As you all probably know, encryption is the process of using cryptographic algorithms to make the substance of a message, plain text, incomprehensible by turning it into ciphertext using a key. Decryption, which is the process that uses the secret key, is the process of returning the ciphertext to the original plain text. Again, as I'm sure most of you here already know, there's a huge inherent advantage mathematically in encrypting versus trying to break encryption. Bruce Schneier writes that it would take the most powerful supercomputer in the world a 46-digit number multiplied by the age of the universe to break a single key using AES 256. I'm a law student, and so I haven't checked the math on that, but I'm going to trust Bruce. So while the government can and the government does attempt to exploit other vulnerabilities in networks and computers and people, there's just no realistic incentive to try and break encryption through brute force. To quote Edward Snowden, encryption works. Properly implemented strong crypto systems are one of the few things that you can rely on. The problem, of course, is that everyone gets to rely on it. Encryption provides protection for government officials and banks and software companies and human rights activists, just as it does for child pornographers and terrorists and criminals. This poses huge problems for the law, whether in criminal investigations or commercial disputes, intelligence gathering, national security, and at the outer edge, what encryption does is create spaces that the government can't reach. Potentially spaces the government can never reach. So in the words of Philip Rogaway, cryptography rearranges power. It configures what we can do from what. This makes cryptography an inherently political tool. So what happens when encryption is standing in the way of a police or anti-terrorism investigation? Where there might be evidence on an encrypted phone, like in the Apple and FBI case, or on an encrypted computer, like in the case of Laurie Love, the UK activist who recently had a series of altercations with the law. It would seem like the obvious solution to these kinds of problems is basically to put a system in place that would let police officers or courts force individuals to either hand over the unencrypted contents of an encrypted device or to give up the secret that would let government authorities do it for themselves. This is compelled decryption. And that's essentially what's been done in the United Kingdom under a law called the Regulation of Investigatory Powers Act. RIPPA gives all kinds of government officials like judges and border agents, military, certain kinds of police, the power to compel decryption. In cases where someone refuses in order to decrypt, they can be put in prison, in some cases for up to five years. Obviously, this creates some pretty stupid incentives, not to mention the privacy and civil liberties issues. There are a lot of cases where it's preferable for someone to refuse a court order and face some jail time rather than hand over evidence of what could potentially be a much more serious crime. By contrast, in the US, one of the leading cases on the issue is called the Boucher. We're all familiar with people like pleading the fifth on television, right? Yeah, okay. So that's that's your right against self-incrimination. It's the idea that the government can't force you to be a witness against yourself. In that case, in Boucher, the court ended up basically using the Fifth Amendment and accepting the argument that by forcing someone to hand over their key or to decrypt a machine, you're forcing them to self-incriminate. The court decides that on the basis of like the process of surrendering your key is like testimony. So we'll come back to that idea. And because of that, citizens are protected in most cases from being forced to decrypt their devices under the Fifth Amendment. In Canada, our Supreme Court hasn't actually addressed this issue yet. But it's unlikely that a cop or a court in Canada would have the power to make you decrypt a personal device here either. This is because we have similar protections against self-incrimination in Canada through Section 11 of the Charter of Rights and Freedoms. Because compelled decryption requires the actual participation of the person who's accused of a crime, it engages your constitutional rights. This right exists at least in some degree to some degree, because we believe you ought to have the choice about whether or not you cooperate with authorities. So like I mentioned, while our Supreme Court hasn't yet dealt with this question, there's been a pretty good case here in Quebec called the Crown versus Boudreau Fontaine. In that case, our Court of Appeal decided that a police officer who ordered a man to enter his laptop password as part of an investigation, not only violated his right to silence, but also his right to be presumed innocent, his right not to be conscripted against himself, and his protections against self-incrimination. So the short story is that from a constitutional perspective, at least, forcing someone to hand over their encryption keys poses some serious problems. And even where that option is available, like in the UK, it's not really a viable end game solution. So these protections, both the technical ones and the constitutional ones, are a source of endless frustration for law enforcement and intelligence. James Comey, who's the director of the FBI and one of the most vocal opponents of strong encryption, recently said at a Senate committee, he was there to speak about terrorism. He said, there's no doubt that the use of encryption is part of terrorist tradecraft now, because they understand the problems we have getting court orders. Of course, terrorists use all kinds of things that the US government doesn't seem to have a huge problem with, like guns and cars. Guns killed over 13,000 people in the US last year. Even more people died in motor vehicle accidents, but James Comey isn't worried about cars and guns. It seems to me conspicuous to single out just one technology that in a lot of cases, in most cases, is keeping us all a lot safer when we're willing for more freedom or convenience or efficiency to accept so much more risk in other dimensions of our lives. But this is all part of a larger narrative that the FBI and others call going dark. That is, they say, as more and more data becomes encrypted, law enforcement and intelligence are losing access to the material they need to do their jobs at a critical cost to national security. Comey and other backers of this narrative have argued that courts should have the ability to compel decryption. And failing that, they've argued for the creation of backdoors and encryption technology. They want to encourage or, in some cases, force people like you to design your products in ways that would give government access to encrypted data without its owner's consent. That was a big part of the issue at stake in the Apple and FBI case. But there are at least two very convincing counterarguments against the proposal for encryption backdoors. The first is that many of the world's leading computer scientists have said over and over again that building backdoors fundamentally breaks the technology. They say that it's a problem of putting keys under door mats, that there's no way that we know of to design a system that only the good guys can access and that can only be used lawfully and not by our adversaries. Backdoors can be exploited potentially by anyone and therefore we have the right to use the technology and security for everyone. Of course, even if the government could force companies domestically to build systems they can access, the bad guys would just use gray market software that hasn't been backdoor. There's a second argument, too. One that strikes against the demand for encryption backdoors and against the idea that the government should be able to force you to decrypt your own devices. Rogue away, the cryptographer I mentioned earlier calls the going dark narrative a brilliant discourse of fear of crime, fear of losing our parents protection, fear of the dark. Because this story as spooky as it may be isn't really an honest one. Things just noted we know that nearly every part of our lives in some way can be monitored or captured by government surveillance. Without any meaningful civilian oversight, without any transparency about how our information is collected, without any idea about how long it's kept, who it's shared with, what it can be used for. In reality, even with the widespread use of encryption, nothing is truly going dark. As Peter Swire has shown, even where law enforcement loses access to specific information in specific contexts, those losses are more than offset in massive gains in the form of new material that was never before available to law enforcement. We're actually living in what he calls a golden age of surveillance. And look, I'm speaking to you, even if you're one of the people, and I know you're in the audience thinking, I've got a top secret security clearance and I've seen things so threatening and so horrible that we need this kind of surveillance power. We need the ability to see inside the encrypted machine. Because even if you're okay with giving government that power, and I'm not, but maybe you are, I think people really underestimate how easily that power can be abused. Tools that we might be okay with using to hunt down terrorists or child predators quickly end up being used to go after copyright violations or shoe parking tickets. This is what we call mission creep, or function creep. It's hard to take away these toys once government has them, and it's hard to say no to perfect enforcement if you have the technical capacity to do it. Here's one of my favorite examples of that, where an MC catcher was used to hunt down for a guy who stole some chicken wings. But more than that, governments change. What's legal today might be illegal tomorrow. Every one of us has been a criminal, at least in some small way, at some point in our lives. And people working in security should be particularly worried, because judges and politicians are really bad at understanding what you do, which means that your work is especially easy to criminalize. As Biela Coleman mentioned yesterday in her keynote, people using perfectly legal technology like Tor, which technology beyond its other practical uses is literally keeping human rights activists alive, are routinely branded as suspects and as criminals by government. Here in Canada, a 19-year-old straight-A student just pled guilty to a criminal charge because he embarrassed the government when he showed that the Canadian Revenue Agency was vulnerable to heart bleed. This kid might have been reckless, but he was no criminal, and he had no intention to cause harm. He was doing security research, like all of you do, and now he's under house arrest and he's going to have a criminal record for the rest of his life. I did a study last year where we looked at 30 different attempts from all around the world over the last 25 years to write a bill of rights for the internet. We studied all of these projects that we're trying to figure out how we ought to protect and entrench our freedoms in this new digital space, and people listed the kinds of rights you would expect, like freedom of expression and access to the network and privacy. But something that surprised me was that in almost a third of the documents that we looked at, they actually spelled out the right to use encryption specifically as a right, as fundamental as freedom of association. But maybe it's not that surprising after all. The government's inability to compel disclosure of an encryption key or to force a tech company to steal it for them is the only thing standing in the way of the government's ability to secure total informational awareness. So what I'm trying to get at here is that terrorists and child predators aren't the only threat in the world worth protecting against. And if we're concerned, like I am, about government information, it's a kind of final frontier, a possibility for space beyond the government's reach. And keeping it that way, I think that's a right worth fighting for. Because access to perfect information invariably leads to the possibility for perfect enforcement. And in the words of Yochai Benkler, imperfection is a core dimension of freedom. So how does the law think about encryption? Well, to explain that, I'm going to talk about what the law thinks generally. So, law is like code in lots of ways, but unlike code, it is not binary. Every word has multiple possible meanings. It's context dependent and certain and shifting and interpretive. Nevertheless, it has its own internal logic. So the first thing that you need to know is that the law relies heavily on categories. Change in the law is often about shifting categories, shifting boundaries of what's included and what's excluded. I'll give you an example. One of the most important categories in the law is the category of personhood. Because persons have rights that non-persons don't have. Personhood, though, is in some sort of fixed or static or objective concept. After all, for a good chunk of the 20th century, the legal model of personhood is what people shrug, like what let people sort of shrug their shoulders and say, look, we'd love to give women the right to vote, but the law says only persons can vote and the case law says that women aren't persons. So sorry. Slavery, segregation, we're both justified on the basis of this really sick assertion that black people were not persons or maybe only three-fifths of a person. The category of personhood is why anti-abortion activists are so fixated on the question of when a fetus becomes a person and the fact that over time the law has decided that corporations are persons has had its own profound legal and political implications. The second thing you need to know about the law is that the law tends to think in metaphor. If you're a jurist or a lawyer or maybe you just dated one once, you're probably going to recognize these two names on the screen, Fuller and Cardozo. Lon Fuller, who's this iconic legal philosopher, said that metaphor is the traditional device of persuasion. Eliminate metaphor from the law and you've reduced its power to convince and convert. Justice Cardozo, one of the most famous American judges in history, warned that metaphors in the law are to be narrowly watched. For starting as devices to liberate thought, they often end by enslaving it. And the history of technology law is very much a history of metaphor. For example, the whole framework of telecommunications law was actually built on a legal model that was meant for railroads. This is all just a big abstraction about interstate commerce in the Constitution. So basically everything new is folded into the law by making reference to what came before it. At the same time, the law is notoriously bad at understanding new technology and legal change is always going to lag behind technical change. The law is slow. Some people argue that this is actually part of the design of our legal system. We need to know what we're dealing with, where interests lie, what the risks are. We also know that the rate of technological change in a variety of sectors tends to track Moore's law very closely, at least for now. That means that we predict a doubling of technical capacity every 18 months or two years. By contrast, it routinely takes that long for a single case to get to trial in Canada. Years more to reach the Supreme Court and become binding to the law. And that's what we're dealing with. We want Moore to reach the Supreme Court and become binding precedent for Parliament to review an issue or amend a law. We also want to avoid legislating in the hypothetical for all kinds of good practical reasons with actual problems on the table, poverty and climate change and war. It's pretty difficult to justify spending time and resources on like legislating penalties for negligent time travelers or drafting traffic regulations for jetpacks. And speaking of time travelers and jetpacks, we're really notoriously bad at predicting the future anyway. So when the law strives for proactivity, it sometimes falls short. So all of these factors mean that the law has a degree of intrinsic reactivity to it. And that claim has been born out again and again throughout history. Copyright laws emerge from Gutenberg's machine, the printing press. And it took hundreds of years before there was any comprehensive law that addressed it. Nearly as soon as those laws were in place, we get cinema and photocopiers and tape recorders and VCR and fax machines. And eventually the internet and arguably copyright law has not yet caught up to that. So analogy and metaphor are essential because they help the law play catch up. It lets it fold in these new technologies as they arise. But metaphors aren't value neutral and they shape precedent. And metaphors are everywhere in technology. Take, for example, probably what is the most infamous failure of metaphor in technology, which is this quote from Ted Stevens in the net neutrality debate that the internet is a series of tubes. But that comment isn't actually as totally out of left field as we might imagine. Because that debate was actually rife with all kinds of metaphors, particularly about public infrastructure. Net neutrality was all about the information superhighway and fast lanes and slow lanes. And even traffic itself has a metaphorical dimension. And if the internet is like a highway, what exactly is traveling on the highway? What is data? In the copyright debate of the mid-2000s, we saw a lot of metaphors like this one. And I'm sure the person who came up with this thought they were being really clever. But really what they're doing is they're arguing that intellectual property is exactly like physical property. And that's a metaphor that record companies and the music industry really love, but it feels absurd to pretty much everyone else. We hear all kinds of other metaphors about data, particularly big data, whatever that means. Data as a natural resource is one that's been very popular in some circles. So you see language like data as an asset or reserve, data as the new oil, raw data, data mining. Sigalowski, thinking in the context of privacy and surveillance, proposed a different metaphor that I really like, which is data as nuclear waste. He invites us to consider it quote, not as a pristine resource, but as a waste product, a bunch of radioactive, toxic sludge that we don't know how to handle. Again, as Beela explained in her keynote, there's an element of metaphor and category around how we treat digital activism too. Maybe you were hearing you saw the CNN commentators arguing like is a DDoS attack like a sit-in or is it like a terrorist attack? And of course, there's the cloud. This ephemeral, distant, borderless thing that moves data away from physicality from tubes and highways into like the ether, right? And so the closer we look at the language we use to describe our digital lives, the more we realize that almost every word we use has its own metaphorical baggage. I particularly like nautical ones, like surfing and torrents and leaks and pirate. But like highways and tubes, we have all kinds of structural imagery. Portals, paywalls, firewalls, chat rooms, even the idea of cyberspace has a dimension of physicality of jurisdiction. Some metaphors, like cloud, also have a natural dimension. There are ones like web and bug and mouse and virus and shells and worm that fit into that category too. There's actually a woman named Sue Thomas who wrote a whole book on this. As someone mentioned to me the other day, a new one is cyberpathogen. And so look, some of these are potentially coincidence or background noise. And not all of them are going to have profound legal implications. But some of them do. And where the technology is complex and novel, and where the metaphor obfuscates or misleads, we can end up with really serious consequences for the law. So there are these three authors, Sonny, Suri and Lee, who coined the term metaphor vacuum, because it itself is a metaphor. They give broadcasting as an example of a metaphor vacuum. Broadcasting had originally been conceived as a point-to-point technology. So originally they were sort of using it like a messy wireless telegraph. But as people started to use it for broadcasting and mass communication, the law really struggled, particularly in copyright, to understand what it was. As a side note, the original meaning of broadcasting is itself a metaphor. It means the act or process of scattering seeds across a field. So anyway, from my perspective, what we have is a kind of metaphor vacuum in encryption technology. It seems that much of the way courts and politicians respond to the encryption problem actually sort of hinges on what they understand encryption to be. So that is that the legal outcomes that we arrive at tend to depend on the model we use to imagine the encrypted machine in a narrative sense. So I want to briefly canvas some of those metaphors because they have implications for whether or not a court can compel the encryption as well as for the broader going dark debate. So the one thing that you need to understand first is that the heart of a lot of these legal arguments is the idea that acts of encryption or decryption, because they normally use a passphrase that you store are testimonial, at least in the US and Canada. That is, to hand over your key is sort of like being forced to testify against yourself, to be a witness against yourself. And where there's testimony, it seems that the courts have accepted that you can be protected from compelled decryption on the basis that your rights against self-incrimination are engaged under the Fifth Amendment in the US or Section 11 in Canada. The US case Boucher I mentioned earlier is interesting for that reason. So in that case it's not about the contents of an encrypted drive being protected but the very act of turning over a document or providing a password to an encrypted hard drive that implicitly communicates incriminating facts, including the fact that you can open the machine at all. So for that reason it's protected. The danger with this model is that you can end up with incoherent results when you focus too much on what form the key takes rather than what the actual rights are at stake. For example, there have been at least two cases I can think of in the US where individuals have been forced to unlock an iPhone with a fingerprint because it didn't have a passphrase and so the logic is that fingerprints are biometric data and therefore non-testimonial and therefore you can be forced to decrypt a device. It's not totally clear whether this is the law in Canada but it's bad law anywhere. Where your fingerprint is used instead of a passphrase nothing has changed about the potential content stored on the device nothing has changed about your expectation of privacy, nothing has changed about the mathematical process taking place yet your rights have been profoundly affected. So if decryption is speech at least sometimes, what is the encrypted machine? The analogy that law enforcement and national security agencies really like to make is the idea of the computer as a locked box or a locked room. James Comey, our favorite FBI director is always going on about locked closets but he also argues that Apple was he argued that Apple was selling cars with trunks that can never be opened or apartments that can never be entered. What you gain by this analogy is that it basically treats digital and physical evidence identically. It minimizes the testimonial aspect of disclosing a private key as much as possible. You see like there's no speech involved in opening your car trunk, right? It rejects the idea that there's some sort of qualitative difference in the kind or scope of information stored on digital devices compared to what otherwise might be physically discovered. So once we make that metaphor more transparent it's clear that the going dark language about back doors and front doors is actually all part of the same framework. The idea of a combination safe might be a little bit better. It's been used in a couple of different court cases because the combination is stored in your mind so there's some testimonial dimension to it. But even by trying to introduce that aspect you're still kind of treating the device like a container which falls short of the technical reality. Encrypting a hard drive doesn't create a barrier between the data and the outside world but it means instead that the information exists only as cyber text it's scrambled. There's no intelligible data hidden inside of an encrypted machine. The data is transformed it's not enclosed. So in this light all of these metaphors box and safe and car trunk and closet simply don't reflect the operation of the technology and unlike a locked room or a combination safe encryption can't be cracked or broken into or dynamite it. It's practically unbreakable. In an amicus brief from the ACLU and the EFF a few years ago they proposed a model that sort of gets around this. They argued that asking an individual to decrypt their machine was not like opening a box full of data but instead in order to explain or transform or translate it. They argued that a court can't make you transform information that's incomprehensible into information that's going to put you in jail because that would go completely against the heart of what the Fifth Amendment is supposed to be about. To illustrate that principle they use the metaphor of translation for decryption and an undecypherable book for the encrypted machine. The idea of translating an undecypherable book is a lot closer to the actual mathematical process involved in decryption. It also more clearly illustrates the testimonial aspect of disclosing a private key but at the same time in that metaphor the idea of testimony speech translation is doing a lot of work and so I worry about the kinds of problems that we start to run into if the key is something not not testimonial like a fingerprint. The ACLU has also recently proposed in a brief the model of a puzzle rather than a book or a library. I'm still thinking about it but my instinct is that it might help fix the problem around testimony. The puzzle metaphor might also help developers in Apple VFBI like cases who don't have their own self-incrimination right to invoke because they're not the ones under criminal investigation but where a court is still trying to make them break their own software to get at someone else. In terms of the law the legal question in those types of situations is often going to be more like what are the limits of what a court can make you do by order by injunction. So the idea of forcing someone to solve a puzzle reminds us that the real moral issue here is compulsion. The idea of being forced to undertake something that incriminates yourself or someone else and not literal speech. Finally I want to refocus on the content itself because in closing I want to make sure that we don't lose sight of the values that we began this conversation with in the first place. So what kind of data is it that we keep on our devices? What kind of data do we encrypt? Is that our assets? Is it our nuclear waste? You know our Supreme Court has actually said a bit about this. When they were talking about the heightened expectation of privacy we have over our personal computers. In a case called the crown versus VU. They said that we can't imagine our devices as simple containers like cabinets or cupboards. They really reject James Comey's closet in his locked boxes. But they don't reject it because it's technically inaccurate. They reject it because of what it can reveal about us, what our devices reveal about us. And these machines are as Vannevar Bush argued as early as the 1940s an enlarged intimate supplement to human memory. They are deeply personal and imagining compelled decryption as a kind of intrusion into the human mind or our most intimate physical places makes a certain kind of sense. Especially as we start to blur the distinction between what is our machine and what is our self. The courts have spent a lot of energy trying to figure out what the device is, what the process of encryption is. But I think in choosing metaphors that we need to be careful not to lose sight of what's actually at stake. The space that's based beyond the reach of government power, space for experimentation, for secrets, for radical intimacy. So Jeffrey Kiyak and others, mostly legal scholars have argued that we need to be hesitant to analogize our employee metaphors at all. That they lead to bad law, bad politics. These critics argue that courts should be engaged in fact specific inquiries informed by technological experts about the precise nature of the technology at hand in precise cases. And he's right. But I'm skeptical to the extent which the law can actually escape this kind of thinking. To the extent that we can move past the cognitive shortcuts we use to understand the world around us. In particular he's right that metaphors can obfuscate, they can distort. But they can also potentially illuminate and make the political implications of our choices more transparent. They can reshape the boundaries of power. So we need to be careful about the choices we make and the words we use. Of course these aren't the only legal problems related to encryption technology. I skimmed through a bit of the case law again this morning and it's an area of the law that's changing in a lot of different ways, really fast. We don't know yet, for example, how these problems are going to play out at the border. There's a case in Nova Scotia involving a man from Quebec called Alain Philippon. He's going to go to trial in August and I think a lot of people in the tech community are watching that really closely. He refused to give border officials his BlackBerry password, so he was charged under the customs act with hindering a customs officer. That charge carries the maximum penalty of $2,000 fine and a year in jail. We can talk a little bit about this in the questions if you want. There's also the Crown versus Fear and Issue, which is a 2014 Supreme Court case where they decided that the police are allowed to search your cell phone incidental to arrest. The court put a few safeguards in place there, but they don't really talk about encryption and that's worrying. They make a really strange comment about whether or not you have a password that doesn't really have a huge impact on your reasonable expectation of privacy. So we're going to have to see how that plays out. I've also seen one or two cases where even when the court has recognized that it can't actually force the accused to surrender their encryption key, they infer from the fact that they're using encryption that they're up to no good. Obviously, those kinds of assumptions make bad law and bad politics, so that's a myth we're going to have to work to debunk. Anyway, what does this mean for you? Right? I think as technologists, engineers, designers, project managers, people who are actually building these systems, systems that can shape the rights people have and the risks they face, you've got some real responsibility. You have to think about the values that you're embedding and the technology you make. And what's its take here, right? Like, maybe it's some big company's money. Fine. But maybe it's someone's privacy or their reputation or their job or their immigration status or their physical well-being. For all you know, your users are the next corporate whistleblower or the next Ed Snowden or the next Dr. King. Your choices can be the difference between safety and imprisonment for people. So keep in mind what we know about encryption and the law. It's unsettled. Law enforcement is bent on building in exceptions and weakening constitutional protections. Staying on top of how the law changes is going to be an important part of how you can design meaningfully secure systems. Remember that you can't exercise your users' right against self-incrimination for them. So when the government comes knocking at your office you can be forced to hand over information even if they wouldn't want you to. That's why the gold standard is kind of a system that just doesn't let you betray your users by design. Apple has actually been a really good example of this. If you literally can't decrypt your users' data without their participation, you make sure that they have the choice about whether or not they want to exercise their constitutional rights. Obviously it's probably not possible to design systems like this in all circumstances. But that doesn't mean that you shouldn't be trying to embed these values wherever you can in your work. I think it's about respecting your users. So even where the stakes aren't that high you have to hold each other to a high standard. Think about integrity in your work because it's not just like Alice and Bob using your software. It's actually real people who will face real risks and consequences. So I've been really grateful to talk to many of you since the beginning of the conference yesterday. And I know that there are actually a lot of people in this room who take these issues seriously who care about privacy and who care about freedom. So the challenge is then how are you going to make these values a bigger part of your work? We're counting on you. Thanks.