 So, good morning or good afternoon. My name is Chris Segoian. I'm the principal technologist for the speech privacy and technology project at the ACLU. I started last September. I'm the first ever technologist that the ACLU has had who's focused specifically on surveillance and privacy. I finished a PhD last year, specifically focused on the role the internet and phone companies play in spying on their customers for the government. It's an extremely timely topic. So, I started last September. The ACLU has been very busy in the last year on surveillance issues. Shortly after the Snowden revelations, we were the first organizations to file suit against the national security agency. Although we were... Thank you very much. Although we are not the last, several other great organizations have also sued the NSA and hopefully those will keep coming. So, today I'm going to be talking... I'm going to be telling a story. I'm going to be telling a story of how law enforcement and the government have responded to technical change. This will be a story in, I guess, three acts and really delves into the relationship between the companies and the governments and the different kinds of relationships because not all companies are the same. Some are friendlier than others to the government. So, the first crypto wars. Those of you who are a little bit older may remember there was a time when you couldn't export strong cryptography from the United States. In the mid-90s, then FBI director Louis Free went before Congress on numerous occasions and warned Congress about the threat of encryption. The widespread use of robust, non-key recovery encryption ultimately will devastate our ability to fight crime and to prevent terrorism, free said at a congressional hearing in 1997. He added uncrackable encryption will allow drug lords, spies, terrorists and even violent gangs to communicate about their crimes and their conspiracies with impunity. In the mid-90s, encryption was a technology that the government sought to demonize. They sought to control the spread of encryption and ultimately to pressure companies to modify their products. So, Free also said, the only acceptable answer is socially responsible encryption products that permit timely law enforcement and national security access and decryption pursuant to a court order or otherwise authorized by law. The socially responsible crypto that the FBI backed in the mid-90s looked like this. This is called the clipper chip. Thankfully, the clipper chip failed. Professor Matt Blaise found several significant security vulnerabilities in the clipper chip that meant that it actually wasn't even good at protecting people from everyone other than the NSA. And so ultimately, the first wave of the crypto wars failed. Congress and the executive branch ultimately did away with the crypto export control rules. In 1996, President Clinton signed an executive order reclassifying cryptography and in the years that followed, the rules were further relaxed. Ultimately, companies like PGP were allowed to export their technology around the world. Web browser vendors like Microsoft and Netscape were allowed to export full 128-bit crypto to anyone except people in Cuba and Iran and a couple other countries. And so really, the FBI's initial attempts and the FBI and the NSA were sort of collaborating there. Their initial attempts to control crypto failed. Their previous strategy was let's stop everyone else other than Americans from getting this stuff. If we make it difficult for them to get the technology, they won't use it and then we'll easily be able to monitor their communications and get their data. But even after the crypto export control rules were weakened and you could download PGP no matter in which country you were, it didn't actually lead to the widespread use of PGP. Hands up, everyone who uses PGP on a daily basis. And for this audience, that's not really that good. I'll confess, I only use it with a handful of colleagues and journalists. Most people who contact me don't know how to use it. And the reason is PGP is really difficult to use. There's a major important study by Alma Witton, who's actually now at Google, 10 years ago pointing out the usability failure of PGP. Turns out that when a tool is ridiculously difficult to understand how to use, people either don't use it or they use it wrong. They think they're encrypting when they're not encrypting, which is actually worse because then they will say things that they might not have said if they thought their emails were going through the clear. And so the widespread availability of encryption really didn't frustrate the FBI in the way that they thought it would. Terrorists and pedophiles and drug dealers didn't suddenly rush out and start using PGP because it turns out that terrorists and pedophiles and drug dealers are like the rest of us. They're lazy and they're not experts at difficult to use obscure technology. And so PGP wasn't the threat that they thought it would be. HTTPS, the lock icon that we see in our browsers is easier to use because it doesn't really involve anything from the user's side but even that wasn't widely deployed. Where SSL was widely used was in e-commerce, online banking. If you were sending your credit card over the web, your communications would be encrypted. But if you were sending your emails, social networking messages, private photos, backing up files, very few of these things would be protected with SSL. And so again, the government had a good time. They didn't have to worry too hard. Although the technologies existed, no one was using them. Or at least they weren't using them for the things the FBI cared about. This is a slide that the Guardian published this week. It's from the latest deck that Snowden provided them. This is a deck from Xkeyscore which is the program they have or the intelligence platform that allows them to monitor vast amounts of communications and then search for it later. Now this deck is from 2008, 2007. So it's a few years old but what you can see clearly outside of law enforcement and in the intelligence space, these folks appreciated that communications were going over the network in the clear. Whether it was Yahoo or Facebook or Twitter or your emails, they were there easily available for the government to grab with the assistance of their friends at the backbone internet providers. And so things were good for a while. It didn't really matter that your browser could do strong crypto. It didn't really matter that you could download tools from a website and configure them and then have a key signing party because no one was doing that. But that didn't stop the FBI from worrying because down the road they saw that things were going to get bad. And it wasn't going to be because people could download tools. But it was going to be because companies were going to start building crypto into their products by default. This is Valerie Caproni. She was until I think a year ago, the general counsel for the FBI, the top FBI lawyer. She's testified before Congress on numerous occasions and in 2011, at a congressional hearing, she warned Congress about what the FBI was calling the going dark problem. Going dark is the FBI's term for what happens when everyone uses encrypted communications. The FBI has coined this term and spent lots of money researching this issue because they're worried about a day in which all of the communications that users are sending are going to be off limits to the FBI. This is in 2011, quote, the FBI and other government agencies are facing a potential widening gap between our legal authority to intercept communications pursuant to a court order and our practical ability to actually intercept those communications. The FBI says that they can get a court order, but when they actually try and get the communications, either the company doesn't have the capability because they haven't built wiretapping systems into their networks or the company cannot provide the data. She added encryption is a problem. It is a problem we see for certain providers. And so what she was describing there was the fact that over a couple years, starting in 2010, companies in Silicon Valley started rolling out SSL encryption by default. In a Washington Post story just last year, a former FBI official described this to the Post. Officials say that the challenge was exacerbated in 2010 when Google began end to end encryption. That made it more difficult for the FBI to intercept email by serving a court order on the ISP whose pipes would carry the encrypted traffic. In 2010, Google was the first of the big free web mail providers to turn on SSL by default. Google had always offered SSL as an option, but it was an option deep in several layers of configuration screens. I think it was the last of 13 options after the vacation auto away message, after Unicode, there was nothing less important in the Google configuration screen than SSL. And so, of course, no one used it. When the option was hidden and disabled by default, no one's emails were secure between the user and Google. But in January of 2010, Google flipped the switch and enabled SSL by default. And in the years that followed, several other Silicon Valley companies did the same. It was Twitter. Then Microsoft, with their renamed hot mail to Outlook, and they turned on SSL at the same time. Facebook started doing it last year, started rolling it out, and I think just this week announced that all Facebook communications will be SSL encrypted from the user to Facebook servers. In addition to that, several companies have started rolling out perfect forward secrecy, an improved algorithm that makes it much more difficult for government agencies to go to companies and demand private keys. They're upping their key sizes. These Silicon Valley companies are making passive interception much more difficult. Now, of course, that doesn't mean that the government can't get things from Google. Your communications between your computer and Google servers are encrypted. But once the files actually arrive at Google, whether it's your emails or your private photographs or your instant messages, they're sitting there in plain text. This is Vint Cerf. He's Google's, I think, their chief internet evangelist. He's also sort of the father of TCPIP. I was on a panel with him in 2011 in Nairobi, and we started talking about Google's lack of stored encryption. And he said, quote, we couldn't run our system if everything in it were encrypted, because then we wouldn't know which ads to show you. This is a system that was designed around a particular business model. So this is a very honest statement from a Google executive. And I don't begrudge Google, right? They offer a fantastic, easy to use service, and they don't charge people for it. And neither does Twitter and neither does Facebook. These companies all offer one and only one product. There's no way to pay for Facebook. There's no way to pay for Twitter. There's no way to upgrade your Gmail account to a corporate account, a Google apps account. They have the accounts for users and then the accounts for businesses. And when the only accounts they offer are free ones that are supported by ads, then it makes sense why they're not encrypting your data in the cloud with a key that only you have, because it would be very difficult to monetize that. Now, if the companies could, and maybe will at some point, switch to a business model where you give them money and they give you a secure service, but that isn't the business that they're in right now. And so what this means then is that the companies can and do receive requests from law enforcement agencies and intelligence agencies. Even before the prism revelations, we've known that Google gets thousands of requests a year from law enforcement and intelligence agencies. This isn't a surprise. But what we've seen in the last few years is a transition. We've seen a migration away from telecommunications companies to Silicon Valley companies. In years past, your private messages, your metadata would be accessible through a backbone provider, through a telephone company, through one of the mob bells, right? And like it or not, the telephone companies have been providing wiretapping assistance to the U.S. government for more than 100 years. The first wiretaps were around 1895 in New York City. For 100 years, these companies have been providing interception assistance to the U.S. government. And it's a relationship that everyone is sort of comfortable with. Everyone, and by that I mean the companies and the government. And so these companies don't just provide targeted access. They don't just provide access to an individual user's data. They provide, when the government asks, access to all user's data. The assistance of the phone companies is what enables drag net surveillance. When the government wants to search through every email or search through every phone record, that is only possible because the phone companies provide access to everything. If you take the internet companies at their word, the Silicon Valley companies, they only provide targeted access. If the government goes to Google and has a court order with my name on it, Google will hand over my data. But if you take Google at their word, they will not provide access to everyone's information. And so what's happened over the last few years is that consumers have started to migrate their data from the old telecommunications carriers to Silicon Valley companies. I mean, in many ways, the telcos haven't had people's email for a while. I mean, no one's using a Verizon or Comcast email anymore really. But when those email messages were going over the network in the clear, it meant the government could still go to the backbone providers. It meant they could still go to the AT&T's and the Verizon's of the world, even if you were using Yahoo or Google or Hotmail. But as these Silicon Valley companies have enabled encryption, you can no longer spy on someone's emails. You can no longer collect bulk information with the assistance of Verizon or AT&T. And I think a great example of this is what Apple did with iOS version five. In one day, they just flipped a switch and suddenly a new version of iMessage was rolled out to users. And if you were an iOS user and you were sending a text message to another iOS user, your messages would go through Apple servers instead of the phone companies. And overnight, millions and then billions of messages started flowing through Apple servers. And those were messages that the government cannot get with the assistance of Verizon or AT&T and Sprint. Now, again, this was a leak that was, this is a document that was leaked to Declan McCulloch, CNET suggesting that the government can never get messages sent through iMessage. I actually don't think that's the case. I think that Apple provides access on a targeted basis, but I don't think they are providing wholesale access in the way that the phone companies do. And I think what's happened here is that there's a difference in culture between the companies. It's not that Google is trying to make the government go dark. It's that Google has 350 people doing nothing but security and only security. It's that Apple has a dedicated security team. It's that Facebook has a dedicated security team. And before you can launch a product at one of these Silicon Valley firms, particularly if it's storing sensitive user data, you have to have crypto. There's no way to secure your user's data against hackers without crypto. And so these companies have, you know, it's a corporate policy to encrypt data not because they want the government to go dark, but because that's what the security teams that the companies demand of them. And realistically, the phone companies don't have a tradition of security. Your voicemail isn't secure. You're not getting OS updates to your smartphone if you're using Android, which is by the way something we filed a complaint with the Federal Trade Commission about earlier this year. The phone companies just aren't interested in security. And so what's happening is consumers are giving their data to companies that finally invest some resources in security and that's making it tougher for the government. So what is the solution? How does the government respond to a world in which they can only, they can only get selective data from companies and in some cases they cannot get data at all if the companies are using end-to-end crypto? The answer is backdoors. The answer is compelled access, forcing companies to modify their products and provide the government with a way of getting data. Starting in sort of 2010 we started seeing leaks to the press suggesting that the FBI and others in the law enforcement community were floating these ideas. They were floating legislative proposals expanding KALIA, which is a law mandating backdoors and communications networks and expanding that to internet companies, to websites and apps and and other providers. We saw sort of these trial balloons floated in 2010 and then ultimately there was a congressional hearing in the spring of 2011 where our friend Valerie Caproni from the FBI testified, quote, no one should be promising their customers that they will thumb their nose at a U.S. court order. They can promise strong encryption. They just need to figure out how they can provide us plain text too. Right? And this is what the FBI wants. They want the power to go to a company secretly and force the company to quietly insert a backdoor in their own product. As recent as this year, as recent as April of this year, it looked like proposals were coming. It looked like there was a multi-agency working group in Washington and they were getting ready to drop a bill that would empower the Department of Justice to find Silicon Valley companies that refused to provide the assistance demanded of them. And then something happened. KALIA II, which is the DC nickname for this backdoor proposal, for now is dead. It is dead in the water. No politician wants to touch that kind of surveillance for now. So thank you very much, Edward Snowden. All right, so if they can't force Google to put a backdoor in Android OS, and if they can't force Apple to put a backdoor in their software, what are they going to do? How is the government going to get your communications? What about when they want to listen into a conversation you're having in your living room where you're not even using your device? Right? Are they supposed to break in in the middle of the night and install a microphone like they did in the 1970s? No. Right? They want other ways to access data. Particularly as consumers have started using services like Skype and we'll talk about Skype later, but services like Skype that have some form of encryption, governments have been having problems. And remember, the government isn't one big beast. The FBI or NSA may have tools to access certain applications, but that doesn't mean they share those toys with local law enforcement agencies. NSA doesn't share their secret backdoors with the likes of local cops in Arizona or Nevada. Those folks have to do things the hard way. It's also important to note that not all governments are the same. So Google has an office, in fact it's main office in California and Microsoft's headquarters is in Seattle. Google and Microsoft have to take orders from the U.S. government. When there's a valid court order, the companies have to provide access to the U.S. government. But Google doesn't have an office in Iran. Microsoft doesn't have an office in Libya. And so if those governments want to get their citizens' communications, now that Google and Microsoft and others are starting to use SSL, those other governments are really going dark. In the countries where Google and Microsoft don't have offices and don't respond to requests, those governments are having a really tough time because of the use of services like Skype, like Twitter, like Facebook. They used to be able to get access through their local, in many cases nationalized telephone company. And now they're going dark. And so those governments are turning to hacking tools too. What we're seeing is an emergence of the private sector helping companies, helping governments. The ones that have gotten the most press, the first is a company called Gamma. They make a software suite called FinFisher. FinFisher has gotten a lot of press in the last couple of years starting with a dump of documents by WikiLeaks and then the excellent work of the Citizen Lab in Canada have exposed the use of their software. They have a really cheesy sales video online that I recommend you look at. So this is the target using iTunes and then getting a malicious man in the middle to update through iTunes and then the police officer sitting at the remote operating center can spy on the calls and text messages and emails of the user. This is the president or I think a CEO of Gamma. His name is Martin Munch. You may not know Martin's name but you probably know Martin's work. Before he was in the government surveillance business, Martin created a Linux distribution called Backtrack, which is very popular with this community. So Martin pivoted from providing open source security tools to providing closed source government interception tools. This is my favorite photo of Martin. He's a German guy. Without any shame sells this software to governments around the world. One of the things his software can do is to remotely activate webcams without the target's knowledge. You can see that he's concerned about this capability because if you zoom in on his laptop you can see he has a little post-it note over his webcam. He clearly knows what his own software can do. So because of the work of the folks at CitizenLab, we know that Gamma's software has been exported to Mexico, Ethiopia. It's been used by seriously oppressive regimes in the Middle East and in Southeast Asia. Now, the company says that it's used for lawful interception and targeting of terrorists and pedophiles and criminals. But from what we know it's been frequently used to target journalists and human rights activists and dissidents. And so Gamma is one of these companies providing these off-the- shelf tools to governments. The police don't have the resources to develop this stuff in-house and so they just buy this off-the-shelf spyware from companies like Gamma. Through the last couple years the newspapers have covered this. The Times and Bloomberg have described the spread of this stuff. And the sale of this technology is really unregulated. Basically any government except for the ones on international blacklists can buy it. The other big company is a company called Hacking Team. They're an Italian company. They make something called the Remote Control System, otherwise known as DaVinci. They have a sales video too that appears to be targeted to 13-year-old boys. Their marketing stuff says defeat encryption, total control over your targets, log everything you need, thousands of encrypted communications per day, get them in the clear. And this software really is sold to law enforcement agencies who are trying to deal with things like Skype. Right? If you're the government of Turkmenistan and there are journalists in your country who are using Skype to communicate, how do you get the contents of their calls when you need them? The phone company in town can't help you. You go to Gamma or Hacking Team and they provide you these tools. This is again from Hacking Team's literature. They say that they can get encrypted voice, location, audio and video spying, web browsing activities, relationships. They get anything that is on a computer without the knowledge of the target. Hacking Team in recent years has expanded into the U.S. market, I believe. In the spring of this year, they hired this man. His name is Eric. He used to be a spokesperson for Verizon. And now he is the U.S. Council for Hacking Team. They have an office in Annapolis, Maryland, just an hour outside of D.C. We don't know whether Hacking Team has successfully sold any products to the domestic U.S. law enforcement market. But they are showing up at conferences that are only open to law enforcement and intelligence agencies in Washington, D.C. They also went to a conference in Chicago this April, the Law Enforcement Intelligence Units Association. Not only did Hacking Team give a talk at this conference, this is a conference targeting local cops around the country. Not only did they give a talk at the conference, but they also sponsored the coffee break in the afternoon. And so if Hacking Team hasn't sold a product to a local law enforcement agency yet, it's not because they haven't been trying. They've been showing up at these conferences for several years. They are actively targeting the law enforcement market. And I think if they haven't succeeded already, they will succeed soon and get a sale in a small town. Now, Hacking Team and Gamma Software is the kind of stuff that local cops and governments without too much money use. This is, you know, a couple hundred thousand or maybe a million dollars. It's the kind of thing you buy with a DHS grant. This is not what you use if you are a sophisticated law enforcement agency with big bucks. And the feds have the big bucks. Federal law enforcement agencies in the United States have enough money to use bespoke custom malware. They don't need to use the same stuff that the Egyptians and the Turkmenistan governments are using. They can use their own custom spyware and they can buy zero days if they need them. Again, our friend, Valerie Caproni, there will always be very sophisticated criminals that are virtually impossible to intercept through targeted means. The government understands that it must develop individually tailored solutions for those sorts of targets. And when Valerie says individually tailored solutions, what she means is hacking. She didn't use the word hacking when she spoke to Congress, but what she means is hacking and malware. In 2009 or so, I think EFF filed one of their many Freedom of Information Act requests to look into the FBI's claims that they were going dark. And after a couple years later, they got hundreds of pages of documents, most of them heavily redacted. This is one that I found, so I read a lot of the documents that groups like EFF produce and documents that the ACLU obtains. And this was one in several hundred pages that the EFF obtained. And most of it was redacted, as you can see, but there was one line that stuck out to me. This, the remote operations unit. So that sounded really interesting. I didn't really know what the remote operations unit was, but it was in a document about going dark. This was a document sort of describing each unit checking in and saying what their progress was. So I thought, okay, let me see what else I can find about the remote operations unit. And so I spent the last six months researching this unit, mainly using open source intelligence, basically just googling and using LinkedIn. And what I found is that the FBI is in the hacking business too. So I found a conference, the materials for a law enforcement conference that happened in April of this year. This was a training session, a training seminar for prosecutors around the country. And in the list of attendees and speakers at this conference, I found the information for this guy, Eric Chang, who's the unit chief of the remote operations unit. And so I searched a bit more, and I found the Zoom info page. This is like a data mining company that collects information from elsewhere on the web. And Eric Chang's Zoom info page mentioned that he was the unit chief of the remote operations unit and said that the unit provides lawful computer collection capabilities in support of FBI investigations. Well, that sounded interesting. So then I turned to LinkedIn and I started researching the remote operations unit. What I found is that there are a couple of contracting companies, a couple of contractors who supply people to the ROU. And contractors, like everyone else, they want to keep their resume up to date in case they get a new job. And so they list things in their resume, maybe things they shouldn't be listing, revealing what they did at their old job. So I've not included the names of the line, or the low-level contractors, but I will be quoting from the LinkedIn pages of several of these contractors because I think what they describe is fascinating. So this is a deployment operations analyst at a company called James Biman Associates. They're a small boutique contracting company in Northern Virginia. So this person performed testing on software used as a critical function for counterterrorism and counterintelligence cases. Okay, that sounds interesting. He worked with FBI case agents with our surveillance imagery software that is currently installed on criminal subject machines in the field. Oh. Okay. That's even more interesting. They test case-specific implants against various OSs and platforms. Good to know. So if you're using Windows or Mac or whatever, they have a tool for you. And then they create documentation for the various technologies and methods that we use to gain access to subject machines. All right, so it's clear. It's clear from this profile what the remote operations unit is doing. I also found another person. This is a remote operations deployment analyst also at James Biman Associates. Her profile was fascinating. I thought this was good. She created policies, guidance and training materials to protect the deployment operations tools from being discovered by adversaries. Those are us. We're the adversaries. So Biman Associates is one of two companies that provides hackers to the FBI. It's my belief and understanding that the companies, the contracting companies actually supply the people who sit at the keyboard and are launching the tools. You know, there hasn't been a debate in Congress about the FBI getting into the hacking business. There hasn't been any legislation giving this power. This just sort of happened out of nowhere. And had it not been for the sloppy actions of a few contractors eagerly updating their LinkedIn profiles, we would have never known about this. So the president of James Biman Associates is a guy named Jerry Menshoff. He used to work at Booz Allen Hamilton, which is also the same place that Edward Snowden used to work. And so this is the president of the company and his LinkedIn profile was pretty bare. But it did describe one of his interests. And so he's a member of the Metaspoit framework users group. I thought you guys would get a chuckle out of that. So I gave Jerry a phone call a few weeks ago and asked him some questions. Of course told him who I was and I said I work for the ACLU and he wasn't very nice. He didn't want to answer any of my questions. So I gave some of this information to the Wall Street Journal and last night they published a story on this unit. The nice part of giving these documents to newspapers once they have a bit of information then they can go and report it and get other stuff too. And so the Wall Street Journal reporter was able to find former law enforcement officials who would be willing to talk on background about this practice. One former law enforcement officials she spoke to said that, quote, the bureau can remotely activate the microphones in phones, Android phones and laptops without the user knowing. It's pretty interesting. But she also added that the FBI is loathe to use these tools when investigating hackers out of fear that the suspect will discover and publicize the technique. So I guess that means you're all safe from FBI malware. So the FBI has this team of agents who are doing nothing but delivering malware to the computers of surveillance targets. We only have a couple cases where these tools have come to light. There was a case in Texas this summer where the FBI sought a search warrant allowing them to target a computer and remotely enable the webcam, collect location data, collect emails. In this case they went to what you could probably say is the most pro-privacy judge in the country in Texas and he said no. But sort of on a technicality he said they should get a wiretap and they only wanted to get a warrant. What's clear is that if you have this capability, if you build this team that does nothing but developing malware, the first time you attempt to use the team you don't go to the most pro-privacy judge in the country. And so presumably they've had this team for a while and they regularly use it to deploy malware. So on one hand we have the FBI basically being in the hacking business and then yesterday I noticed that the FBI's official Twitter account issued a warning saying pirated software may contain malware, be aware. And so I guess we only have to worry about the malware made by other people not the FBI's malware. All right. So the government is using hacking tools. The government is trying to penetrate people's computers. They have tried and have up until now been unsuccessful in their attempts to obtain legislation allowing them to force tech companies to put back doors in their products. What are they going to do in the future? Because hacking doesn't scale. You can break into one person's computer. You can break into a thousand people's computers but you cannot break into a billion computers without getting caught. You can do it temporarily but you will get caught and the government doesn't want their tools to get out. So what are they going to do in the future? What are they going to do when Silicon Valley companies actually start delivering end-to-end crypto? Not Google, not Facebook but companies who actually sell services to users. While Microsoft, you know, owned one of those companies for some time Skype was advertising itself as a service that didn't have back doors. They were advertising itself as a service that couldn't provide access to law enforcement agencies but we learned last month that the government was able to go to Skype before Microsoft bought them and convinced them to modify their products and provide access to the government. Quote from the Guardian story, Skype was served with a directive to comply by the Attorney General. Now we don't know what kind of directive this was. We don't know if they went to court, if Skype said no and they fought it or if they did this because they could negotiate some better deal. We really know very little about the ins and outs of how companies can be compelled under existing law. But even so, Skype stopped bragging about their security several years ago. And by the time Microsoft bought them, all their claims of being wire tap proof had disappeared. Skype was no longer a service, even if it ever was. It was no longer a service that advertised itself as the way to securely talk to your friends and family. Instead, Skype was a service that you use to talk to your friends and family for free. But Skype is not the only company offering VoIP services or video. There are now companies that are selling services to users. So one of them is a company called Silent Circle, co-founded by Phil Zimmerman, the guy behind PGP. And they charge 10 or 20 bucks a month for encrypted VoIP and text messages and video. Now, I'm not telling you to go out and use this company services, but they've clearly said in their marketing materials, we have no government mandated backdoors. You know, I've spoken to the CEO of the company and he said, you know, if the government comes to us and tries to force us to put a backdoor on our product, we'll close up and move to a different country. I mean this is a company, the only reason you use their product is for the security. You're not using Silent Circle because it's crystal clear audio or because it's cheap and easy to use. You're using them because they're secure. Likewise Spider Oak, which is a competitor to Dropbox, you only use Spider Oak and you only pay for this service because they provide encrypted backups with a key only known to the user. And this, again, Spider Oak makes clear statements to users. We've created a system that makes it impossible for anyone to reveal your data, impossible for us to reveal your data to anyone. That's it. The only reason you use these companies is to protect your data and this is the only reason they're in business. And so the question right now is, and I don't have the answer to this, the question is can the government force these companies to modify their products? Because if Spider Oak were forced to have a back door and it became known, they'd go bankrupt. The only reason you're using them is for the security. And so there's this law that I mentioned before in passing called CLEA and it's normally thought of as a bad law. It's called the Communications Assistance for Law Enforcement Act. CLEA is the law that forces telecommunications companies to put law enforcement interception interfaces in their networks. The reason that AT&T has very easy to use fast wiretapping capabilities is because CLEA forced them to buy a bunch of equipment. But CLEA has a provision in it that most folks don't know about. And I'm going to read it to you. It's very short. A telecommunications carrier shall not be responsible for decrypting or ensuring a government's ability to decrypt any communication encrypted by a subscriber unless the encryption was provided by the carrier and the carrier possesses the information necessary to encrypt the communications. This little feature in CLEA, I think, and I'm not a lawyer, is the thing that is standing between these companies and the government. This section of CLEA protects the right of companies that want to offer encrypted end-to-end services with a key only known to the user to the general public. And it is my belief that when the next crypto wars come, if they do come and when they come, that this section of the law will be the thing that the government targets. I think that down the road we are going to see consumers using services that offer end-to-end crypto. I think we will see people paying for these services. And I do think that the government is going to target these because without it, they cannot engage in dragnet surveillance. Thank you very much.