 My name is Moxie Marlin Spike. I'm from the Institute for Disruptive Studies. I don't really like introductions, so instead today I'm going to introduce two friends of mine, Shane and Sarah, probably two of my favorite people in the world. And exactly one year ago today, they were hiking in Kurdish Iraq and were kidnapped by the Iranian military. And for the past year they've been held in prison in Iran. Sarah has been in solitary confinement. And there doesn't seem to be any sign of them being released. It's a pretty disempowering situation. There's nothing that I or anyone that I know can do. But I think about them all the time and I just want other people to know that people are stuck in prison in Iran and it's been exactly one year as of today. Okay, so I would like to take some time to talk about privacy. What I'd like to do is start by looking into the past, talking about the threats that we saw, the things that we thought were important, the projects that we thought were worth working on. And then I want to talk a little bit about how I feel like trends have changed and then look into the future and talk a little bit about things that I think might be important moving forward, things that I'm interested in working on and that maybe other people might be interested in working on. So looking into the past, the sort of technology narrative of the 90s was largely dominated by a clicker that doesn't work, by this thing. I'm just not going to use this. The web browser. When Netscape first introduced Netscape Navigator, it was almost revolutionary and a lot of people moved to capitalize on that knowledge. In particular, one of the major players that wanted to protect their interests was Microsoft. And when they introduced Internet Explorer, the narrative changed from the idea of a browser to this browser war between Netscape and Internet Explorer. And we all know how the browser wars turned out. But at the same time there was another war that was happening and it was somewhat more subtle, but it was just as fierce and perhaps more important. And it was a war over this thing. The little padlock in the bottom left-hand corner of the web browser. And more importantly, the technology behind it. On one side of this war were the cypherpunks. These are people who wanted to see this information and technology spread widely. They wanted to proliferate this throughout the world so that many people could use it. And on the other side of this war were the eavesdroppers. And these were people who wanted to prevent the spread of this information and stifle the use of this technology. And so the lines were drawn. And on the cypherpunks side you had people like Matt Blaze, Phillip Zimmerman, Ian Goldberg, David Schaum, and Timothy May, the heroes of my teenage years. And the eavesdroppers thought that these people were dangerous. In fact, their ideas scared the fuck out of them. They were talking about the move from a world where they had ultimate control and ultimate access to all information to a world where they would have no control and no access to any information. In fact, they thought this was so dangerous that they classified these ideas as weapons. That if you wrote a little bit of crypto code and sent it to your friend in Canada, that was tantamount to exporting munitions. And you could be tried and prosecuted as such. At the same time, however, they realized that this privacy thing might be important to some people and that this idea might catch on. And so they decided to come up with their own solution, which they called key escrow and was best embodied by this thing, the clipper chip. What they wanted to do was make this chip and then embed it into every piece of consumer communications electronics, every telephone, every fax machine, every personal computer. And what it would do is some cryptography. But it was a closed system and you wouldn't be able to get access to the internals of this chip and you would be able to use it to start secure sessions. The only trick was that the government would have the equivalent of a master key, which they could then use to decrypt anything that they thought might be interesting. So the eavesdropper's problem here was that cryptography is not a banana, which is to say that it's difficult to treat information as objects. If I have a banana and I share it with my friend, there's still only one banana in the world. If they then share it with a friend of theirs, there's still only one banana in the world. However, information works differently every time I share information I'm copying and there's a chance for an exponential explosion. This sort of fundamental dilemma was made worse by Cypherpunk mantra. Cypherpunk's right code. The idea was that a lot of good work had been done in academia and research circles developing public key cryptography and other encryption systems outside the government realm. But not a lot had been done to actually put into practice. And what Cypherpunk's wanted was actual software that people would download and use right now to communicate securely. And so they kind of went nuts. Some people moved to Anguilla, an island in the Caribbean that had very favorable laws in terms of exporting cryptography and started writing crypto code and trying to ship it throughout the world. There were more creative strategies like in 1995, Philip Zimmerman published a book in conjunction with MIT Press called PGP, Source Code and Internals. And the deal was that the book was just the entire PGP source code printed in a machine-readable font. Because digital representations of cryptography were weapons, but if you printed it in a book, that was speech. So they printed this whole thing in a book and that very small print run and then just shipped it to every country in the world where they wanted to see this. And then people there just scanned it back in because it was a machine-readable font. And now PGP had been distributed completely legally all over the world. So this kind of stuff continued and the strategies got more and more rabid and cryptography became more and more ubiquitous until 2000 when suddenly the Clinton administration repealed all of the significant laws limiting the export of cryptography. And so it sort of seemed like the game was over, that the world was won. If you go back and you look at the cyberpunk predictions about what would happen once cryptography was ubiquitous, the first prediction that they made was simply that it would become ubiquitous, that it would inevitably spread throughout the world. And this turned out to be their most prescient prediction. This was really one of the first times that we saw that information really does want to be free. But if you look at their predictions about what happened once it was ubiquitous, they were somewhat less prescient, that anonymous digital cash would flourish, that intellectual property would disappear, that surveillance would become impossible, that governments would then be unable to continue collecting taxes and that governments would fall. If we flash forward 10 years from when these predictions were made, cryptography is the thing that allows you to securely transmit your credit card number to Amazon.com so you can buy a copy of Sarah Palin's book on going rogue. You know, sure some of these ideas have been eroded somewhat, but surveillance is probably at an all-time high, while privacy is probably at an all-time low. So what happened? You know, it sort of seemed like we were wedging this war and it seemed like we won the war, and now here we are in the strange situation. Well, I think part of my thesis here is that in many ways I feel like the Cypher punks were preparing for a future. And the future that they anticipated was fascism. But what we got was social democracy. And that's not necessarily better, it's just different. Let me give you an example. How many people in this room would feel good about a law which required everyone to carry a government-mandated tracking device with them at all times? Yeah, not even one person. So that's fascism, right? That's the fascist future. Now let me ask another question. How many people here carry a mobile phone? I'm guessing actually 100% of the people in this room. And so that's social democracy, right? So what is the difference between a government-mandated tracking device and a mobile phone? A mobile phone is just a tracking device that reports your real-time position to a few telecommunications companies which are required by law to turn that information over to the government. So logistically they're the same, you know? So what is the difference? You can turn it off, but you don't. Psychological choice. And you pay for one. Yeah. Choice, right? Choice is the big difference, you know? You choose to carry a cell phone, and you wouldn't choose to carry this government-mandated tracking device. So let's talk about that. Never in my wildest dreams did I think that I would have a cell phone. You know, it's a mobile tracking device. It's a mobile bug and operates over an insecure protocol. Why would I want one of these things? And yet I have one, and I carry it with me all the time every day. Well, I think if we look at the way that people tend to communicate and coordinate in groups, often there are sort of informal mechanisms and channels that people use to communicate and to make plans and stay in touch. And if I introduce a more codified communications channel, there's a well-known problem called the no network effect, where I invent this thing, maybe like the GSM cellular network, and I start using it, but it's difficult to use because the value of that network is in the number of nodes that are connected to it. And if I'm the only one using it, then it's really not worth very much. If, however, I somehow manage to get everyone to start using this thing, then it becomes very useful and very valuable. But there's an interesting side effect, which is that the old informal methods of communication and coordination are destroyed. That technology actually changes the fabric of society. I mean, there are many sort of trivial or trite examples of this in the mobile phone world, where we see that mobile phones have changed the way that people make plans. It used to be that people made plans. They'd say, I'll meet you on the street corner at this time and we'll go to this thing. They'd say, I'll call you when I'm getting off work. And so if you don't have that piece of technology, you can't participate in the way that society is communicating or coordinating. And so what actually ends up happening is now, if I decide that I don't want to participate in this codified communications channel, I'm once again victim to the no network effect. Because what I'm trying to do is go be a part of a network that has been destroyed, and no longer exists, and I'm once again the only one using it, and I'm part of a network that has very little value. So yes, I made a choice to have a cell phone, but what kind of choice did I make? And I think that this is the way that things tend to go now, that what ends up happening is the choice to start out very simple. Do I have a piece of consumer electronics in my pocket or not? And over time the scope of that choice slowly expands until it becomes a choice to participate in society or not. That on some level today, to choose not to have a cell phone means in some sense to choose not to participate in society. And so I think if we start looking at this pattern of small choices becoming big choices, you start to see them everywhere. One of my favorite most recent examples is from this Firefox extension called Adblock Plus. I'm sure that many people are familiar with this. The idea is that it's supposed to help you block ads on the web. And the way it works is it allows you to specify a set of regular expressions that would match URLs, that would be advertising URLs for ad banners and stuff like that. And the problem with this is, you know, it's quite effective, but these URLs are constantly changing, and so you want to be changing your regular expressions as well. And it could be kind of a pain to try and keep up with that. So they've done a clever thing where they have a subscription model where what you can do is you can subscribe to a list of regular expressions that someone else is maintaining. That way, you know, you only need one person or a group of people who are sort of on the ball looking at these regular expressions, and everyone else just benefits from that research. And now, so there's a number of popular subscriptions, and they're not all for ads, right? So there's a few popular subscriptions for blocking trackers. You know, these are like web bugs that track your movements around the web as you browse along. And so I'm subscribed to one of these tracker lists. And, of course, one of the trackers that I'm most interested in blocking is Google Analytics, because, you know, there's no problem with Google Analytics at all. And one day, something interesting happened. Google Analytics disappeared from the list of trackers in my subscription. And, you know, if you think about the old world, the way that things used to work, you imagine that some Google executive tracked down the person who's maintaining this list and, you know, showed up with a briefcase full of cash and there was some shady backroom deal where hands were shaken and, you know, Google Analytics was removed from this list. As far as I can tell, that's not what happened, and that actually something much more subtle and much more insidious actually happened. The way that Google Analytics works is through JavaScript. What happens is website operators who wish to use Google Analytics just import a little bit of JavaScript into their HTML file. And when the page loads, the JavaScript tracks you. Now, what Google did was they started including small bits of just usefully generic JavaScript functions into this Google Analytics JavaScript file. And what they were essentially saying was, hey, you guys are importing this JavaScript file anyway, just to use Google Analytics on your website. We're going to throw in a few just generically useful functions and we've done this right and, you know, we figured this out, that you can use as long as you're importing this thing. That way you can just use it for the core functionality of your website. And so now what happens is if you block the Google Analytics JavaScript, you don't just break Google Analytics, you actually break the core functionality of the website because now the JavaScript functions, the website is actually depending on for the functionality of the website, don't exist. And so again, what they've done is they've expanded the scope of the choice that you have to make. It used to be a very simple small choice. Do I want to be tracked by Google or not? Simple enough, you can either block the JavaScript or now block the JavaScript. Now the choice becomes larger. Do I want to visit this website or not? And again, that's a much more difficult choice to make. So, you know, why is this significant, right? Well, okay, this guy's name is John Pointexter. And then he's an incidentally the guy who was found to be most responsible for the Iran Contra scandal. He was convicted of lying to Congress, but then never went to jail. And in 2001, he started a program, a government program called Total Information Awareness. And he made a speech when he announced the program where he said that data must be made available in large-scale repositories with enhanced semantic content for easy analysis. And essentially what he wanted to do was have the government siphon off all email traffic, all web traffic, all credit card history, everybody's medical records, and throw it into one big sink. Just put it in one big pile. Don't worry about analyzing it or processing it in real time. And then develop the technology to really efficiently mine this data, to pull out the interesting statistics, relationships, profiles that they were interested at any point in the future. So you just collect this big sink of data and then at any point in the future, you can go back and pull out anything that you want from it. So this was the totalitarian future. This was the cypherpunk nightmare that, you know, they had been worried about, right? This is what they had been thinking about and preparing for all this time. And people freaked out. I mean, you know, this was, you know, a significant story in the news. People were up in arms. And in fact, even Congress was like, what are you guys doing, you know? And eventually the program was shut down. Well, okay, so why was it shut down? First of all, these people are clearly from the old world. They really don't know what they're doing. This was their actual logo. This isn't like the onion made a logo in parody of this. This was the logo they came up with. They have the pyramid with the eye of God and a light beam shining down on the planet. That little bit of Latin under there means knowledge is power. I mean, come on, you know. I mean, if you're going to have some, like, scary government program, you know, a friendly logo. You know, don't call it total information awareness. Call it the kitten surveillance society or something, you know? I mean, really, you want something that's sort of, like, colorful, almost cartoonish, something that seems childish and really harmless, you know? Something like this. Because if you go back and you look at what total information awareness was trying to do, Google has done all of it. I mean, in fact, they have exceeded the original scope of what TIA dreamed to collect and process. And one thing that we know that they've really excelled at and actually how they've made their money is in being able to really efficiently mine the data that they collect and pull out the statistics and relationships of everything that they have. Now, clearly, their intent is different, right? They are not John Poindexter. They're trying to sell advertising. But make no mistake about it. They are in the surveillance business. That is how they make money. They surveil people and use that to profit. And so the effect is the same, right? Now, there's this quote, who knows more about the citizens in their own country, Kim Jong-il or Google? Now, I think it's Google. I think it's pretty clearly Google. So once again, there's this question. Why are people so concerned about the surveillance practices of Kim Jong-il or the John Poindexters of the world and not as concerned about people like Google? Well, again, I think it comes back to this question of choice, right? You choose to use Google and you don't choose to be surveilled by John Poindexter or Kim Jong-il. But once again, I think the scope of this choice is expanding and that it's going to become harder and harder to make that choice until, once again, it's a choice between participating in society or not. I mean, already, if you were to say, well, I don't want to participate in Google's data collection, so I'm not going to email anybody that has a Gmail address. That's probably pretty hard to do. I mean, you would be, in some sense, removed from the social narrative. You would be cut out from part of the conversation that's happening that is essential to the way that society works today. So once again, I think that this is going to become this on the scope of society itself. So I would think that I would say that trends have changed, that now we're dealing with a situation where technology alters the actual fabric of society, that information, as a result, accumulates in distinct places and that the eavesdroppers now just move to those distinct places. The past was really direct, right? We saw the eavesdroppers trying to embed surveillance equipment into every consumer communications device. And the present is much more subtle, right? Instead of doing that, they just move to the few distinct places where information tends to accumulate. Places like Room 641A and the AT&T World Comp Facility, where the NSA has been operating a fiber optic splitter for who knows how long. The past was direct. You saw people like total information awareness directly trying to take your data. And the present is a lot more subtle. It starts by soliciting, rather than demanding your data, and then the eavesdroppers just move to those points where the data collects. So when I'm thinking about the future, the first thing that I want to think about is these choices that aren't really choices. And I want to deal with those as problems. I want to acknowledge that the choices are expanding and that in some sense they are becoming demands. So some projects are all in those lines. And some projects are along those lines. At first I started by thinking, okay, well, so what's up with Google, right? The main problem is that they have an awful lot of data about you. They record everything. They never throw anything away. They have your TCP headers. They have your IP address. They issue you a cookie. They know who you are. They know where you live. They know who your friends are. They know about your health, your political leanings, your love life. They know not just a lot about what you're doing, what you're thinking about. They've also done a really good job of controlling this debate by defining the terms. They say things like, well, we care about privacy, so we anonymize your information after nine months. Well, what they mean by anonymize is drop the last octet of your IP address. That's not anonymity, right? But they've done a very good job of being able to define that as anonymity so that they can just start throwing that word around. They also did this brilliant thing with this Google dashboard, where they say, oh, you know, we're putting privacy under your control. And first of all, they only show you some of the information that they are most obviously capable of collecting about you. They don't show you any of the other correlational stuff that they could easily derive about you. And the most sort of diabolical thing about it is that to get privacy, you have to be tracked, right? Because to control your privacy using the Google dashboard, you have to stay logged in all the time and maintain a cookie. So it's like, you know, they've turned the tables on you. And, you know, they have warned us, right? You know, Eric Schmidt said this famous thing, if there's something you don't want anyone to know, maybe you shouldn't be doing it in the first place, right? So they've warned us. And, you know, lastly, we now know that Aurora, these Aurora attacks on Google were at least partially about intercept. You know, one thing we learned from those attacks is that the government is running intercept systems on their networks, and that not only that, but other eavesdroppers were trying to gain access to those intercept systems. So what we're seeing is as data, more and more data accumulates in these places, it becomes more and more valuable. And so eavesdroppers move to those places and then even eavesdroppers without illegal backing also try to move to those places. And so I think we're going to continue to see that problem as these become more and more valuable over time. So one project that I started working on is called Google Sharing. And the basic premise of Google Sharing is that this choice that we are given is a false choice, and that we shouldn't accept it, we should just reject it. And so what we should say is, you know, it's not really possible for us to stop participating with Google. And so instead what we want to do is that allow us to continue to participate and still maintain our privacy. And so the way it works is it's a two-part system. It's a Firefox add-on as well as a custom proxy server. And the add-on sits in your web browser and it watches your web requests. And all of your non-Google traffic just goes directly to the internet, totally unmolested, not modified or impacted at all. Then if it sees a request to a Google service that does not require a login, these are things like Google Search, Google News, Google Maps, Google Groups, Google Images, Google Shopping, but not things like Google Mail or Google Checkout. Then it shuts off that traffic to the Google Sharing Proxy server. The Google Sharing Proxy server maintains a collection of identities. And each identity is a unique HTTP header set. These are, you know, sort of like the fingerprint of your web browser, as well as a cookie that was issued by Google. And so these are maintained in this pool and every time a request comes in, one of these identities is randomly chosen from the pool and the identifying information from the request is stripped off and the stuff from the identity is tacked on. The process of the request responds to the proxy and the information is proxied back to you. So, you know, the upshot is that Google can't track you, right, because these cookies are constantly moving around and your traffic does not come from your IP address. Additionally, we encrypted this first link using SSL between your web browser and the proxy. So that means that actually you even get SSL protection for services that Google does not provide SSL access to. Google now has SSL protected search, but none of the other stuff like maps or shopping or groups or images are SSL protected. But you get that with Google sharing. And how does it look? It looks exactly like it's indistinguishable from using Google directly. You can use all of their services, Google Maps, Google News, whatever you like. Totally transparently, the only difference is that in the bottom right-hand corner there's a little status telling you that Google sharing is enabled. Additionally, Google has sort of given us a win here by allowing us to make SSL protected searches. And so now what we can do is have the client prefetch cookies from the Google sharing proxy server and then make an SSL connection end-to-end directly to Google. And so now the Google sharing proxy server is proxying the data and sharing cookies around identifying information around but cannot actually see the requests because they're SSL protected all the way to Google. So now you don't have to trust us not to examine your requests. So anyway, this project is available online. It's been active for about six months now. We have about 80,000 users and you can get the add-on from Google sharing.net. Another project that I think is interesting is this thing called Face Cloak. It was developed by Professor Urz Hengartner from the University of Waterloo. And the idea or his basic premise is that using Facebook what you're trying to do is share information with your friends or friends of friends. But you're not actually trying to share information with Facebook. There's no reason to give them the information. And so he developed this interesting sort of proof-of-concept Firefox add-on that sits in your web browser and any time you type anything into Facebook you can prefix it with two equals. And when you do that it will just transparently encrypt it before sending it up to Facebook. And then he has these mechanisms that allow you to really easily share keys with your friends. That way, if they're also running the add-on then it will transparently decrypt it before displaying it. So the net result is that everything works just totally the same and everything appears as normal to everybody in your friends group. But Facebook never gets the data. It's an interesting project and an interesting theory that I would like to see more of. You could possibly apply it to things like Twitter. Twitter is one part broadcast and one part conversation. For the conversation bit there's no reason for Twitter to actually have the data. My second thesis here is that the crypto war was largely about data freedom that at the time you were talking about trying to get information out into the world and other people trying to prevent you from getting information out into the world. And so it was very easy to extrapolate a future of data control from that. In those moments it was easy to think that, wow, we're going to be dealing this with this forever. And so a lot of projects were born out of that reality. A lot of the anonymity and privacy projects that we have today are things like dark nets, data havens, hidden services. Does anybody here use those things? One person? Two people. Not many. But because I think that that's not really the future that we got. We got this other future that we're talking about that's somewhat more subtle, more complicated this social democracy. And so one thing that I've noticed is that privacy advocates, people who are working on these privacy projects are really in love with the other. These are people like Iranian dissidents or Chinese dissidents in far away parts of the world. And I think the interesting thing about it to me is that if you look closely at Iranian dissidents or Chinese dissidents, they really have very little in common with what privacy advocates are doing. And yet they're still sort of obsessed with these struggles. And I guess I would suggest that it seems like that's because these are the few places in the world that are still speaking that language of data control, that language of information freedom that all of our projects were sort of born out of. So they just happen to sort of dovetail even though they don't really connect with the lives of the people that are working on the projects. And I would say that even those places are beginning to realize that the strategy of data control is not entirely effective. Iran recently announced that they were launching a national email service. Unlimited storage. Nice web interface. Because I think that even they realize that the Google strategy is more effective than their strategy of deep-packing inspection and trying to lock everything down. And I also think that the loss of the crypto war was less about giving up and more about changing strategies. Particularly the strategy of key escrow I think has eventually become a strategy of key disclosure. We've seen laws like RIPAA and the United Kingdom and other parts of Europe that essentially say, okay, we're going to let you do cryptography. We're not going to try and regulate that. Because that's a really hard problem. We have failed to do that in the past. And so instead we're going to let anybody can use whatever cryptography they want. But if at any point in the future we want to see what this encrypted traffic is we come to you and we ask for your key and if you don't give it to us you go to jail. And so this is a problem of key disclosure and I think that this is a problem of security in the secure protocols that we're using today. So again, if I'm looking into the future the first thing I want to do is deal with the choices that aren't really choices. The second thing I want to do is worry a little bit less about information freedom and the third thing is I want to worry a lot more about forward security and this key disclosure problem. What happens when you show up at customs? What happens when you're living in Europe and someone comes knocking on your door? And Eric Brewer wrote a pretty nice paper I think in 2004 or 2006 called off the record communication or why not to use PGP. And in this paper they make a pretty simple observation. They say, okay everyone's familiar with the PGP model you have an email you want to send to Bob you encrypt it with Bob's public key and you send it to Bob. The next time you want to send an email to Bob you encrypt it with Bob's public key and you send it to Bob. You could do this for 20 years and when you're compromised all previous traffic is compromised as well. That someone could easily just record all traffic and that's totally not unrealistic today. And then at any point try and compromise Bob's public key and then go back and decrypt all of the previous traffic. So the first thing they note is that one key compromise affects all previous correspondence. The second thing that seems weird is the secrecy of what I write is a function of your security practices and I feel like I'm somewhat paranoid I have reasonable security practices but I don't know about the people that I'm communicating with I would like for what I write to somehow be a function of my security practices. And the third thing that they note is that the PGP model gives you authenticity but not deniability. That if I sign my email hey Bob today I was thinking that Eve is a real jerk and at some point this email is compromised and discovered there is no way for me to deny this. So the nice thing is that Bob knows that I wrote it but there's no way for me to deny to everyone else that I wrote it. You have this undeniable problem. And so the OTR model works a little bit differently. What happens is every time you wish to communicate you do an ephemeral key exchange and you have public private key pair just like normal except it is only used for signing ephemeral key exchanges. It is never used to actually encrypt data. Then every time you're exchanging messages your message also includes one half of a new key exchange. So each time you complete a message exchange you're also doing a new key exchange. So the key material that you have is constantly rolling forward and the old key material is discarded. So if at some point in the future somebody comes and tries to get something off of your computer there's nothing for them to get. The old key material is gone there's nothing that you can use to decrypt previous traffic. It gives you forward security. Additionally you have messages that are encrypted with message authentication codes. The key for the message authentication code is derived from the session key and two parties have that session key, Alice and Bob, me and Bob. So signatures are undeniable because there is only one possible author. But Macs are deniable because they have two possible authors. Both me and Bob know the key that could be used to authenticate this message. If Bob receives an email with a Mac on it he knows that he didn't write it so it must have come from me. But now he can't take that and show it to the world and say Moxie wrote this because it's just as likely that he wrote it. Additionally since this session key is constantly rolling forward the old Mac keys are constantly rolling forward as well. And every time they roll forward you can just broadcast them in the clear and now anybody could just as likely have created an old message. So you get authenticity but you also get deniability. These are two principles that I think are going to become more and more important in the future as we roll forward. Some projects that I've been working on that line one is called whisper systems and the idea is to try and bring forward secure protocols into mobile devices. So these are these two spaces, right? Mobile devices are this place of a choice that isn't really a choice and forward secure protocols is this thing that's becoming increasingly important with this new strategy of key disclosure. So the apps we have one is called red phone and basically it's a encrypted voice application for mobile devices. And the way it works is through VoIP and there's this problem right well doesn't VoIP suck it tends to and it tends to really suck in the mobile environment. And so looking at this we wondered okay well why is it what is so bad about VoIP in the mobile environment and we realized that the problems often almost always come down to the signaling layer. And so the way it usually works is there's some asterisk server out on the internet and you do signaling through this thing you have to maintain a TCP connection and then do SIP to notify the other client that you want to try and call them and all this stuff, right? And so it's a big problem because in the mobile environment your connection status is usually pretty flaky and so you're moving in between networks and maybe you still have a connection maybe you think you do but you don't and also it's it doesn't allow your device to go to sleep because you have to maintain this either UDP or TCP connection and so your device can't ever really power down and so it's bad for your battery. Well what we realized is that the mobile environment actually has an entire signaling infrastructure already built in by the telecoms made it for us and they use it to signal mobile devices and so potentially we could just leverage that and so now instead of like some asterisk server allowing you to communicate to other devices using SIP we just use SMS which is a signaling piece that's already built into the mobile environment. The nice things about this are that you don't need to maintain some constant network connection to some server so your phone can go to sleep you don't need the equivalent of like a Skype ID or a SIP account or something like that you can do addressing based on your normal phone number because you know we're using SMS for the signaling and the third thing is that you don't need to run a VoIP server or set up an account or anything like that you just install this one piece of software and now you're ready to call anybody whose number you know so then the question is okay well how do we provide security and so normally VoIP has just a simple RTP stream of voice data between two devices and so we have what's known as a ZRTP stream it's a protocol that was developed by Philip Zimmerman and it's actually a pretty nice protocol the way it works is that you do some ephemeral key exchange and then from that key material you derive what's known as a short authentication string and once the call is set up at the bottom of the in-call screen you display the short authentication screen short authentication string in this case the two words flat foot Eskimo now if there was a man in the middle attack the key material between the two devices would be different right you have one key on one side of the man in the middle and one key on the other side of the man in the middle and so these two words would be different on the two phones that were trying to talk to each other so what happens is now you set up a call and you just read these two words to each other flat foot Eskimo and if they're the same on both sides then you know that the call is authentic and so you don't need certificates certificate authorities you don't need fingerprints digital signatures web of trust none of that stuff you just read the two words to each other and you know you have an authentic call it's also kind of fun the two words always end up being somehow it's like a refrigerator magnet poetry or something like that the two words always end up being you know like some kind of profound haiku you know flat foot Eskimo yeah that's how I'm feeling today you know that's I don't know so it's actually kind of fun to read the things to each other the other app we have right now is called tech secure and it's an encrypted text messaging application using a protocol this derivative of OTR this thing that gives you nice forward security and deniability properties and it works just like the normal stock sms app for Android we've cloned it feature for feature and so the idea is you can install this and fully replace the default messaging app and use it to message anybody you like and at the same time if someone else is running this you get an encrypted session the way that the session works once again every message you exchange includes one half of a new key exchange and so the key material that you're using is constantly rolling forward if someone were to record all of your sms traffic and later try and compromise your phone and get something to decrypt all of that they can't because those keys are gone so anyway these projects are a small hope of mine that we can come up with technical solutions to reduce the scope of the choices that we need to make we have the whisper systems projects are actually on a table in the vendor area and we're doing demos and stuff you want to stop by and check it out but otherwise you can download all of the stuff for free the android apps Google sharing online and feel free to contact me and thank you for listening to me talk.