 Thank you all very much. As a member and good standing of the Society of After-Dinner and Conference Speakers of England and Wales, I am contractually required as the final speaker before the drinks are served to have a joke about being the final speaker before the drinks are served. This is that joke. Thank you. We live in a world made out of computers and by that I don't mean the kind of roller ball, curvilinear, internet of things, promotional video world made out of computers. I mean that we are literally presently living in the world made of computers, that our buildings are computers that we keep our bodies in. When you remove the computers from modern buildings, they rapidly become uninhabitable, whether that's because they have such a high specification of insulation that without computer controlled respiration they fill up with black mold and you have to scrape them down to the foundation and start over again or because they are one of those willowy stark attack super towers going up in all the big cities of the world that use computers to dynamically allocate their load stresses against seismic and wind forces and when you take the computers out of those buildings, they fall over. The most salient fact about those buildings isn't what color they are painted or how much the condos cost, it's what software they are running. And it's not just our buildings. Increasingly, we ride around in cars that are computers that we put our bodies into that go 80 miles an hour unless you are on the 101 in which case it's 10 miles an hour. And I don't mean that these are Google self-driving cars, I mean that if you go to DEF CON or CCC or any other big security conference, you will see every year monotonically one after another like some kind of horror movie where Freddy Krueger's hand just keeps coming out of the grave, you will see presentations from people who figured out how to get into the car's informatic systems using such innocuous things as the Bluetooth interface for sound and can use the informatics to take over the steering and the brakes. The most salient fact about your car is the computer in it. The Boeing 747 that I'm flying home to London in tomorrow night, that is a flying Sun Solaris workstation in a very fancy aluminum case connected to some tragically badly secured SCADA controllers. And it's not just that we keep our bodies inside of computers, increasingly we keep computers inside of our bodies. I mean there's a reason that when Dick Cheney had his defibrillator implanted, he had the wireless interface turned off even though that means all of his firmware upgrades now involve a scalpel because not long after Cheney made that decision, Barnaby Jack, the security researcher, gave a presentation showing that from 30 feet away he could rewrite the firmware on implanted defibrillators using their wireless interfaces, cause them to seek out other implanted defibrillators to reprogram them and then to deliver lethal shocks on his timing. So you probably know people walking around with computers inside of their bodies already that are the most salient fact about their ongoing ability to breathe. But if you're like me and you grew up with Walkmans or if you're a little younger and you grew up with MP3 players, you have logged enough punishing earbud hours that there will come a day if you're not killed by a self-driving car when you will have a hearing aid and it's vanishingly unlikely that that hearing aid is going to be a beige retro hipster analog transistorized hearing aid. It's going to be a computer that lives in your body and depending on how it's configured it will either tell you what's out there to be heard or it might tell someone else what you're hearing or it might stop you selectively from hearing some of the things that people say or it might even make you hear things that aren't there. And I'm a science fiction writer by trade and so I sometimes think that when I talk this way people think I'm being metaphorical about some futuristic world in which you will have computers in your body. But this is really our contemporary reality. Like many of you, I'm a very frequent traveler. I'm changing the climate. Ask me how. And when you're a frequent traveler your situational awareness is all about baseboards and electrical outlets because your whole life is recharging your batteries. And so one day I had arrived at an airport lounge and I was feeling very smug because I had seized the lone electrical outlet and I was charging my laptop before the flight and a man came up to me and he said, do you mind if I use that electrical outlet? And I was like, this is not on, right? So I looked at him over my glasses and I said, I'm charging my laptop before the flight. And he rolled up his pants leg and he showed me that attached to the stump of his amputated leg was a robot leg. And he said, I need to charge my leg before the flight. And I said, the electrical outlet is all yours. So in a world where we keep our bodies inside of computers and where we fill our bodies with computers, there is nothing more important than information security than computer security. And we have in that regard a great deal of luck because although there are many subtle and difficult things about information security, we have a nearly unassailable bedrock upon which to build our information security practices. And that's cryptography. And cryptography has this amazing characteristic to it that most people don't grasp until they begin to practice it, which is that it's all public. We tend to assume that if you're going to keep something secret that you also keep how you're keeping it a secret, a secret. And in fact, that was the practice until World War II when all of our messages that were sensitive began to fly around on public media on on airwaves that anyone who had a receiver could receive. And once it became the norm that your adversary could see your messages in transit, we had to start developing security methodologies that could withstand knowledge by our adversaries of what we were doing. And in this regard, we were actually recapitulating some of the history of technology and science as we know it today. After all, before we had science, we had a thing that looked a lot like science called alchemy. And the major difference between alchemy and science is that alchemists understood that once one of them had discovered how to turn lead into gold and had flooded the market with gold, that there would be no money in pursuing alchemy, right? The first one takes all. And so alchemists were really committed to not telling other alchemists what they thought they knew. And as a result, they fell prey to the ongoing tragedy that is the boundless capacity of human beings to kid themselves about what they think they know. Alchemists would design experiments, run the experiments, and then blindly assert that their experiments were successful, which is why every alchemist discovered for himself in the hardest way possible that drinking mercury is a bad idea. And the 500 year period during which alchemists didn't tell each other what they thought they knew, we call that period the Dark Ages. And we call the period when alchemists started telling each other what they thought they knew when they started publishing, when they started submitting themselves to the punishing and often humiliating practice of adversarial peer reviews when your friends point out the mistakes you made and your enemies tell you how stupid you are to have made them. We call that moment the Enlightenment, and we call what came out of that science. And thanks to the efforts of early cryptographers and code breakers, notably the folks at Bletchley Park where Alan Turing worked with the Polish mathematicians who were living in exile, who had fled Nazi occupied Poland. And to the scientists at the Princeton Institute who included the Hungarian mathematicians in exile, particularly Johnny von Neumann, we developed this practice of cryptography that involves telling other people how our crypto works and so that they can wrinkle out the problems that we've made, our blind spots, and help us shore them up and make our crypto better. Now, there's another legacy of Turing and von Neumann, of course. It's not just crypto, it's computers. Because the way that they invented modern crypto was by inventing modern computers. First, von Neumann not only invented the von Neumann architecture, he had time left over so he invented the non von Neumann architecture too. And what they developed for us was the general purpose computing architecture, which is a single machine that can run all the programs that we can express in symbolic logic. So no longer did we need to build a special computer to calculate our ballistics tables and another computer to tabulate our elections, now we could run a single computer that could execute all the instructions that we could write down that made sense, which was an amazing, miraculous thing. A thing we're still in the first years of understanding and making sense of, I believe, even though it's half a century later. But one of the things that we are very slow to realize, once we got to our incompleteness, once we got universality, was that it was kind of nice to have machines that only did a few things, instead of machines that did everything. Every now and again, you really want a computer that can't run all the programs, you just want a computer that can run some of the programs, like you might decide to make yourself a little social network for some reason. I guess people still keep making social networks. And you may think that the way that you will help your audience express their awesome individuality is by giving them glittery unicorn gifs, it's pronounced GIF, across the top of their pages, and then you can give them a little toy scripting environment, like five instructions that they can use to make the unicorns dance. And that seems like a great idea until the next DEFCON or CCC or other big security conference when some researcher shows up and says, yeah, you had five instructions, so I kind of added them together and figured out how to gang them up until I had a fully tour and complete computing environment in your glitter GIF unicorn language, and I wrote a virus and it's now infected all the pages in your social network. Sometimes we want printers that just can execute the code to convert vectors to bitmaps and instruct a little actuator to drop some ink on some pressed vegetable matter and not printers that can run all the instructions in the world, because it turns out that when malware gets into your tragically badly secured computers and on your printers, and they're all tragically badly secured, that once it's there, it can start crawling your network for machines that are unpatched against known bones and then infect them and then act as a bridge to exfiltrate them out of your corporate network. Again, one of the horrific things you can see every year if you go to a big security conference. You know, touring completeness is so ubiquitous, we find it so hard to get away from, that it turns out that collectible card games like Magic the Gathering are touring complete and with the right deck and enough time, and in this case, enough time to run a very sophisticated program might exceed the heat death of the universe, but with the right deck and enough time, you can run all the programs. You can run the firmware for your Lockheed Martin jet and you can run Pacman on your Magic the Gathering technology. So the amazing thing about crypto and general purpose computing, these two legacies, is the principle of end-to-end cryptography. Because end-to-end cryptography means that if you can write cipher that you believe to be secure, because you've subjected it to rigorous peer screening, you've indulged in the only security methodology we have, which is to ask other people what mistakes you might have made, and you know what computer you're encrypting on, it doesn't matter if you don't trust the end points between you and the other side. You can send messages to the people that you trust and not worry about the people in between that you don't trust. They can't read your messages on the wire, so long as you can see the code and you know about the end points, you don't have to trust the middle. Ultimately, though, if you can't see the code that you're using to encrypt, if you don't know about the machines that you're running on, you can't ever really have that certainty. Someone can tell you that they've implemented the cipher that is well understood, but you can't know whether or not they've added a backdoor or made a mistake unless you can audit it. You need the adversarial peer review or you're back in alchemy land. And this really matters when it comes to the Internet of Things, because as our world is made of computers, as our computers are inside of our bodies, we're starting to get an inkling of what it means not to be able to trust those computers because they are black boxes, they are designed by vendors who treat the fact that you can't see inside of them as an existential matter. If you can, you might be able to do something that would affect their bottom line, and so they go to enormous lengths to stop you from seeing how your computers work. And going back to those videos about the Internet of Things, where everything looks like rollerball, if you've ever noticed, the first thing that happens when someone walks into their video Internet of Things house is they wave their hands and all the lights come on, and then they speak to the house. They say, uh, house, tea, earl, gray, hot, and the machine just sort of hops to life in the kitchen, and they get their tea. Well, what is the implication of a house that you can wave your hands at and talk to anywhere you happen to be? That's a house where everywhere you are, you're on camera and you're on mic, and if you can't trust that networked camera, microphone, panopticon that you share living quarters with, you are in potentially a lot of trouble. And there are plenty of companies, and there are plenty of governments, who have what they see as good reasons to stop you from seeing how those computers work, and they have powerful tools to accomplish this end, tools that have their origins in the old days of the copyright wars. So in the early days of the copyright wars, the American government adopted a statute called the Digital Millennium Copyright Act, and the U.S. Trade Representative encouraged countries all over the world that trade with the U.S. to adopt their own versions. The DMCA has got lots of complicated provisions. One of the most pernicious, though, is section 1201 that says that if you put a lock on a device, that it's against the law for someone else who owns that device to unlock it, or to tell people information that they might be able to use to unlock it, or to point out vulnerabilities that can be used to bootstrap a jailbreak that might unlock it, or to publish the keys if you can recover them from the device so that other people can unlock them. And this was used in the early days of the copyright wars to accomplish all kinds of business objectives. Like, for example, you may have noticed that if you stick a CD which doesn't have a digital lock in your computer, your computer will automatically wake up a piece of software whose slogan was rip, mix, and burn, and it'll offer to rip that CD for you so you can get more value out of it. You can rip it, you can mash it up, you can put it on your mobile player, you can put it in your car, you can stream it, you can turn it into an alarm tone or a ring tone. All that value is unlocked from your CD because there is no digital lock, and so vendors can make interoperable products. And all the things that are legal to do with the music on a CD are legal to do with a video on a DVD, but the DVD has a digital lock, and because it's against the law to remove the digital lock, you can't accomplish the legal things that you could get out of the DVD if you unlocked it. So if you want to watch the movies on your DVD, on your phone, you have to buy them again from a store like the iTunes Store or from the Amazon Video Store, and you can see why this is a great deal for the people who made the CD and the DVD, but as the owner of that product, that latent value was confiscated from you, and you were never able to realize it. So firms really liked this because they realized that they could, instead of having an ecosystem where their competitors and other companies came in and unlocked new value in the things that they sold you, they could capture 100% of that value and decide what of that value you can have. I think of it sometimes as the urinary tract infection business model. With a CD, all the value flows in a kind of healthy gush, but with a DVD, if you want to add a new feature to your video, it comes in a kind of burning, painful trickle, and it costs you a little extra every time. But the problem isn't just that we have this consumer rights story where some of the stuff that we own can't do as much as it should be able to do. It's that this model has proven to be a very valuable tool for other people who have other objectives. So I live in the United Kingdom, as you can tell by my accent, and we've just re-elected a guy named David Cameron, and one of his election promises was that if he was elected, he would ensure that nobody would have crypto that the government couldn't listen in on, that the government couldn't break, and that's not just an Anglo madness. That's something that FBI Director James Comey is called for, and many other people in other countries around the world in positions of power have said that this is one of their goals. Now, we don't know how to make a computer that can run all the programs, except for the ones that execute good crypto, but what we do know how to make is computers that only run signed code, and we can imagine government saying, well, we're only going to sign the code if it has a back door in it that we can use, and because we already have this legal structure that says if there's a digital lock, like a thing that checks to see whether your code has been signed before it can run, that rule will protect our technology, even though the technology isn't hard to break. The combination of that technology and felony convictions for people who unlock their devices will allow us to start putting back doors in everything. Of course, there's the walled garden world, which got really, really big on this digital lock stuff, so Apple doesn't use digital locks to stop you from pirating iOS apps. If you want to do that, you just download an illicit jailbreak, but it does stop you from running a commercial store that competes with the Apple Store that might offer ISVs a better deal instead of taking a 30% VIG on every piece of software sold through the App Store. Maybe you want to start one that takes a 15% VIG and offers ISVs a better deal and starts to eat Apple's lunch. The lack of the ability to break a digital lock and do so lawfully means that Apple can prevent that kind of competition in the market, and Sony and Nintendo and everyone who makes those game consoles, as well as the burgeoning ecosystems for things like Nest. They all rely on being able to extract rent from customers and from suppliers by using this law to stop people from knowing how they work. After all, an iPhone isn't a phone that is incapable of running software that hasn't been blessed by Apple. An iPhone is a phone that's been designed to refuse to run software that hasn't been blessed by Apple. It's the, I can't let you do that, Dave, phone. In the Internet of Things, we're seeing that investors really want to know what your ecosystem play is before they'll give you any capital. They want to know that your smart thing isn't just a single purchase that your customers make, that it's locked into an ecosystem that requires your customers to buy their interoperable products from you or from firms that you have a commercial relationship with, partly because that lets you extract rent from them, you can charge them for the license to run inside of your Internet of Things, and partly so that you can make covenants, right? Like you may want to promise a carrier that none of the apps in your mobile ecosystem allow the kind of tethering that they can't detect in the middle and shut down. And so that deal with a carrier is worth more if you can make that promise, and you can only make that promise if you can lock third parties out of your store when they don't play nice with you. And although nobody wants this in their devices, it's a thing that keeps recurring in our new devices. You may have seen that Courig, the people who make the K-Cup, brought out a new model device with digital rights management in it for their coffee so that you can run third-party pods through their coffee machines. And the CEO just did an investor call where he blamed the 25% drop in their sales on DRM, but he didn't say that they were going to get rid of the DRM. He said that they needed to educate their customers, and the fact is there's never been a customer who woke up and said, you know what I want is a coffee machine that lets me do less. And car companies have really latched on to this business model, not because they are worried that their competitors will steal their ABS code or their other informatics, but because if the only way to fix a car is to do software diagnostics on it, and the only way to do software diagnostics is to license a tool from them, because otherwise you would have to break a digital lock on the car's engine, then they can say to the mechanics, if you want a license from us to read informatics off of the diagnostics off of the car's informatics, you have to covenant to us that you're only going to buy parts from us, and that you're not going to buy parts from third parties who might sell you cheaper and or better parts for these cars. This model has been so powerful that now you have venerable automotive companies and industrial companies arguing that you don't and can't own your car. So every three years, the copyright office asks if we should make any exemptions to this rule. This year was one of those three years, and one of the petitions was to let mechanics create diagnostic tools that will let them read cars even if they don't have a license arrangement with the manufacturer. And John Deere filed reply comments saying, oh, you own the engine, but because you can't own the software, you're really a licencer of the whole package, right? Like you are that sucker who clicked the eula that said by buying this tractor, you agree that we're allowed to come over to your house and punch your grandmother and wear your underwear and make long distance calls and eat all the food in your fridge. And the car companies aren't looking at this to get the high end, the people who've bought really expensive cars, but also the low end, the people who don't have any money at all. So there's this exciting new subprime car loan industry. There's a million cars on the road in America that were bought with subprime car loans, which are just what they sound like. People who don't have enough money to qualify for the loan or enough credit to qualify for the loan, but are given a loan anyways. Those loans are securitized and packaged up. So you want to make sure that their coupon returns well and that the thing is worth as much money as possible. The way that they get there is by fitting the ignition with an override that's networked and location aware so that if you miss a day's payment, first of all, it's got its own speaker. How obnoxious is this, right? It's got its own speaker. If you miss a payment, the car starts saying, you're late on your payments. You're late on your payments. You're late on your payments. But also, if you drive out of the area where your car is allowed to drive according to the terms of your lease, the ignition won't start again. So when the New York Times profiled the use of this technology, they talked to one woman who had crossed a county line without realizing it when she took her family out for a walk in the woods. And at the end of the day, alone in the woods with no one else around and out of cell phone range, her car refused to start and get her home again. It didn't end well, although there weren't wolves or anything. So there's a guy named Hugh Hare who runs the MIT Media Lab's prosthetics lab and he gives a much better presentation than I do because I'm a words guy and he's got pictures. So instead of, you know, a picture of him on the screen, there's pictures of all the awesome ways that they figured out to put computers in people's bodies while he walks through them, legs, arms, feet, neural prostheses for people with untreatable depression and on and on and on. And then the last slide, it's a killer, is him climbing a mountain all in Gore-Tex, stuck to the mountain, super ripped and he has no legs below the knees. He's got robotic prostheses to help him climb the mountain. He said, yeah, I'm a mountaineer and I lost my legs in a mountain climbing exercise to Frostbite and he's been walking around like this the whole time and he pulls up his pants, legs, and there he is, he's robot from the knee down. He starts charging around and leaping like a mountain goat. So the first question when I saw him do this that anyone asked was, how much did your legs cost? And he named a price like you could buy a nice townhouse in Noe Valley or a brownstone in New York or like a Mayfair row house. And then the second question anyone asks is like, well, who's gonna buy those legs? And he said, well, everybody because if it's a 60-year mortgage on your house or a 60-year mortgage on legs, you'll take the legs. But imagine in a world of where owners aren't allowed to know and interrogate and change the software on their devices, imagine what subprime leg loans look like, right? Imagine what happens when you miss a payment. Your legs walk themselves back, right? And imagine what lawful interception overrides look like on parts of your body, right? Where the cops can pull your legs over and where anyone who the cops have leaked the authentication keys to can also pull your legs over. And when the cops in countries that we think of as being less democratic like our own like Syria or Russia or Iran can also exercise the same facility. Moreover, the whole model rests not just on a prohibition of telling people how the software works but also telling people what its vulnerabilities are, right? If you know what mistakes the programmer's made in making your device, then you know how to jailbreak the device and trick it into installing code of your own design that overrides these anti-features. And when you're not allowed to tell people about mistakes that they've made in their security model, we are back to alchemy. We are drinking mercury. We are designing things that we keep our bodies in and that we put inside of our bodies whose vulnerabilities we are literally not allowed to tell one another about. So I've gone back to the Electronic Frontier Foundation after 10 years absence to work on a project called the Apollo 1201 project. 1201 is the section that prohibits changing code if there's a digital lock and the Apollo project obviously is the project that put a man on the moon in 10 years. We within 10 years are planning to end all digital lock laws everywhere in the world. That's the project we've just kicked off. Thank you. Now, thank you very much. Now, none of us is pure in our technology decisions, right? We want to watch our movies even though Netflix invagled the W3C into putting DRM into our browsers that our browsers are now long live reservoirs of digital pathogens that can attack this in every conceivable way. We need to get our internet from companies like Comcast and AT&T and Time Warner even though they are enemies of network neutrality and want to have the right to decide what you can and can't watch on the internet and to turn the internet into a glorified cable TV system. We may have to work for firms that as part of their remit supply the government with weaponized vulnerabilities that they use to attack people that they don't like and so it gives them an incentive to keep those vulnerabilities from being discovered and patched and therefore puts everybody at risk. So none of us can be perfect and I'm not asking you to be perfect. You know, every smug vegetarian eventually meets a vegan. It doesn't matter how perfect you think you are. But I am going to ask you to think about all of the organizations that are working to mediate the harms that I've talked about today and the harms that are coming down the pipe and it's not just the Electronic Frontier Foundation although that's one of them. There's the Free Software Foundation. There's Creative Commons. There's Public Knowledge. There's Fight for the Future and so on and so on. There are dozens of organizations of every political stripe that work on these issues of technological freedom and that is itself an amazing thing because 10 years ago there weren't dozens of these organizations and I want you to add up all the money you spend every month on firms that are trying to destroy the nervous system of the 21st century and turn it into a system of control instead of a system of liberation and then decide what percentage of that you think you want to use to hedge your bet on the future on giving your kids a world where the computers say yes master, not I can't let you do that Dave and make that decision and follow your conscience. Thank you all very much for coming.