 So, without any more waiting, I'm just going to give Bruce a chance to get started. So, please welcome Bruce Schneier. Good morning. Thanks for waking up for me. I appreciate it. I always worry about starting at the early hour of 10. And also, I would have done the hand-wraised as who has two or more seats next to them and has taken a shower this morning, because there might be a reason. I'm going to talk about the going dark debate for a second. And this is a 25-year-old issue. Government has wanted access to encrypted communications, mid-90s clipper chip. That was the first example of that. I remember access to iPhones a few years ago. Very recently, Attorney General Barr remade demands that companies make communication systems available for law enforcement. A lot of politics here, Five Eyes, the law enforcement arms released a statement a couple of weeks ago. So there's actually some real technology to talk about in this problem. We can talk about key escrow technologies, ways to make keys available to a third party under cryptographic rules. We can talk about obfuscation technologies, how to write code that can't be a verse engineered. Vulnerability finding is relevant here. And we can actually do research in building better or worse back doors, different kinds of back doors. What's the mechanism? How does it work? There's more stuff in this debate, the underpinnings of surveillance capitalism, how companies are already spying on us. Gmail is already backdoored for a reason. We want the cloud provider to do work on our email, therefore they must have the plain text. We have other systems backdoored because of surveillance. Companies want to spy on us to sell us stuff. Underlying security needs of society matter here. These systems that are being backdoored are increasingly being used for very sensitive personal government, military, national security applications. And there's some real policy decisions to make. There is a security benefit to making data available to the FBI to solve crimes. Even though that necessarily makes that data available to others. And there's a security benefit to making sure our data is secure from everybody, even if it makes that also secure from law enforcement. Which is more important? To what degree it's more important? Also questions about consumer acceptance, international alternatives, how this technology would be exported to other countries who would use the same systems for different reasons. And so here's the issue. Almost no policy makers discussing this issue have the technological chops to understand the tech part. Going dark is a scare term. It's an effective one. So 60 years ago, there was a British scientist named C.P. No. And he wrote an essay called The Two Cultures. And in that essay, he lamented on the lack of dialogue between what he called the scientific technical culture and the humanist culture. He kind of bemoaned that neither culture understood each other, and it was like they were in two completely different worlds. So today we still have those two separate worlds. We have the world of the technologists who build really cool tools, often without regard of how they affect society. And then there's a world of policy that can criticize technology and propose solutions that actually understanding how the technology works. So those two cultures, that split, was largely okay in 1959. And for the most part, tech and policy didn't interact with each other very much. And there are exceptions at large government programs like nuclear weapons and the space program, but you could see the two as separate. Today it's really different. Tech and policy are deeply intertwined. Tech makes de facto policy, laws forever catching up with what tech does, and it's no longer sustainable for tech and policy to be in different worlds. Specifically what I'm calling for, what I write about, is the case for public interest technologists, important in all aspects of technology, especially in security. All right, so let's start by talking about how we got here. Most people know this story. The internet was never designed with security in mind. Absolutely crazy when I say it today. Go back to the early 80s, two things true about the internet. One, it wasn't used for anything important ever. And two, you had to be a member of a research institution to get access to it in the first place. Those constraints were sufficient, and the designers decided to ignore security, push it to the end points. The internet develops, and you had a very specific ethos, very American-centric, male-dominated, profit-motivated IT industry. It didn't spend a lot of time thinking about the social effects of what they were building. Software was deliberately excluded from normal liability laws. Policymakers didn't want to touch it because it was very much an engine of economic growth. Couple that with the libertarian ethos in Silicon Valley. And you have an internet which is not really touched by regulation, by societal concerns. And internet tech is different in the 1960s tech. More democratic, more distributed, more diffuse, more commercial, moves a lot faster. The internet became critical kind of at all levels of our society, really by accident, without any planning, without any forethought. Today it's embedded in every aspect of our lives. Pretty much every form of communication uses the internet, including communications between like really important people. Add the internet of things, add critical infrastructure, add in the coming years, automation, autonomy, physical agency. And suddenly these systems are affecting life and property. And tech has become de facto policy. There are companies that have effective control on free speech, on censorship, regardless of the laws. Companies can set limits on personal freedoms, regardless of the laws. A lot of it's because there's a still belief that these things are personal choice to use, as if you could be a fully functioning member of society in our century without a cell phone, without an email address. So now we hear terms like algorithmic dissemination, digital divide, information attacks on democracies, surveillance capitalism. And the internet is no longer a separate thing. It's part of everything. It's part of consumer policy. It goes to the villages, it's part of automobile policy, airplane policy, medical device policy, et cetera, et cetera. It affects discrimination, equal protection, fairness, liberty, power, democracy, it's part of national security, it's part of everything. So now when you think of going dark, it's suddenly a bigger question. As internet security becomes everything security, internet security technology becomes more important to overall security policy. And we'll never get the policy right if policymakers get the tech wrong. And this is why we need public interest technologists. Actually, fixing this has two parts. The first is policymakers need to understand tech. What we want is all policy discussions to be informed by the relevant technologies. Reality is more that policymakers ignore the tech if it doesn't conform to their politics. Maybe they don't know enough to do anything useful. I think that stifling innovation is still a big fear. Lobbyists will easily provide whatever information matches the political beliefs. You saw this, I think, in full display in the Facebook hearings. Remember the question? How do you make money? We laugh, but the fact that a sitting senator doesn't think it would be an idiotic question to ask means we have some serious problems. And I don't need policymakers to be technologists. We have government where things are governed that the people governing don't understand them. They have staffers that do. They have staffers that know to ask the right questions. They have good bullshit detectors. They believe in the truths of technology. This is no different than any other area of society. That's the first part. The second part is we need technologists to get involved in public policy. We need more public interest technologists. So let me define that term. Bunch of different definitions. I'm going to read the Ford Foundation's definition. Technical practitioners who focus on social justice, the common good, and or the public interest. A little bit issue-focused. Tim Berners-Lee has a great term, philosophical engineers. Another definition I read is people who study the application of technology expertise to advance the public interest, generate public benefits, or promote the public good. All right, so it's not one thing, it's a lot of things. I think of public interest technologists as people who combine their tech expertise with a public interest focus. By working on tech policy, by working on a tech project with a public benefit, by working for a more traditional organization in an IT role with has a public benefit, or working on technology inside government, still kind of a developing term. Not everyone likes it, but I think it's a decent umbrella term for what we all do. Large tent, a lot of job descriptions. So we need these people who can weigh in on public interest, on public policy debates. So a second example where policy desperately needs some tech focus, supply chain security. In the news a lot, this year, last year, China, Huawei, should we trust networking equipment built by a company who resides in a country that we don't trust. Reasonable question to ask, a couple of years ago, same question that's about Kaspersky. And sure, yes, companies are subject to the pressures by the governments of their home country, right, U.S. companies included. But supply chain security is a lot more complicated than that. This is not made in the United States. These chips are not fad in the United States. It's programmers carry, what, 100 different passports? And we all know that the security of this device can be subverted at any of those points. We all know, do you have to trust the distribution mechanism? We have fake apps in the Google Play Store. We have to trust the update mechanism, remember, not Petya distributed through a malicious update of an Ukrainian accounting package. We know you have to trust the shipping mechanism because we all remember that photograph of NSA employees opening up a Cisco box that was intended for the Syrian telephone company. Supply chain security is a lot harder problem. You have to trust everyone, yet you can't trust anyone. And the solutions are equally problematic. We could build a U.S. only version of an iPhone. It'll cost a lot, 10 times as much. No one will buy it. And the policy discussions should take this all into account. And I think there actually is a major research effort we should undergo. Just like the Internet was built around the question, can we create a reliable network with unreliable parts? Can we create a secure system within secure parts? That's another one. There are more policy debates in security that technologists need to get involved in. Vulnerability equities debate. Talk about a lot here, offense versus defense, how bug bounties work, the international aspects of it, the cyber weapons or arms manufacturers, the debates on election security, blockchain, what it does, what it doesn't do, how to regulate it, Internet of Things, safety and security. I'm kind of listing the villages we have here. 5G security versus 5G surveillance, critical infrastructure, data privacy and big data, algorithmic security, algorithmic fairness, AI robotics, these are all going to be major policy issues that need to be informed by technology. And there are a lot more once you broaden the definition of Internet security. I wrote in a series of papers on influence operations against democracies, looking at a democracy as an information system. And what can we learn by bringing our way of thinking about security to this very much non-traditional question? So we need this and we need it now. There's one report written about this that called this a pivotal moment. I want to read something from the report, read a sentence. While we cite individual instances of visionary leadership and successful deployment of technology skill for the public interest, there was a consensus that the stubborn cycle of inadequate supply, misarticulated demand, and an inefficient marketplace stymies progress. So that quote speaks to how we can intervene to start fixing this problem. It's three things. First was the supply side. I think in the end, this is our biggest problem. There isn't enough raw talent to tap for the public interest, especially acute in cybersecurity because there isn't actually enough raw talent to tap into the regular corporate needs. It's cybersecurity skills gap is a big deal. And when you look at the public interest technology today, it's a very diverse group of people. It's a very multidisciplinary group of people. Backgrounds come from tech, from policy, from law. A lot of people without a computer science degree doing this work. We need to make all this work. We need a lot of different ways for people to engage in this sphere. It's not just taking it as your job. How can we do it on the side? How can people take a couple of years between regular jobs and do this? Or sabbatical years if you work for a company that has those. So we need clinics at universities where people can get a taste for this kind of public interest tech work. And we need to really force a diversity. What we've learned, I think, very graphically in the past decade or so, is that if the populations using tech are not represented in the groups that shape the tech, you get really lousy tech. One is the demand side. And right now at this moment, as bad as supply is, demand is worse. I get more people asking me after talks like this, I want to do this. Where do I go? And there are a few places to go. So we need jobs funded at a variety of NGOs, inside government, at all levels. More organizations doing this kind of work. And the third intervention is the marketplace. And here, we just need things that reduce the friction, where people who want to do this can find people who need this done. I mean, right now, it's a little haphazard. Here maybe, Wright's Conmore, Internet Freedom Festival, I mean, those are places where public interest tech happens. Is it something called a non-profit technology conference? Anyone heard of it? I sure didn't. But places like that. There are organizations doing this. We can list the Electronic Frontier Foundation, Electronic Privacy Information Center, Access Now, lots more. There are now academic programs. I teach at Harvard, but Carnegie Mellon, Georgetown, Stamford, everywhere. New America, last year formed the Public Interest Technology University Network, about 21 universities, going to be starting up different programs. And there are technologies inside government. Some of our colleagues have taken senior positions inside the Federal Trade Commission for a year, for two years. There's an organization called Tech Congress that puts technologies like us on congressional staffs for a year. Aspen Institute has a tech policy hub, has fellows. And there are even programs and initiatives inside corporations. The big one you've probably heard of is Jigsaw, right, inside Google Alphabet. It's something more near and dear to our community are our public interest technology, building technology to benefit the public interest, right? So tour, signal, and all the others, tails, cubes, et cetera, et cetera. Or apps that track public policy issues. Something I don't think we give enough credit to are the people who are doing IT security work inside public interest organizations. Like Ammons International, Human Rights Watch, Greenpeace. It's a hard job. You make like half what you'd make elsewhere. And honestly, the government at China versus Human Rights Watch is not a fair fight. And our colleagues are fighting that fight. Lastly, there are public interest technologies doing training. You're looking at tactical tech or digital security exchange, right, matching expertise with people that need it. And there are foundations funding in this space. Ford, MacArthur, Hewitt, in particular, there are others. And this all might seem like a lot, but it's really not. These are examples, but there's still largely exceptions. There's still largely around the edges. We have to scale this. We know about these examples because we're paying attention. Right now, there aren't enough people doing it, and there are enough people who know it needs to be done. So I want to create a world where all of this is normal and all of this is common, where there's a viable career path for a public interest technologist. And to do that, we need a cultural shift. We don't need all the pieces working together. And I think we also need to recognize that what's in the best interest of corporations is not necessarily the best interest of society and that that's okay. That's not a failure of the market. That's normal. And I need to start from the top. A lot of public interest talent came from the eight years of the Obama administration embracing tech change and building tech organizations inside government. There's an interesting parallel here to public interest law. So tell a story of public interest law. In the 1970s, there was no such thing. It didn't exist. The field was created deliberately by organizations like the Ford. And they would fund law clinics at universities, so law students would get a taste of housing law, discrimination law, immigration law. They funded fellowships at places like the ACLU or the NAACP. So the places for these new attorneys to go and do this work. They created a world where public interest law is a valued career, where if you tell your parents you're doing that, they're impressed, where every partner at a major law firm is expected to have done pro bono work, is expected to continue to do pro bono work throughout their career. And today, ACLU advertised a position for a staff attorney. Who pays between one-third and one-tenth of what an attorney would make out in the corporate world, and they get hundreds of applications. Today, 20% of Harvard Law School graduates don't go to work for a major law firm or for a corporation. They go work in the public interest. And a couple of years ago, that university had a soul-searching seminar because that percentage was so low. Number of computer science grads from Harvard that go into the public interest, it's probably zero. Not their fault, but the ecosystem doesn't exist. So more generally, we technologists need to understand the policy ramifications of our work. There's a pervasive myth in Silicon Valley that tech is politically neutral. It's not. We here all know that, but it is a widely held truth. Our work is deeply embedded in policy. The things we do affect the world we live in. And we all need to decide what tools we're willing to build. Do we build technologies of surveillance and control? Do we build technologies of liberty and autonomy? This matters when we work on spyware, on censorship, control tools. Historically, we have created a world where programmers had an inherent right to code the world as they saw fit. We did that because historically, it didn't matter. Tech was tools. Now it does matter. And in a lot of ways, the special privilege needs to end. Everything we build is a complex socio-technical system. It is not just a tool. So our third example, 5G, IoT, big data. And the next disruption in technology is going to be about things and not about people. 5G is not being built so you can watch Netflix faster. It is being built so things can talk to other things behind your back. The number of things, well, the number of people on the internet, these will be semi-autonomous things and they'll be generating data about us and they'll be using data about us. And right now we're building that world. And when we build these systems, we can prioritize different aspects of society. We can prioritize corporate profits. We can prioritize individual autonomy. We can prioritize privacy. We can prioritize group benefit of information. Government control. We can prioritize human rights. But all of these are possible. The question is, which future will we collectively build? And I like some of the talk I'm hearing the past couple of years about the decentralized internet, the decentralized web. Movements that try to pull back from the centralized control we've seen since the mid-90s. And as much as I derived blockchain, pretty much every chance I get, the politics of that is heartening. Because there's a politics of reducing centralized control. It's not just that. Everything we do has a moral dimension and we need to engage in that. And a lot of times it's really hard in security because of so much of what we do is dual use. The same tool has positive and negative effects depending on who is using it and how it's being used. That makes us hard. And of course we are not responsible for every different use of something we build. But we are responsible for the world we create with the technologies we build. And we have a surprising amount of power. Now, kind of as consumers we don't. A lot of these things are monopolies. A lot of these things are sold with sort of deep psychological manipulation. And consumer choice doesn't really work the way it's supposed to in an effective market. But as employees we do. Because even if the big companies don't have to compete with each other on products, they all have to compete for our talent. And we've seen this in the past couple of years as employees taking a stand against what their companies are doing. And I assure you this terrifies companies. Google has already has problem recruiting enough people. Ten percent walk out it's a freaking disaster. The employees demand that they don't work on something, they don't work on it. And law and policy have to work together. Either they work together, they don't work at all. And I think actually this is the fundamental lesson of Edward Snowden. We all knew that we could build tech to subvert policy. He showed us you can build policy to subvert tech. And if they're not working together, they're failing. Again, this is bigger than computer security. Nearly all the major policy debates of this century will have a strong tech component. Robotics, climate change, food safety, drones, AI, bioengineering. Hey, I mean these have deep tech components. And there are places where we as hackers can get involved. And I actually think this is where the core issues of society lie. So in the 20th century, the question that organized society was basically this. How much of our lives should be governed by the state and how much of our lives should be governed by the market? That's the Cold War in a sense. That's most countries politics in a sense. The defining question of this century, at least the first half of this century, I think will look like this. How much of our lives should be governed by technology and under what terms? Now in the 20th century, the question was really an economic one. And that's why economists basically were the ones who made public policy. This century's question is technological. And we are the people who need to make policy. So this future is coming. I think it's coming faster than we think. I think it's coming faster than our policy tools. Today's policy tools can deal with. I think the only way to fix this is develop a new set of policy tools that work for the environment we live in. And we need technologists in all aspects of public policy. In all aspects of public interest work. Informing policy, creating tools, and building the future. And you know this field. We do not need permission to do this. Our ethos is that we can do this without permission. When you hack a public system and make the information available, that is public interest work. When you build tools of security and tools to counter surveillance, that is public interest work. When you decide that the world you want to live in is not the word you're living in and you move to create that world, that's public interest work. We need it. We need more of it. I need your help. Thank you. All right, so I left a bunch of time for questions. I see people are escaping through the correct door. You all listen to the announcement. All right, there's no microphones, so you either have to walk to the front or be loud. Run. Yes. Is the open borders internet? Yeah, so the question is about internet, not bifurcation, balkanization. I don't know. I see a split in three different ways. We'll see how it lasts, sort of US-centric, Europe-centric, and China-centric. And right now, that's a split on the way policy works. It's, I don't know if it'll turn into hard splits. China is doing a hard split inside its own country. But we'll see. It really depends on how incompatible the laws are and how much the different spheres don't want each other in. I do worry about it. I think it is not a big worry right now, but it easily can turn into a big worry because it's based on perception and policy. I mean, you can imagine Europe saying, look, Facebook, you're no longer welcome. But you're not following our laws, we're just done with you. And that would be a big deal, but it certainly could happen. So a question to further up. Yes? Question about government involvement and election security? I mean, US is very unique in that we don't have a professional election organization that many other countries do. It is because of the way the US is organized. We don't have one election. We have 52 separate elections that are autonomous. That was great in the mid-1800s, great idea. I think it's like less good today. Because again, like Russia versus North Carolina is not a fair fight. But there's a lot of politics in the way of free and fair elections in the United States, right? This isn't a debate you can have purely technically, although it would be nice if we could. And that is the problem. Other countries, I think, are doing better because they can nationalize it, because they can bring nation-level defenses in a way that you can't in the United States. So I think our uniqueness makes that harder. And this is very much political issue. Because after an election, the winning side wants the results to stand no matter what happened. So you need a professional class that are in charge of fairness and not outcome. And we don't have that, I'm not sure we'll get that. So a hand there, and then I'll go this way. So I think abuse is hard, because a lot of tools can be abused. A lot of it's having the right people in the room when you're building them. I mean, we've built them as tools where they're techy. And us techies just create them. You need the people using them. You need the different groups. Understanding how an affected disempowered group is using a tool makes a huge difference. And I think even big things, the right people aren't in the room early enough to understand it. I mean, I don't know the answers, but I think we have to learn how to ask the questions at a point where we can make the design trade-offs and not when it's too late. So that's really what I want right now. That these, think of these as complex socio-technical systems from the beginning and getting the sociologists in the room, getting the activists in the room, getting the users in the room, getting the soft science people with the programmers at the beginning. Google invented a job classification a few years ago called staff attorney. The idea was, this is a really good idea, that what they build is gonna have legal implications down the road. Wouldn't it be great to have us an attorney on staff in the beginning of the design? And now they do that. Let's do that for public interest as well. I get, and I think some companies are doing this. We just need to be part of our ethos. So a hand down there. Yes. You only get one. You know, I think the problem of activists getting so powerful and pushing their agendas is so far from reality, I'm totally not worried about it. I'm way more worried about the money pushing their agenda. I mean, if the disempowered get a little more powerful, I think we'll do good. I'm not really worried about that. We've got a long way to go before marginal agendas are at the top. All right, I'm gonna go down there. Comment on fricking regulations is a huge issue. And by comment, we don't mean write a bot that sends a million similar comments. Because we're kind of on to you for that. And that like messes up the whole process. But yes, more tech comments on regulatory matters is really useful. That did a lot of good in the open internet debate. Why am I blanking on the term of it? Not so, but Pippa, after that the FCC wanting to allow companies to throttle. Net neutrality. All right, it's early. Yes, I mean, the public comments made a huge effect on that. And we think they don't matter, they do. I see a hand over there. Questions about arms race and kiosk are where we end. And that's just we don't know. A lot of our debate on the insecurity of backdoors is undercut by the fact that corporations backdoor their stuff all the time for corporate reasons. I mean, it's hard to argue that, look, you must make eye message secure or bad things will happen when your email is all escrowed by whatever service provider you have. Now, my belief is that in the long game, these systems become so critical for society that there's no choice but to make them to secure. That defense wins, offense loses. Because defense becomes much more important. Now, there's a lot of short term between here and there. But I think once the internet starts killing people, all these debates change. And this was the subject of, actually hold it up, that my latest book, which you might be able to see in the front row, has the great title of click here to kill everybody. Where I talk about the world of physically capable computers and how that changes the debate. I think that changes the kiosk debate. At an enormous degree. But short term, and we just recently heard that the FBI can't get into the phone of one of the two mass shooters over a couple of weeks ago. And that's gonna become a talking point. Now, why do we need to get into his phone? I mean, the question of digital forensics is a much bigger question. And one that I think we also could help. And the reason the FBI wants into your phone is because they don't know how to do anything else. And getting them better at actual forensics. Because we know it's the golden age of surveillance. We know a lot of data's out there. We know that you can do an enormous amount, even if you can't get into the phone, or get into WhatsApp. But that data, that information is not being transmitted to the FBI. A pot of public interest tech is going to work for the Justice Department. And making them smarter on fighting digital crime. All right, take one more question. Oh, way in the side. You know, so there's a lot of danger for us power shooting in and helping other industries. I mean, that almost never works out well, even if we do it in corporate ways or for public interest ways. If you wanna help, there's an issue you care about. Find an organization that is doing that answer and ask them what they need. Ask them how you can help. And it might be our computers aren't, our net keeps coming down. And it might be we need this tool that we can use, or it might be we need data analysis expertise. The people doing the work know what they need. If you wanna help, whatever the issue is, find those people, ask them. All right, I'm gonna get off stage. I have one more thing to say. I keep a webpage on public interest tech, public interest tech with the hyphens.com. It has a whole list of resources, organizations, documents, talks, people doing this. I will be around to say hi to people afterwards. I have to get off stage, I can't stay here. So I'm gonna go out that door, which is totally illegal, don't you do it. I'll go around and I'll meet you all at the back there. And then we can chat later. Thank you for coming. Enjoy your DEF CON. I've been comments that I think DEF CON four and it's really always neat to be here. And thank you again.