 So this is Slacking Towards Utopia, the state of the internet dream. Please welcome Jennifer S. Granick. Thank you. Really? New college? Awesome. My alma mater. I should talk a little bit about new college in this talk. I want to thank everybody for coming. I think I got a really awesome room to speak in this year with these totally cool computers behind me. And I want to also thank my parents who are here. They're over here. They've gotten to come see me talk a number of times here at DEFCON. My dad's hacker name is The Eagle, so you guys can look him up. And I want to talk today about something that I think that a lot of us here at DEFCON hold really dear. And I spoke about this last year at Black Hat when I gave the keynote. Can I just see, just a show of hands, how many people saw my Black Hat keynote? OK, so some but not all. So the beginning, I apologize, might be a little bit repetitive for you guys, but I think it's important to set the stage. Because in that speech, I talked about something that I called the dream of internet freedom. And it's something that I think means a lot to many of us. I said that the dream of internet freedom was endangered and that we as a community needed to start taking that seriously if we wanted to ensure that we could have some of those things that maybe animated us politically, animated us technologically in the early days of the internet. So today, what I want to do is I want to revisit that idea of the dream of internet freedom and I want to take a look at what's happened over the past year and see whether we're getting closer to the dream or we're getting further away. And my goal is going to be to end my talk early before all the time is up so that we can start to have a conversation and start to talk together about what it is we might do. If we want to take the dream seriously and try to see some of these things come to fruition. So before we get there though, I want to take a little time for the people who didn't see the speech to talk about what this dream of internet freedom is. And I'm going to talk about it from my experience and how I came to believe in this utopian vision. And for me it started when I read Stephen Levy's book Hackers and Hackers Heroes of the Computer Revolution. And in this book I learned that information wants to be free. I learned that computers would help us understand authority better and be able to be more individualistic, make our own decisions about what was right and what was wrong. I learned that computers could connect people so that we could talk to each other and share information. Then I learned that individual rights were something that could be built in to the technological world around us. That decentralization was not just a principle of computing but was also a principle of political freedom as well. And so this came to be what I started to understand as the dream of internet freedom. The dream of a free, open, interoperable, reliable internet where people can speak their minds and anyone who wants to hear it can listen. So I also learned not that long after about that the hackers were a thing and the hackers were really intimately tied to this dream. And Stephen Levy tells this story about old school hackers from back in the 1950s at MIT. But I learned that there were people now today who were trying to make this dream true. Because I discovered the Hacker Manifesto, written and published in 1986, which I guess it's an anniversary this year, and published in Frack Magazine. And there I learned that hackers were people who wanted free access to information and were willing to take time to build the tools to make it so. That hackers wanted a world where curiosity was its own reward and that people could explore this world around them and find their own truths, not just accept the conventional wisdom. So as the mentor explained it, with this background, the future could be a place where people would just meet mind to mind and exist without skin color, without nationality, without religious bias. And I wanted this to be true. But I learned when I started to read the Hacker Crackdown by Bruce Sterling that it was in danger and that there were law enforcement agents who were using a law called the Computer Fraud and Abuse Act, my personal enemy, to go after people who are trying to explore this network and interpreting this law in these huge ways that basically criminalized curiosity. Around about the same time, so I was going to law school and I was like, this is going to be the thing I really try to fight against. And around about the same time, we learned about the horrible global scourge of online pornography. And some people here are pro-porn. I understand. I think I was pro-porn too. The idea is that there was this internet, which should be this nice place for people to hang out, was polluted with all of this terrible stuff and that we needed to get rid of it. That was a very common point of view. There shouldn't be anything dirty on the internet. And another sort of alternative view is that, well, if there's going to be dirty stuff for those few people who are interested in it, it should be zoned. Like in a city, how there's a red light district that's kind of slummy and run down and you've got to go there. But everything else should be policed and be clean. And this idea was terrible for those of us who wanted the internet to be a place that would be a free exchange of ideas. It was terrible because our dream was that the internet would be like a library, but better than a library, where every book that had ever been written was on the shelf and available. And here it was that these people were coming in and saying, crudishly no, it's not going to be like a library. It's going to be like TV or radio. And we didn't want that. So this galvanized sort of a generation of activists who wanted to fight against this vision of the internet as TV or radio in favor of the vision of the internet that it was freer, freer even than a library. So into this mix comes John Perry Barlow, lyricist for the Grateful Dead, founder of the Electronic Frontier Foundation, lovely man, very poetic writer. And he drafted this Declaration of Independence of Cyberspace. And in the Declaration of Independence of Cyberspace, he set forth this vision. And he said in it, governments of the industrial world, you weary giants of flesh and steel, I come from cyberspace, the new home of mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather. Now this vision wasn't just a reaction against the whole cyber porn hysteria with Marty Rim and the passage by Congress of laws that would say that it's a crime to put porn on the internet. This was reacting to those legislative proposals. But it was also a reaction to this government business as usual. And it's a real expression of this dream of internet freedom that the people together would govern themselves and would be able to take care of and provide for each other this intellectual stimulation. So this galvanized me, motivated me in my legal career. And I think that it did the same for many, if not most of you. That this was the thing that got you excited about computers in the first place and made you want to be a part of it. So the dream of internet freedom, that we would be able to overcome age, race, class, and gender. That we would be able to communicate with anyone, anywhere. That we would have free access to information wherever in the world it was generated. The hands-on imperative that we wouldn't have to take the way things work for granted, that we would be able to investigate and explore it for ourselves and find our own reality. And ultimately, the idea that computers would liberate us. So how are we doing? Well, I think that this utopian vision is less and less true every day. And it's less and less true because of some dynamics that maybe we couldn't have predicted at the time. Today, technology is generating more information about us than ever before. Increasingly making a map of what we do for governments and for companies to learn about us and to attempt to manipulate us or to otherwise regulate us. It is a golden age for surveillance. Today, there's a real appetite for censorship. Racism and sexism have proven more than resilient enough to thrive online. And people who use the internet don't want to hear this garbage. They want someone to take care of it for them. And so there's an appetite for censorship, not just in the United States, the home of the First Amendment, but in countries around the world that have even a less robust free speech tradition than we do. And there's a real appetite for corporate control. And a lot of that has to do with spam, malware, user interface. People want to have their technology work. They are attracted to a safe, closed ecosystem where spam and malware and taking care of, where software updates happen automatically. There's just not this appetite for individualized control of technology. There's no market for it. And so these are the things that are happening today, surveillance, censorship, and control. And they're enabled by three trends, three things that are happening that make this possible. And one is centralization. Centralization is the idea that all our services are now one thing, Google for search, Gmail for email, Facebook for social networking. It's enabled by regulation. Governments are getting involved here and are regulating for these things that people want. And ultimately, globalization. Internet companies are global from the get-go. And so it's not just the United States that's involved or the United States in Europe that's involved. It's countries from around the world who are getting involved in the regulation game. So now I want to talk about, that's my sad state of affairs. So now I want to talk about what's happened over the last year or so. And see if we can learn from recent experience about these things. So let's take freedom of expression. All right. We have moved very far from the idea that people get to see whatever information they want to see. It's actually particularly scary because sometimes we don't even know, maybe most of the time we don't even know, what information is being removed from our purview. So here's an example. Facebook has been sued multiple times by families whose loved ones were killed by ISIS. And these families have said that it's, at least partially, Facebook's responsibility. Because Facebook has videos from jihadists online that are talking about and trying to foment allegiance to the group and foment jihadist activities. So Facebook is getting all this pressure to take things down. And it's not just from a few civil litigants either. United States government officials go to these social networks and they say, we know we can't make you do this under the First Amendment. But surely you can exercise some corporate responsibility and take down these beheading videos and stuff. Surely you don't need to carry this kind of terrible inflammatory crap on your network. And companies start to take this stuff down. But then we have something like the Philando Castile video, a video of an African-American man who was shot and killed by police officers for no good reason. And initially, Facebook took the Castile video down. And then there was this outcry. Why are you taking the video down? You're censoring this evidence of police brutality. People need to see how this man died. These are the same thing. They're not different. They're the same thing. But we're putting this pressure on the intermediary, on a private company, to make these decisions for us about what we're going to get to see and what we're not. It's not just Facebook. When on Twitter, some guy basically harassed an African-American actress, Leslie Jones, the woman who appeared in Ghostbusters, to the point where she was like, screw this. I'm not going to be on Twitter anymore. It's too filled with hate speech. I don't want to put up with this. And so Twitter blocked the guy. And he claimed, OK, it's a big violation of my free speech. He can go on the web. And he can still say whatever he wants. People want to find it. Maybe it'll be a little bit harder. I don't know. But the web only really works as open if you can find the stuff you want when you search for it. And that means Google. Google is the dominant search engine basically around the world. Some countries have their own. But basically, in almost every country of the world, Google's the dominant search engine. Well, what does Google carry? They have a huge responsibility because that's the venue. That's the path through which most people are going to find the information that they want. So Google is under an immense amount of pressure to change its search results in order to accommodate certain political demands. So Google is supposed to demote torrents because torrents are copyright infringing. Google is now subject to orders in Europe about the right to be forgotten. Now, the right to be forgotten is the idea that even true information about you at some point becomes outdated or outmoded. And if you don't want people to be able to find that thing about you and have it be the main thing they know, you can go and be delisted from the search engine so that if they search for you, they don't see your DUI or your terrible high school debate performance or whatever it is that you don't want people to see. So it's this privacy right that is developed in the European Union. But the result of it is that when people search for you in Google, they're not getting truthful information. And France and Germany were the countries that have been really advocating this, saying, well, it's not really even a conflict between privacy and free speech because you don't have a free speech right to have this private information about people. A very different vision than we here in the United States have. So ultimately, the question is, Google's got to comply with French law. They have to. They operate there. They have to. What does it mean, though, to comply with French law? Does it mean that if I'm a French citizen and I have information de-indexed about me, then nobody in France can get it? Or nobody in Europe can get it? Or nobody in the entire world can get it? And what France is doing now is France has pushed its case against Google, that Google is legally obligated to remove these results, not just for France, not just for Europe, which has the right to be forgotten, although it's different in different countries, but for everybody, even here in the United States, where we would have a right to access that information. Sorry, technical difficulty. Why should France be able to tell us what we can and can't see? But because these are global companies being regulated by global governments, that's what's happening. OK. In another example, people are suing in Europe. They're suing Twitter and YouTube and Facebook for a deficient response in their opinion to hate speech. But hate speech is legal in the United States. There was a news story this year where the German chancellor, Angela Merkel, was overheard pressuring Mark Zuckerberg, asking him, what is he doing to prevent criticism of her open door immigration policies? In the law, we call this a slippery slope. It starts with stuff that people agree on, like maybe copyright law and terrorist content. And then it goes to information about police brutality or truthful facts about people that we're interested in or government policies. Oh, and I want to show you just one other thing about this. My point is that these decisions about freedom of expression are inherently discriminatory. They're inherently discriminatory because it's all about the government's point of view or the majority's point of view of what's legitimate speech and what's not. We don't see Google and Twitter and Facebook under huge amounts of pressure to erase their sites of every iteration of the Confederate flag. But we do see them under that pressure to erase their sites of everything about ISIS. It's just a political question. Okay, surveillance. How are we doing on surveillance? Just gonna say, so you guys know, not very good, okay? Technology proliferates all kinds of information about us, online and increasingly offline. As we use health devices, we have the internet of things, we have sensors everywhere in our TVs and on the streets, it's a golden age for surveillance. And when technology has done what it's done and made information collection about us so cheap, so easy and so ubiquitous, then the law has a role to play. This is where the law should step up its game and get involved and say, we're gonna provide people with protection from suspicionless surveillance, okay? But the law's actually done the opposite. It's been really pretty pathetic. Basically, the United States Department of Justice argues, and in many cases successfully, that the law doesn't require a warrant for vast categories of data about you. Not for your opened emails, not for your Dropbox files, not for information about your physical location, not for your information about who you call or who you email, not for your health data, none of that. The Department of Justice says no warrant. And I wanna stress for people who are not lawyers, the importance of the warrant requirement, because the warrant does a couple of things. One thing it does is it requires probable cause, and that's really important because that removes suspicionlessness out of it. It's not like, well, we're just gonna spy on you for no good reason. It gets a judge involved, somebody from the government whose job it is to do more than just catch criminals, somebody who's got another job so that we can balance. And when you put these two things together, basically what it means is it is, the warrant requirement is the enemy of mass surveillance. The warrant requirement is about going after only individual people for good reasons. But we're at odds. We, the people, are at odds with our Department of Justice on this, which basically wants to be able to go after this information if it's reasonable to get it, which is in the eye of the beholder. In other countries, the legal protections for this kind of data are even lower than they are in the United States. As a general rule, there are examples to the contrary, but as a general rule, people in other countries are doing even worse. Now we're gonna be fighting this battle this year and next year. The law that underlies the prison program that Edward Snowden revealed is gonna expire in December of 2017. It's called Section 702. And that law basically says that if the government is looking at a foreigner, their intention is to spy on a foreigner, then they don't need to get a warrant to do it even when that foreigner, or maybe especially when that foreigner is talking with Americans. So for people who care about American privacy, the most bad for American privacy, for foreigners means if your data is stored in the United States, it's bad for you. There's no need for a warrant. So as this law starts to expire, those of us in the civil liberties community are gonna be fighting to reform it. And so you guys will be hearing from me and others about basically trying to make the law step up and protect people's data that's stored in the United States. This year, there's been another move to make data even easier for law enforcement to get. So one big issue is the idea of borders and internet jurisdiction. It's really complicated. If you have law of one jurisdiction, what if another country wants to do things a different way? Similarly, if the data's here in the United States, can other governments get it? That friction, that uncertainty, used to be kind of a protection for people from around the world. But now there's a legislative proposal that is being considered, which would basically say that so long as the United States government certifies, does sort of a handshake with another government to say that they meet certain requirements, that they're fair and that they are, you know, they have adequate safeguards in place and those sorts of things, then United States companies will be able to turn over their customers' data to these other governments without needing to go through a U.S. legal process or without needing a warrant. Again, it's another way to make sure or to set up a system where people's data will get turned over to government investigators and intelligence agents without judicial review and without a warrant requirement. Okay, so this is another legal battle that we're fighting this year. So ultimately, you know, we think about the internet as this very chaotic thing where it's like all these beautiful individual drops of water coming together to make this wonderful cloud that we can investigate, but increasingly the cloud is getting locked down, right? And it's becoming something that is very surveilled, very controlled and very knowable. The best thing that's happened in surveillance all year is encryption. Encryption adoption's been huge and it's been that successful because companies have been able to implement it pretty much unilaterally. They've had to struggle against governments making arguments that they shouldn't encrypt, but they have been able to roll out encryption. So Gmail to Yahoo Mail is now encrypted, WhatsApp is now encrypted, but we're doing a lot better. If you compare us to three years ago or four years ago, a lot more of our stuff is encrypted. And encryption is good because even when it's not end-to-end, encryption means that it frees us from mass surveillance and it means you have to go to somebody who holds the data and at least have legal process, right? So even encryption that's not end-to-end is really valuable for defeating suspicionless surveillance. But there's all this pressure against that. And the pressure against that comes from our government which wants to try to pass rules or get voluntary cooperation with systems that ensure wire tapability. And this pressure comes from other governments like, for example, Brazil. Brazil has jailed Facebook executives because WhatsApp is encrypted and they're unable to turn over information. And now they're fining Facebook for that. So what are we going to do about that? And then ultimately, there's been reports, one by rapid seven and some other reports out there that while we're doing better with encryption, we're doing far less than we actually would need to do. There have always been unpoliceable spaces, right? There have always been things that the government didn't know, whether it's our thoughts, our behavior in the bedroom, what we do and say when we're in church, our ephemeral movements that was, we moved through space and time during the day. And there are, yes, risks from having these things be private, but there are vast rewards from the fact that we take these risks. The fact that people can break the law is necessary for the evolution of our society. It is part of the natural growth of things, even when it includes crimes. Because over time, we've changed our minds about some things that are crimes because people were able to do it and eventually we saw that it was good. Homosexuality, marijuana, sedition and more. So this idea that there should never be anything that you can hide, that we should live in this closed down, cartoon cloud world, to me is really quite terrifying because if governments are effective, it basically is a hindrance to our social and political development as different communities. So finally, I want to talk about the freedom to tinker. Many of you understand why this is important. It sounds like a hobby, but it's really not. It's a phrase that means to capture our ability to study, modify, and ultimately to understand the world around us. And interference with the freedom to tinker is part of this centralization movement, right? The Digital Millennium Copyright Act, section 1201, is a great example. It basically gives this legal protection to digital rights management software so that people can't tamper with it and so it also controls the way people use the underlying copyrighted work. But of course, the big enemy in my mind of the freedom to tinker is the CFAA, the Computer Fraud and Abuse Act, my personal least favorite statute. And this year, we had a ruling from court in a case called Facebook versus Vachani. It had been in the lower courts as Facebook versus Power. And it just came out in July as a case brought by Facebook against Power. And what Power was doesn't exist anymore. It was a social network aggregator. It allowed people to take all their social networks, Facebook and, you guys may remember some of these, MySpace, your LinkedIn, your, how was the music? Anyway, you could take all your social networks, some of which don't even exist anymore and maybe this is why, and aggregate them into a single place so you could see them all at one time. And Facebook didn't like that, right? They didn't want that to happen. So initially, the company, Power said, well, we can do this even if Facebook doesn't want us to do it, even if it's against Facebook's terms of service because the Facebook users want us to do this. They want us to pull their information out and show it to them this way. So Facebook wrote a cease and desist letter to Power and said, you've got to stop doing this. And the court case was over whether Power had violated the CFAA, a criminal law in continuing to provide its customers with this social network aggregator. Now, those of us who hate the CFAA celebrated years ago when the court said a mere violation of the terms of service is not a crime, not a violation of the law. And we were like, yay, that's great because terms of service say all sorts of crazy things. That's the law in the Ninth Circuit where California and Nevada are, but not necessarily the law in other parts of the country. But we were like, this is leadership, we're gonna show how it is, we'll lead the way. Well, now the Ninth Circuit has said, okay, terms of service violation may not be a crime, but if you get a cease and desist letter and in the letter the company says, stop doing this and then you keep doing it, well then that is a crime. Then you don't have authorization and that is a crime. How can we let companies that write letters tell us what is and isn't a crime? Why are we letting companies that run computers say what people should and shouldn't be allowed to do with their own data? Yet that is the ruling from the previously, the Ninth Circuit which I previously praised. This is a terrible decision for the freedom to tinker. It pushes us into living in a permission-based world where if we don't have permission, then we can't act. And if we act without permission, we can be sued or we can be incarcerated. What's at stake? Over the next 20 years, I think my fear is that if things keep going this way, things will happen and people really won't know why. Companies will make decisions, you'll see something or you won't, there'll be an algorithm that does something with your data or shows you an ad or otherwise tries to sell you something and maybe you'll get a loan, maybe you won't get a loan and you won't be allowed to investigate the software that helped make those decisions. Increasingly, we'll see less and less controversial content on the internet as companies either exercise their corporate responsibility or as they're pressured by governments around the world to take things off. There'll be security halves. The governments that are allowed to break the Computer Fraud and Abuse Act will continue to go ahead and do so and there'll be security have nots. The people who cannot, who's ability to tinker, who's ability to explore is controlled will just have to go along and we're headed towards a world that's less like the utopian dream that I described and more a world where surveillance, censorship and centralized control by companies and governments is the norm. That's my fear. I proposed last year in my Black Hat talk a number of things that I thought would start to help us to avoid going too far into the horrible cartoon cloud world and try to bring us back more closely to the dream of internet freedom. And I think that we are doing some of these, we're starting to do some of these pretty well. But we don't have the, actually the support necessarily of everybody who uses the internet because the reason why it's headed this way isn't because people are stupid. The reason why it's headed this way is because people have other values at stake. There's other things that people care about. It's not going to just be oh, people are prejudiced against minorities or whatever, but it's because of fear. It's because fear will start to drive our decision making. We're afraid of terrorists, we're afraid of pedophiles, we're afraid of drug dealers, we're afraid of crime. We don't like hate speech. So people will start to embrace, we don't like malware, we don't like spam, so people will start to embrace this centralized control. They'll go for the walled garden. They'll cheer for pressure on social networks. What do we need to do to fix it? How do we make the dream of internet freedom possibly become more real? This year I went to a decentralization camp at the Internet Archive, which is operated by Brewster Kale. Also a really wonderful man. And the decentralization camp was Brewster's effort to jumpstart this conversation that I wanna have here about what can be done to try to retain, capture, enshrine the architecture of the internet that leads to, that can lead to ensuring these things that I've described as part of the dream. And Brewster's camp had a lot of builders there, people who are currently in the process of building these decentralized technologies. And that was really awesome to see. And I learned a lot of stuff and I met a lot of really cool people who wanted to have technology be this force for political freedom, this force for individual freedom, this force for free speech and free expression. But ultimately I think, and maybe you guys are going to be able to convince me otherwise, but ultimately I think that technology alone isn't really the problem. Because I don't think the problem is that we don't have the tools for a decentralized network. I think the problem is that people maybe don't understand or value enough what a decentralized network can bring us, what openness can bring us, what this kind of technological freedom can bring us, and they want the bad things to be taken care of. And so from my perspective, I feel like not just technology but norms, the values that people have need to be, we need to talk about it and we need to basically explain to people and help people understand when we get rid of this, we're also getting rid of this. That there are good things that go together with it. So there's a saying we have in the law, I think that there's four regulators for human behavior. There's technology, there's markets, there's law and there's norms. And I think that if we want to maintain what we have today of the dream of internet freedom, if there are things that we want to get better or even if we just want to kind of keep the status quo, then we need to start talking about bringing those levers to bear on this problem and start building the system for this revolutionary technology to either stay or for us to think about what technology that's going to be liberating is going to be the thing that replaces the internet that we will have in 20 years. So thank you for your attention. I promised I would leave time for questions and comments and I think we actually have about 10 minutes so I did what I wanted to do and I'd like to invite people who have thoughts or questions to come to the microphone. You need to come to the microphone so that people can read it. But I do want to let you know even though I got to give a speech for a really long time, you don't. So please keep your comment pretty short so that other people who want to say something have an opportunity. Okay, I'm going to go with this. Oh, thank you. I'm going to start with this brave gentleman in the front here in the blue shirt. So in the spirit of keeping it short, I got two questions. One, what's your prediction on encryption becoming illegal at some point in the US? And two, is there any role for the dark web to save us? Okay, great question. So I know I'm going to be quoted on these things but I'm still going to tell you what I think. I don't think encryption's going to become illegal in the United States. I don't think that's going to happen. What I like to say about that is you have law enforcement interests, not even intelligence because the intelligence agencies, the NSA, they've developed the skills and the techniques that they need to circumvent encryption. We've seen it in the Snowden slides. They have these great hackers. They've got lots of great tools. They can get around encryption in lots of different areas. But the people who are really wanting to circumvent encryption are law enforcement, the FBI and ultimately local cops. And usually I say that law enforcement wins but law enforcement is up against all the money in the world. And when you put law enforcement, usually very successful political player, up against Apple, Facebook, Twitter, Google, business interests that don't want to be hacked, et cetera, then I think they're not going to win. But these American companies that are really dominant are not so in other countries. And just look at what's happening to WhatsApp in Brazil. People, the guy who's like the vice president of sales spent a couple of days in prison. Now Facebook's being fined. And I don't know, maybe this fine isn't a serious penalty, but governments can bring very heavy fines to bear. And ultimately, then the question is, well, what are you gonna do? Are you gonna abandon that country as a company? You're just not gonna operate there anymore? Or are you gonna try to make it work? And we've seen both. We've seen Google get out of China and we've seen Blackberry Rim make special devices for particular markets. So that's where I think the risk is really gonna come. What do I think about the dark web? I think it's possible that the dark web is the future web. We just don't know. Every technology gets its start in crime. Pagers used to be a sign that you were a drug dealer or a prostitute. You know, I can go on. So the early adopters of technologies are usually criminals, but then, you know, God bless them, they develop the technology and then it finds these legitimate uses and then people around the world fall in love with them. Not pagers anymore, obviously, but for a while, those were useful. Yes? Hey, as an old internet fart, I mean, I'm gonna play a little devil's advocate here and hopefully you can provide a little bit of incentive. You just did with your last comment. I've always run my own servers. I run over other people's pipes, X25, Bisonc, all this stuff. I don't only care. I don't have any Facebook out of any of this stuff. So why would I care about this? And I've worked on both sides, by the way. I've worked on the government side and the outside and they both have their points. Yeah. I mean, I think that, you know, other than altruism, I think the reason for people who are very technologically astute to care is because you wanna talk to other people, not just yourselves. You want to be able to access information from people around the world who don't have that same technological expertise. You know, if it's going to really be a global marketplace for ideas, there can't be an expertise price to pay to get in the door. Can you help me clarify or just help clarify what seems to be an apparent contradiction with freedom of information, freedom of access and those kinds of things? But also you're at the same time you push for into an encryption and how do you compare the needs of the freedom of information with individual property rights and electoral property rights, protection of national security information, those kinds of things and what's your view on how that kind of drives together? Yeah, that's a fantastic question. So I'm just gonna come clean and admit that there is a conflict there. If you're talking about free flow of information, then what's privacy? If you wanna talk about incentives to create and those are part of intellectual property, then how does that impact the free flow of information? And I guess what I would say is there are specific policy issues we could discuss. But I think that we don't value the internet freedom utopia view as we make those debates enough. So we have very strong intellectual property protection like in the DMCA and we have a lot of things that are prohibitive there. I think we go too far. In the surveillance versus freedom of information thing, I think that's a really hard issue. You can see it in the right to be forgotten. If I'm really worried about personal privacy, I'm an American. So for me, it's a little bit of an easier question. I feel like truthful information should not be suppressed, period. But that does mean that the privacy of some people or their ability to keep information about themselves secret is gonna be a little bit less. But I think that we need to value to me, I value the freedom part more. But when we do policy, a lot of times it's a negotiation, right? You're not really picking one or the other. You have these options where you're trying to sort of optimize the two things and you have to balance. But in order to do it correctly, you have to value both sides. And what I'm saying is I think there's a side here that we're losing track of that we're not valuing enough. Now I wanna say something else about that which is related to encryption, which is encryption. I think one of the policy problems there is that we have policy people, lawyers who are used to on the one hand, on the other hand, let's try to find a way to make it work for everybody. Nobody's gonna be entirely happy. And then we have it's either encrypted or it's hackable. And I think the problem there is that policymakers have a very hard time understanding how difficult security is and that efforts to undermine security given our current state of knowledge means it's gonna be insecure. And so we're sort of taking a nuanced way of looking at the world and trying to make it, and they're trying to apply that nuanced way to something where it's a much more binary on or off kind of thing. So I think that we don't have that nuance there. Thank you. Hi. Pertaining to your comment about the Facebook lawsuit in regards to their cease and desist, how does that apply to the First Amendment in the form? And if I say fuck Facebook, does that mean I'm gonna get arrested when I walk outside because fuck Facebook, how about that? You know, Facebook benefits politically from being able to control the way that people use Facebook data. And it's, you know, Facebook was like, well, if we have aggregators, I don't know what they thought, but certainly an aggregator allows you to use different social networks more, but if you have to pick one, then somebody's going to end up being the victor in the marketplace, and that's what ended up happening. We have this ongoing debate about what the CFA can prohibit. If you, you know, let it be up to a company that either puts it in terms of service or writes a letter, and this irrationality is not something I think courts are really understanding very well right now. They're considering it in like a narrow case, but they're not thinking large enough about what the risks to free expression are. Yes, next. Can you remind me of what your speech was called? Slouching towards Internet Utopia, I believe it was. And I recognize and respect that you're an intelligent woman. So I would like to believe that you understand that the principle of utopia is completely impossible. Are you asking me if I think utopia is possible? No, what I'm trying to ask is, how can you say that there should be no regulations whatsoever when the past has shown that regulations are necessary to form the utopia and without them it leads to anarchy and chaos? Okay, I could totally, thank you. So there's two views, I think, of what I'm saying. One view might be we should have less regulation. I don't think no regulation is possible, but one view might be we should have less regulation. Another view might be we should have more regulation, that we regulate what people can do, we regulate what companies do, we regulate what the government can do in order to further these values. So the reason why I call it the dream of Internet freedom and I say that it's utopian is because I don't think that we're gonna live like that, but I think it's a vision, it's like a set of values that I think we're losing sight of, that I think drew me and a lot of people to the Internet and I think we're slowly losing. And I think if we wanna preserve some of those aspects, if we wanna have some of that be true, then we need to start thinking about what to do. Now, whether we do it through regulation of government or we have less regulation of speech, these are gonna be case by case things. One more minute, so I think this is gonna have to be the... Yeah, this is gonna be quick. Basically, this is about the end to end encryption that we're talking about and how the end to end encryption cannot be misused by organizations like ISIS and others and that is something that I'm... So for how do we segregate organizations like ISIS versus the good community? How do you differentiate and how do you make sure that those communities do not abuse the others? That they don't, yeah. So I think, I mean, my view of this is that, you know, I don't think that... I have an opinion about this, but I don't think that more information about ISIS will make more people join ISIS. I think it's just as likely, if not more likely, that more information about ISIS is going to make people feel resolved against it, that will make people, if they hear their friends or relatives talking about how ISIS are attractive in some way, will make them resist it or will make them say something to their friends and fight against it. So I don't think that the regulation of the information about these social or political movements, about abusive governments, about terrorists, about these issues, I don't think controlling the information about it actually helps make the world a safer, better place. I think that more information overall makes the world a safer, better place. I'm looking for my boss who's supposed to tell me if I'm not allowed to take any more questions until I stop. Okay. All right, so I've always seen the US Fourth Amendment as making a space where people can explore the world and as long as what they do does not reach a realm of probable cause, they'd be left alone. That it would be a so-called lawless space, but it would still be a constitutional space. And I always saw it that it's not gonna engendering anarchy and chaos as if what you're doing to be left while you're left your own devices does not reach to the point where you're harming someone else and there's suddenly probable cause against you that you'd have the freedom to do that kind of stuff. Separately, real quick, you say that you say that people want companies to essentially take care of everything for them, that they just want the bad things to go away. And from my perspective, that doesn't really seem to be too much why so much centralization or why the decentralized parts of the internet are shrinking. The loss of the morality of communications online where information is stored indefinitely where you can't just have casual conversations. There's a lot more consideration that has to be made towards everything you say online now and the thought of exploring some random topic now has, you know, especially the extremes, now has a lot more concern to be had because of the loss of the morality. That's also compounded by the fears of surveillance where, you know, some random chatter online is more likely to be seen if you're talking about something that may be extreme for your current culture. And Lost in Trust is Google is a capable steward of your privacy, not necessarily because of any purposeful action on their part but because they still have to abide by a lot of legal processes. A lot of people are aware they have to abide by a lot of legal processes that unfortunately your search queries have to be seen. So let me ask you a very quick question, okay? Do you think, what do you think is the answer to that? Is technology the answer? Is law the answer? Do people have to change the way they think about the internet? All of those? I would say make it safer for the people who typically explore the front of the society to do so. Yeah, so keep fighting against surveillance. Keep pushing for more ephemeral communications. Make it safe to keep exploring. Thank you. So thanks to all of you. I repeatedly appreciate your coming and your attention. And thank you for having me.